My SSIS projects tend to run queries that require changes as they move between environments, like the table schema might change or a value in the Where clause. I've always either put my SQL into a Project Parameter, which is hard to edit since formatting is lost, or just put it directly into the Execute SQL Task/Data Flow Source then manually edited it between migrations which is also not ideal.
I was wonder though if I added my SQL scripts to files within the project, can these be read back in? Example if I put a query like this:
select id, name from %schema%.tablename
I'd like to read this into a variable then it's easy to use an expression as I do with Project Parameters to replace %schema% with the appropriate value. Then the .sql files within the project can be edited with little effort or even tested through an Execute SQL Task that's disabled/removed before the project goes into the deployment flow. But I've been unable to find how to read in a file using a relative path within the project. Also I'm not even sure these get deployed to the SSIS Server.
Thanks for any insight.
I've added a text file query.sql to an SSIS (SQL 2017) Project in Visual Studio, bit I've found no way to pull the contents of query.sql into a variable.
Native tooling approach
For an Execute SQL Task, there's an option to source your query directly from a file.
Set your SQLSourceType to File Connection and then specify a file connection manager in the FileConnection section.
Do be aware that while this is handy, it's also ripe for someone escalating their permissions. If I had access to the file the SSIS package is looking for, I can add a drop database, create a new user and give them SA rights, etc - anything the account that runs the SSIS package can do, a nefarious person could exploit.
Roll your own approach
If you're adamant about reading the file yourself, add two Variables to your SSIS package and supply values like the following
User::QueryPath -> String -> C:\path\to\file.sql
User::QueryActual -> String -> SELECT 1;
Add a Script Task to the package. Specify as a ReadOnly variable User::QueryPath and specify as a ReadWrite variable User::QueryActual
Within the Main you'd need code like the following
string filePath = this.Dts.Variables["User::QueryPath"].Value.ToString();
this.Dts.Variables["User::QueryActual"].Value = System.IO.File.ReadAllText(filePath);
The meat of the matter is System.IO.File.ReadAllText. Note that this doesn't handle checking whether the file exists, you have permission to access, etc. It's just a barebones read of a file (and also open to the same injection challenges as the above method - just this way you own maintaining it versus the fine engineers at Microsoft)
You can build your query by using both Variable and Parameter.
For example:
Parameter A: dbo
Build your variable A (string type) as : "Select * FROM server.DB." + ParameterA + ".Table"
So if you need to change the schema, just change the parameter A will give you the corresponding query in variable A.
Related
this is one that has me stumped and Ive been doing this a long while.
Migrating to SQL server 2016, large number of ETL. Easy enough.
One of the ETL packages has a simple script task to take a table of files, run a file exists foreach loop.
it uses a project parameter to create the unc ( \servername\share) and then binds that to the file name in the script task.
use an environment config setup in SSISDB
execute in SSDT works fine, deploy to catalog and it cant see the file. i know youll say permissions, but ive permissioned everyone group to share and drive in case its that. SSISDB execution means it should be running under my security context and im domain admin, local admin and creator owner of the share.
even strangeR, i have created simple package to grab the contents of one of the files and import into a dump table in case permissions or pathway were duff ( even though they work in SSDT might be the enviroNment config in SSISDB). THIS WORKS FINE, therefore it cant be the envrionment setup of SSISDB being referenced.
please note this is not running from an agent job yet so wont be due to agent server account issue. need to get it running from ssisdb first then ill create an agent job
So -- script task cant see unc share, built from two variables, that works in ssdt and its running under same credentials...
Go
For what its work the script task code is
Dts.Variables("BolFileExists").Value = File.Exists(Dts.Variables("StrLoadFileLocation").Value.ToString & Dts.Variables("StrCurrentFile").Value.ToString)
This is a slightly different answer as it shows a different approach and removes the script task. I use a foreach to check if the file exists using GUI tools provided by SSIS:
Well I found the answer and I deserve to punch myself in the face.
Tried everything, it was a file variable and path variable being pulled together in the script task so tried concatenation that before the script task, pumped this into a table to ensure it was going to write table.
Literally everything was fine and still didn’t work.
The issue....
Building it as a 2017 package onto a 2016 Sql server.
I’ve not found what was missing dll wise but it must have been one of those that meant the script task couldn’t find the files but weird it didn’t break and just said the files weren’t there!
Thanks all for input, I’m going to go put my head in the door and slam it
I have just set up an SSDT project which I want to use to create local databases on the SQL server hosted locally on my machine.
I want to add some pre- and post- deployment SQL scripts for initialization and cleanups.
Since, the server and the database name can change, I have defined two build variables using the project properties each for the target server and target database.
However, I can't seem to access them inside the post deployment scripts.
The syntax below won't build the project -
use [$(TargetDatabaseName)]
This builds, but then fails while publishing -
use ['$(TargetDatabaseName)']
and the error says the ''myTargetDB'' doesn't exist (myTargetDB was passed as a value at the time of publishing)
This might be a trivial thing but I am just not able to get around it. I am on SQL server 2016 if that matters.
Make sure that you put both scripts in SQLCMD mode. See the image below surrounding with red.
Once your target variable is defined, see surrounding with blue in the image above, it can be safely used in the PostDeployment script, see the image below surrounding with blue.
If you have any questions, feel free to contact me.
There is a predefined variable, $(DatabaseName), for the name of the target database. You don't need to create your own; and even if you do, you would need to set the same value to both of them.
Not sure about the target server. In most cases, SQL scripts are generated with the assumption that connection to the correct server is already established. Sure, you can change the current server using something like :connect $(TargetServerName), but I think it will only lead to confusion (and I'm not sure it will work, actually).
The only exception I can think of is that you can't use SQLCMD variables to parameterise the logical/physical names of the database files - these have to be hardcoded.
All other variables, if declared in the project properties, should be accessible everywhere. Below is a fragment of a post deploy from one of my projects:
use [master];
go
print 'Switching database ownership to sa...';
GO
alter authorization on database::[$(DatabaseName)] to [sa];
go
use [$(DatabaseName)];
go
print 'Creating database master key...';
go
-- Create database master key
create master key encryption by password = '$(DMK_Key)';
go
print 'Running database setup...';
go
exec dbo.init_database;
go
It is possible, however, that you are trying to reference another database, located on a different server. If that's the case, you need to follow a different approach, namely: built a project for that remote database and add its DACPAC to the list of project references, using the Add database reference... menu. There, you will be able to specify variables for both the (linked) server and the database name.
I'm learning to develop ETL using Pentaho Spoon, I'm pretty noob yet.
Instead of storing SQL operations inside its file, I'd like to have them on their own .sql files. It makes easier to track changes on Subversion, and in case of need I can just open the sql file on DB manager and execute it directly.
How could I do that? I suppose I could use some component to read a txt file into a variable, and another component to take that variable and execute it on DB.
How's the simplest way to achieve that?
In the standard SQL Table input, you can define the query to be a parameter ${my_query} and this parameter has to be defined (without ${...} decoration) in the transformation properties: right-click anywhere, select Properties on the popup menu, the Parameter tab.
Each time you run the transformation, you'll presented the list of parameters, among which my_query which you can overwrite.
To automatize, follow the example which was shipped with the installation zip. In the same directory as you spoon.bat/spoon.sh, there is a folder named sample, in which you will find a job to read_all_files or read all_tables. Basically this job list the files in a directory, and for each one puts it in a variable and use it as a parameter to run the transformation. Much more easier to do than to explain.
I'm looking to create a text file, which will contain SQL code to create a database and its tables, and to later on, modify the same database.
The text file will be read via an application the user installs, and when it runs, it should read the text file and create, or modify the database if any changes have been applied.
The SQL text file should of course, be somewhat validated in order to not duplicate tables and such.
I'm not asking for any code, just a specific pathway I should follow in order to make this happen.
Thanks for your input.
I'd do database creation via a SQL script which checks for the existence of tables/views/SPs/etc. before creating them, then I'd execute it in the VB application via ADO.NET. I'd ship it with the application in a subdirectory. It's not a big deal to read text files, or to execute a SQL string via ADO.NET.
I'd have a VERSION table in the database that identifies what DB schema version is installed, and when I shipped upgrade scripts which modified the DB, I would have them update the VERSION table. The first version you ship is 1.0, increment as appropriate thereafter.
All the SQL object creation/detection/versioning logic would be in SQL. That's by far the simplest way to do it on the client, it's the simplest thing to develop and to test before shipping (MS SQL Management Studio is a godsend), it's the simplest thing to diff against the previous version, store in source control, etc.
Incidentally, I would also have my application interact with the database strictly via stored procedures, and I would absolutely never, ever feed SQL any concatenated strings. All parameters going to SQL should be delivered via ADO.NET's SqlParameter mechanism, which is very cool because it makes for clean, readable code, and sanitizes all of your parameters for you. Ever use a DB application that crashed on apostrophes? They didn't sanitize their parameters.
If what you are asking is How do I read a text file and make the results execute in SQL
I would use a StreamReader to read the text file into a string variable.
Once you have read it in, go ahead and open a connection to the database and do a ExecuteNonQuery with the value of the string variable.
I would post in the comments but I can't. I think this may be what you are looking for.
Is it possible to execute a text file from SQL query?
Use MS SQL Server Management Studio to perfect your scripts:
http://technet.microsoft.com/en-us/library/ms174173.aspx
SSMS comes with the server installs and is available for the SQL Express versions. (It isn't needed on the client PCs but it may be useful for debugging.
This will most likely be a low security environment and each user will have full control of the DB.
For there it is pretty straight forward to read the text file and run it against the DB. Just get a connection and send the script:
Dim cmd As New Data.SqlClient.SqlCommand
con.Open()
cmd.CommandText = SQL
cmd.Connection = con
cmd.ExecuteNonQuery()
You might want to use a virtual machine on your development PC as it will allow you to quickly do testing of your scripts and code, and return to baseline state.
I'm really bad at SQL and couldn't find anything near what I really need. I'm trying to create a Stored Procedure that should run each night to check if records in my database have an equivalent file on a server with all our data.
Example: Record with a mp4 has: [Spotnumber] -> 0000001. Then my procedure should check (not locally) if the file exists on the other server with this number.
Also the place where it should look could be fore exemple (not locally) C:/Spots. And in this directory there'll be subdirectories like: 2013, 2012, 2011. It should check in each directory if it doesn't exists.
For this I was thinking to make something like this: Single check. But this one searches locally and already has the url in a table-field. This won't be possible for mine.
So my question is: Is it even possible to do this with just a SQL procedure? If yes how should I make it check all the files on another server (what path should I use?) + How can I make it check for each record in each subdirectory?
I would suggest another approach.
Instead of using Sql server to check if the file exists then update the db.
Why don't you use a powershell script checking if a file exist, then in this powershell script update the database. With a little search on google you can find all functions on microsoft blogs explaining how to check if a file exists and update file.
Another solution, you could create an assembly in your database with a .net language and work with that.
Last possibility, i think it can be possible too with SSRS.
If you really need to do that with tsql, you should allow xcmd command on your server, then use xcmd... but it means every body could use xcmd. It's not designed too and not suggested :)