I'm writing a SQL script which is executed as part of a series of SQL scripts. I cannot access the other scripts nor do I have control over the "series-execution"-logic.
I want to change the database within my script (USE someDB), however, I want to make sure that after my script has run the previous DB is the current DB again. Is there some kind of pushd/popd for database usage? An alternative, e.g., by somehow writing the current DB into a temporary variable?
I don't know about any pushd/popd functionality, but the way to get the current database name is use the DB_NAME() function like this:
SELECT DB_NAME()
Related
Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.
My SSIS projects tend to run queries that require changes as they move between environments, like the table schema might change or a value in the Where clause. I've always either put my SQL into a Project Parameter, which is hard to edit since formatting is lost, or just put it directly into the Execute SQL Task/Data Flow Source then manually edited it between migrations which is also not ideal.
I was wonder though if I added my SQL scripts to files within the project, can these be read back in? Example if I put a query like this:
select id, name from %schema%.tablename
I'd like to read this into a variable then it's easy to use an expression as I do with Project Parameters to replace %schema% with the appropriate value. Then the .sql files within the project can be edited with little effort or even tested through an Execute SQL Task that's disabled/removed before the project goes into the deployment flow. But I've been unable to find how to read in a file using a relative path within the project. Also I'm not even sure these get deployed to the SSIS Server.
Thanks for any insight.
I've added a text file query.sql to an SSIS (SQL 2017) Project in Visual Studio, bit I've found no way to pull the contents of query.sql into a variable.
Native tooling approach
For an Execute SQL Task, there's an option to source your query directly from a file.
Set your SQLSourceType to File Connection and then specify a file connection manager in the FileConnection section.
Do be aware that while this is handy, it's also ripe for someone escalating their permissions. If I had access to the file the SSIS package is looking for, I can add a drop database, create a new user and give them SA rights, etc - anything the account that runs the SSIS package can do, a nefarious person could exploit.
Roll your own approach
If you're adamant about reading the file yourself, add two Variables to your SSIS package and supply values like the following
User::QueryPath -> String -> C:\path\to\file.sql
User::QueryActual -> String -> SELECT 1;
Add a Script Task to the package. Specify as a ReadOnly variable User::QueryPath and specify as a ReadWrite variable User::QueryActual
Within the Main you'd need code like the following
string filePath = this.Dts.Variables["User::QueryPath"].Value.ToString();
this.Dts.Variables["User::QueryActual"].Value = System.IO.File.ReadAllText(filePath);
The meat of the matter is System.IO.File.ReadAllText. Note that this doesn't handle checking whether the file exists, you have permission to access, etc. It's just a barebones read of a file (and also open to the same injection challenges as the above method - just this way you own maintaining it versus the fine engineers at Microsoft)
You can build your query by using both Variable and Parameter.
For example:
Parameter A: dbo
Build your variable A (string type) as : "Select * FROM server.DB." + ParameterA + ".Table"
So if you need to change the schema, just change the parameter A will give you the corresponding query in variable A.
I have a collection of .sql files containing ddl for various database objects.
For example:
User.sql
Group.sql
GroupUser.sql
Is there a way I can use a stored procedure to easily/elegantly load/execute these files in sequence? For example, GroupUser.sql depends on the existence of User and Group, so I need to execute the .sql files in the order listed above.
I know I could concatenate the contents of all of the .sql scripts above into a stored procedure, but I would prefer to take a more modular approach. I could also put each script into its own stored procedure but I'd rather not clutter the stored procedure collection in my app database with DDL setup scripts.
From SSMS, go to the "Query" menu, and select "SQLCMD Mode". Once there, you can run commands like this. I script out stuff like this all the time.
use test
go
:r "D:\SomeDataDirectory\SomeSQLFile.sql"
EDIT: Didn't see you wanted to do this within a stored procedure. That's a bit of a dicey proposition. Assuming you have permissions to execute it, you could put the same SQLCMD code in calls to xp_cmdshell, but in many circumstances that won't be an option for you unless you've got admin-like permissions on the server.
Is it possible to programmatically find the database context from a tsql script? ie the context that changes when you add a USE . I ask because I am not using a USE, and would like to find the database name the script is running on.
select db_name()
I have approximately 100 SQL views that are a variation of this:
select * from RTC.dbo.MyTable
...now I find I need to change the name of the RTC table to something else. Rather than edit one view at a time, is there a way to script out all their drop/create statements to a text file so that I can do a global replacement?
In SSMS right click the database, go to Tasks and select there 'Generate Scripts...'. Select 'Views', select the views you want exported, export.
I'd use PowerShell. If you're not using SQL 2008 Client Tools, install them. Then get the PowerShell client, add the registered snapins (plenty of information out there on how to do that), and then use the directory structure to get to the folder representing your Views.
Then script them using something like:
Get-ChildItems | % {$_.Script()}
Use ScriptOptions to tell it to use an Alter script.
And replace "RTC." with the new database name... and run them using sqlcmd.
PowerShell actually becomes a really nice deployment option too.