I have a large database creation query the beginning of which is as follows:
USE Master
GO
IF EXISTS(SELECT * FROM sys.sysdatabases where name = 'MyDatabase')
DROP DATABASE MyDatabase
GO
CREATE DATABASE MyDatabase
GO
USE MyDatabase
GO
I want to declare a variable at the beginning like this:
DECLARE #MainDB VARCHAR(30) = NULL
USE Master
GO
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #MainDB)
DROP DATABASE #MainDB
GO
CREATE DATABASE #MainDB
GO
USE #MainDB
GO
I would execute this query from the command line with the new database name being assigned using the sqlcmd tool. however sql is telling me that the variable #MainDB is not declared. Is this something I can do? If not how would you recommend I work around this problem?
This question is kind of a rehash.
How to use a variable for the database name in T-SQL?
The short answer is you can't have a variable database name. You have to put your T-SQL into a string/VARCHAR(MAX), do a find and replace, then execute it.
Sorry, but you cannot do it like this, because SQL variables only have "batch"-scope and those "GO" commands indicate the boundaries of your batches. Thus, every time you pass GO, all of your variables get wiped out (they're not even Declared any more).
There are several ways around this:
Get Rid of the GO's: This is what we do when we need a Stored Procedure (which cannot have any
GOs in it) instead of a script, but it is a fairly complicated series of sophisticated tricks that have to be wrapped around each other in order to pull it off. You have to be pretty T-SQL adept to use it.
Use a #temp table to hold your values instead. This works because #temp table have session-scope instead or just batch-scope.
I believe that you can also use and manipulate NTDOS environment variables from SQLCMD as well, though I am not familiar with exactly how to do it.
If you want to pursue any of these, let me know which one and I can explain in more detail with examples.
Oops, I missed the "variable Database name" part of this. That can be done as well, but you have to add dynamic SQL to the mix.
You have multiple issues...
GO ends the current batch. So your variable is not visible anymore. You need to move declaration. Also, you cannot DROP DATABASE #MainDB, you have to use dynamic queries. For example,
GO
DECLARE #MainDB VARCHAR(30) = 'MyDB';
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #MainDB)
BEGIN
EXECUTE ('DROP DATABASE '+#MainDB);
END;
You say you need to run this from the command line, so if you have a lot of code which depends on the structure you have already built you can do the following:
Create a temporary copy of the script
Search and replace #MainDB with the DB name you pass in as the parameter in the temporary script
Run the temporary script using the sqlcmd tool
Obviously, remove the DECLARE #MainDB varchar(30) = NULL from your script if you like this option.
If you choose this approach, you can implement your 3 steps using a variety of different technologies (powershell, python, batch file, VBScript ...).
VBScript file approach:
set obj = CreateObject("Scriptlet.TypeLib")
tempsqlfile = obj.GUID & ".sql" 'get a new name for your sql file
set fso = CreateObject("Scripting.FileSystemObject")
set objFile = objFSO.OpenTextFile(tempsqlfile, ForReading) 'open the template file
strSQLText = objFile.ReadAll
objFile.Close
strNewSQLText = Replace(strSQLText, "#MainDB", Wscript.Arguments(1)) 'replace the db name
Set objFile = objFSO.OpenTextFile(tempsqlfile, ForWriting)
objFile.WriteLine strNewText 'write the new file
objFile.Close
Set Shell = WScript.CreateObject("WScript.Shell")
commandLine = "osql -E -i " & Wscript.Arguments(0) & -o " & tempsqlfile & ".rpt"
Set oExec = Shell.Exec(commandLine)
Apologies for the variable names - I cut and pasted bits and pieces from various places but you should get the gist.
(Also - apologies for choosing VBScript out of all those options and be aware that there is no error checking for missing parameters)
As it stands above, if you save that script as 'runmystuff.vbs' then you can do:
runmystuff.vbs sqlfile.sql MagicNewDB
This will replace #MainDB with MagicNewDB everywhere inside the script and then run it using osql.
Found a way to make cmdline variables equal t-sql variables
USE master
GO
DECLARE #Mydb VARCHAR(30) = "$(mydb)"
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #Mydb)
print #Mydb
Execute('create database ' + #Mydb)
The batch file I run from looks like this.
sqlcmd -S %1 -i CreateDatabases.sql -v mydb="%2"
I can now run from sqlcmd and enter my server for %1 and desired DB name for %2.
thanks everyone for the replies they all helped me find the right solution.
Related
How can I execute the following SQL inside a single command (single execution) through ADO.NET?
ALTER TABLE [MyTable]
ADD NewCol INT
GO
UPDATE [MyTable]
SET [NewCol] = 1
The batch separator GO is not supported, and without it the second statement fails.
Are there any solutions to this other than using multiple command executions?
The GO keyword is not T-SQL, but a SQL Server Management Studio artifact that allows you to separate the execution of a script file in multiple batches.I.e. when you run a T-SQL script file in SSMS, the statements are run in batches separated by the GO keyword. More details can be found here: https://msdn.microsoft.com/en-us/library/ms188037.aspx
If you read that, you'll see that sqlcmd and osql do also support GO.
SQL Server doesn't understand the GO keyword. So if you need an equivalent, you need to separate and run the batches individually on your own.
Remove the GO:
String sql = "ALTER TABLE [MyTable] ADD NewCol INT;";
cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
sql = "UPDATE [MyTable] SET [NewCol] = 1";
cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
It seems that you can use the Server class for that. Here is an article:
C#: Executing batch T-SQL Scripts containing GO statements
In SSMS (SQL Server Management System), you can run GO after any query, but there's a catch. You can't have the semicolon and the GO on the same line. Go figure.
This works:
SELECT 'This Works';
GO
This works too:
SELECT 'This Too'
;
GO
But this doesn't:
SELECT 'This Doesn''t Work'
;GO
This can also happen when your batch separator has been changed in your settings. In SSMS click on Tools --> Options and go to Query Execution/SQL Server/General to check that batch separator.
I've just had this fail with a script that didn't have CR LF line endings. Closing and reopening the script seems to prompt a fix. Just another thing to check for!
Came across this trying to determine why my query was not working in SSRS. You don't use GO in SSRS, instead use semicolons between your different statements.
I placed a semicolon ; after the GO, which was the cause of my error.
You will also get this error if you have used IF statements and closed them incorrectly.
Remember that you must use BEGIN/END if your IF statement is longer than one line.
This works:
IF ##ROWCOUNT = 0
PRINT 'Row count is zero.'
But if you have two lines, it should look like this:
IF ##ROWCOUNT = 0
BEGIN
PRINT 'Row count is zero.'
PRINT 'You should probably do something about that.'
END
I got this error message when I placed the 'GO' keyword after a sql query in the same line, like this:
insert into fruits (Name) values ('Apple'); GO
Writing this in two separate lines run. Maybe this will help someone...
I first tried to remove GO statements by pattern matching on (?:\s|\r?\n)+GO(?:\s|\r?\n)+ regex but found more issues with our SQL scripts that were not compatible for SQL Command executions.
However, thanks to #tim-schmelter answer, I ended up using Microsoft.SqlServer.SqlManagementObjects package.
string sqlText;
string connectionString = #"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=FOO;Integrated Security=True;";
var sqlConnection = new System.Data.SqlClient.SqlConnection(connectionString);
var serverConnection = new Microsoft.SqlServer.Management.Common.ServerConnection(sqlConnection);
var server = new Microsoft.SqlServer.Management.Smo.Server(serverConnection);
int result = server.ConnectionContext.ExecuteNonQuery(sqlText);
I am working with Visual Studio 2017 Database Project (Dacpac) and I have some SQLCMD variables in Publish file (in xml file) like below-
<SqlCmdVariable Include="ClientDBName">
<Value>Client_1</Value>
</SqlCmdVariable>
And my problem is, we have multiple clients and we are deploying the database changes by dacpac for multiple clients in once. So if I assign the static value for my SQLCMD variable "ClientDBName" like above example, it will take the same value (same db name "Client_1") for all the clients.
And to fix that I am using PreDeployment script. In which I am trying to assign dynamic value or db name to the SQLCMD variable "CleintDBName". Like below-
DECLARE #dbname varchar(50)
SET #dbName = "xyz"
:setvar ClientDBName #dbName
But this is not working. I explored it and found this would not work. Another way I am trying to do is by assign the dbname value via calling the script like below-
:setvar ClientDBName "C:\GetDatabaseName.sql"
But this is also not working.
So can anyone help me out on this, how we can assign dynamic values to SQLCMD variable?
The sqlpackage example command below specifies SQLCMD values with the /Variables: argument. These values are used instead of those in the publish profile.
SqlPackage.exe /Action:Publish /SourceFile:"YourDatabase.dacpac" /TargetDatabaseName:YourDatabaseName /TargetServerName:"." /Variables:"ClientDBName=YourValue"
If your actual need is the published database name, you could use the built-in DatabaseName SQLCMD variable instead of a user-defined SQLCMD variable, which will be the /TargetDatabaseName value.
I'm new to Powershell. I have a bunch of stored procedures that are exactly the same except they reside in different schemas and target different names. Very simply, they look like this (real procs are much longer)
CREATE PROC [a].[ProcessTable] AS UPDATE a.Table_a
CREATE PROC [b].[ProcessTable] AS UPDATE b.Table_b
CREATE PROC [c].[ProcessTable] as UPDATE c.Table_c
I wanted to create just one file (call it 'FileWithStoredProc.sql') with the base stored procedure and then use an outer powershell script to execute it multiple times:
loop through an array of {a, b, c, ... n}
use Invoke-SQLcmd -inputfile "FileWithBasicStoredProc.sql" -Variable $args
Then in my target file, have something like
CREATE PROC [$(arg1)].[ProcessTable] AS UPDATE $(arg1).Table_$(arg1)
I can now see that it's fine as long as I want to use $(arg1) as a string, say
print 'Creating stored proc ' + $(arg1) + '.ProcessTable'
But I want the target file to intepret my variable as part of a sql object name.
How do I do this? I cannot redesign/rename my SQL objects, and since the procedure is quite long, with a lot of these schema swaps, I don't want to use the -Query parameter in my powershell wrapper. Is what I want possible, without changing everything into a giant 'exec sql' string? I'd like to keep these target sql files relatively easy to read and modify.
(Submitting on behalf of a Snowflake client)
.........................
I would like to flexibly name tables I create.
For example
Set name = April
and then
Create table customer_data_$name as
I've found two recommended options thus far:
1 - Using Snowsql:
snowsql -c myconn -w trainingwh --variable NAME=April -f test.sql -o variable_substitution=True
script test.sql:
create table mytab_&NAME as
select current_timestamp ts;
2 - Using JavaScript Stored procedures:
create or replace procedure Proc_CT(NAME varchar)
RETURNS varchar(22)
LANGUAGE JAVASCRIPT
Execute as OWNER
as
$$
var ct_qry = `create or replace table mytab_`+NAME+`(i int);`
var ct_stmt = snowflake.createStatement({ sqlText: ct_qry });
ct_stmt.execute();
return 'Done.';
$$
;
CALL Proc_CT('April');
Two Questions:
A. Out of these two recommendations, is there any reason to leverage one more than the other?
B. Are there any other recommended options that can be leveraged in this situation?
.........................
Any advice or additional recommendations would be GREATLY APPRECIATED. Thank you!
Of the 2 options, I'd go with the stored procedure over Snowsql, because it's a more portable solution. Snowsql needs to be executed from a host machine, while stored procedures can be executed from anywhere, since they run inside Snowflake. This way, if you want to do this within an ELT/ETL process using a third-party tool, python, java, etc., you could simply call the SP to create your table.
As a note, I'd probably create a SP that renames a table for me, rather than doing the full CTAS statement. Your process could then create a table without an SP being involved, and then you could pass the table name + $name values into the SP and have it rename it for you. Either way works, but that's how I'd do it.
I have a data flow working just fine, it is compound of a source that is evaluated by a lookup component and then it does an upsert, diagram is shown here:
Now, on BanqueCIBI (the ole db source), I have a SQL Command Text where I would like to receive a param from another component to use it as valueDate. This is the query right now:
SELECT [IdTransactionType]
,[IdBank]
,[IdBanqueDetailHistoryRef]
,[IdBanqueDetail]
,[IdBanqueHeader]
,[CCI]
,[ValueDate]
,[Text]
,[Reference]
,[Amount]
,[Sign]
,[IdCurrency]
,[OrigBranch]
,[dtCreatedOrModified]
,[oldText]
,[oldReference]
,[IdAccount]
,[IdSubAccount]
,[Date]
,[IdRD]
,[Flag]
,[History]
,[DtDate]
,[iTIB]
,[iSAP]
FROM [dbCibi3].[dbo].[BanqueDetailHistoryRef]
WHERE [ValueDate] = '2015-31-01'
So, the diagram would look something like this:
Right now, that new OLE DB Command looks like this:
And this is the usp_GetDateParamsSSIS invoked in the source above:
USE [dbMODIFE]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[usp_GetDateParamSSIS]
#name VARCHAR(50) = NULL,
#value DATETIME OUTPUT
AS
BEGIN
SELECT TOP 1 #value =valueDate FROM helperPARAMS_SSIS WHERE name = #name;
END
So, how could I use that #value OUTPUT on the BanqueCIBI component? Thank you so much! (Please notice that BanqueCibi and the new component are querying different servers and a linked served is not an option because of company's policies).
Ok, since you are passing a hard-coded Name parameter to your stored procedure, I am assuming that this is a stored procedure you only need to call once for each execution of the package, and not something you're calling once for every row in your data source.
In that case, do NOT call the stored proc with an OLE DB Command in the data flow.
Instead, call it with an Execute SQL Task that you put BEFORE the DataFlow Task in the Command Flow. Direct the return value of the proc to a package-level variable.
Then in the Source of your dataflow (BanqueCIBI), map that variable to the first parameter of your SELECT query.
There are examples of all of these techniques easily available on the internet. But if you find one you are having trouble following, feel free to edit your question with the details, or create a new question if it is sufficiently different in scope from this one.