Using UDF's in Excel SQL Server DB query [duplicate] - sql-server

How can I execute the following SQL inside a single command (single execution) through ADO.NET?
ALTER TABLE [MyTable]
ADD NewCol INT
GO
UPDATE [MyTable]
SET [NewCol] = 1
The batch separator GO is not supported, and without it the second statement fails.
Are there any solutions to this other than using multiple command executions?

The GO keyword is not T-SQL, but a SQL Server Management Studio artifact that allows you to separate the execution of a script file in multiple batches.I.e. when you run a T-SQL script file in SSMS, the statements are run in batches separated by the GO keyword. More details can be found here: https://msdn.microsoft.com/en-us/library/ms188037.aspx
If you read that, you'll see that sqlcmd and osql do also support GO.
SQL Server doesn't understand the GO keyword. So if you need an equivalent, you need to separate and run the batches individually on your own.

Remove the GO:
String sql = "ALTER TABLE [MyTable] ADD NewCol INT;";
cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
sql = "UPDATE [MyTable] SET [NewCol] = 1";
cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
It seems that you can use the Server class for that. Here is an article:
C#: Executing batch T-SQL Scripts containing GO statements

In SSMS (SQL Server Management System), you can run GO after any query, but there's a catch. You can't have the semicolon and the GO on the same line. Go figure.
This works:
SELECT 'This Works';
GO
This works too:
SELECT 'This Too'
;
GO
But this doesn't:
SELECT 'This Doesn''t Work'
;GO

This can also happen when your batch separator has been changed in your settings. In SSMS click on Tools --> Options and go to Query Execution/SQL Server/General to check that batch separator.
I've just had this fail with a script that didn't have CR LF line endings. Closing and reopening the script seems to prompt a fix. Just another thing to check for!

Came across this trying to determine why my query was not working in SSRS. You don't use GO in SSRS, instead use semicolons between your different statements.

I placed a semicolon ; after the GO, which was the cause of my error.

You will also get this error if you have used IF statements and closed them incorrectly.
Remember that you must use BEGIN/END if your IF statement is longer than one line.
This works:
IF ##ROWCOUNT = 0
PRINT 'Row count is zero.'
But if you have two lines, it should look like this:
IF ##ROWCOUNT = 0
BEGIN
PRINT 'Row count is zero.'
PRINT 'You should probably do something about that.'
END

I got this error message when I placed the 'GO' keyword after a sql query in the same line, like this:
insert into fruits (Name) values ('Apple'); GO
Writing this in two separate lines run. Maybe this will help someone...

I first tried to remove GO statements by pattern matching on (?:\s|\r?\n)+GO(?:\s|\r?\n)+ regex but found more issues with our SQL scripts that were not compatible for SQL Command executions.
However, thanks to #tim-schmelter answer, I ended up using Microsoft.SqlServer.SqlManagementObjects package.
string sqlText;
string connectionString = #"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=FOO;Integrated Security=True;";
var sqlConnection = new System.Data.SqlClient.SqlConnection(connectionString);
var serverConnection = new Microsoft.SqlServer.Management.Common.ServerConnection(sqlConnection);
var server = new Microsoft.SqlServer.Management.Smo.Server(serverConnection);
int result = server.ConnectionContext.ExecuteNonQuery(sqlText);

Related

Fast way to transfer table to a remote server as import/export does

I have a stored procedure with statements similar to this:
DELETE FROM [LinkedServer].[DB1].[dbo].[Table1]
DELETE FROM [LinkedServer].[DB1].[dbo].[Table2]
DELETE FROM [LinkedServer].[DB1].[dbo].[Table3]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table3]
SELECT * FROM [DB1].[dbo].[Table3]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table2]
SELECT * FROM [DB1].[dbo].[Table2]
INSERT INTO [LinkedServer].[DB1].[dbo].[Table1]
SELECT * FROM [DB1].[dbo].[Table1]
Which is extremely slow. But my goal is simple as this.
I cannot use replication, just a method to empty the tables and fill them again.
If I do the same action using import/export functionality from SSMS, it empties the remote tables and fill them up very quickly.
Is there a way to simulate what import/export is doing, using T-SQL commands?
It would not be a problem to disable restrictions while copying data.
Perhaps exists some kind of BULK INSERT I could use in this scenario? After search info about this option it seems to be useful to transfer data from SQL to file or from file to SQL, but I don't find examples to transfer from table to table.
Is there a way to simulate what import/export is doing, using Transact SQL commands?
Not quite. But try running the INSERT from the other end.
INSERT INTO [DB1].[dbo].[Table3] SELECT * FROM [LinkedServer].[DB1].[dbo].[Table3]
Or install an instance of SQL Server 2008 and backup/restore to upgrade the database; then backup/restore that to the target version.
I finally opted for #Jeroen Mostert's proposal, since no other working solutions have turned up.
I have created a small command line tool that receives the parameters for the source and destination connections. I've tested it with a table and it goes just as fast as the import/export does.
Using sourceConn As New SqlConnection(sourceConnStr)
Using destinationConn As New SqlConnection(destinationConnStr)
Dim cM As New SqlCommand("SELECT * FROM " & sourceTable, sourceConn)
sourceConn.Open()
Using dR As SqlDataReader = cM.ExecuteReader
Dim bC As New SqlClient.SqlBulkCopy(destinationConn)
destinationConn.Open()
'Truncate
cM = New SqlCommand("TRUNCATE TABLE " & destinationTable, destinationConn)
cM.ExecuteNonQuery()
'BulkCopy
bC.DestinationTableName = destinationTable
bC.WriteToServer(dR)
'Close connections
dR.Close()
sourceConn.Close()
destinationConn.Close()
End Using
End Using
End Using

Variables declaration in Create Database Script

I have a large database creation query the beginning of which is as follows:
USE Master
GO
IF EXISTS(SELECT * FROM sys.sysdatabases where name = 'MyDatabase')
DROP DATABASE MyDatabase
GO
CREATE DATABASE MyDatabase
GO
USE MyDatabase
GO
I want to declare a variable at the beginning like this:
DECLARE #MainDB VARCHAR(30) = NULL
USE Master
GO
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #MainDB)
DROP DATABASE #MainDB
GO
CREATE DATABASE #MainDB
GO
USE #MainDB
GO
I would execute this query from the command line with the new database name being assigned using the sqlcmd tool. however sql is telling me that the variable #MainDB is not declared. Is this something I can do? If not how would you recommend I work around this problem?
This question is kind of a rehash.
How to use a variable for the database name in T-SQL?
The short answer is you can't have a variable database name. You have to put your T-SQL into a string/VARCHAR(MAX), do a find and replace, then execute it.
Sorry, but you cannot do it like this, because SQL variables only have "batch"-scope and those "GO" commands indicate the boundaries of your batches. Thus, every time you pass GO, all of your variables get wiped out (they're not even Declared any more).
There are several ways around this:
Get Rid of the GO's: This is what we do when we need a Stored Procedure (which cannot have any
GOs in it) instead of a script, but it is a fairly complicated series of sophisticated tricks that have to be wrapped around each other in order to pull it off. You have to be pretty T-SQL adept to use it.
Use a #temp table to hold your values instead. This works because #temp table have session-scope instead or just batch-scope.
I believe that you can also use and manipulate NTDOS environment variables from SQLCMD as well, though I am not familiar with exactly how to do it.
If you want to pursue any of these, let me know which one and I can explain in more detail with examples.
Oops, I missed the "variable Database name" part of this. That can be done as well, but you have to add dynamic SQL to the mix.
You have multiple issues...
GO ends the current batch. So your variable is not visible anymore. You need to move declaration. Also, you cannot DROP DATABASE #MainDB, you have to use dynamic queries. For example,
GO
DECLARE #MainDB VARCHAR(30) = 'MyDB';
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #MainDB)
BEGIN
EXECUTE ('DROP DATABASE '+#MainDB);
END;
You say you need to run this from the command line, so if you have a lot of code which depends on the structure you have already built you can do the following:
Create a temporary copy of the script
Search and replace #MainDB with the DB name you pass in as the parameter in the temporary script
Run the temporary script using the sqlcmd tool
Obviously, remove the DECLARE #MainDB varchar(30) = NULL from your script if you like this option.
If you choose this approach, you can implement your 3 steps using a variety of different technologies (powershell, python, batch file, VBScript ...).
VBScript file approach:
set obj = CreateObject("Scriptlet.TypeLib")
tempsqlfile = obj.GUID & ".sql" 'get a new name for your sql file
set fso = CreateObject("Scripting.FileSystemObject")
set objFile = objFSO.OpenTextFile(tempsqlfile, ForReading) 'open the template file
strSQLText = objFile.ReadAll
objFile.Close
strNewSQLText = Replace(strSQLText, "#MainDB", Wscript.Arguments(1)) 'replace the db name
Set objFile = objFSO.OpenTextFile(tempsqlfile, ForWriting)
objFile.WriteLine strNewText 'write the new file
objFile.Close
Set Shell = WScript.CreateObject("WScript.Shell")
commandLine = "osql -E -i " & Wscript.Arguments(0) & -o " & tempsqlfile & ".rpt"
Set oExec = Shell.Exec(commandLine)
Apologies for the variable names - I cut and pasted bits and pieces from various places but you should get the gist.
(Also - apologies for choosing VBScript out of all those options and be aware that there is no error checking for missing parameters)
As it stands above, if you save that script as 'runmystuff.vbs' then you can do:
runmystuff.vbs sqlfile.sql MagicNewDB
This will replace #MainDB with MagicNewDB everywhere inside the script and then run it using osql.
Found a way to make cmdline variables equal t-sql variables
USE master
GO
DECLARE #Mydb VARCHAR(30) = "$(mydb)"
IF EXISTS(SELECT * FROM sys.sysdatabases where name = #Mydb)
print #Mydb
Execute('create database ' + #Mydb)
The batch file I run from looks like this.
sqlcmd -S %1 -i CreateDatabases.sql -v mydb="%2"
I can now run from sqlcmd and enter my server for %1 and desired DB name for %2.
thanks everyone for the replies they all helped me find the right solution.

postgresql 9. update not working

I am using PostgreSQL 9.
When trying to do this update, row table does not get updated.
$cmd = "UPDATE table1 SET field1 = '$value1' WHERE key_field = '$key_value'; ";
table1 has privileges for PUBLIC to INSERT and UPDATE.
When using pgAdmin III SQL console it does perfectly the job.
Don't use variable parsing (or string concatenation) to build SQL queries;
What does "using PgAdminIII sql console it does perfectly the job" mean? You have pasted the same query in pgAdmin3 and it worked? I very much doubt pgAdmin3 understands PHP and does PHP-style variable parsing as a consequence.
If it was not exactly the same query (most probably it was one with the PHP variables replaced with literals) what was the query you tested in pgAdmin3?
Most probably the reason the update is ineffective is that there are no rows that satisfy your WHERE clause.
$cmd = "UPDATE table1 SET field1 = '$value1' WHERE key_field = '$key_value'";
Now try there was an extra ;

How to get SQL Server CE TableAdapter to commit to database?

VS2008 project. Created local database (SSCE). Created dataset with tableadapter. Dragged dataset and tableadapter to the form so I can reference it in code.
New records successfully add to the dataset but will not commit back to the database. Gives no error or clue why it won't work.
TableAdapter insert statement was created automatically and is parameterized (#p1, #p2, etc.), but I am trying to avoid those parameter.add statements as I want to be able to use a field-by-field format without having to essentially repeat the schema of my database in code with parameter.add statements. Command object and INSERT statements work fine, but then you always have to construct an INSERT statement -- a pain if they're complicated. I want something as simple as working with a ADO recordset, but in .NET.
I know the last statement with the update is wrong without the parameters, but looking for an alternative method.
What can I do to accomplish the following without parameter.add statements?
DocsTableAdapter1.Fill(Documents1.Docs)
Debug.Print("Starting Row Count is: " & Documents1.Docs.Count.ToString)
Dim dr As DataRow = Documents1.Docs.NewRow
dr("Name") = "John Smith"
dr("Reference") = "My new reference code"
Documents1.Docs.Rows.Add(dr)
Debug.Print("New Row Count is: " & Documents1.Docs.Count.ToString)
DocsTableAdapter1.Update(Documents1.Docs)

Changing the MySQL query delimiter through the C API

How can I change the MySQL query delimiter using the C API? I tried sending DELIMITER | as a query, complained about ..the right syntax to use near 'delimiter' at line 1..
Tried DELIMITER |; too, no luck. Tried DELIMITER |; SELECT 1|, no luck. :(
The reason I ask is that I need to create a trigger through the C API such as the following:
create trigger increase_count after insert on torrent_clients for each row
begin
IF NEW.BytesLeft = 0 THEN
UPDATE torrents SET Seeders = Seeders + 1 WHERE torrents.InfoHash = NEW.InfoHash;
ELSE
UPDATE torrents SET Leechers = Leechers + 1 WHERE torrents.InfoHash = NEW.InfoHash;
END IF;
end|
but I don't think there is a way to do it without changing the delimiter, is there?
Thanks!
EDIT: Note that I do need to explicitly write the delimiter at the end, as I'm running multiple queries with only one API call (I have multi statements on)
EDIT2: the mysqlclient that uses the C api does this so there must be a way..
Changing the delimiter is only needed when using the mysql client program (because it is mysql that interpretes the semicolon as statement delimiter). You don't need to change the delimiter when using the C API:
Normally, you can execute only a single SQL command with mysql_query(). A semicolon can appear in such a command only when it is syntactically allowed, and that is normally not the case! In particular, SQL does not allow for a command to end in a semicolon. (The C API will raise an error if you attempt to do so.)
The commands CREATE PROCEDURE, CREATE FUNCTION, CREATE TRIGGER, and the like are exceptions when they define a stored procedure or trigger: In such commands the semicolon serves as a separator between the SQL instructions that are part of the stored procedure or trigger. Such commands can be executed without problem.
P.S. You probably should set multi statements off again.
You can execute multiple commands using mysql_query. You will have to set some parameters though while establishing the connection.
e.g.,
unsigned long opt_flags = CLIENT_FOUND_ROWS | CLIENT_MULTI_STATEMENTS | CLIENT_MULTI_RESULTS ;
if (0 == mysql_real_connect(mConn,mHostName,mUserName,mUserPass,mDbName,mPort,0,opt_flags))
...
/* execute multiple statements */
status = mysql_query(mysql,
"DROP TABLE IF EXISTS test_table;\
CREATE TABLE test_table(id INT);\
INSERT INTO test_table VALUES(10);\
UPDATE test_table SET id=20 WHERE id=10;\
SELECT * FROM test_table;\
DROP TABLE test_table");
Did you try
DELIMITER //
Without a ";" at the end? It might be not possible to change the delimiter within a query that has multiple queries in it.

Resources