SQL where greater than in groovy script - sql-server

The following groovy script does nothing:
def cmd = /sqlcmd -S 127.0.0.1\MSSQLSERVER -d LocalDevelop10DB -Q "DELETE FROM T_TimeRegistration WHERE TimeRegLineNr > 36"/
cmd.execute()
While this groovy script works perfectly:
def cmd = /sqlcmd -S 127.0.0.1\MSSQLSERVER -d LocalDevelop10DB -Q "DELETE FROM T_TimeRegistration WHERE TimeRegLineNr = 37"/
cmd.execute()
I want to use the (effects of) first script. It seems the '>' character is somehow not supported, I tried escaping it but no joy. What am I missing, can someone help?
Thanks

I don't know Groovy at all, but if you want to avoid the greater than symbol, you could use between:
DELETE FROM T_TimeRegistration
WHERE TimeRegLineNr between 37 and 2147483647
2147483647 is maximum int value

Do you have any error when your run the script.
Try to run the query (DELETE FROM T_TimeRegistration WHERE TimeRegLineNr > 36) from SSMS and see if it works, or see why it doesn't work.
Maybe you have some fk restrictions and one of rows is refered in other table.

Related

psql Batch File - Escaping "Not Equal" Operator

I'm working on a batch file that will import data into the PostgreSQL database I use for testing. The batch file drops all of the databases, then recreates/reloads them from a previous dump file made from our production database. However, I sometimes run into a problem if I've accidentally left a connection open to that server/database. The "drop" portion fails because there are still users connected (me).
I've been trying to "tweak" my batch file with a command to disconnect all users from the database(s) prior to issuing the command to drop them, but I can't get that part (disconnection) to work. I've taken the disconnect code from another SO question How to drop a PostgreSQL database if there are active connections to it?, and I've been looking at other questions like How to execute postgres' sql queries from batch file? for help with the syntax.
I've also seen the "alternate" syntax for a not equal operator on the 9.2. Comparison Functions and Operators page of the official PostgreSQL documentation, but that seems to also be using "special" characters that would require escaping, so I'm not sure how to proceed.
At this point, the batch file looks like this:
#Echo OFF
SET PGPASSWORD=PASSWORD
cd /D "C:\PostgreSQL\bin"
psql.exe -h localhost -p 5432 -d postgres -U username -c 'SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = ''betadb'' AND pid \<\> pg_backend_pid();'
dropdb.exe -h localhost -p 5432 -U username betadb
psql.exe -h localhost -p 5432 -d postgres -U username < "C:\PostgresSQL\prodserverdump.sql"
Everything else works except for the pg_terminate_backend query. Every time I run that, I get strange errors indicating a problem with a path, or a file, or something else like that. I believe I've narrowed the problem down to the "not equal" operator (<>) in the query, but I can't seem to find the correct way to escape this so it doesn't try to pipe in data from a file that's not being defined.
I've tried using single backslashes (\) and double backslashes (\\), in front of one or both of the characters in the operator, but that doesn't appear to work. Is there a special way to escape the "greater than" and "less than" characters for the -c command line option in psql?
Using a combination of suggestions and "trial & error", I believe I found the correct syntax for executing this particular SQL command through a batch file.
Trying the "alternative" not equal operator (!=), I was still getting errors. They were different errors (it was giving me some nonsense about too many parameters), but it still wouldn't execute.
Using #Compo's suggestion from the comments, I then tried to enclose the entire SELECT statement in double quotes instead of single quotes. Still not quite there.
Finally, I removed the "extra" single quotes I was using around the database names from before. The query appears to have executed properly.
The final result looks like this:
#Echo OFF
SET PGPASSWORD=PASSWORD
cd /D "C:\PostgreSQL\bin"
psql.exe -h localhost -p 5432 -d postgres -U username -c "SELECT pg_terminate_backend(pg_stat_activity.pid) FROM pg_stat_activity WHERE pg_stat_activity.datname = 'betadb' AND pid != pg_backend_pid();"
dropdb.exe -h localhost -p 5432 -U username betadb
psql.exe -h localhost -p 5432 -d postgres -U username < "C:\PostgresSQL\prodserverdump.sql"
I suppose I had assumed that, because all of the examples I had found were using single quotes to surround the SQL statement, that's what I had to use. Apparently, that assumption was incorrect.
Regardless, it all seems to be working correctly now. Hope this helps someone else who's looking to accomplish something similar.

SQL - Automatic results to CSV or Text File

I was wondering if anyone can help.
I have a number of queries in SQL (all in separate *.sql files). I wanted to know if there is a way to run these queries automatically or mass run them to be saved to either a csv or txt file?
Also, I have come variables within these queries which will need to be amended on a weekly bases before the queries are run.
Thanks.
KJ
Could you please provide some additional help in relation to the variables? Previously I would declare and set variables as:
DECLARE #TW_FROM DATETIME
DECLARE #TW_TO DATETIME
SET #TW_FROM = '2015-11-16 00:00:00';
SET #TW_TO = '2015-11-22 23:00:00';
How do I do this using sqlcmd?
Yes, you can use sqlcmd to do this.
First of all - variables. You can refer to your variables in the .sql files using $(variablename) wherever you want to substitue the variable. For example,
use $(dbname);
select $(columnname) from table1 where column= '$(var1)'
You then call sqlcmd with the following command (note the argument -v variables)
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred"
In order to output this to a file, you tag > filename.txt on the end
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.txt
If you want to output to a csv, you can also specify the delimiter using the argument -s (note the idfference with the capital S for server). So now we have
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.csv
If you want to output several commands to the same csv or txt file, use >> instead of > as it add to teh bottom of the file, rather than replacing it.
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" >> filename.csv
To run this for several scripts, you can put the statements in a batch file, and then change the variables every week.
You could write a batch file that uses sqlcmd:
MSDN sqlcmd
That will allow you to call script files in a loop and output the results to a file.
Convert your current scrips to a Stored Procedure.
You can then pass your variables to that and run the query.
If you have SQL Server agent available (SQL standard or better) you can use this to automate the running of the stored procedures.
Otherwise the same can be achieved with Task Scheduler in windows.
As for exporting to CSV this will be useful.
It depends on where your SQL Server is acutally running. It might be quite tricky to write anything to the location you want.
You could read about BCP.
My suggestion is:
Create an UDF (best is inline-UDF!) from all of your queries within your database. Than call them from EXCEL or any other fitting product. You might want to set up an Excel where all your queries are filled one on each Sheet automatically

Variable defined in SQL readable by .cmd (pass a variable from SQL to cmd)

I know how to setup a variable in a cmd file that passes the variable to SQL via sqlcmd.
Example:
sqlcmd -Usa -Ppass -d MASTER -v num="%num%" -i C:\scriptfile
My question is how can I define a variable in SQL that can be read outside of SQL. I know how to declare and define #variables in a SQL script but those are not recognized outside of when the SQL script runs.
My question is how to you pass a variable from SQL back to cmd?
Is there anyway to accomplish this?
Thank you
You can do this using scripting variable and using the -v option of SQLCMD utility. A small example from MSDN Documentation
Consider that the script file name is testscript.sql, Col1 is a scripting variable; your SQL script look like
USE test;
SELECT x.$(Col1) FROM Student x WHERE marks < 5;
You can then specify the name of the column that you want returned by using the -v option like
sqlcmd -v Col1 = "FirstName" -i c:\testscript.sql
Which will resemble to below query
SELECT x.FirstName FROM Student x WHERE marks < 5;
EDIT:
If you just want to capture the output from your script file then you can use -o parameter and specify a outfile like
sqlcmd -v Col1 = "FirstName" -i c:\testscript.sql -o output.txt
Thanks Rahul, you inadvertently answered my question. You can output your script results to a file via the -o option for SQLCMD.
Thinking about that I realized I could use the PRINT SQL to create a -o .cmd file that contains the .cmd syntax to define a variable. Then in the .cmd file I tell it to run the SQL created .cmd file and then the variable gets defined in the .cmd environment.
Kind of a round about way but works!!
Thanks!
If you just run a simple select to get your value or an exec spname that returns just the value you are after, you can use the following.
for /f "tokens=*" %a in ('sqlcmd -Usa -Ppass -W -h -1 -d MASTER -Q "select Column from table"') do set ResultVariable=%a
Remember to use %%a if putting this in a bat file

csv output from windows batch + sqlcmd only returns first column

i have looked all over the internet and cant seem to find a solution to this problem.
i am trying to output query results as a CSV through using a combination of sqlcmd and windows batch. here is what i have so far:
sqlcmd.exe -S %DBSERVER% -U %DBUSER% -P %DBPASS% -d %USERPREFIX% -Q "SELECT Username, UserDOB, UserGender FROM TABLE" -o %USERDATA%\%USERPREFIX%\FACT_BP.CSV -h-1 -s","
is there something i'm missing here? some setting that only looks at the first column of the query results?
any advice at all would be a huge help - i'm lost.
Here is the reference page from MSDN on SQLCMD.
http://technet.microsoft.com/en-us/library/ms162773.aspx
I placed this command in a batch file in C:\temp as go.bat.
sqlcmd -S(local) -E -dmaster
-Q"select cast(name as varchar(16)), str(database_id,1,0), create_date from sys.databases"
-oc:\temp\sys.databases.csv -h-1 -s,
Notice I hard coded the file name and removed the "" around the field delimiter.
I get the expected output below.
Either the command does not like the system variables or something else is wrong. Please try my code as a base line test. It works for SQL 2012.
Also, the number of lines is always dumped to file. You must clear this out of the file. That is why I do not use SQLCMD for ETL.
Why not use BCP instead?
I have writing several articles on my website.
http://craftydba.com/?p=1584

using isql across multiple ksh scripts

I very new to ksh script but I have 2 ksh scripts each of them calling sybase stored procedure via isql. The issue I'm seeing is that when I execute the first script the stored procedure runs fine but when I execute the second it fails with error of (isql -b -S value -U value -P value: not found). Here are code snipets
Values for $SERVER, $DBO_USER and $DBO_PASSWORD are set earlier in the script.
test1.ksh:
ISQL_CMD="isql -b -S ${SERVER} -U ${DBO_USER} -P ${DBO_PASSWORD}"
VAR=`${ISQL_CMD} << EOF
set nocount on
go
set proc_return_status off
go
declare #var_id int
,#rtnval int
exec #rtnval = DB_NAME..MY_STORED_PROC_1
#parameter1 = ${VAR_IN}
,#parameter_output = #var_id output
go
EOF`
This executes fine and I get value in VAR variable
test2.ksh (VAR variable gets passed in from test1.ksh):
ISQL_CMD="isql -b -S ${DSQUERY} -U ${DBO_USER} -P ${DBO_PASSWORD}"
RETURN_VALUE=`${ISQL_CMD} << EOF
set nocount on
go
declare #rtnval int
exec #rtnval = DB_NAME..MY_STORED_PROC_2
#var_id = '${VAR}'
go
EOF`
I get the following error:
isql -b -S value -U value -P value: not found
These scripts can be run independent of each other so there is no guarantee that isql may have been called before test2.ksh and that is why I set the the ISQL_CMD variable in each script.
test1.ksh runs properly but test2.ksh does not, whether called from test1.ksh or run on it's own. Tried running the scripts in debug, but it didn't really provide any further information.
Ok I have figured this out after a full day of head scratching. test2.ksh is performing some file processing and setting the IFS value several lines prior to isql command. I had to reset or unset the IFS once I was done and isql command worked fine! the command to unset the IFS value is:
unset IFS
Thanks for those that were trying to help!!

Resources