I have a script with dynamic query. I want to execute the query and output its result to a file. I can't seem to figure out how to output result of an "execute" statement.
Sample code below.
declare #sql_text varchar(300)
select #sql_text = select 1
exec (#sql_text) > output.txt
To give more context. My actual script would be looping through the dynamic query and output to different files (dynamic filename as well).
You set the output file via the -o parameter to the isql client to execute the SQL. This will send the output to a file from any SQL be that normal or dynamic SQL.
So put the SQL in an input file and then run
isql -U user - P password -S -i input_filename -o output.txt
You can't call directly to a operating system file from within ASE itself without enabling xp_cmdshell which is a potential security issue (as it allows O/S commands to be run as the user running the Sybase dataserver) and is therefore prohibited in most sites.
Related
In SQL Server 2016, I am executing a SQL script through SQLCMD like this:
SQLCMD -H XXXXXX,1433 -U username -P password -d mydatabase
-v varMDF="testing" -i "Script.sql" -o "DATA.txt"
and in Script.sql, I want to echo some text to the console, just to see the progress. I have a while loop in the script and executing the command
echo I am in sql script
as shown here:
OPEN tab_cursor
FETCH NEXT FROM tab_cursor INTO #tablename
WHILE ##FETCH_STATUS = 0
BEGIN
!!echo i am in sql script
PRINT #tablename
FETCH NEXT FROM tab_cursor INTO #tablename
END
CLOSE tab_cursor
DEALLOCATE tab_cursor
The problem is, it display the line "i am in sql script" only once in console but I could see many entries for tablename in my output file. Please help to solve this issue or suggest if there is any other way to do this.
Thanks
I would try the following solutions in order:
1) Look into BCP; it might allow you to see what you are doing much more effectively, and depending on the size of your output file it may be significantly faster. (1b : look into SSIS, even though it's a huge pain)
2) putting a SQLCMD execution inside of Script.sql that did the data push to the file, and having the PRINT statement work as normal without a -o. (NOTE: If this is a Complicated Stored Procedure, why aren't you writing a Complicated Stored Procedure?)
3) Monkeying with server monitoring and profiler. This would be for debugging purposes only, if that's why you need the output.
Generally, it sounds to me like the source of your problem is that you're using the wrong tool for the job. If you want lots of output from SQLCMD on process status, you're probably using it where you should be using BCP, which is designed for doing exports programmatically. SQLCMD isn't all that great an interface for running complicated scripts, in my experience; it needs fire-and-forget.
I have created a BCP utility and I have wrapped it in a bat file. I have then created a daily task using Task Scheduler in Windows Server 2012.
The function of the BCP utility is to rename a file called 'myfile.csv' (located in C:) by adding a date stamp to it and updating the file with the result of a SQL query.
The codes currently stand as follows:
cd:\Program Files\ Microsoft SQL Server\Client SDK\ODBC\110\Tools\Binn
set vardate=%DATE:~4,10%
set varDateWithoutSlashes=%vardate:/=-%
ren C:\myfile.csv myfile_%varDateWithoutSlashes%.csv
bcp "SELECT TOP 100 ReservationStayID,NameTitle,FirstName,LastName,ArrivalDate,DepartureDate FROM MyDatabase.dbo.GuestNameInfo" queryout C:\myfile.csv -t, -c -S [ipaddress] -U sa -P 1234
My problem is that when the task runs, it renames the file correctly with a the date stamp but it seems that the SELECT query does not run as the file is empty (except the headers, which have been pre-loaded by the way).
What is wrong with my codes?
I should also add the following:
Are the double quotes in the select statement above correct? Or should they be single quotes?
Should the ipaddress in my codes above be in square brackets or should I remove them?
I have left the "Location" filed 'as is' in the Task Scheduler (please see screenshot below). Should that be filled? If yes, by what?
Thanks for helping out!
I was wondering if anyone can help.
I have a number of queries in SQL (all in separate *.sql files). I wanted to know if there is a way to run these queries automatically or mass run them to be saved to either a csv or txt file?
Also, I have come variables within these queries which will need to be amended on a weekly bases before the queries are run.
Thanks.
KJ
Could you please provide some additional help in relation to the variables? Previously I would declare and set variables as:
DECLARE #TW_FROM DATETIME
DECLARE #TW_TO DATETIME
SET #TW_FROM = '2015-11-16 00:00:00';
SET #TW_TO = '2015-11-22 23:00:00';
How do I do this using sqlcmd?
Yes, you can use sqlcmd to do this.
First of all - variables. You can refer to your variables in the .sql files using $(variablename) wherever you want to substitue the variable. For example,
use $(dbname);
select $(columnname) from table1 where column= '$(var1)'
You then call sqlcmd with the following command (note the argument -v variables)
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred"
In order to output this to a file, you tag > filename.txt on the end
sqlcmd -S servername -d database -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.txt
If you want to output to a csv, you can also specify the delimiter using the argument -s (note the idfference with the capital S for server). So now we have
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" > filename.csv
If you want to output several commands to the same csv or txt file, use >> instead of > as it add to teh bottom of the file, rather than replacing it.
sqlcmd -S servername -d database -s "," -i "yoursqlfile.sql" -v dbname="database" columnname="column" var1="Fred" >> filename.csv
To run this for several scripts, you can put the statements in a batch file, and then change the variables every week.
You could write a batch file that uses sqlcmd:
MSDN sqlcmd
That will allow you to call script files in a loop and output the results to a file.
Convert your current scrips to a Stored Procedure.
You can then pass your variables to that and run the query.
If you have SQL Server agent available (SQL standard or better) you can use this to automate the running of the stored procedures.
Otherwise the same can be achieved with Task Scheduler in windows.
As for exporting to CSV this will be useful.
It depends on where your SQL Server is acutally running. It might be quite tricky to write anything to the location you want.
You could read about BCP.
My suggestion is:
Create an UDF (best is inline-UDF!) from all of your queries within your database. Than call them from EXCEL or any other fitting product. You might want to set up an Excel where all your queries are filled one on each Sheet automatically
I know how to setup a variable in a cmd file that passes the variable to SQL via sqlcmd.
Example:
sqlcmd -Usa -Ppass -d MASTER -v num="%num%" -i C:\scriptfile
My question is how can I define a variable in SQL that can be read outside of SQL. I know how to declare and define #variables in a SQL script but those are not recognized outside of when the SQL script runs.
My question is how to you pass a variable from SQL back to cmd?
Is there anyway to accomplish this?
Thank you
You can do this using scripting variable and using the -v option of SQLCMD utility. A small example from MSDN Documentation
Consider that the script file name is testscript.sql, Col1 is a scripting variable; your SQL script look like
USE test;
SELECT x.$(Col1) FROM Student x WHERE marks < 5;
You can then specify the name of the column that you want returned by using the -v option like
sqlcmd -v Col1 = "FirstName" -i c:\testscript.sql
Which will resemble to below query
SELECT x.FirstName FROM Student x WHERE marks < 5;
EDIT:
If you just want to capture the output from your script file then you can use -o parameter and specify a outfile like
sqlcmd -v Col1 = "FirstName" -i c:\testscript.sql -o output.txt
Thanks Rahul, you inadvertently answered my question. You can output your script results to a file via the -o option for SQLCMD.
Thinking about that I realized I could use the PRINT SQL to create a -o .cmd file that contains the .cmd syntax to define a variable. Then in the .cmd file I tell it to run the SQL created .cmd file and then the variable gets defined in the .cmd environment.
Kind of a round about way but works!!
Thanks!
If you just run a simple select to get your value or an exec spname that returns just the value you are after, you can use the following.
for /f "tokens=*" %a in ('sqlcmd -Usa -Ppass -W -h -1 -d MASTER -Q "select Column from table"') do set ResultVariable=%a
Remember to use %%a if putting this in a bat file
I have a situation where I have a data table in SQL server Database. Now i want it to be inserted into an XML File through SQL Commands. How can I do it?
You can execute query from cmd shell with -o parameter or the same command by xp_cmdshell. in select statement you can use for xml option. and results must be printed with print command.
for example:
SQLcmd -S "(local)\sqlexpress" -E -d "EFEx"
-q "declare #i nvarchar(max)
set #i = (select * from [Group] for xml auto,root(''groups''))
print #i"
-o "C:\Projs\results.xml"
don't forget security rights for creating file and enable xp_cmdshell
exec sp_configure 'xp_cmdshell', 1
reconfigure;
I hope this helps.
The short answer is no you can't output to a file using pure T-SQL. Whatever client you are using to connect to the SQL server needs to pipe the output to a file.
You have a few choices:
- BCP utility (recommended, read the XML format file section)
- SQLCMD utility
- Build an SSIS package to do the job (sounds like overkill for your case)
That said you can execute these commands from a SQL session using xp_cmdshell.