xp_cmdshell command not executing last command when run as job - sql-server

First off, before everybody shouts at me - I'm bug fixing in legacy code and a re-write is off the cards for now - I have to try to find a fix using the xp_cmdshell command.
I have a proc which is executed via a scheduled job. The proc is full of TSQL like the below to dump data to a log file.
SELECT *
INTO Temp
FROM MyView
SET #cmd1 = 'bcp "SELECT * FROM [myDatabase].dbo.Temp" queryout "C:\temp.txt" -T -c -t" "'
SET #cmd2= 'type "C:\temp.txt" >> "C:\output.txt"'
EXEC master..xp_cmdshell #cmd1
EXEC master..xp_cmdshell #cmd2
DROP TABLE Temp
The problem is that the last of these commands in the proc doesn't appear to run. I can see the result in the text.txt file, but not the output.txt file. All of the preceding work fine though and it works fine when I run this on it's own.
Can anyone suggest why this might happen or suggest an alternative way to achieve this?
Thanks

I think, that BCP as external process runs async. So it could be, that your file is not yet written in the moment you are trying to copy its content.
Suggestion 1: Include an appropriate wait time
Suggestion 2: Call your first command a second time with changed target file name
Suggestion 3: Use copy rather than type
You might create a file c\temp.txt with just hello world in it. Try to type it into one file before the BCP and type it into another file after the BCP.

Related

BCP Command to Export Query Results to File with Pipe Delimiter

I have a requirement to create a SQL Job that exports a query to a Pipe / Vertical bar delimited file (|) and save it on a network drive in either *.txt or *.csv format. Right now, I am just trying to get this to work inside SSMS, calling the BCP command and exporting the stored procedure in the proper format to the network location, but not able to get it working.
I have been researching this and there are two methods for this.
Use the export data wizard to create a job and schedule that to run. But this method, if we need to make changes, I believe we cannot change the SSIS package that is created so we lose flexibility
Use the BCP command to export the file.
I greatly prefer to use option #2, the BCP command, but I am having problems. I just cannot seem to get the syntax correct and hoping someone could show me what I am missing:
This is my command:
Exec master..xp_cmdshell 'bcp EXEC [dbo].[usp_Report_1123] ''7786'' -t| out \\networkDrive\Reports\REPORT_1123\report1123.csv -T'
But I get the following messages:
output
'out' is not recognized as an internal or external command,
operable program or batch file.
NULL
The stored procedure does work and returns data. The network path, if I enter it into my computer, finds the path. But I am not sure what I am missing and hoping someone could help.

Execute stored procedure by passing the Script File(.sql) as Parameter

I have a stored procedure, in wWhich I m passing the script file (.sql file) as a parameter.
I want to know how .sql file gets executed through command (not command prompt).
exec my_sp VersionNumber, SqlDeltaScript.sql (it is a file)
I want my stored procedure to execute SqlDeltaScript.sql
Can anyone please help regarding this ...
Thanks in advance ...
This does not sound like an ideal situation, but if you have to do this then you could use xp_cmdshell to run the sqlcmd utility to run your script file.
The xp_cmdshell SP must be enabled in order ot use it - see Enable 'xp_cmdshell' SQL Server.

In SSMS, are commands separated with GO guaranteed to be synchronous?

If I execute the following script:
EXECUTE LongRunningSP1
GO
EXECUTE LongRunningSP2
GO
Assuming both procedures take several minutes, will the GO batching cause any concurrency to happen or is LongRunningSP1 guaranteed to finish before LongRunningSP2 starts?
The GO will just split your code in batches, but it won't cause any concurrency: all batches are executed one at time, in the order they appear in the code.
LongRunningSP1 is guaranteed to finish before LongRunningSP2 with or without the GO in between; GO is a batch separator for the command processor.
It's easier to see what it does when using the command line utility SQLCMD.
SQLCMD
1> exec LongRunningSP1
-- nothing happens
2> exec LongRunningSP2
-- nothing happens
3> GO
-- both procs are run, first SP1, then SP2
Yes!! Go will actually make it into batches to be executed.
So it's LongRunningSP1 which gets completed first, ALWAYS!
GO is not a Transact-SQL statement; it is a command recognized by the sqlcmd and osql utilities and SQL Server Management Studio Code editor. It is a batch terminator, it will not change the order of your query. You can however change it to whatever you want under options.
Here are a set of very simple, easy steps to customize the batch separator in SSMS:
Launch SSMS
Go to Tools –> Options
Click on the “Query Execution” node
Notice that we have an option to change the Batch Separator
Change the batch separator
Click “OK”

Periodically check if a folder has been deleted, using MSSQL SP + Command Line?

I want to write an MSSQL Stored Procedure that periodically checks the filesystem it lives on to see if any folders have been deleted. It should work in an XP or Windows 7 environment.
I was thinking I might use the Windows Command Line or PowerShell (or VBScript) for this. I'll just call the script from the SQL Stored Procedure, it will check the filesystem, and then if a folder has been deleted it will alert the users.
My gut tells me there is a dead-simple solution for this somewhere. I know that matching directories is already a common task.
I've been playing with command line DIR and TREE, but so far they give me too much text. I really just want a simple list of folders that I can put into a small table in SQL. (I know that's overkill but it's what was requested.)
CREATE TABLE [dbo].[TABLEYOUCREATE] (
[dir] varchar(1000)
, [diroutput] varchar(1000)
)
GO
DECLARE #cmd varchar(8000)
SELECT #cmd = 'Dir "' + #path + '"'
INSERT INTO TABLEYOUCREATE(diroutput) EXEC master..xp_cmdshell #cmd

Executing a bat file inside a Stored Procedure using SQL server 2005

When i try to execute a bat file using xp_CMDShell, i am getting a message as not recognized command.
Following is the command i executed:
EXEC master..xp_CMDShell 'C:\Documents and Settings\adcxqcv\Desktop\PQA\sample.bat'
I got a message as follows:
'C:\Documents' is not recognized as an internal or external command,
operable program or batch file.
NULL
Any Suggestions. Let me know how to execute a bat file inside a Stored Procedure.
I am new to SQl Server.
Thanks,
Vinu
Put the path inside ""
EXEC master..xp_CMDShell '"C:\Documents and Settings\adcxqcv\Desktop\PQA\sample.bat"'
xp_cmdshell can be a bit picky on the long file names, you are using quotes and it isn't playing ball, double quotes can sometimes work but if it still doesn't want to play ball then try use the older 8.3 filename instead.
exec master..xp_cmdshell 'c:\docume~1\adcxqcv\Desktop\PQA\sample.bat'
Without parameter
exec(' xp_cmdshell ''C:\script\test.bat'); --your bat file location(path)
With parameter
exec(' xp_cmdshell ''C:\script\test.bat '+#ecistate+' '+#stateid+' '+#pcno+''''); --your bat file location(path)
Execute and enjoy the solution:)

Resources