I'm trying to save a XML file from SQL. I'm doing something like:
DECLARE #FileName VARCHAR(100)
DECLARE #db VARCHAR(100)
DECLARE #SQLCmd VARCHAR(1000)
SELECT #FileName = 'D:\ProductsList.xml'
SELECT #db='MyBase.dbo.'
SELECT #SQLCmd = 'bcp "SELECT '+#db+'ArticleName FROM '+#db+' ARTICLES' +
' FOR XML PATH(''Product''),ELEMENTS, ROOT(''Products''), TYPE "' +
' queryout '+#FileName +' -w -T -S' + ##SERVERNAME
EXECUTE master..xp_cmdshell #SQLCmd
It works okay for me one one sql server but fails one another. I get message "Error = [Microsoft][SQL Server Native Client 11.0]Unable to open BCP host data-file".
Bcp,queryout, cmdshell are unfortunately all new for me.
I'm reading a lot of websites and all they have some interesting information but I really don't know how to begin.
They say - add permissions for the user running the script. But who is this user if not me? ;) Is there some "automatic" Windows user created for running SQL Server or what? Where can I see this user and configure his rights so it can save my file?
Or maybe there are some other issues with bcp and cmdshell that I should to know?
Big thanks for any help. I'm really stuck :(
Related
Is there any way I can write file to azure storage from AZRE SQL .
Please note : I have created secure connection and able to read file from storage to SQL server . But not able to write back to the file .
Any suggestion . Below code using to write on premise file but not sure for Azure SQL
declare #fn varchar(500) = 'E:/c/logs/'+#filename+'_'+#fileTimeStamp+'.'+#fileExtension;
declare #cmd varchar(8000) = concat('echo ', #var, ' > "', #fn, '"');
print #cmd
exec xp_cmdshell #cmd, no_output
set #cmd = concat('type "', #fn, '"');
print #cmd
exec xp_cmdshell #cmd;
Depends on the architecture of your database, the only solution I'm aware of is running CETAS statements when in a server pool (using Synapse).
Otherwise, you are looking at Data Factory or SSIS with the Azure toolkit.
Migrating from a hosted SQL Server to AWS RDS which is a managed service (SaaS), we couldn't enable xp_cmdshell.
Our java app currently use a stored procedure that use xp_cmdshell to execute BCP.exe (Bulk Copy Program is a command-line tool used to import or export data against a Microsoft SQL Server) to export large data to CSV files.
And as xp_cmdshell cannot be enabled on RDS, we cannot use it to execute BCP.
Stored procedure code:
declare #bcp_header varchar(8000)
declare #bcp varchar(8000)
declare #bcpCopy varchar(8000)
declare #deleteFile varchar(300)
select #bcp_header = 'BCP "' + #header_select + '" queryout ' + #pathAndFileName +'.csv -c -C 1252 -t; -T '
select #bcp = 'BCP "' + #sql_export + '" queryout '+ #pathAndFileName +'_data.csv -c -C 1252 -t; -T '
select #bcpCopy = 'TYPE '+ #pathAndFileName +'_data.csv >> '+ #pathAndFileName +'.csv'
select #deleteFile = 'DEL '+#pathAndFileName +'_data.csv'
exec master..xp_cmdshell #bcp_header
exec master..xp_cmdshell #bcp
exec master..xp_cmdshell #bcpCopy
exec master..xp_cmdshell #deleteFile
Can we use a T-SQL command to export the query result to a CSV file and host it on S3 ?
Can we use a T-SQL command to export the query result to a CSV file and host it on S3 ?
Nope. There are TSQL alternatives to the BCP command, but ONLY for IMPORT.
Your best bet is to write a program that returns the data from SQL via some sort of SELECT statement and writes out the file yourself - bascailly a homebrewn replacement to BCP.
There is nothing out of the box that can act as a BCP replacement as all non BCP bulk functionality is either for programmatic use OR import only.
The answer is use BCP.exe inside an SSIS package using the SQL Server that the SSIS instance uses.
I have an SSIS package which logs into my client's SFTP site and uploads a file using commands sent to WinSCP.
This package fails quite often and is difficult to work with. Visual Studio crashes so frequently that my team refuses to use it any more. We are migrating all automated tasks to Stored Procedures and SQL Agent Jobs, as they work a lot better than SSIS does.
However, this leaves us with no good solution for automating FTP uploads of report files produced on SQL Server.
I have been doing it using PSFTP and batch files called from VB scripts. The script checks a particular folder to see if a file is there and uploads it if it is. This works, but is inelegant and I have to schedule these scripts to look for a file in the folder every 15 minutes, rather than just call an "upload" command from my stored procedure on SQL Server. There's potential for things to go horribly wrong if someone renames a file in that folder, or puts two files in there, etc.
I can easily create a .CSV file as part of a stored procedure into a particular folder using BCP, e.g.:
--Create Client Update .CSV File if there are any rows in the CLIENT_UPDATE table
IF EXISTS(SELECT 1 FROM CLIENT_UPDATE)
BEGIN
DECLARE #ColumnHeader VARCHAR(8000)
SET #FileName = #DataPath + '\Output File.csv'
SELECT #ColumnHeader = COALESCE(#ColumnHeader+',' ,'')+''''+column_name+'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='CLIENT_UPDATE'
SET #BCPCommand = 'bcp "SELECT '+ #ColumnHeader +' UNION ALL SELECT * FROM CLIENT_UPDATE" queryout "' + #FileName + '" -c -t , -r \n -S . -T'
EXEC master..xp_cmdshell #bcpCommand
END
I'd like the next step in that stored procedure to be "Upload .CSV file to client SFTP site". How would I accomplish this?
Success!
I created a stored proc on my database named sp_UploadFileToSFTP, which uses PSFTP.EXE to send files.
-- ================================================================================================================================
-- Description: Upload File to SFTP
-- - Creates "SFTPUploadScript.bat" and "SFTPUploadScript.txt" scripting files to control psftp.exe, then executes the batch file
-- - Requires a local #ScriptFolder on the SQL Server to create files inside, and which contains psftp.exe
-- - "SFTPUploadScript.bat" needs to be run once manually from the SQL Server desktop to capture the keypair of the SFTP site
-- After this, the script will work automatically
-- ================================================================================================================================
CREATE PROCEDURE sp_UploadFileToSFTP
#FilePathToUpload varchar(1000),
#SFTPSiteAddress varchar(1000),
#SFTPLogin varchar(1000),
#SFTPPassword varchar(1000),
#SFTPRemoteFolder varchar(1000)
AS
BEGIN
SET NOCOUNT ON;
--Declare Variables
DECLARE #ScriptFolder varchar(1000)
SET #ScriptFolder = 'C:\SFTPBatchFiles\'
DECLARE #CommandString AS varchar(8000)
--Delete Existing files if they exist
SET #CommandString = 'del "'+ #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
SET #CommandString = 'del "'+ #ScriptFolder + 'SFTPUploadScript.bat"'
EXEC master..xp_cmdshell #CommandString
--Create Batch fle with login credentials
SET #CommandString = 'echo "'+ #ScriptFolder +'psftp.exe" ' + #SFTPSiteAddress + ' -l ' + #SFTPLogin + ' -pw ' + #SFTPPassword + ' -b "' + #ScriptFolder + 'SFTPUploadScript.txt" > "' + #ScriptFolder + 'SFTPUploadScript.bat"'
EXEC master..xp_cmdshell #CommandString
--Create SFTP upload script file
SET #CommandString = 'echo cd "' + #SFTPRemoteFolder + '" > "' + #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
SET #CommandString = 'echo put "' + #FilePathToUpload + '" >> "' + #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
--Run the Batch File
SET #CommandString = #ScriptFolder + 'SFTPUploadScript.bat'
EXEC master..xp_cmdshell #CommandString
END
GO
I can then call it from inside another stored proc like so:
sp_UploadFileToSFTP 'C:\FileToUpload\Test.txt','sftp.mysite.com','Login#domain.com','Password1234','UploadFolder/In here/'
All works perfectly, takes about a quarter of a second to run, a very neat and stable solution.
Compared to what I was doing before which was using SQL Server to drop a file into a folder, then checking that folder for new files every 15 minutes with my Visual Basic SFTP upload script, it's miles better :)
If you want to try this yourself, all you need is a folder on the C:\ drive of your SQL Server named "SFTPBatchFiles", and in there you put psftp.exe, which is part of PuTTY.
You will also need a trusting server admin who will allow you to run psftp commands and your own batch files using xp_cmdshell, and they'll probably need to open a route in your firewall so you can connect to your remote site directly from the SQL Server.
Finally, you will also need to be able to remote desktop into your SQL Server, since you'll need to run the batch file it creates manually once and press "Y" when psftp asks you to accept the new keypair. After that, it's all plain sailing.
No SSIS, no WINSCP, no Visual Studio, no hours upon hours of crashes and debugging!
The only issue right now is that if the FTP transfer fails for whatever reason, the stored procedure still reports success, I'm not bringing the psftp error messages back into SQL Server.
My next task will be to trap the status and error messages as they get generated from xp_cmdshell and give the user some detailed feedback and generate errors if the process fails.
UPDATE:
Trapping errors and feedback from the remote FTP site
--Run the Batch File
DROP TABLE IF EXISTS #CommandOutput
CREATE TABLE #CommandOutput (ReadLine varchar(1000))
SET #CommandString = #ScriptFolder + 'SFTPUploadScript.bat'
INSERT #CommandOutput
EXEC master..xp_cmdshell #CommandString
--Check for Invalid Password error
IF EXISTS (SELECT ReadLine
FROM #CommandOutput
WHERE ReadLine LIKE '%Invalid Password%')
BEGIN
PRINT 'Invalid Password Error!'
END
All of the output from the batch file (what would normally be shown on the screen if you were to run it manually) is now inserted line-by-line into a temporary table named #CommandOutput as the batch file runs.
After the batch file runs, you can then query #CommandOutput to see what's happened with your transfer.
I've added a simple check for the words "Invalid Password" anywhere in the output, as this is a common error you might have. You could add as many of these checks as you liked, or you could import the entire table into an email which gets sent to you every time the job runs, log the table to an FTP event logging table, or any number of other things.
i am facing an issue while executing the BCP command in the remote server.
i have the following command in a batch file which executes a script file.
The script file process a temporary table and writes the records in temporary table to a file.
SQLCMD -SQA_Server236 -dtestdb -Usa -PPassword1 -i"D:\script\Writing to Files\Write to CSV.sql" -o"D:\script\script_logs.log"
--script files contains...
declare #table NVARCHAR(255)
declare #filename VARCHAR(100)
set #filename='C:\TextFile\abcd.csv'
set #table ='##Indexes_Add'
IF OBJECT_ID('tempdb..#Indexes_Add') IS NOT NULL
BEGIN
DROP TABLE ##Indexes_Add
END
CREATE TABLE ##Indexes_Add
(
id int IDENTITY(1,1) PRIMARY KEY,
alter_command NVARCHAR(MAX),
successfully_readded BIT NULL,
)
insert into ##Indexes_Add select 'a',0
insert into ##Indexes_Add select 'a',0
SET NOCOUNT ON;
IF OBJECT_ID('tempdb..'+#table) IS NOT NULL
BEGIN
DECLARE
#sql NVARCHAR(MAX),
#cols NVARCHAR(MAX) = N'';
SELECT #cols += ',' + name
FROM tempdb.sys.columns
WHERE [object_id] = OBJECT_ID('tempdb..'+#table)
ORDER BY column_id;
SELECT #cols = STUFF(#cols, 1, 1, '');
SET #sql = N'EXEC master..xp_cmdshell ''bcp "SELECT '''''
+ REPLACE(#cols, ',', ''''',''''') + ''''' UNION ALL SELECT '
+ 'RTRIM(' + REPLACE(#cols, ',', '),RTRIM(') + ') FROM '
+ 'tempdb.dbo.'+#table + '" queryout "' + #filename + '" -c -t, -SQA_Server236 -Usa -PPassword1''';
EXEC sp_executesql #sql;
print #sql
END
My problem-:
When i run the above batch command in a batch file and i give my local server other than server name "QA_Server236" , i am getting the file "abcd.csv" created in my system but when i give the server name as "QA_Server236" the file is created in the remote machine i.e QA_Server236. But i want the file to be created in my system if the given server is a remote server say "QA_Server236"
Can anyone help me in this issue. i am not getting any method to do so.
If I'm right BCP does not allow saving result sets and/or logs to remote machines. But you could try to mount a folder on remote PC to a shared folder on your local machine and set the output location there, or maybe try using network path "\[YOURPC]....".As I'm not sure it works (actually, I think it won't), here's the only solution I can think of: add a line to your batch which moves the file(s) from the remote machine to your PC (xcopy or similar) after BCP finished executing (in batch, do this as "Call bcp.exe [params]" instead of just "bcp.exe ...")Hope this helps!
I want to write a SQL 2005 script to create a new login that uses Windows authentication. The Windows user is a local account (not a domain one). A local account with the same name exists on many SQL Server machines and I want to run the same script on all of them.
It seemed simple enough:
CREATE LOGIN [MyUser]
FROM WINDOWS
However, that doesn't work! SQL returns an error, saying Give the complete name: <domain\username>.
Of course, I can do that for one machine and it works, but the same script will not work on other machines.
Looks like sp_executesql is the answer, as beach posted. I'll post mine as well, because ##SERVERNAME doesn't work correctly if you use named SQL instances, as we do.
DECLARE #loginName SYSNAME
SET #loginName = CAST(SERVERPROPERTY('MachineName') AS SYSNAME) + '\MyUser'
IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE [name] = #loginName)
BEGIN
DECLARE #sql NVARCHAR(1000)
SET #sql = 'CREATE LOGIN [' + #loginName + '] FROM WINDOWS'
EXEC sp_executesql #sql
END
This just worked for me:
DECLARE #sql nvarchar(1000)
SET #SQL = 'CREATE LOGIN [' + ##SERVERNAME + '\MyUser] FROM WINDOWS'
EXEC sp_executeSQL #SQL