I am not able to delete a .csv file using xp_cmdshell - sql-server

I am getting an error Access is denied when I try deleting the csv file from a folder using xp_cmdshell. however I can delete .csv.gpg file successfully from the same location using xp_cmdshell.
My query goes as:
--delete the csv file from local folder
SELECT #Delete2 = 'del ' + 'C:\Akshay\files\testfile.csv'
EXEC master..Xp_cmdshell #Delete2

I would say you need to user declare and set assuming you know what the file name is. I just tried the statement you have and it erred because of the select.
Here is my solution with the information I have:
DECLARE #delete VARCHAR(50)
SELECT #delete = 'del B:\test.txt'
EXEC xp_cmdshell #delete

Related

Azcopy move or remove specific files from SQL

I need to move files from one blob to another. Not copy. I need to move; meaning when I move from A to B then files go from A to B and nothing is there in A. Which is so basic but not possible in Azure Blob. Please let me know if its possible. I am using it from SQL Server using AzcopyVersion 10.3.2
Now because of this, I need to copy files from A to B and then remove files form A. There are 2 problems.
1) I only want certain files to go from A to B.
DECLARE #Program varchar(200) = 'C:\azcopy.exe'
DECLARE #Source varchar(max) = '"https://myblob.blob.core.windows.net/test/myfolder/*?SAS"'
DECLARE #Destination varchar(max) = '"https://myblob.blob.core.windows.net/test/archive?SAS"'
DECLARE #Cmd varchar(5000)
SELECT #Cmd = #Program +' cp '+ #Source +' '+ #Destination + ' --recursive'
PRINT #cmd
EXECUTE master..xp_cmdshell #Cmd
So When I type myfolder/* then it will take all the files. When I try myfolder/*.pdf, it says
failed to parse user input due to error: cannot use wildcards in the path section of the URL except in trailing "/*". If you wish to use * in your URL, manually encode it to %2A
When I try myfolder/%2A.pdf OR myfolder/%2Apdf it still gives the error.
INFO: Failed to create one or more destination container(s). Your transfers may still succeed if the container already exists.
But the destination folder is already there. And in the log file it says,
RESPONSE Status: 403 This request is not authorized to perform this operation.
For azcopy version 10.3.2:
1.Copy specific files, like only copy .pdf files: you should add --include-pattern "*.pdf" to your command. And also remember for the #Source variable, remove the wildcard *, so your #Source should be '"https://myblob.blob.core.windows.net/test/myfolder?SAS"'.
The completed command looks like this(please change it to meet your sql cmd):
azcopy cp "https://xx.blob.core.windows.net/test1/folder1?sas" "https://xx.blob.core.windows.net/test1/archive1?sas" --include-pattern "*.pdf" --recursive=true
2.For delete specific blobs, like only delete .pdf files, you should also add --include-pattern "*.pdf" to your azcopy rm command.
And also, there is no move command in azcopy, you should copy it first => then delete it. You can achieve this with the above 2 commands.

Is it possible to get SQL Server to log into an SFTP and upload a file just by using T-SQL commands, no SSIS?

I have an SSIS package which logs into my client's SFTP site and uploads a file using commands sent to WinSCP.
This package fails quite often and is difficult to work with. Visual Studio crashes so frequently that my team refuses to use it any more. We are migrating all automated tasks to Stored Procedures and SQL Agent Jobs, as they work a lot better than SSIS does.
However, this leaves us with no good solution for automating FTP uploads of report files produced on SQL Server.
I have been doing it using PSFTP and batch files called from VB scripts. The script checks a particular folder to see if a file is there and uploads it if it is. This works, but is inelegant and I have to schedule these scripts to look for a file in the folder every 15 minutes, rather than just call an "upload" command from my stored procedure on SQL Server. There's potential for things to go horribly wrong if someone renames a file in that folder, or puts two files in there, etc.
I can easily create a .CSV file as part of a stored procedure into a particular folder using BCP, e.g.:
--Create Client Update .CSV File if there are any rows in the CLIENT_UPDATE table
IF EXISTS(SELECT 1 FROM CLIENT_UPDATE)
BEGIN
DECLARE #ColumnHeader VARCHAR(8000)
SET #FileName = #DataPath + '\Output File.csv'
SELECT #ColumnHeader = COALESCE(#ColumnHeader+',' ,'')+''''+column_name+'''' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='CLIENT_UPDATE'
SET #BCPCommand = 'bcp "SELECT '+ #ColumnHeader +' UNION ALL SELECT * FROM CLIENT_UPDATE" queryout "' + #FileName + '" -c -t , -r \n -S . -T'
EXEC master..xp_cmdshell #bcpCommand
END
I'd like the next step in that stored procedure to be "Upload .CSV file to client SFTP site". How would I accomplish this?
Success!
I created a stored proc on my database named sp_UploadFileToSFTP, which uses PSFTP.EXE to send files.
-- ================================================================================================================================
-- Description: Upload File to SFTP
-- - Creates "SFTPUploadScript.bat" and "SFTPUploadScript.txt" scripting files to control psftp.exe, then executes the batch file
-- - Requires a local #ScriptFolder on the SQL Server to create files inside, and which contains psftp.exe
-- - "SFTPUploadScript.bat" needs to be run once manually from the SQL Server desktop to capture the keypair of the SFTP site
-- After this, the script will work automatically
-- ================================================================================================================================
CREATE PROCEDURE sp_UploadFileToSFTP
#FilePathToUpload varchar(1000),
#SFTPSiteAddress varchar(1000),
#SFTPLogin varchar(1000),
#SFTPPassword varchar(1000),
#SFTPRemoteFolder varchar(1000)
AS
BEGIN
SET NOCOUNT ON;
--Declare Variables
DECLARE #ScriptFolder varchar(1000)
SET #ScriptFolder = 'C:\SFTPBatchFiles\'
DECLARE #CommandString AS varchar(8000)
--Delete Existing files if they exist
SET #CommandString = 'del "'+ #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
SET #CommandString = 'del "'+ #ScriptFolder + 'SFTPUploadScript.bat"'
EXEC master..xp_cmdshell #CommandString
--Create Batch fle with login credentials
SET #CommandString = 'echo "'+ #ScriptFolder +'psftp.exe" ' + #SFTPSiteAddress + ' -l ' + #SFTPLogin + ' -pw ' + #SFTPPassword + ' -b "' + #ScriptFolder + 'SFTPUploadScript.txt" > "' + #ScriptFolder + 'SFTPUploadScript.bat"'
EXEC master..xp_cmdshell #CommandString
--Create SFTP upload script file
SET #CommandString = 'echo cd "' + #SFTPRemoteFolder + '" > "' + #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
SET #CommandString = 'echo put "' + #FilePathToUpload + '" >> "' + #ScriptFolder + 'SFTPUploadScript.txt"'
EXEC master..xp_cmdshell #CommandString
--Run the Batch File
SET #CommandString = #ScriptFolder + 'SFTPUploadScript.bat'
EXEC master..xp_cmdshell #CommandString
END
GO
I can then call it from inside another stored proc like so:
sp_UploadFileToSFTP 'C:\FileToUpload\Test.txt','sftp.mysite.com','Login#domain.com','Password1234','UploadFolder/In here/'
All works perfectly, takes about a quarter of a second to run, a very neat and stable solution.
Compared to what I was doing before which was using SQL Server to drop a file into a folder, then checking that folder for new files every 15 minutes with my Visual Basic SFTP upload script, it's miles better :)
If you want to try this yourself, all you need is a folder on the C:\ drive of your SQL Server named "SFTPBatchFiles", and in there you put psftp.exe, which is part of PuTTY.
You will also need a trusting server admin who will allow you to run psftp commands and your own batch files using xp_cmdshell, and they'll probably need to open a route in your firewall so you can connect to your remote site directly from the SQL Server.
Finally, you will also need to be able to remote desktop into your SQL Server, since you'll need to run the batch file it creates manually once and press "Y" when psftp asks you to accept the new keypair. After that, it's all plain sailing.
No SSIS, no WINSCP, no Visual Studio, no hours upon hours of crashes and debugging!
The only issue right now is that if the FTP transfer fails for whatever reason, the stored procedure still reports success, I'm not bringing the psftp error messages back into SQL Server.
My next task will be to trap the status and error messages as they get generated from xp_cmdshell and give the user some detailed feedback and generate errors if the process fails.
UPDATE:
Trapping errors and feedback from the remote FTP site
--Run the Batch File
DROP TABLE IF EXISTS #CommandOutput
CREATE TABLE #CommandOutput (ReadLine varchar(1000))
SET #CommandString = #ScriptFolder + 'SFTPUploadScript.bat'
INSERT #CommandOutput
EXEC master..xp_cmdshell #CommandString
--Check for Invalid Password error
IF EXISTS (SELECT ReadLine
FROM #CommandOutput
WHERE ReadLine LIKE '%Invalid Password%')
BEGIN
PRINT 'Invalid Password Error!'
END
All of the output from the batch file (what would normally be shown on the screen if you were to run it manually) is now inserted line-by-line into a temporary table named #CommandOutput as the batch file runs.
After the batch file runs, you can then query #CommandOutput to see what's happened with your transfer.
I've added a simple check for the words "Invalid Password" anywhere in the output, as this is a common error you might have. You could add as many of these checks as you liked, or you could import the entire table into an email which gets sent to you every time the job runs, log the table to an FTP event logging table, or any number of other things.

Schedule importing flat files with different names into SQL server 2014

As I am a beginner in SQL Server and my scripting is not very polished yet. I need suggestions on the below issue.
I receive files from a remote server to my machine (around 700/day) as follows :
ABCD.100.1601310200
ABCD.101.1601310210
ABCD.102.1601310215
Naming Convention:
Here the first part 'ABCD' remains the same, middle part is a sequence id which is in incremental order for every file. The last part is time stamp.
File structure
The file does not have any specific extension but can be opened with notepad/excel. Therefore can be called as flat file. Each files consist of 95 columns and 20000 rows fixed with some garbage value on top 4 and bottom 4 rows of column 1.
Now, I need to make a database in SQL server where I can import data from these flat files using a scheduler. Suggestion needed.
There are probably other ways of doing this, but this is one way:
Create a format file for your tables. You only need to create it once. Use this file in the import script in step 2.
Create an import script based on OPENROWSET(BULK '<file_name>', FORMATFILE='<format_file>'
Schedule the script from step 2 in SQL Server to run against the database you want the data imported in
Create the format file
This creates a format file to be used in the next step. The following script creates a format file in C:\Temp\imp.fmt based on an existing table (replace TEST_TT with the database you are importing to). This creates such a format file with a , as field seperator. If the files have tab as seperator, remove the -t, switch.
DECLARE #cmd VARCHAR(8000);
SET #cmd='BCP TEST_TT.dbo.[ABCD.100.1601310200] format nul -f "C:\Temp\imp.fmt" -c -t, -T -S ' + (SELECT ##SERVERNAME);
EXEC master..xp_cmdshell #cmd;
Before executing this you will to reconfigure SQL Server to allow the xp_cmdshell stored procedure. You only need to do this once.
EXEC sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
EXEC sp_configure 'xp_cmdshell', 1
GO
RECONFIGURE
GO
import script
This script assumes:
The files need to be imported to separate tables
The files are located in C:\Temp
The format file is C:\Temp\imp.fmt (generated in the previous step)
SET NOCOUNT ON;
DECLARE #store_path VARCHAR(256)='C:\Temp';
DECLARE #files TABLE(fn NVARCHAR(256));
DECLARE #list_cmd VARCHAR(256)='DIR ' + #store_path + '\ABCD.* /B';
INSERT INTO #files EXEC master..xp_cmdshell #list_cmd;
DECLARE #fullcmd NVARCHAR(MAX);
SET #fullcmd=(
SELECT
'IF OBJECT_ID('''+QUOTENAME(fn)+''',''U'') IS NOT NULL DROP TABLE '+QUOTENAME(fn)+';'+
'SELECT * INTO '+QUOTENAME(fn)+' '+
'FROM OPENROWSET(BULK '''+#store_path+'\'+fn+''',FORMATFILE=''C:\Temp\imp.fmt'') AS tt;'
FROM
#files
WHERE
fn IS NOT NULL
FOR XML PATH('')
);
EXEC sp_executesql #fullcmd;

Can't save csv file to folder user the BCP job

I can't get my script to save my file to a specified folder that I have permissions to write to. Here is the script.
DECLARE #sql VARCHAR (8000)
SET #SQL = 'bcp SELECT * FROM LMP_DAILY.dbo.vw_DirDailyActual ORDER by Date, DirCOde queryout
T:\Shared information\SD3\dirdailyactual.csv -T -c';
PRINT #SQL
exec master.xp_cmdshel #sql;
I have sqladmin privileges on the DB and the server. I also, have read/write permissions on the path where I'm try to send the file.
The job runs and completes but not file is being created on my specified path.

SQL Server 2012 -- xp_fileexist returns 0 when checking for file on local machine

The SQL server exists on the same machine I am physically logged into, and xp_fileexist is failing to recognize any files on the D drive, which is not a network drive. I already configured xp_cmdshell and restarted the SQL server instance. Any other ideas?
Yup, we had the same problem. On SQL Server 2008, our legacy xp_fileexist code worked fine, but on SQL Server 2012... nope.
It would work if we ran the xp_fileexist command as ourselves (with Admin rights) but not when we ran it as a SQL Server user, who didn't exist as an Active Directory user. Even if we changed the security on that folder to give Everyone full permissions, the xp_fileexist would fail, always returning a 0, as if a file within that folder didn't exist.
However, what did work was to use dir from within a Stored Procedure, and test if the file existed that way. (Yeah, I know... I'm rolling my eyes myself... this is dodgy..)
Here's the Stored Procedure I wrote, based on suggestions on this site :
CREATE PROCEDURE [dbo].[DoesFileExist]
(
#directoryName NVARCHAR(500),
#filename NVARCHAR(500)
)
AS
BEGIN
-- Does a file exist in a particular folder ?
--
-- EXEC [dbo].[DoesFileExist] 'D:\MyFiles', 'SomeExcelFile.xls'
--
DECLARE #bFileExists INT
DECLARE #cmd nvarchar(300);
SELECT #cmd = 'dir ' + #directoryName + '\' + #filename;
DECLARE #dir TABLE ([output] varchar( 2000 ))
INSERT INTO #dir
EXEC master.dbo.xp_cmdshell #cmd;
-- Uncomment the following line, if you want to see what
-- a "dir" looks like from SQL Server !!
-- SELECT * FROM #dir
if EXISTS(SELECT * FROM #dir WHERE [output] LIKE '%' + #filename + '%' )
BEGIN
-- File *was* found in this folder
SET #bFileExists = 1
END
ELSE
BEGIN
-- File was NOT found in this folder
SET #bFileExists = 0
END
SELECT #bFileExists
END
You can call this SP simply by passing it a folder name and a filename:
EXEC [dbo].[DoesFileExist] 'D:\MyFiles', 'SomeExcelFile.xls'
And yes, strangely, it does seem to work, for the SQL Server users who can't use xp_fileexist.
Remember that, to use this code, your SQL Server user must have permissions to use xp_cmdshell :
GRANT EXECUTE ON xp_cmdshell TO YourSQLServerUser

Resources