I was wondering if it is possible for SQL Server to check a directory for files and run a stored procedure. I did some research and found this, but I am wondering if there is a way to do what I want WITHOUT SSIS.
EDIT: After reading my post, I realized I should have been more specific. Is there a way to AUTOMATICALLY or set SQL Server to check for files in a directory and run a stored procedure?
You can use xp_cmdshell to run file related commands. To get a directly listing:
exec xp_cmdshell 'dir *.csv';
You can also use bulk insert to load a file from disk into a table and take actions based on the loaded contents.
Normally you'd use the File Watcher Task with SSIS. But you can also use SQL Server Agent to schedule a task for periodic execution, schedule a task with Windows Task Scheduler, or configure a stored procedure to runs at startup with sp_procoption that pauses (using waitfor) between processing times.
Related
Curious if this is feasible. I am currently in the process of building a number of SQL Server agent jobs. These jobs are just SFTP jobs that pass files between the servers of 2 different clients, making a pit-stop at my local server for some pre and post processing. Yes, this setup is from one standpoint unnecessarily complicated, but it is necessary from a security standpoint. All of these jobs have identical structure:
SFTP a file from the client1server to the local server.
Run an executable on the file
SFTP the processed file to client2server.
Wait a predetermined amount of time so that client2 can perform their query on the input.
SFTP the response file from client2server to the local server.
Run a second executable on the file.
SFTP the processed response file back to client1server.
Pretty straight forward.
There are only a handful of values that change between each job:
- Input/output file path on client1server
- Input/output file path on client2server
- Directory on local server
These jobs are not complicated, so If necessary I can just create them all by hand. It seems like an unnecessary amount of work though. I had the thought that maybe I could create a stored procedure that would generate the SQL script that creates the job, and that stored procedure could simple accept the variables that change from job to job. Is this feasible?
Broadly, heres what I'm thinking:
CREATE PROCEDURE create_ftp_interface_job
#client1input_fp nvarchar(100),
#client1output_fp nvarchar(100),
#client2input_fp nvarchar(100),
#client2output_fp nvarchar(100),
etc...
AS
<SQL Script for creating SQL Server Agent jobs, with parameters inserted>
GO
I've tried an early version of this, and I seem to be having trouble referencing the variables I declare in the stored procedure definition inside of the SSA job script. I came here to ask if what I'm attempting is feasible, and I just have a run of the mill reference error, or if what I'm doing is not allowed.
You can use msdb.dbo.sp_add_job, msdb.dbo.sp_add_jobstep, msdb.dbo.sp_update_job, msdb.dbo.sp_add_jobschedule and msdb.dbo.sp_add_jobserver to create and manage SQL Agent Jobs programatically.
In fact there are even more stored procedures here which relate to managing SQL Agent Jobs.
System Error: Cannot bulk load because the file "XYZ.txt" could not be opened. Operating system error code 1311(There are currently no logon servers available to service the logon request.)
I have a stored procedure in SQL Server 2008 R2 which is using Bulk Insert command to load data from txt files into SQL Server tables. These files are on a shared folder on a drive located on different domain. I have full access to the drive. I tried copying files to different directory on that drive, moving files, deleting files and everything works.
When I execute the stored procedure from a ssms session from local computer it works like a champ. It is able to open the files on shared drive, read it and load the data into the SQL Server tables without any issue. When I call the stored procedure from a SQL Server Agent job, it throws this error.
SQL Server Agent is using the account which is very powerful with lot more permissions than mine. But the job fails.
To find a workaround I created a SSIS package which calls the stored procedure from "Execute SQL Task". It uses Windows authentication to connect to the database. I tried executing the package and it ran successfully. It is able to upload the data from txt to table.
So, then I created a SQL Server user with my account details, and then used that credentials to create a proxy with ssis sub system. I then scheduled the job to execute the step with newly created proxy to see if it can upload the data. But it failed with the same error.
I am confused what am I doing wrong..? I even added myself to bulkdmin role and ran the job again with no success.
I'll appreciate if someone can help.
Thanks.
Just out of curiosity I tried replacing Bulk Insert command with BCP. For some reason BCP worked. It is able to Open the files on network drive and read through it to insert the data in sql server tables. I can even call the same stored proc from sql agent job and it works perfectly fine. I didn't need to use SSIS package to solve this.
I want to run batch file from SQL Job without using exec xp_cmdshell.
Any idea?
Thanks
You could use a SQL Server Job, otherwise i cannot think of a way you could without xp_cmdshell.
Take a look at this
I want to run batch file from SQL Job without using exec xp_cmdshell.
Any idea?
Worth to mention that you can also leverage SQLCLR.
Example: CLR Stored procedure to execute command
Some other googlable threads:
How to execute a DOS command when xp_cmdshell is disabled in SQL Server
Executing an external process() in SQLCLR Project
Such approach introduces severe risks like memory leaks, crashing of underlying .net app pool etc
Therefore another link: Security in the CLR World Inside SQL Server
Instead of running batch file, i have created power shell and ran it from SQL job. It satisfy my requirement and resolved my issue.
Do it like the picture: like this image.
The drive containing the batch file should be other than the C drive, to avoid trouble.
Add execute, read and write permissions for the user, which you are using to run the batch file, to get the username run this query: EXEC master..xp_cmdshell 'whoami', get the name after the \ sign. For example "nt service\mssqlserver". Add permission for this user: mssqlserver
Finally make sure you put the batch file on the same server as where you execute your Job.
I have an ogr2ogr batch file that reprojects SQL data into a new SQL Server table.
It works fine when I run the bat file manually but it fails if I run the bat file via a SQL Server stored procedure. I have given the gdal folders SQL Service permissions and xp_CommandShell is also enabled. I'm using
EXECUTE xp_CMDShell 'blah'
in the T-SQL script.
For some reason the ogr_MSSQLSpatial.dll causes it to fail.
ERROR 1: Can't load requested DLL: Z:\BroadSpectrumSQLTreeExtract\ogr2ogr\gdalplugins\ogr_MSSQLSpatial.dll
If I remove this dll the script runs via SQL but it means I need to add extra commands that the dll must take care of, such as setting source coordinate system. I haven't managed to get it working 100%. The furthest I got to was producing the reprojected table but the geometry field is empty.
The DLL does contain SQL commands to the system tables. Could this be a SQL Server security issue stopping it from working?
I again had this problem with another ogr2ogr bat while executing with SQL. If I put the bat in the same folder as the dll's it works fine.
Ideally would like to run something from a SQL query or SQL agent job to FTP upload a file to an external site, but cannot use xp_cmdshell.
Yes. You need to split your work into two separate tasks:
How to run executable or a batch program from within SQL Server without resorting to xp_cmdshell.
An example of how to do it can be found in:
https://www.mssqltips.com/sqlservertip/2014/replace-xpcmdshell-command-line-use-with-sql-server-agent/.
You should modify this example to suit your particular needs. Suggested stored procedure would:
run command passed as a parameter in created on-the-fly SQL job (indicate CmdExec subsystem)
wait for SQL job completion (query msdb.dbo.sysjobactivity) or kill the job if predefined timeout value has been reached
return results of job execution (query msdb.dbo.sysjobhistory)
delete the job
Note: Full version of SQL Server required. If you are using express version, you would have to manually define a windows scheduled task.
How to send a file via ftp using a batch program.
Please see:
How to ftp with a batch file?