I am using a flyway tool to run SQL script. I have requirements where I need to return the SPID during the run time in the flyway. Are there any config parameters or variables which can return SPID?
Related
Ideally would like to run something from a SQL query or SQL agent job to FTP upload a file to an external site, but cannot use xp_cmdshell.
Yes. You need to split your work into two separate tasks:
How to run executable or a batch program from within SQL Server without resorting to xp_cmdshell.
An example of how to do it can be found in:
https://www.mssqltips.com/sqlservertip/2014/replace-xpcmdshell-command-line-use-with-sql-server-agent/.
You should modify this example to suit your particular needs. Suggested stored procedure would:
run command passed as a parameter in created on-the-fly SQL job (indicate CmdExec subsystem)
wait for SQL job completion (query msdb.dbo.sysjobactivity) or kill the job if predefined timeout value has been reached
return results of job execution (query msdb.dbo.sysjobhistory)
delete the job
Note: Full version of SQL Server required. If you are using express version, you would have to manually define a windows scheduled task.
How to send a file via ftp using a batch program.
Please see:
How to ftp with a batch file?
I am trying to execute a perl script from xp_cmdshell.
The output of the perl script is a csv file, but when I run
EXEC master..xp_cmdshell N'perl G:\script\perl.pl';
I can't find the csv file created, though the xp_cmdshell command seems to run fine, the output is the name of file that has to be created.
I am using xp_cmdshell to create a job step to execute the perl script.
Any help would be appreciated.
Since you're running this via a SQL Agent job, it'll be much safer to disable the use of xp_cmdshell via sp_configure (ref1 | ref2) and use a CmdExec job step instead.
When configuring the job step, be sure to go to the advanced page and enable job step logging to a table.
This will allow you to better troubleshoot the issues you're having with the perl job in general, as the issue could be related to something entirely outside the context of the database engine.
Does anyone know how to trigger the execution of a SSIS 2008 package while running a DTS 2000 package?
Actually, my DTS 2000 has to be runned as it is and cannot be converted into a SSIS 2008.
SO is it possible to execute maybe a shell command (Dtutil , etc..) to run this SSIS 2008 package?
Thanks for feedbacks
There are two ways I can think of doing this.
Make the DTS execution a step in a SQL Agent job, and start that job by running the stored procedure sp_start_job
Run by executing xp_cmdshell:
EXEC xp_cmdshell 'dtexec /f "C:\Package.dtsx"'
Option two involves configuring xp_cmdshell to run. xp_cmdshell allows you to issue operating system commands directly to the Windows command shell via T-SQL code - something I'm not entirely comfortable with, so I would go for option 1.
Some helpful links:
sp_start_job
Execute SSIS Package using the Stored Procedure in T-SQL.
Executing all SSIS packages in a folder: three methods
We are in the process of migrating from SQL 2000 to SQL 2005. We have hundreds of DTS pacakges, that the development team is reluctant to redevelop using SSIS.
When migrating these packages to SSIS, I am faced with a problem - many of these packages read from Excel files.
Given that my production Box is 64 bit, I am forced to use CmdExec sub-system to call the 32 bit runtime to execute these packages.
My question here is : What are the security risks involved with using CmdExec subsystem to schedule these SSIS packages as SQL agent jobs?
Thanks,
Raj
Whatever account running the job will potentially have access to run commands from the command line - so you need to think about how it will be running and what permissions the account will have.
For example, if a user could create a job that would run under the context of your sqlagent and your sql agent was overpriviledged (rights to change security), she could grant herself elevated privs or hurt your machine.
SQL 2008 introduced a switch for DTExec that allows you to run the packages in 32 bit mode using the native SQL Agent task for SSIS. On the execution tab of the job step properties there is a check box for 32 bit, which translates to the "/X86" switch when looking at the command line view.
If you are stuck using SQL 2005 then the CMDEXEC option is the only one I know of.
xp_cmdshell is the biggest security risk in SQL Server because it allows a compromised SQL Server box to elevate the attack to the host operating system itself, and from there to the entire network.
The typical vector of attack is web site HTTP form -> SQL injection -> xp_cmdshell -> take over SQL hosting machine -> take over domain. If xp_cmdshell is shut down then the attacker has to find other means to elevate its attack from SQL to the host.
Other scenarios exists, like insider users using it to elevate privileges, or using the cmdshell for other purposes, eg. steal a database. All are based on the fact that xp_cmdshell allows arbitrary commands to be executed and on the host, and in some cases the commands executed also inherit the SQL Server service account privileges.
There are other commands and extend procedures that can be used by an attacker if xp_cmdshell is blocked, but they far less known. Using the xp_cmdshell vector is in every SQL injection cheat sheet and forum discussion, so is known by everyone and their grand ma.
What is the best method for executing FTP commands from a SQL Server stored procedure? we currently use something like this:
EXEC master..xp_cmdshell 'ftp -n -s:d:\ftp\ftpscript.xmt 172.1.1.1'
The problem is that the command seems to succeed even if the FTP ended in error. Also, the use of xp_cmdshell requires special permissions and may leave room for security issues.
If you're running SQL 2005 you could do this in a CLR integration assembly and use the FTP classes in the System.Net namespace to build a simple FTP client.
You'd benefit from being able to trap and handle exceptions and reduce the security risk of having to use xp_cmdshell.
Just some thoughts.
Another possibility is to use DTS or Integration Services (DTS for SQL Server 7 or 2000, SSIS for 2005 or higher). Both are from Microsoft, included in the Sql Server installation (in Standard edition at least) and have an FTP task and are designed for import/export jobs from Sql Server.
If you need to do FTP from within the database, then I would go with a .NET assembly as Kevin suggested. That would provide the most control over the process, plus you would be able to log meaningful error messages to a table for reporting.
Another option would be to write a command line app that read the database for commands to run. You could then define a scheduled task to call that command line app every minutes or whatever the polling period needed to be. That would be more secure than enabling CLR support in the database server.