Is there any way to have a package (which will be a wrapper) run every minute in SQL Server agent, even if it is already running from a previous execution. It seems the SQL Server agent does not launch if already running. Is there a way to override this behaviour?
I wanted to do something such as
Wrapper.dtsx
--> read from table of packages to run, and select next in line
--> execute package task with the package dynamically set from previously
selection
--> exit
ie
table has the following packages (assume some ranking will exist eventually)
a.dtsx (say runs for 5 mins)
b.dtsx (say runs for 4 mins)
c.dtsc (say runs for 6 mins)
12:01 am a.dtsx is executed
12:02 am b.dtsx is executed
12:03 am c.dtsx is executed
at the moment I can only get the following to occur
12:01 am a.dtsx is executed
12:06 am b.dtsx is executed
12:10 am c.dtsx is executed
Hm, this SQL Jobs behavior is standard for MS SQL Server and cannot be altered. For your situation if you are on SQL 2012 and higher, you can use new SSIS Catalog with async execution. By using this your job will start package execution and quit; therefore you are free to start it in a minute. Disadvantage - job status will only show whether package been started and nothing on its outcome; you have to do execution monitoring yourself.
Switching to async package execution requires SSIS 2012+, establishing SSIS Catalog DB, switching your packages to Project deploy model. After all, create SQL Job to start package, specify all needed connections and parameters, save it. Then with context menu select Script Job as -> DROP and CREATE to -> New Query Editor Window. In the query text - locate substring
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True
and switch it to
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";False
Then run script updating your job.
This strange script manipulation is needed since by default SQL Job executes package synchronously and does not expose async option in user interface.
Related
I have two SSIS jobs that are stuck in Created Execution status. They cannot run since the package version has changed (I get Error Msg 27150: The version of the project has changed since the instance of the execution has been created. Create a new execution instance and try again.) I do not want to run this anymore, just delete it.
How can I remove these from the execution log? catalog.stop_operation does not work since there is no active operation for this job.
Note: the job does not appear in Active Operations, since it never started.
SSISDB keeps track of all operations that are currently active/executing. In order to retrieve a list of all active operations, you need to right-click SSISDB and choose Active Operations
You can then click the Stop button located at the bottom right of the window
It’s also possible to do the same process via T-SQL. You can stop a package by calling the stored procedure catalog.stop_operation passing the operation ID as a parameter
Use this query to retrieve all currently running packages in the SSIS. Catalog and their IDs:
SELECT * FROM SSISDB.catalog.executions WHERE end_time IS NULL
The statement below stops the execution of the SSIS package with operation_id=65
EXEC SSISDB.catalog.stop_operation #operation_id = 65
I am dealing with my problem on some Windows Server 2019 (Core) with one running SQL Server 2019 CU4 instance each.
What we try to do
We are currently building a data warehouse with distributed databases. The individual layers of the DWH are located on one database server each. The data exchange between the layers/servers takes place via SSIS ETLs, which use Linked Servers to reach the other layers and drag and drop data. Each layer also has its own SSIS service instance and executes the corresponding SSIS packets.
The SSIS packages are called by SQL Server Agent jobs. We have a job that executes the SSIS packets (#1), which in turn calls another job (#2) as the last step, which after a short wait time executes the calling job (#1). Thus, controlled by schedules, a loop is created and data is continuously transferred with ETLs.
I hope this was not too much unnecessary background
The error
Basically the job is running and there are numerous successful executions. However, we are observing interruptions at job #1 without helpful information regarding the error. This means that the job history log refers to the SSIS log, which again only contains an "unexpected termination". In the SSIS log, we only see behavior that indicates that the ETL packet active at that time stopped after validation. Depending on the log level, nothing is logged at all, not even the execution of single packages of the project. The package where this error occurs is different and not limited to a specific one.
What I have already tried
Re-create the jobs and SSIS Enviroments by hand (scripted before)
Using the 32Bit Runtime
Upgrade the SSIS project/package version to
2019
Increase the log level to "verbose"
Patching the SQL Server to CU4
Save ssis dump files (couldn't find them or they weren't created)
Search Windows and SQL Server Logfiles
Does anyone have some suggestions or some ideas how to become more error specific informations?
Thank you very much and take care :)
UPDATE We have an error message (OLE DB 0xC0202009 and 0X80004005)!
In order to exclude the use of environments as a cause, I manually set the parameters in the SSIS job step instead of overwriting them by selecting an environment.
Long story short: Today it turns out that the parameter for an OLE DB Connection String is not passed correctly.
The following is specified as a parameter in the job step:
However, the following connection string is specified in the context of the error message:
Please note that some arguments are added twice to the parameter (red).
What could have caused that?
I have an SSIS package which imports the data into the SQL Server 2008 database. I have set up the schedule job in the SQL Server Agent to run that package. When I check the history, I could only see whether the job ran successfully or not. I could not see other messages apart from that.
I would like to know how many records are imported whenever the job is executed. How can I monitor that? Should I use the additional components in SSIS package or set some configurations in SQL Server Agent Job Setup?
I found some logging facilities in SQL Server Agent Job Setup but I am not sure it can fulfill my requirements or not.
If you are just interested in knowing the columns being processed and not interested with the info for further use, one possible option is making use of the SSIS logging feature. Here is how it works for data flow tasks.
Click on the SSIS package.
On the menus, select SSIS --> Logging...
On the Configure SSIS Logs: dialog, select the provider type and click Add. I have chosen SQL Server for this example. Check the Name checkbox and provide the data source under Configuration column. Here SQLServer is the name of the connection manager. SSIS will create a table named dbo.sysssislog and stored procedure dbo.sp_ssis_addlogentry in the database that you selected. Refer screenshot #1 below.
If you need the rows processed, select the checkbox OnInformation. Here in the example, the package executed successfully so the log records were found under OnInformation. You may need to fine tune this event selection according to your requirements. Refer screenshot #2 below.
Here is a sample package execution within data flow task. Refer screenshot #3 below.
Here is a sample output of the log table dbo.sysssislog. I have only displayed the columns id and message. There are many other columns in the table. In the query, I am filtering the output only for the package named 'Package1' and the event 'OnInformation'. You can notice that records with ids 7, 14 and 15 contain the rows processed. Refer screenshot #4 below.
Hope that helps.
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
use the below procedure for getting SSIS errors with execution id
CREATE PROCEDURE [dbo].[get_ssis_status] #EXECUTION_ID INT\n
AS
BEGIN
SELECT o.operation_id EXECUTION_ID
,convert(datetimeoffset,OM.message_time,109) TIME
,D.message_source_desc ERROR_SOURCE
,OM.message ERROR_MESSAGE
,CASE ex.STATUS
WHEN 4 THEN 'Package Failed'
WHEN 7 THEN CASE EM.message_type
WHEN 120 THEN 'package failed'
WHEN 130 THEN 'package failed' ELSE 'Package Succeed'END
END AS STATUS
FROM SSISDB.CATALOG.operation_messages AS OM
INNER JOIN SSISDB.CATALOG.operations AS O ON O.operation_id = OM.operation_id
INNER JOIN SSISDB.CATALOG.executions AS EX ON o.operation_id = ex.execution_id
INNER JOIN (VALUES (- 1,'Unknown'),(120,'Error'),(110,'Warning'),(130,'TaskFailed')) EM(message_type, message_desc) ON EM.message_type = OM.message_type
INNER JOIN (VALUES
(10,'Entry APIs, such as T-SQL and CLR Stored procedures')
,(20,'External process used to run package (ISServerExec.exe)')
,(30,'Package-level objects')
,(40,'Control Flow tasks')
,(50,'Control Flow containers')
,(60,'Data Flow task')
) D(message_source_type, message_source_desc) ON D.message_source_type = OM.message_source_type
WHERE ex.execution_id = #EXECUTION_ID
AND OM.message_type IN (120,130,-1);
END
Here's another approach for when SQL Server job history is not showing output from SSIS packages: use DTEXEC command lines.
(Upside: this approach puts the job's output where anyone else supporting it would expect to find it: in job history.
Downside for big packages: if you have a long SSIS package, with lots of tasks or components, and lots of output, then the job history will split package output into many lines of job history, making the approach in the previous answer--logging to a table--easier to read.)
To show SSIS package output in the job's View History:
(1) Change the job steps from type "SQL Server Integration Services Package", to "Operating system (CmdExec)",
(2) Use DTEXEC command lines, to execute the packages.
Example of command line:
DTExec /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
Note that if the SSIS package requires 32-BIT execution (true for exporting to Excel, for example), then use the DTEXEC utility in "Program Files (x86)" by fully qualifying it. Example, where the SQL Server application was installed on an "E:" drive, and where SQL Server 2014 is being used:
"E:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\DTExec.exe" /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
If your SSIS packages are in the file system (as ".dtsx" files), then replace "/DTS" with "/FILE".
If your SSIS packages were placed in SSISDB (using the "project deployment model", which is available starting with SQL Server 2012, instead of the older "package deployment model"), then replace "/DTS" with "/ISSERVER".
Next, go into the job step's "Advanced" page, and make sure that the box is checked for "Include step output in history".
Lastly, consider your job step's "Run as": if your job steps "Run as" were already set to a proxy, on job steps of type "SQL Server Integration Services Package", then you already made that proxy active to the subsystem "SQL Server Integration Services Package". Now, to do command lines like the above, check the proxy's properties, and make sure it is also active to the subsystem "Operating system (CmdExec)".
MSDN reference: SSIS Output on Sql Agent history
If you have deployed the package to the database's Integration Services Catalog (rather than load it from a file system) you can easily get detailed reporting.
Open the catalog node in SQL Server Management Studio, right click the Package name, select Reports | Standard Reports | All Executions and see details about every step of the job and its subcomponents, including records imported.
In a scripting step in a scheduled task in SQL Server Agent 2005, I need to trigger a webscript that is running on a different server. I'm doing this:
Dim ie
Set ie = CreateObject( "InternetExplorer.Application" )
ie.navigate "...to my dreamscript"
' Wait till IE is ready
Do While ie.Busy
(1)
Loop
ie.Quit
set ie = Nothing
At (1) I would like to "sleep" (e.g. WScript.sleep(..)), but WScript is not available in this environment. Is there another way to "sleep" for a while?
If you're only trying to have the SQL SErver Agent task that waits for a time period use a T-SQL Task with the script
WAITFOR DELAY '01:00:00' -- wait for an hour
and change the time to the duration that you'd like to wait.
HTH
Andy
You can write a console applicaton and execute the console app in SQL agent job.
You could execute the wscript by using a SQL Server Agent task with a type of "Operpating System (CmdExec)". This requires that xp_cmdshell be enabled and it is often disabled (by default) due to security concerns. However, it does allow you to initiate programs, such as wscript, that are run at the command prompt.
You could move the code into SQLCLR where you can write a stored procedure in C# or VB.Net. The VB.Net SQLCLR Code would be pretty similar to your original wscript.
Is it possible to set up somehow Microsoft SQL Server to run a stored procedure on regular basis?
Yes, in MS SQL Server, you can create scheduled jobs. In SQL Management Studio, navigate to the server, then expand the SQL Server Agent item, and finally the Jobs folder to view, edit, add scheduled jobs.
If MS SQL Server Express Edition is being used then SQL Server Agent is not available. I found the following worked for all editions:
USE Master
GO
IF EXISTS( SELECT *
FROM sys.objects
WHERE object_id = OBJECT_ID(N'[dbo].[MyBackgroundTask]')
AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[MyBackgroundTask]
GO
CREATE PROCEDURE MyBackgroundTask
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- The interval between cleanup attempts
declare #timeToRun nvarchar(50)
set #timeToRun = '03:33:33'
while 1 = 1
begin
waitfor time #timeToRun
begin
execute [MyDatabaseName].[dbo].[MyDatabaseStoredProcedure];
end
end
END
GO
-- Run the procedure when the master database starts.
sp_procoption #ProcName = 'MyBackgroundTask',
#OptionName = 'startup',
#OptionValue = 'on'
GO
Some notes:
It is worth writing an audit entry somewhere so that you can see that the query actually ran.
The server needs rebooting once to ensure that the script runs the first time.
A related question is: How to run a stored procedure every day in SQL Server Express Edition?
Yes, if you use the SQL Server Agent.
Open your Enterprise Manager, and go to the Management folder under the SQL Server instance you are interested in. There you will see the SQL Server Agent, and underneath that you will see a Jobs section.
Here you can create a new job and you will see a list of steps you will need to create. When you create a new step, you can specify the step to actually run a stored procedure (type TSQL Script). Choose the database, and then for the command section put in something like:
exec MyStoredProcedure
That's the overview, post back here if you need any further advice.
[I actually thought I might get in first on this one, boy was I wrong :)]
Probably not the answer you are looking for, but I find it more useful to simply use Windows Server Task Scheduler
You can use directly the command sqlcmd.exe -S "." -d YourDataBase -Q "exec SP_YourJob"
Or even create a .bat file. So you can even 2x click on the task on demand.
This has also been approached in this HERE
I'll add one thing: where I'm at we used to have a bunch of batch jobs that ran every night. However, we're moving away from that to using a client application scheduled in windows scheduled tasks that kicks off each job. There are (at least) three reasons for this:
We have some console programs that need to run every night as well. This way all scheduled tasks can be in one place. Of course, this creates a single point of failure, but if the console jobs don't run we're gonna lose a day's work the next day anyway.
The program that kicks off the jobs captures print messages and errors from the server and writes them to a common application log for all our batch processes. It makes logging from withing the sql jobs much simpler.
If we ever need to upgrade the server (and we are hoping to do this soon) we don't need to worry about moving the jobs over. Just re-point the application once.
It's a real short VB.Net app: I can post code if any one is interested.
You could use SQL Server Service Broker to create custom made mechanism.
Idea (simplified):
Write a stored procedure/trigger that begins a conversation (BEGIN DIALOG) as loopback (FROM my_service TO my_service) - get conversation handler
DECLARE #dialog UNIQUEIDENTIFIER;
BEGIN DIALOG CONVERSATION #dialog
FROM SERVICE [name]
TO SERVICE 'name'
...;
Start the conversation timer
DECLARE #time INT;
BEGIN CONVERSATION TIMER (#dialog) TIMEOUT = #time;
After specified number of seconds a message will be sent to a service. It will be enqueued with associated queue.
CREATE QUEUE queue_name WITH STATUS = ON, RETENTION = OFF
, ACTIVATION (STATUS = ON, PROCEDURE_NAME = <procedure_name>
, MAX_QUEUE_READERS = 20, EXECUTE AS N'dbo')
, POISON_MESSAGE_HANDLING (STATUS = ON)
Procedure will execute specific code and reanable timer to fire again.
You can find fully-baked solution(T-SQL) written by Michał Gołoś called Task Scheduler
Key points from blog:
Pros:
Supported on each version (from Express to Enterprise). SQL Server Agent Job is not available for SQL Server Express
Scoped to database level. You could easiliy move database with associated tasks (especially when you have to move around 100 jobs from one enviromnent to another)
Lower privileges needed to see/manipulate tasks(database level)
Proposed distinction:
SQL Server Agent (maintenance):
backups
index/statistics rebuilds
replication
Task Scheduler (business processes):
removing old data
preaggregations/cyclic recalculations
denormalization
How to set it up:
get source code from section: "Do pobrania" - To download
(enabling broker/setting up schema tsks/configuration table + triggers + stored procedure)/setting up broker things)
set up configuration table [tsks].[tsksx_task_scheduler] to add new tasks (columns names are self-descriptive, sample task included)
Warning: Blog is written in Polish but associated source code is in English and it is easy to follow.
Warning 2: Before you use it, please make sure you have tested it on non-production environment.
Using Management Studio - you may create a Job (unter SQL Server Agent)
One Job may include several Steps
from T-SQL scripts up to SSIS Packages
Jeb was faster ;)
You should look at a job scheduled using the SQL Server Agent.