Transaction in SSIS Package is hanging - sql-server

I have an SSIS package (built w/Visual Studio 2012) with various Execute SQL Tasks in it. It's running as a job on SQL Server 2012. In the package properties I have TransactionOption set to Required; all of the tasks' TransactionOption settings are Supported. IsolationLevel is set to Serializable on everything. Suddenly yesterday, one of the tasks started hanging. If I run this task individually, it runs fine. If I disable it and run the rest of the package, there's no problem. However if this task is enabled and I try to run the package, it hangs at this task, and blocks all sessions behind it. The only way I can get the entire package to run properly is to change the package's TransactionOption setting to Supported (i.e., turn transactions off). Here's what the offending task does:
update timekeep
set tkemail = u.EMail,
tkmoddate = getdate(),
tkmoduser = 'appmgr'
from ALLUSERDATA.AllUserData.dbo.[User] u
where u.Login COLLATE SQL_Latin1_General_CP1_CI_AI = tkinit
and tkemail <> u.EMail COLLATE SQL_Latin1_General_CP1_CI_AI
I've tried putting WITH (ROWLOCK) after the UPDATE statement, but that doesn't help. The timekeep table is used by multiple people throughout the company all day, but the SSIS package is scheduled to run at 3:45 am, and it's still hanging. Again, this just started yesterday morning. The DBA says nothing on the server has changed. Any ideas on what could be causing this?

Related

Is there any way to rollback transactions in SSIS for SQL Server 2012?

Cannot successfully execute an SSIS package with BEGIN TRAN functionality.
I'm at a loss with an SSIS package I inherited. It contains:
1 Script Task
3 Execute SQL tasks
5 Data flow tasks (each contains a number of merges, lookups, data inserts and other transformations)
1 file system task of the package.
All of these are encapsulated in a Foreach loop container. I've been tasked with modifying the package so that if any of the steps within the control/data flow fails, the entire thing is rolled back. Now I've tried two different approaches to accomplish this:
I. Using Distributed Transactions.
I ensured that:
MSDTC was running on target server and executing client (screenshot enclosed)
msdtc.exe was added as an exception to server and client firewall
Inbound and outbound rules were set for both server and client to allow DTC connections.
ForeachLoop Container TrasanctionLevel: Required
All other tasks TransactionLevel: Supported
My OLEDB Connection has RetainSameConnection set to TRUE and I'm using SQL Server Authentication with Save Password checked
When I execute the package, it fails right after the script task (first step).
After spending an entire week trying to figure out a workaround, I decided to try SQL Tasks to try to accomplish my goal using 3 Execute SQL Tasks:
BEGIN TRAN before the ForeachLoop Container
COMMIT TRAN after the ForeachLoop Container with a Success Constraint
ROLLBACK TRAN after the ForeachLoop Container with a Failure constraint
In this case, the ForeachLoop container and all other tasks have TransactionLevel property set to Supported. Now here, the problem is that the package executes up to the fourth data flow task and hangs there forever. After logging into SQL Server and verifying the running sessions, I noticed sys.sp_describe_first_result_set;1 as a headblocker session.
Doing some research, I found it could be related to a few TRUNCATE statements in some of my Data flow tasks which could cause a schema lock. I went ahead and changed the ValidateExternalMetaData property to False for all tasks within my data flow and changed my truncate statements to DELETE statements instead. Re-ran package and still hangs in the same spot with the same headblocker. As an alternative, I tried creating a second OLEDB connection to the same database, assigned that new OLEDB Connection to my BEGIN, ROLLBACK and COMMIT SQL tasks with RetainSameConnectionProperty set to TRUE and changed the RetainSameConnectionProperty to FALSE (and tried it with TRUE as well) in the original OLEDB connection (the one used by the data flow tasks). This worked in the sense that the package appeared to execute (It ran and Commit Tran executed fine) and then I ran it again with a forced error to cause it to fail and the Rollback TRAN task executed successfully, however, when I queried the affected tables, the transaction hadn't rolled back, all new records were inserted and old ones were updated (the begin tran was clearly started in a different connection and hence didn't affect the package's workflow). I'm not sure what else to try at this point. Any help would be truly appreciated, I’m about to go nuts with this!
P.S. additionally, all objects have "DelayValidation" set to true on everything and SQL Server version is 2012.

SSIS SQL Server agent launch job already running

Is there any way to have a package (which will be a wrapper) run every minute in SQL Server agent, even if it is already running from a previous execution. It seems the SQL Server agent does not launch if already running. Is there a way to override this behaviour?
I wanted to do something such as
Wrapper.dtsx
--> read from table of packages to run, and select next in line
--> execute package task with the package dynamically set from previously
selection
--> exit
ie
table has the following packages (assume some ranking will exist eventually)
a.dtsx (say runs for 5 mins)
b.dtsx (say runs for 4 mins)
c.dtsc (say runs for 6 mins)
12:01 am a.dtsx is executed
12:02 am b.dtsx is executed
12:03 am c.dtsx is executed
at the moment I can only get the following to occur
12:01 am a.dtsx is executed
12:06 am b.dtsx is executed
12:10 am c.dtsx is executed
Hm, this SQL Jobs behavior is standard for MS SQL Server and cannot be altered. For your situation if you are on SQL 2012 and higher, you can use new SSIS Catalog with async execution. By using this your job will start package execution and quit; therefore you are free to start it in a minute. Disadvantage - job status will only show whether package been started and nothing on its outcome; you have to do execution monitoring yourself.
Switching to async package execution requires SSIS 2012+, establishing SSIS Catalog DB, switching your packages to Project deploy model. After all, create SQL Job to start package, specify all needed connections and parameters, save it. Then with context menu select Script Job as -> DROP and CREATE to -> New Query Editor Window. In the query text - locate substring
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True
and switch it to
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";False
Then run script updating your job.
This strange script manipulation is needed since by default SQL Job executes package synchronously and does not expose async option in user interface.

SQL Server Agent - SSIS Package - Error 0x80131904 - Timeout expired

There's been a string of random occurrences of the following error in the SQL Server Agent scheduled jobs lately that I have been unable to find a solution to.
The error occurs infrequently, but usually once a week for a daily scheduled job, but in any number of different jobs and not always the same one. Each job shares the fact that it executes an SSIS package from the same server that is running the job. It also always runs for almost exactly 30 seconds elapsed time, which I guess is the timeout threshold. I'm not sure why it would timeout if the server is just connecting to its own SSIS catalog. Also of note is that it never actually gets to the point where it executes the SSIS package, and this occurs regardless of which package is trying to be executed.
During my research I came across many people suggesting that simply updating SQL Server 2012 to the latest CU* or SP2 would solve the problem. However, upgrading the server to SP2 has not.
One solution tried (which admittedly was ugly) was to simply have a single retry upon failure of the job step, which actually did solve the problem in about 30% of the cases.
I would welcome anyone who has experience with this error, or anyone who has any suggestions.
The error message is as follows:
Date 16/07/2014 6:00:11 AM
Log Job History ({$jobname})
Step ID 1
Server {$productionserver}
Job Name {$jobname}
Step Name {$stepname}
Duration 00:00:31
Sql Severity 0
Sql Message ID 0
Operator Emailed
Operator Net sent
Operator Paged
Retries Attempted 0
Message
Executed as user: {$user}.
Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 64-bit Copyright (C) Microsoft Corporation. All rights reserved.
Started: 6:00:11 AM Failed to execute IS server package because of error 0x80131904.
Server: {$productionserver},
Package path: {$packagepath},
Environment reference Id: NULL.
Description: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Source: .Net SqlClient Data Provider
Started: 6:00:11 AM Finished: 6:00:42 AM
Elapsed: 31.122 seconds. The package execution failed. The step failed.
Try this:
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding
And this
https://connect.microsoft.com/SQLServer/feedback/details/783291/ssis-package-fails-to-start-application-lock-timeout-in-ssisdb-catalog-create-execution
Looks like its a known bug.
Check what else is/was running on the instance at the time of the package failures (e.g. a database integrity check or similarly intensive operation).
The SQL Agent is timing out talking to its own SSIS catalog (a 30 second timeout). It's not actually executing the packages, so it's nothing to do with the packages themselves and everything to do how busy the instance is at the time of the execution.
(Answering this question since it comes up in a Google search)
I know this is an older question. but I'm having the same problem and this doesn't have an accepted answer.
The job fails in 1.5 seconds so I believe it is NOT a timeout issue.
I can confirm 0x80131904 is (or can be) a permissions issue. I had my SSIS package running under a SQL Agent job just fine with sysadmin and network admin privileges. when i switched it to an account with fewer permissions, i get this error.
For me, the problem was because i was not assigning permissions in all the correct places. I already set Read/Execute permissions in the Project Properties. Then (this is the step I didn't do) I had to assign Read permissions on the folder containing Projects and Environments.
Hope this helps someone.
We have experienced this error when attempting to start several SSIS packages at the same instant. Service packs were supposed to fix it, but have not. We have implemented a staggered schedule for SSIS packages so only one package is starting at any given moment.
We also experienced the same bug. As a workaround, we created the following stored procedure. If you put this into a job that runs every f.e. 10 minutes, it makes sure that if there are random failures, the job gets restarted continuously until you reach an occurence without timeout failure.
USE [msdb]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[usp_StartTimedOutJob]
AS
DECLARE #jobid NVARCHAR(100)
, #jobname NVARCHAR(250)
, #stepname NVARCHAR(250)
, #varMail VARCHAR(MAX)
DECLARE cJobs CURSOR FOR
-- CTE selects all jobs that are currently not running and orders them by most recent
WITH CTE_NotRunning AS (
SELECT S.job_id
, S.step_name
, S.[message]
, rownum = ROW_NUMBER() OVER (PARTITION BY S.job_id ORDER BY S.run_date DESC, S.run_time DESC)
FROM msdb.dbo.sysjobhistory AS S
LEFT OUTER JOIN (SELECT DISTINCT ja.job_id
FROM msdb.dbo.sysjobactivity ja
LEFT JOIN msdb.dbo.sysjobhistory jh ON ja.job_history_id = jh.instance_id
JOIN msdb.dbo.sysjobs j ON ja.job_id = j.job_id
JOIN msdb.dbo.sysjobsteps js
ON ja.job_id = js.job_id
AND ISNULL(ja.last_executed_step_id,0)+1 = js.step_id
WHERE
ja.session_id = (
SELECT TOP 1 session_id FROM msdb.dbo.syssessions ORDER BY agent_start_date DESC
)
AND start_execution_date is not null
AND stop_execution_date is NULL) AS R
ON S.job_id = R.job_id
WHERE R.job_id IS NULL)
-- only select the jobs into the cursor set for which the most recent job had a timeout issue
SELECT job_id
, step_name
FROM CTE_NotRunning
WHERE [message] LIKE '%0x80131904%time%out%' -- error message that corresponds to timed out jobs, error code: 0x80131904
AND rownum = 1
OPEN cJobs
FETCH NEXT FROM cJobs
INTO #jobid, #stepname
WHILE ##FETCH_STATUS = 0
BEGIN
-- for each of the timed out jobs in the cursor, start the job again from the step that caused the timeout
SET #jobname = (SELECT [name] FROM msdb.dbo.sysjobs WHERE job_id = #jobid)
EXECUTE dbo.sp_start_job #job_id = #jobid, #step_name = #stepname
END
CLOSE cJobs
DEALLOCATE cJobs
GO
I had this exact same issue. SQL Agent was running SSIS Jobs perfectly fine then suddenly I came across this error. Spent about an hour looking for a fix online. Found out the server admin had installed new windows updates.
I simply restarted the Server (which hosts the SSIS catalog and SQL Server/Agent). After server restart jobs ran fine again.
Hope server restart works for the next person that goes through this.
Sometimes this kind of error occurs when the package is deployed twice under SQL Integration Service Catalogs. You also may have changed the package name but there are other related auto-generated configurations are unique like the Environment reference Id and others .
So if you have a scheduled job, you will need to create a new one and point it to the .
Good Luck
I had the same problem and error message on SQL Server 2017.
My problem was on the SSISDB database, that was too big and had to be maintained (no more space available). After having cleaned up the SSISDB database, the jobs ran well again on this server.

How to check the SSIS package job results after it has completed its execution?

I have an SSIS package which imports the data into the SQL Server 2008 database. I have set up the schedule job in the SQL Server Agent to run that package. When I check the history, I could only see whether the job ran successfully or not. I could not see other messages apart from that.
I would like to know how many records are imported whenever the job is executed. How can I monitor that? Should I use the additional components in SSIS package or set some configurations in SQL Server Agent Job Setup?
I found some logging facilities in SQL Server Agent Job Setup but I am not sure it can fulfill my requirements or not.
If you are just interested in knowing the columns being processed and not interested with the info for further use, one possible option is making use of the SSIS logging feature. Here is how it works for data flow tasks.
Click on the SSIS package.
On the menus, select SSIS --> Logging...
On the Configure SSIS Logs: dialog, select the provider type and click Add. I have chosen SQL Server for this example. Check the Name checkbox and provide the data source under Configuration column. Here SQLServer is the name of the connection manager. SSIS will create a table named dbo.sysssislog and stored procedure dbo.sp_ssis_addlogentry in the database that you selected. Refer screenshot #1 below.
If you need the rows processed, select the checkbox OnInformation. Here in the example, the package executed successfully so the log records were found under OnInformation. You may need to fine tune this event selection according to your requirements. Refer screenshot #2 below.
Here is a sample package execution within data flow task. Refer screenshot #3 below.
Here is a sample output of the log table dbo.sysssislog. I have only displayed the columns id and message. There are many other columns in the table. In the query, I am filtering the output only for the package named 'Package1' and the event 'OnInformation'. You can notice that records with ids 7, 14 and 15 contain the rows processed. Refer screenshot #4 below.
Hope that helps.
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
use the below procedure for getting SSIS errors with execution id
CREATE PROCEDURE [dbo].[get_ssis_status] #EXECUTION_ID INT\n
AS
BEGIN
SELECT o.operation_id EXECUTION_ID
,convert(datetimeoffset,OM.message_time,109) TIME
,D.message_source_desc ERROR_SOURCE
,OM.message ERROR_MESSAGE
,CASE ex.STATUS
WHEN 4 THEN 'Package Failed'
WHEN 7 THEN CASE EM.message_type
WHEN 120 THEN 'package failed'
WHEN 130 THEN 'package failed' ELSE 'Package Succeed'END
END AS STATUS
FROM SSISDB.CATALOG.operation_messages AS OM
INNER JOIN SSISDB.CATALOG.operations AS O ON O.operation_id = OM.operation_id
INNER JOIN SSISDB.CATALOG.executions AS EX ON o.operation_id = ex.execution_id
INNER JOIN (VALUES (- 1,'Unknown'),(120,'Error'),(110,'Warning'),(130,'TaskFailed')) EM(message_type, message_desc) ON EM.message_type = OM.message_type
INNER JOIN (VALUES
(10,'Entry APIs, such as T-SQL and CLR Stored procedures')
,(20,'External process used to run package (ISServerExec.exe)')
,(30,'Package-level objects')
,(40,'Control Flow tasks')
,(50,'Control Flow containers')
,(60,'Data Flow task')
) D(message_source_type, message_source_desc) ON D.message_source_type = OM.message_source_type
WHERE ex.execution_id = #EXECUTION_ID
AND OM.message_type IN (120,130,-1);
END
Here's another approach for when SQL Server job history is not showing output from SSIS packages: use DTEXEC command lines.
(Upside: this approach puts the job's output where anyone else supporting it would expect to find it: in job history.
Downside for big packages: if you have a long SSIS package, with lots of tasks or components, and lots of output, then the job history will split package output into many lines of job history, making the approach in the previous answer--logging to a table--easier to read.)
To show SSIS package output in the job's View History:
(1) Change the job steps from type "SQL Server Integration Services Package", to "Operating system (CmdExec)",
(2) Use DTEXEC command lines, to execute the packages.
Example of command line:
DTExec /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
Note that if the SSIS package requires 32-BIT execution (true for exporting to Excel, for example), then use the DTEXEC utility in "Program Files (x86)" by fully qualifying it. Example, where the SQL Server application was installed on an "E:" drive, and where SQL Server 2014 is being used:
"E:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\DTExec.exe" /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
If your SSIS packages are in the file system (as ".dtsx" files), then replace "/DTS" with "/FILE".
If your SSIS packages were placed in SSISDB (using the "project deployment model", which is available starting with SQL Server 2012, instead of the older "package deployment model"), then replace "/DTS" with "/ISSERVER".
Next, go into the job step's "Advanced" page, and make sure that the box is checked for "Include step output in history".
Lastly, consider your job step's "Run as": if your job steps "Run as" were already set to a proxy, on job steps of type "SQL Server Integration Services Package", then you already made that proxy active to the subsystem "SQL Server Integration Services Package". Now, to do command lines like the above, check the proxy's properties, and make sure it is also active to the subsystem "Operating system (CmdExec)".
MSDN reference: SSIS Output on Sql Agent history
If you have deployed the package to the database's Integration Services Catalog (rather than load it from a file system) you can easily get detailed reporting.
Open the catalog node in SQL Server Management Studio, right click the Package name, select Reports | Standard Reports | All Executions and see details about every step of the job and its subcomponents, including records imported.

Scheduled run of stored procedure on SQL server

Is it possible to set up somehow Microsoft SQL Server to run a stored procedure on regular basis?
Yes, in MS SQL Server, you can create scheduled jobs. In SQL Management Studio, navigate to the server, then expand the SQL Server Agent item, and finally the Jobs folder to view, edit, add scheduled jobs.
If MS SQL Server Express Edition is being used then SQL Server Agent is not available. I found the following worked for all editions:
USE Master
GO
IF EXISTS( SELECT *
FROM sys.objects
WHERE object_id = OBJECT_ID(N'[dbo].[MyBackgroundTask]')
AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[MyBackgroundTask]
GO
CREATE PROCEDURE MyBackgroundTask
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- The interval between cleanup attempts
declare #timeToRun nvarchar(50)
set #timeToRun = '03:33:33'
while 1 = 1
begin
waitfor time #timeToRun
begin
execute [MyDatabaseName].[dbo].[MyDatabaseStoredProcedure];
end
end
END
GO
-- Run the procedure when the master database starts.
sp_procoption #ProcName = 'MyBackgroundTask',
#OptionName = 'startup',
#OptionValue = 'on'
GO
Some notes:
It is worth writing an audit entry somewhere so that you can see that the query actually ran.
The server needs rebooting once to ensure that the script runs the first time.
A related question is: How to run a stored procedure every day in SQL Server Express Edition?
Yes, if you use the SQL Server Agent.
Open your Enterprise Manager, and go to the Management folder under the SQL Server instance you are interested in. There you will see the SQL Server Agent, and underneath that you will see a Jobs section.
Here you can create a new job and you will see a list of steps you will need to create. When you create a new step, you can specify the step to actually run a stored procedure (type TSQL Script). Choose the database, and then for the command section put in something like:
exec MyStoredProcedure
That's the overview, post back here if you need any further advice.
[I actually thought I might get in first on this one, boy was I wrong :)]
Probably not the answer you are looking for, but I find it more useful to simply use Windows Server Task Scheduler
You can use directly the command sqlcmd.exe -S "." -d YourDataBase -Q "exec SP_YourJob"
Or even create a .bat file. So you can even 2x click on the task on demand.
This has also been approached in this HERE
I'll add one thing: where I'm at we used to have a bunch of batch jobs that ran every night. However, we're moving away from that to using a client application scheduled in windows scheduled tasks that kicks off each job. There are (at least) three reasons for this:
We have some console programs that need to run every night as well. This way all scheduled tasks can be in one place. Of course, this creates a single point of failure, but if the console jobs don't run we're gonna lose a day's work the next day anyway.
The program that kicks off the jobs captures print messages and errors from the server and writes them to a common application log for all our batch processes. It makes logging from withing the sql jobs much simpler.
If we ever need to upgrade the server (and we are hoping to do this soon) we don't need to worry about moving the jobs over. Just re-point the application once.
It's a real short VB.Net app: I can post code if any one is interested.
You could use SQL Server Service Broker to create custom made mechanism.
Idea (simplified):
Write a stored procedure/trigger that begins a conversation (BEGIN DIALOG) as loopback (FROM my_service TO my_service) - get conversation handler
DECLARE #dialog UNIQUEIDENTIFIER;
BEGIN DIALOG CONVERSATION #dialog
FROM SERVICE [name]
TO SERVICE 'name'
...;
Start the conversation timer
DECLARE #time INT;
BEGIN CONVERSATION TIMER (#dialog) TIMEOUT = #time;
After specified number of seconds a message will be sent to a service. It will be enqueued with associated queue.
CREATE QUEUE queue_name WITH STATUS = ON, RETENTION = OFF
, ACTIVATION (STATUS = ON, PROCEDURE_NAME = <procedure_name>
, MAX_QUEUE_READERS = 20, EXECUTE AS N'dbo')
, POISON_MESSAGE_HANDLING (STATUS = ON)
Procedure will execute specific code and reanable timer to fire again.
You can find fully-baked solution(T-SQL) written by Michał Gołoś called Task Scheduler
Key points from blog:
Pros:
Supported on each version (from Express to Enterprise). SQL Server Agent Job is not available for SQL Server Express
Scoped to database level. You could easiliy move database with associated tasks (especially when you have to move around 100 jobs from one enviromnent to another)
Lower privileges needed to see/manipulate tasks(database level)
Proposed distinction:
SQL Server Agent (maintenance):
backups
index/statistics rebuilds
replication
Task Scheduler (business processes):
removing old data
preaggregations/cyclic recalculations
denormalization
How to set it up:
get source code from section: "Do pobrania" - To download
(enabling broker/setting up schema tsks/configuration table + triggers + stored procedure)/setting up broker things)
set up configuration table [tsks].[tsksx_task_scheduler] to add new tasks (columns names are self-descriptive, sample task included)
Warning: Blog is written in Polish but associated source code is in English and it is easy to follow.
Warning 2: Before you use it, please make sure you have tested it on non-production environment.
Using Management Studio - you may create a Job (unter SQL Server Agent)
One Job may include several Steps
from T-SQL scripts up to SSIS Packages
Jeb was faster ;)
You should look at a job scheduled using the SQL Server Agent.

Resources