How to check the SSIS package job results after it has completed its execution? - sql-server

I have an SSIS package which imports the data into the SQL Server 2008 database. I have set up the schedule job in the SQL Server Agent to run that package. When I check the history, I could only see whether the job ran successfully or not. I could not see other messages apart from that.
I would like to know how many records are imported whenever the job is executed. How can I monitor that? Should I use the additional components in SSIS package or set some configurations in SQL Server Agent Job Setup?
I found some logging facilities in SQL Server Agent Job Setup but I am not sure it can fulfill my requirements or not.

If you are just interested in knowing the columns being processed and not interested with the info for further use, one possible option is making use of the SSIS logging feature. Here is how it works for data flow tasks.
Click on the SSIS package.
On the menus, select SSIS --> Logging...
On the Configure SSIS Logs: dialog, select the provider type and click Add. I have chosen SQL Server for this example. Check the Name checkbox and provide the data source under Configuration column. Here SQLServer is the name of the connection manager. SSIS will create a table named dbo.sysssislog and stored procedure dbo.sp_ssis_addlogentry in the database that you selected. Refer screenshot #1 below.
If you need the rows processed, select the checkbox OnInformation. Here in the example, the package executed successfully so the log records were found under OnInformation. You may need to fine tune this event selection according to your requirements. Refer screenshot #2 below.
Here is a sample package execution within data flow task. Refer screenshot #3 below.
Here is a sample output of the log table dbo.sysssislog. I have only displayed the columns id and message. There are many other columns in the table. In the query, I am filtering the output only for the package named 'Package1' and the event 'OnInformation'. You can notice that records with ids 7, 14 and 15 contain the rows processed. Refer screenshot #4 below.
Hope that helps.
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:

use the below procedure for getting SSIS errors with execution id
CREATE PROCEDURE [dbo].[get_ssis_status] #EXECUTION_ID INT\n
AS
BEGIN
SELECT o.operation_id EXECUTION_ID
,convert(datetimeoffset,OM.message_time,109) TIME
,D.message_source_desc ERROR_SOURCE
,OM.message ERROR_MESSAGE
,CASE ex.STATUS
WHEN 4 THEN 'Package Failed'
WHEN 7 THEN CASE EM.message_type
WHEN 120 THEN 'package failed'
WHEN 130 THEN 'package failed' ELSE 'Package Succeed'END
END AS STATUS
FROM SSISDB.CATALOG.operation_messages AS OM
INNER JOIN SSISDB.CATALOG.operations AS O ON O.operation_id = OM.operation_id
INNER JOIN SSISDB.CATALOG.executions AS EX ON o.operation_id = ex.execution_id
INNER JOIN (VALUES (- 1,'Unknown'),(120,'Error'),(110,'Warning'),(130,'TaskFailed')) EM(message_type, message_desc) ON EM.message_type = OM.message_type
INNER JOIN (VALUES
(10,'Entry APIs, such as T-SQL and CLR Stored procedures')
,(20,'External process used to run package (ISServerExec.exe)')
,(30,'Package-level objects')
,(40,'Control Flow tasks')
,(50,'Control Flow containers')
,(60,'Data Flow task')
) D(message_source_type, message_source_desc) ON D.message_source_type = OM.message_source_type
WHERE ex.execution_id = #EXECUTION_ID
AND OM.message_type IN (120,130,-1);
END

Here's another approach for when SQL Server job history is not showing output from SSIS packages: use DTEXEC command lines.
(Upside: this approach puts the job's output where anyone else supporting it would expect to find it: in job history.
Downside for big packages: if you have a long SSIS package, with lots of tasks or components, and lots of output, then the job history will split package output into many lines of job history, making the approach in the previous answer--logging to a table--easier to read.)
To show SSIS package output in the job's View History:
(1) Change the job steps from type "SQL Server Integration Services Package", to "Operating system (CmdExec)",
(2) Use DTEXEC command lines, to execute the packages.
Example of command line:
DTExec /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
Note that if the SSIS package requires 32-BIT execution (true for exporting to Excel, for example), then use the DTEXEC utility in "Program Files (x86)" by fully qualifying it. Example, where the SQL Server application was installed on an "E:" drive, and where SQL Server 2014 is being used:
"E:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\DTExec.exe" /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
If your SSIS packages are in the file system (as ".dtsx" files), then replace "/DTS" with "/FILE".
If your SSIS packages were placed in SSISDB (using the "project deployment model", which is available starting with SQL Server 2012, instead of the older "package deployment model"), then replace "/DTS" with "/ISSERVER".
Next, go into the job step's "Advanced" page, and make sure that the box is checked for "Include step output in history".
Lastly, consider your job step's "Run as": if your job steps "Run as" were already set to a proxy, on job steps of type "SQL Server Integration Services Package", then you already made that proxy active to the subsystem "SQL Server Integration Services Package". Now, to do command lines like the above, check the proxy's properties, and make sure it is also active to the subsystem "Operating system (CmdExec)".
MSDN reference: SSIS Output on Sql Agent history

If you have deployed the package to the database's Integration Services Catalog (rather than load it from a file system) you can easily get detailed reporting.
Open the catalog node in SQL Server Management Studio, right click the Package name, select Reports | Standard Reports | All Executions and see details about every step of the job and its subcomponents, including records imported.

Related

SSRS Report fails when trying to run Oracle Stored Procedure

I have an issue whereby I can execute an SSRS report which calls an Oracle Stored Procedure in VS2017, but when I deploy to the SSRS Server and run, it returns the following message:-
• An error has occurred during report processing. (rsProcessingAborted)
o Query execution failed for dataset 'spTestSubDet'. (rsErrorExecutingCommand)
For more information about this error navigate to the report server on the local server machine, or enable remote errors
The dataset 'spTestSubDet' is the Oracle Stored Proc.
Some configuration details:-
Oracle Database 19c Standard Edition 2 Release 19.0.0.0.0 - Production
SSRS version is 15.0.19528.0.
SQL Server version is 2014.
I can execute SQL code and Views against the Oracle server with the same DSN from the deployed report (without the oracle stored proc being present), so I know the DSN configuration is not the issue.
I have also check marked the "Use single transaction when processing the queries" box in the DS Properties.
I’m guessing that it might be some form of “Execute” permissions issue on Oracle, rather than the Report Server, where the Stored Proc is concerned.
As a developer, I don’t have any DBA permissions to interrogate how the SSRS Server is set up, or the Oracle DB, so any suggestions will have to be passed on to my ICT dept.
I also can't enable "remote errors" on the Report Server, but have requested that with the ICT dept.
Any help greatly appreciated.
Seems I got lucky with enabling “Remote Errors” on the report server and not personally having to restart the service.
I now have a more explicit error message from the SSRS report:-
“ORA-06550: line 1, column 7: PLS-00306: wrong number or types of arguments in call to 'SPTESTSUBDET' ORA-06550: line 1, column 7: PL/SQL: Statement ignored”
As mentioned in my original post, the report works fine locally from VS2017, so I don’t know why it’s telling me when deployed and run from the server that there seems to be a problem with the SQL code:-
create or replace
PROCEDURE SPTESTSUBDET
(s1 OUT SYS_REFCURSOR)
IS
BEGIN
OPEN s1 FOR
SELECT
*
FROM
onemain.sbceysubmitted sbceysub
WHERE
sbceysub.STUD_ID = 167071
;
END SPTESTSUBDET;
It’s as simple a test as I can put together and doesn’t use any parameters to complicate things.
I’m wondering if it might be a driver issue, though why it works locally and not on the server is baffling me.
I have Oracle Developer tools “ODAC v18.3.0” installed for VS2017.
The user in the referenced post below had what looked like to be the same problem, but it's not clear what version of the ODAC tools has been used to resolve the issue:-
https://stackoverflow.com/a/60569788/2053847
Any thoughts/help greatly appreciated.
The easiest thing to do is check the log files. I bet this is a SQL exception and it is related to something wrong with the way you are calling the stored procedure or within the stored procedure itself. The log files reside on the SSRS instance at -> SQL SERVER INTALL DIR\MSSQL.15(OR OTHER SSRS VERSION DIR)\Reporting Service\Log Files. Log files for the SSRS manager and SSRS service are saved here. Open the log for the SSRS Service after you encounter the error search for "spTestSubDet" and you should see the detail of the exception that is causing your problems.

SQL Server Agent Job stops SSIS Step with "unexpected error" and without any error informations

I am dealing with my problem on some Windows Server 2019 (Core) with one running SQL Server 2019 CU4 instance each.
What we try to do
We are currently building a data warehouse with distributed databases. The individual layers of the DWH are located on one database server each. The data exchange between the layers/servers takes place via SSIS ETLs, which use Linked Servers to reach the other layers and drag and drop data. Each layer also has its own SSIS service instance and executes the corresponding SSIS packets.
The SSIS packages are called by SQL Server Agent jobs. We have a job that executes the SSIS packets (#1), which in turn calls another job (#2) as the last step, which after a short wait time executes the calling job (#1). Thus, controlled by schedules, a loop is created and data is continuously transferred with ETLs.
I hope this was not too much unnecessary background
The error
Basically the job is running and there are numerous successful executions. However, we are observing interruptions at job #1 without helpful information regarding the error. This means that the job history log refers to the SSIS log, which again only contains an "unexpected termination". In the SSIS log, we only see behavior that indicates that the ETL packet active at that time stopped after validation. Depending on the log level, nothing is logged at all, not even the execution of single packages of the project. The package where this error occurs is different and not limited to a specific one.
What I have already tried
Re-create the jobs and SSIS Enviroments by hand (scripted before)
Using the 32Bit Runtime
Upgrade the SSIS project/package version to
2019
Increase the log level to "verbose"
Patching the SQL Server to CU4
Save ssis dump files (couldn't find them or they weren't created)
Search Windows and SQL Server Logfiles
Does anyone have some suggestions or some ideas how to become more error specific informations?
Thank you very much and take care :)
UPDATE We have an error message (OLE DB 0xC0202009 and 0X80004005)!
In order to exclude the use of environments as a cause, I manually set the parameters in the SSIS job step instead of overwriting them by selecting an environment.
Long story short: Today it turns out that the parameter for an OLE DB Connection String is not passed correctly.
The following is specified as a parameter in the job step:
However, the following connection string is specified in the context of the error message:
Please note that some arguments are added twice to the parameter (red).
What could have caused that?

ssis moving data between sql and access databases

In SQL Server Data Tools 2015, I would like to move data from a SQL Server 2012 database into a new access database(2005) and need to create the access table as part of the process. Can this be done all in one Execute SQL process under control flow. This will be part of a loop to run through a list of tables that need to be dynamically created and loaded into an empty access db.
I have created a connection manager and that is in the connection field for the access database and put the code into the SQL statement field under the general tab of the Execute SQL Task component.
Both databases are on my local machine.
"SELECT a.* into providers from OPENROWSET('SQLNCLI', 'Server=localhost;Trusted_Connection=yes;', 'SELECT * FROM newResults.dbo.providers') as a"
I get the following error:
SSIS package "C:\Users\chris\source\repos\Integration Services
Project5\Integration Services Project5\Package1.dtsx" starting. Error:
0xC002F210 at Execute SQL Task 2, Execute SQL Task: Executing the
query "SELECT a.* into providers from OPENQUERYSET('SQLN..." failed
with the following error: "Syntax error in FROM clause.". Possible
failure reasons: Problems with the query, "ResultSet" property not set
correctly, parameters not set correctly, or connection not established
correctly. Task failed: Execute SQL Task 2 SSIS package
"C:\Users\chris\source\repos\Integration Services Project5\Integration
Services Project5\Package1.dtsx" finished: Success.
The SQL contained in the Execute SQL Task is executed in the destination's context. The SELECT INTO FROM OPENQUERYSET statement is being passed to Access. Access doesn't have the OPENQUERYSET function and even if it did, your source is SQL Server, which Access doesn't know about unless you have made a connection to SQL Server in Access. Copy your SQL statement into Access and try to execute it and you'll see the same or a similar error. That's what the Execute SQL Task is doing.
Dynamic data is one of the more challenging problems in SSIS. The COZYROC tools include a lot of support for handling dynamic scenarios. Check out the videos for their Data Flow Task Plus for some ideas.

SSIS SQL Server agent launch job already running

Is there any way to have a package (which will be a wrapper) run every minute in SQL Server agent, even if it is already running from a previous execution. It seems the SQL Server agent does not launch if already running. Is there a way to override this behaviour?
I wanted to do something such as
Wrapper.dtsx
--> read from table of packages to run, and select next in line
--> execute package task with the package dynamically set from previously
selection
--> exit
ie
table has the following packages (assume some ranking will exist eventually)
a.dtsx (say runs for 5 mins)
b.dtsx (say runs for 4 mins)
c.dtsc (say runs for 6 mins)
12:01 am a.dtsx is executed
12:02 am b.dtsx is executed
12:03 am c.dtsx is executed
at the moment I can only get the following to occur
12:01 am a.dtsx is executed
12:06 am b.dtsx is executed
12:10 am c.dtsx is executed
Hm, this SQL Jobs behavior is standard for MS SQL Server and cannot be altered. For your situation if you are on SQL 2012 and higher, you can use new SSIS Catalog with async execution. By using this your job will start package execution and quit; therefore you are free to start it in a minute. Disadvantage - job status will only show whether package been started and nothing on its outcome; you have to do execution monitoring yourself.
Switching to async package execution requires SSIS 2012+, establishing SSIS Catalog DB, switching your packages to Project deploy model. After all, create SQL Job to start package, specify all needed connections and parameters, save it. Then with context menu select Script Job as -> DROP and CREATE to -> New Query Editor Window. In the query text - locate substring
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True
and switch it to
/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";False
Then run script updating your job.
This strange script manipulation is needed since by default SQL Job executes package synchronously and does not expose async option in user interface.

My SSIS Package works, but fails as a SSMS job (Error: 0xC0016016)

My SSIS Package works, but fails as a SSMS job (Error: 0xC0016016)
I am posting this question and sharing my solution to this issue as my own question because I didn't see the posed problem match the specific issue I encountered and the answers seemed to be scattered in different forum questions.
Background:
I have four SSIS packages on SQL Server 2012 that import a table from SQL Server 2008 R2, 2008, or 2005, depending on the specific package. I use a designated sql server login and password for the source database and integrated Windows security for the target database.
Within SSIS I am able to run each package without a problem.
To ensure this package ran on a schedule, I set up a SSMS job on the same server as the SSIS package. In Job step properties, I chose: SQL Server Integration Services Package > run as SQL Sever Agent Service Account > Package source: File system).
The Symptom:
When manually running the job to make sure it worked, I got an error and saw this in the Log File Viewer. This was the first of several errors, but as this was chronologically the first error, I looked into this one first.
Error: 2014-10-24 09:52:34.48
Code: 0xC0016016
Source: [Redacted -- the correct name of the table I was importing]
Description: Failed to decrypt protected XML node "DTS:Password" with
error 0x8009000B "Key not valid for use in specified state.". You may
not be authorized to access this information. This error occurs when
there is a cryptographic error. Verify that the correct key is
available.
End Error
I looked up the error code on Google and started looking at resolutions. Rather than retelling how I found the bits and pieces of the resolution, I am presenting an actionable result in sequence -- at least for me and the network infrastructure I'm working with.
In the "properties" panel of the SSIS solution (do this first) and each package in that solution, reset the "ProtectionLevel" attribute to EncryptSensitiveWithPassword and set a password. The package passwords must match the solution password.
Just a double-check, run your packages in debug mode to make sure that no other new issues arise. In my case, I needed to re-enter the sql server password for the source server database.
Rebuild your SSIS solution.
In SSMS, open your job and open the job step properties for the task in question.
Select the "Command Line" tab. A "Package Password" popup appears. Enter the password that you entered in step 1.
Select "Edit the command line manually" and place the same password from step 1 immediately after /DECRYPT.
Repeat steps 4-6 for each job step that runs this type of package.
Press the "OK" button and re-run your job.
I was able to run my job successfully after that.

Resources