It seems SSIS has some unexposed variables called execution_result in the SSISDB/ Catalog under internal.executable_statistics.
But it's also referred to as DataCode in some places but it's essentially for each executable (so task or package level, etc).
0 = success, 1 = failure, 2 = complete, 3 = canceled.
I'm rolling some custom logging just as a simplified/ understandable view of my ETL tasks in SSIS to use in tandem with the more 'kitchen sink' logging in the SSIS Catalog 'internal' schema in SQL Server.
There are no system variables that seem to mimim/ capture ExecutionResult for any particular executable. Is there any way I can capture this and write it to a table or save it in a variable?
I've seen some try to use it in a Script Task that had to reference .dtsx filepath etc --- seems pretty complicated but anyone know an elegant way?
Essentially, I want a logging level (custom written) that is essentially: Package XYZ, blah, blah, blah, blah, executionResult: 1
Again I'm aware this is in executable_statistics, but not how I like it, and I only have SQL Sever 2012 currently so can't customize too much either.
Essentially I want to know if a package ultimately "succeeded" or "failed". Errors are on the right track, but not the same. Since it's possible a package (or container) can have an error and still succeed ultimately.
I suppose I could jerry-rig two separate Execute SQL tasks depending on if a package/ container "succeeded" or "failed" and go from there. Hmm.
ExecutionResult is available when using Event Handlers (on OnExecStatusChanged event handler):
System Variables for Event Handlers
So, on each task you can add an OnExecStatusChanged event handler and the #[System::ExecutionResult] variable will be available.
If you are new to Event handlers you can refer to the following articles:
SSIS Event Handlers Basics
Event Handlers in SSIS
Integration Services (SSIS) Event Handlers
Update 1
You can also benefit from the ExecValueVariable property of each task:
ExecutionValue and ExecValueVariable in SSIS
Have you used the ExecutionValue and ExecValueVariable properties?
Related
I'm using cqrs pattern with multiples databases (one for query and another for search). Should I put the insert inside Repository
CommunityRepository{
Add(Community community){
Database1.Insert(community);
Database2.Insert(community);
}
}
and then:
CommunityCommands{
Handler(AddCommunityCommand community){
communityRepository.Add(community);
}
or should i put this in Commands, like this:
CommunityCommands{
Handler(AddCommunityCommand community){
db1.Insert(community);
db2.Insert(community);
}
or maybe something like this, using the main repository + database2
CommunityCommands{
Handler(AddCommunityCommand community){
communityRepository.Add(community);
db2.Insert(community);
}
I would do neither of those options as you'd be basically coupling the Command and Query immplementations.
Instead, publish events from the Command side, like OrderPlacedEvent and subscribe to them from the Query side. This not only allows you to separate the implementations of the Command and Query sides, but it will also allow you to implement other side effects of the events without coupling the code from the multiple features (eg. "when an order is placed, send a confirmation email").
You can implement the pub/sub synchronously (in process) or asynchronously (with a messaging system). If you use messaging, note that you'll have to deal with eventual consistency (the read data is slightly behind the write data, but eventually it catches up).
refreshing the Query Models should be handled in an offline operation. You should do something like this:
process your Command logic (whatever it is)
right before your Command handler returns, send an Event to a message bus
then in a background service you can listen to those Events and update the Query side.
Bonus tip: you can use the Outbox pattern to get more reliability. The idea is to store the Event messages in a table on your Write DB in the same transaction as your previous write operation, instead of sending them directly. A background service would check for "pending" messages and dispatch them. The rest is unchanged.
I recently updated an old Delphi project to separate the creation of its data module and opening of that module's TDatabase, TTable and TClientDataSet components from the creation and showing of the app's form. Now, the app can toggle on and off the data components of the app. This new capacity isn't critical of course, but is "nice to have." My tool chain is RS v21 (the marketing version 10.4.1).
Yet there's a problem: after closing and reopening the data components from the UI, data aware controls show no data. Tests show that the underlying tables are open, can be navigated, and field values can be retrieved programmatically.
I've also established that resetting a data aware component's DataSource restores its display of underlying field data.
NB: data aware controls' DataSource are set at design time and, at run time after closing and reopening data tables, are not nil.
I conclude that resetting DataSource has a side effect -- probably tickling the DataLink within.
You can imagine I would like a means to obviate the need to reset a whole bunch of DataSources. The easy hopes that TDataSet methods like Refresh all fail. ;-)
A sketch of the coding inside the app's form looks like this -- to open:
dm:=Tdm.Create(nil);
dm.OpenDatabase;
dm.OpenDataTables;
And closing is the reverse:
dm.CloseDataTables;
dm.CloseDatabase;
dm.Free;
dm:=nil;
In the database Tdm: In design, the datasets are not active, and the database is not connected. Tdm.OpenDatabase envokes Open on the TDatabase; Tdm.OpenDataTables runs through the needed tables using their Open method. Tdm.CloseDatabase and Tdm.CloseDataTables are symmetrical.
Thank you for any insights.
My project consist of creating multiple sub directories and copy files to those sub directories. I developed this part using file system task inside a foreach loop in SSIS.
The final part is insert into SQL Table, the status of the process. If the file was copy successful the Status column should be "Successful" and the reason in another column should be "File was copied successfully" or something like that.
The error flow redirection (red arrow) is available for file system task or foreach loop? I have read somewhere that in event handlers you can work these status messages and insert them in SQL. Could someone please provide a solution or suggest one to solve this problem?
I would steer away from using event handlers. They are like hidden GOTOs, in which there is no indication in the control flow that they exist and you have to go to another screen to see what they are doing.
It's much more clear to use the control flow to direct errors. Any arrow from any task or container can be double clicked and configured. Change the constraint option to value=Failure to make the arrow go red.
I have a package with 1 container.Does the ssis pacakge fail,If that container fail!?
The property
FAIL PACKAGE ON FAILURE
is false for the container.
Does that mean the package fail only if this property set to TRUE,other wise only the container status is failed ,and the package status is not !?
Yes. If the Sequence Container fails, the overall package will fail. Raise the MaximiumAllowedErrors property of the Sequence Container to get the behavior you want.
Example
Below we have an example package. The Sequence Container has a task that will never succeed.
Above, the Sequence Container has failed and the Package has failed. Below are the properties of the container above. These are the default values for a new container.
Now lets stop and study. If we compare the package behavior against the property settings, this looks wrong. Here we have set FailPackageOnFailure=False, yet a Sequence Container failure is causing a Package failure. Why is this? Unintuitive attribute names. See this Microsoft Connect issue. You are not alone in your confusion. The official explanation from Microsoft is this.
Despite some pretty circular previous messages, we believe that the
feature is behaving as designed. When you set FailParentOnFailure to
false, the parent will not fail until the number of failures in the
child exceeds the MaximumAllowedErrors threshold. When you set
FailparentOnFailure to true, the parent will fail on the first
occurence of an error regardless of the MaximiumAllowedErrors
threshold.
The important piece of information to take away from that quote is that the FailPackageOnFailure and MaximiumAllowedErrors work as a pair!!!
So - knowing this - we can achieve the expected behavior by raising the MaximiumAllowedErrors count from 1 to 2.
This will allow you to have a sequence container which fails, but does NOT fail the overall package.
Hope this helps!
It all depends on how the package and containers is set up. You have to open/import it (in SQL Server Business Intelligence Development Studio) and run on preferable test data to see which one fails. Do the two containers have inter-dependencies on each other?
I have a Guage control in MainPage.xaml which needs three value (value, minimum and maximum). I have written the logic to get these three values from database in a stored procedure.
Please let me know how can I call these value in DomainService and bind the guage control properties to it.
Below I will elucidate my knowledge dump as it took me some time to figure this out precisely, and I'd like to contribute this per SO's documentation guidelines for anyone other than us:
0) I'm going to assume you have your stored procedures all implemented and are utilizing RIA services (as they're simpler for Silverlight development)
1) In the .Web, add a new EDM - you'll need to pull in all stored procedures manually via import function. At this point the autogenerated code part should be done for you, go ahead and build your project.
2) Time to add your domain service. Add a new Domain Service, this won't automatically generate code to pull in your stored procedures.
3) Next, visit leeontech (http://leeontech.wordpress.com/2010/05/24/ria-services-and-storedprocedures/) for some manual coding based on your stored procedure definitions.
What you're doing here is exposing data from stored procedures.
4) Start using your newly created classes
5) In silverlight, when you use Gauge on the front end side, make sure you assign max first using Math.Max() and similarly min using Math.Min() methods in the Load completed event handler. This way you are guaranteed that the asynchronous request is completed and you have values available. Actually in Visual Studio its pretty easy to debug that event handler even.
ps:
The visual studio tooling can recognize and allow you to create complex types based on your stored procedures you have implemented in step 1. As my personal best practice I like to assign the return column names as uniquely as I possibly can, thereby using them in client code the same way.
In stored procedure's final select I'll do something like
SELECT actualValueInDatabase as clientSideDataTypeIWant...
if that makes sense
If any of the above isn't clear, please let me know and I'll try to update with more information.
Good luck!