We currently have a process that calls SQLCMD in shell script that outputs the results of a stored procedure to a text log file. The stored procedure does multiple Updates, Inserts, and Select statements and we capture all the messages and results to a text file partly for having a Select statement that shows the table before it is updated and after it is updated. Now we are converting to SSIS and would like to capture both the results and messages in a text file.
I have 2 questions: Is there a way to do this without calling SQLCMD in SSIS and possibly use execute sql or data flow task? If not, what is the best practice for capturing changes? (I see that I need enterprise edition for Change Capture via SQL so that doesn't work for us.)
Edit (more explanation):
I have stored procedure that does 10 updates in a row. Before I do the update I want to see what the table looks like for that specific update query by selecting the data out of it with the same parameters as the update query. Now each update does something different but one may do something to a record that I did not expect. This will allow me to pinpoint the exact problem. The best idea suggested is triggers, although it may be slow, it can be set up to capture the changes that I need.
Related
I am trying to execute (call) a SQL Server stored procedure from Infa Developer, I created a mapping (new mapping from SQL Query). I am trying to pass it runtime variables from the previous mapping task in order to log these to a SQL Server table (the stored procedure does an INSERT). It generated the following T-SQL query:
?RETURN_VALUE? = call usp_TempTestInsertINFARunTimeParams (?Workflow_Name?, ?Instance_Id?, ?StartTime?, ?EndTime?, ?SourceRows?, ?TargetRows?)
However, it does not validate, the validation log states 'the mapping must have a source' and '... must have a target'. I have a feeling I'm doing this completely wrong. And: this is not Power Center (no sessions, as far as I can tell).
Any help is appreciated! Thanks
Now with the comments I can confirm and answer your question:
Yes, Soure and Target transformations in Informatica are mandatory elements of the mapping. It will not be a valid mapping without them. Let me try to explain a bit more.
The whole concept of ETL tool is to Extract data from the Source, do all the needed Transformations outside the database and Load the data to required Target. It is possible - and quite often necessary - to invoke Stored Procedures before or after the data load. Sometimes even use the exisitng Stored Procedures as part of the dataload. However, from ETL perspective, this is the additional feature. ETL tool - here Informatica being a perfect example - is not meant to be a tool for invoking SPs. This reminds me a question any T-SQL developer asks with his first PL-SQL query: what in the world is this DUAL? Why do I need 'from dual' if I just want to do some calculation like SELECT 123*456? That is the theory.
Now in real world it happens quite often that you NEED to invoke a stored procedure. And that it is the ONLY thing you need to do. Then you do use the DUAL ;) Which in PowerCenter world means you use DUAL as the Source (or actually any table you know that exists in the source system), you put 1=2 in the Source Filter property (or put the Filter Transforation in the mapping with FALSE as the condition), link just one port with the target. Next, you put the Stored Procedure call as Pre- or Post-SQL property on your source or target - depending on where you actually want to run it.
Odd? Well - the odd part is where you want to use the ETL tool as a trigger, not the ETL tool ;)
I am trying to completely automate this process, and I'm wondering if its viable or efficient to do in VBA.
Report process involves 2 files: one sql file and one excel file.
SQL file has the algorithm, and the final step is a query who's result is then pasted into the excel file.
The algorithm is simpler(than what the audience might be used to) but has two "into" commands and several "update" commands.
Two "into" commands, the first grabs a small portion(constrained on first and last day of previous month) of a 500m+ record table. The second joins the first table with an eligibility type table.
After the second table is created, there is a series of UPDATE commands that change existing data of existing columns.
Then a series of ALTER & UPDATE commands that add new columns to the [second] table and UPDATES them with desired data.
the final step is a query who's results are copy-pasted into excel (as is, no formatting changes necessary).
I'm not too well-versed in VBA/VBNET nor TSQL stored procedures and dynamic sql, if the sql algorithm was a simple pull query with no table creation, I can build something to automate that. But the SQL has 2 table creations, and about a dozen ALTER & UPDATE commands.
Am I stirring up the wrong nest? Should I run it manually as is?
You can definitely do automate this. I created a report that ran two stored procedures and created numerous queries with temp tables including both update and alter commands then used VBA to run execute these and aggregate the data in the final summary sheet.
There is a ton of documentation out there. You can even pass your values to the stored procedure after the user inputs them.
I would add this as a comment but I do not have enough reputation to comment yet (need 50).
I am currently working with a stored procedure that performs some background processes and then returns one of two results tables.
If it works ok I get a one column table that says success, if it doesn't then I get a four column table with various error data.
While this is fine if you just execute the code from .net, I now need to execute this from within another stored procedure. While I don't need the output, I do need the background processes to take place. I'd usually insert the output into a table, but can't in this case as the columns in the output varies dependent on the result, and as such cannot define a table that it can insert into.
Easiest answer would be to rewrite the outputs of the background SP to be consistent but this isn't an option. I've even tried wrapping this inside a UDF but the stored procedure can't be called from with a function.
Whatever solution I finally use it must work on versions from SQL Server 2008 R2 up to 2016.
Does anybody have any suggestions?
Many thanks,
Mat.
I would image you could create a SP that inserts the result of the inner SP into a temporary table using the hack below.
Insert results of a stored procedure into a temporary table
If that blocks the ouput then you can return no data.
I have an SSIS package with a data flow task. The OLE DB source has an execute proc statement. It fails while saving with below error message.
an OLEDB record is available... The metadata could not be determined because the statement 'select appname....' in procedure is not compatible with the statement 'select appid....' in procedure
This proc has several select statements and returns the appropriate result set as per parameters passed. Any pointers to bypass this error?
So you're saying that the SP will return different meta data depending on the parameter passed? SSIS doesn't like this - it can't update the meta data dynamically at run time. i.e. if you create a package that splits or sorts on a certain column, then you run the SP and it doesn't return that column, or the same column is a different data type, what should SSIS do? It can't automatically work it out.
I suggest you create a data source for each possibility of result set returned and conditionally execute each on as required.
In short SP's returning optionally different datasets is often not a good idea, definitely not from an ETL perspective.
Here is some code that shows how to create dynamically built output, (you could use the same method with just one output), but you'll still face the same problems downstream.
http://www.codeproject.com/Articles/32151/How-to-Use-a-Multi-Result-Set-Stored-Procedure-in
I ran into this issue as well. In my case, the result returned looked identical no matter which branch was executed, the difference was just in how that result was obtained (including different source tables). I simply executed all the cases with a union, and each "where" clause included the conditions for its execution instead of using "if" logic to choose a query.
Recently I noticed that a stored proc we are trying to profile failed to appear in the profiling output.
After adding in SP:StmtStarting and SP:StmtCompleted events, I noticed the TextData reported as
-- Encrypted text
.. but the stored procedure is not encrypted.
This has only recently started happening - we used to profile this SP perfectly fine, and I can't figure what has changed.
Any suggestions would be gratefully received.
UPDATE: The SP is definitely not encrypted. I've created new SP's on the box, and I see SP:BatchStarting event with the new SP's name. With the old SP, I don't see the BatchStarting event, but I do see the statements within the SP executing.
However I need to see the values of the parameters the SP is being called with, as they are table types. Originally I could see the table types being instantiated and populated before the SP is called.
So I figured this out in case anyone finds it useful.
I have table type parameters to this stored procedure. One of the parameters is passed a lot of data (i.e. a C# DataTable with >5000 rows). Without this quantity of data the stored proc profiled fine.
I guess there must be some cut-off at which point Profiler does not show all of the data being passed in.
Someone has altered the stored procedure and added the 'WITH ENCRYPTION' hint, which will cause this behavior. Alter the stored procedure and remove that hint and you'll start seeing the text of the proc again.
Also to note, if you don't have the original code, you will not be able to decrypt the text of the proc to issue the ALTER statement, so hopefully you have that handy.
Here's a decent run down of this option: Options for hiding SQL Server code
Moving the Trace Properties from the default of OnlySP(<your database here>)(user,default) to TSQL or TSQL_Replay unveiled the SQL being used for me, ... Go to File|Properties... and change the [Use the template:] drop-down combobox.