SQL DMO based code generator - sql-server

I already have a code generator based on SQL DMO, that writes the a C# function for any stored procedure in by SQL Server 2008 database. Currenly however the code generator can only handle stored procedures that have input and output parameters. For stored procedures that return multiple records, the code generator returns a datatable with rows that each represent a output record.
Is there a way using SQL DMO to determine the fields that would be returned by a stored procedure if the output of a stored procedure is select * from Member where MemberID=1?
Thanks

My guess is that you cannot do this in DMO, since DMO relies on meta information stored in SQL Server, and the result set description of an SP is not stored in that way (as far as I know).
What you can do, however, is to have your generator execute the stored procedure inside a transaction, and analyze the resulting SqlDataReader. Have a look at the GetName(), GetFieldType() and GetSchemaTable() methods to construct your result class.
After execution, rollback the transaction (in case the SP makes any changes to the database).
You also might consider upgrading your generator to SMO, as MSDN states that DMO will not be supported in the future.

Related

Number of lines of code per dataset in Reort Builder 3.0

How many lines of code does report builder 3.0 handle per data? is it 757?
It seem i can not add more lines of code, what is the work around ?
Instead of keeping the SQL in the data set, convert this to a stored procedure then use the Stored Procedure option for the query type. See the documentation for more details on this. Using a stored procedure will bring additional benefits, such query plan reuse, an extra layer of security, and efficient reuse of the same stored procedure for other reports that would be using an identical SQL query.

How do I execute a SQL Server stored procedure from Informatica Developer (10.1, not Power Center)

I am trying to execute (call) a SQL Server stored procedure from Infa Developer, I created a mapping (new mapping from SQL Query). I am trying to pass it runtime variables from the previous mapping task in order to log these to a SQL Server table (the stored procedure does an INSERT). It generated the following T-SQL query:
?RETURN_VALUE? = call usp_TempTestInsertINFARunTimeParams (?Workflow_Name?, ?Instance_Id?, ?StartTime?, ?EndTime?, ?SourceRows?, ?TargetRows?)
However, it does not validate, the validation log states 'the mapping must have a source' and '... must have a target'. I have a feeling I'm doing this completely wrong. And: this is not Power Center (no sessions, as far as I can tell).
Any help is appreciated! Thanks
Now with the comments I can confirm and answer your question:
Yes, Soure and Target transformations in Informatica are mandatory elements of the mapping. It will not be a valid mapping without them. Let me try to explain a bit more.
The whole concept of ETL tool is to Extract data from the Source, do all the needed Transformations outside the database and Load the data to required Target. It is possible - and quite often necessary - to invoke Stored Procedures before or after the data load. Sometimes even use the exisitng Stored Procedures as part of the dataload. However, from ETL perspective, this is the additional feature. ETL tool - here Informatica being a perfect example - is not meant to be a tool for invoking SPs. This reminds me a question any T-SQL developer asks with his first PL-SQL query: what in the world is this DUAL? Why do I need 'from dual' if I just want to do some calculation like SELECT 123*456? That is the theory.
Now in real world it happens quite often that you NEED to invoke a stored procedure. And that it is the ONLY thing you need to do. Then you do use the DUAL ;) Which in PowerCenter world means you use DUAL as the Source (or actually any table you know that exists in the source system), you put 1=2 in the Source Filter property (or put the Filter Transforation in the mapping with FALSE as the condition), link just one port with the target. Next, you put the Stored Procedure call as Pre- or Post-SQL property on your source or target - depending on where you actually want to run it.
Odd? Well - the odd part is where you want to use the ETL tool as a trigger, not the ETL tool ;)

Suppressing output from a stored procedure in SQL Server 2008 R2 and newer

I am currently working with a stored procedure that performs some background processes and then returns one of two results tables.
If it works ok I get a one column table that says success, if it doesn't then I get a four column table with various error data.
While this is fine if you just execute the code from .net, I now need to execute this from within another stored procedure. While I don't need the output, I do need the background processes to take place. I'd usually insert the output into a table, but can't in this case as the columns in the output varies dependent on the result, and as such cannot define a table that it can insert into.
Easiest answer would be to rewrite the outputs of the background SP to be consistent but this isn't an option. I've even tried wrapping this inside a UDF but the stored procedure can't be called from with a function.
Whatever solution I finally use it must work on versions from SQL Server 2008 R2 up to 2016.
Does anybody have any suggestions?
Many thanks,
Mat.
I would image you could create a SP that inserts the result of the inner SP into a temporary table using the hack below.
Insert results of a stored procedure into a temporary table
If that blocks the ouput then you can return no data.

SQL Server - compare the results of two stored procedures that output multiple tables

So, similar to "SQL Server compare results of two queries that should be identical", I need to compare the output of two stored procedures to ensure the new version is generating equivalent output to the old version. The tricky part is that my SP outputs six tables of differing widths.
I started writing a hybrid version of them that would compare each of the tables individually, but it's a pretty complex SP, so I was hoping there was an easier way.
I tried using EXCEPT as in the linked question, but it looks like that will only compare one table to one other table.
Easy option 1: Output the stored procedure results to a text file (one per procedure version) and use a diff tool/editor to make sure they are the same.
Easy option 2: Write the stored procedure results to a table/temp table (per return table per procedure) and write sql to compare the results. Just count the rows in each result table and then do a count of the union (not union all) of both tables. Repeat for each result table.
You can capture multiple result sets in .NET (C# or VB) quite easily. You can create a DataAdapter and DataSet, and use the DataAdapter.Fill() method to populate the DataSet. Each result set will be stored as a DataTable within that DataSet. Then you just need to loop through the DataTables collection in each DataSet and compare them. You can find more info on this MSDN page: Populating a DataSet from a DataAdapter
This can be done in either SQLCLR if you want to run it as a stored procedure or user-defined function, OR it can be a stand-alone console application. Running it as a SQLCLR stored procedure is quite convenient, but given that you will be stored all results for all 6 result sets, and for both stored procedures that you are testing, that might require too much memory. In that case, the console app is the the way to go.
The only thing I can think of is add an additional parameter to your both of (New/old) stored procedures to handle which result it should return like.
Exec usp_proc #var1 , #var2 , #ResultSet = 1
The above execution should return the first result set and if you pass #ResultSet = 2 it should return second result set and so on.....
do this with both stored procedure and then compare the result sets group by group (using except will do the trick).

Architecture for crystal report

All,
Architecture for crystal report,
I have two option,
Stored procedure with business logic and display data on crystal report.( tight coupled)
as SP are specially designed for the reports - less reusable. but recompiled.
Views to pull data and add business logic on report itself to filter data.(loose coupled)
-reusable views but what about performance compared to SP?
Any suggestions are more welcome...
If I understand your question correctly, I would recommend implementing option 1.
By calling a stored procedure, you will be reducing network traffic because you'd be passing only parameter definitions and the procedure name, instead of the entire query string you'd be sending to the DB in option 2.
Using stored procedures also keeps the plan cache tidy by compiling the set of SQL statements within the stored procedure, instead of storing separate plans for each statement within the string you'd be passing to the DB in option 2.

Resources