All,
Architecture for crystal report,
I have two option,
Stored procedure with business logic and display data on crystal report.( tight coupled)
as SP are specially designed for the reports - less reusable. but recompiled.
Views to pull data and add business logic on report itself to filter data.(loose coupled)
-reusable views but what about performance compared to SP?
Any suggestions are more welcome...
If I understand your question correctly, I would recommend implementing option 1.
By calling a stored procedure, you will be reducing network traffic because you'd be passing only parameter definitions and the procedure name, instead of the entire query string you'd be sending to the DB in option 2.
Using stored procedures also keeps the plan cache tidy by compiling the set of SQL statements within the stored procedure, instead of storing separate plans for each statement within the string you'd be passing to the DB in option 2.
Related
I have a SSRS report with 20 different datasets with some calculated columns in each.
I want to take few fields from all data sets including some calculated columns and insert them into a SQL table.
I want to do this for each month so that I can see the trends during a period. Is there any way to do that with out editing the data sets?
Can I refer the fields that I need by referring to Textbox4 and insert them into a SQL table? What is the easy way to do without touching data sets?
There is most likely alot better solution than using SSRS to update a SQL database. I am not proposing this as the best solution but rather a way to achieve what was asked.
You could create a dataset that runs a stored procedure you can pass the data as parameters to. The Sproc would do the insert into your chosen table and you can pass in the parameters from your original dataset however you see fit. You could even setup a second report with the Stored procedure Dataset that you can call on command by having an action event on an item to call the report. (I had a subreport embedded in a column of a tablix configured so it would only update with values from that row for instance).
To clarify:
Create a subreport that accepts the data you want to insert as a parameter for each column
Instead of adding a normal Dataset, have it call a stored procedure that inserts as you require
Add the subreport to your main report to be called once its run and configure the required parameters to be passed through.
There will be better, more efficient, cleaner ways to do this, but I found the above to work for my purposes since I was limited by time and resources. But I would still recommend you seek other solutions if possible.
How many lines of code does report builder 3.0 handle per data? is it 757?
It seem i can not add more lines of code, what is the work around ?
Instead of keeping the SQL in the data set, convert this to a stored procedure then use the Stored Procedure option for the query type. See the documentation for more details on this. Using a stored procedure will bring additional benefits, such query plan reuse, an extra layer of security, and efficient reuse of the same stored procedure for other reports that would be using an identical SQL query.
I am trying to execute (call) a SQL Server stored procedure from Infa Developer, I created a mapping (new mapping from SQL Query). I am trying to pass it runtime variables from the previous mapping task in order to log these to a SQL Server table (the stored procedure does an INSERT). It generated the following T-SQL query:
?RETURN_VALUE? = call usp_TempTestInsertINFARunTimeParams (?Workflow_Name?, ?Instance_Id?, ?StartTime?, ?EndTime?, ?SourceRows?, ?TargetRows?)
However, it does not validate, the validation log states 'the mapping must have a source' and '... must have a target'. I have a feeling I'm doing this completely wrong. And: this is not Power Center (no sessions, as far as I can tell).
Any help is appreciated! Thanks
Now with the comments I can confirm and answer your question:
Yes, Soure and Target transformations in Informatica are mandatory elements of the mapping. It will not be a valid mapping without them. Let me try to explain a bit more.
The whole concept of ETL tool is to Extract data from the Source, do all the needed Transformations outside the database and Load the data to required Target. It is possible - and quite often necessary - to invoke Stored Procedures before or after the data load. Sometimes even use the exisitng Stored Procedures as part of the dataload. However, from ETL perspective, this is the additional feature. ETL tool - here Informatica being a perfect example - is not meant to be a tool for invoking SPs. This reminds me a question any T-SQL developer asks with his first PL-SQL query: what in the world is this DUAL? Why do I need 'from dual' if I just want to do some calculation like SELECT 123*456? That is the theory.
Now in real world it happens quite often that you NEED to invoke a stored procedure. And that it is the ONLY thing you need to do. Then you do use the DUAL ;) Which in PowerCenter world means you use DUAL as the Source (or actually any table you know that exists in the source system), you put 1=2 in the Source Filter property (or put the Filter Transforation in the mapping with FALSE as the condition), link just one port with the target. Next, you put the Stored Procedure call as Pre- or Post-SQL property on your source or target - depending on where you actually want to run it.
Odd? Well - the odd part is where you want to use the ETL tool as a trigger, not the ETL tool ;)
I have a table with about 30 fields. I current have several stored procedures which access either a (aggregated) view of this table or the table itself. For many of these SPs I would like to assure that the returned records have all the same fields with the same column names. Is there a way to do this where I don't have to change 20 stored procs if I do need to change the output.
My way around it thus far is to provide clients with lists of ID which they then call SP's that return the data however this seems to be slow compared with getting the data in one shot. I have also considered using the formatting stored procs with a cursor from inside the search stored procs but was unsure if that would really buy me a whole lot.
The typical way to define a standardised and consistent data access method across multiple stored procedures in SQL Server to use Views.
Now your problem description seems to suggest that you are already using Views in order to manage your data access. If you are indeed unable to use Views for a specific reason, perhaps you can clarify the nature of your problem further for us.
When using multivalue parameters in sql reporting services is it more appropriate to implement the list filter using a filter on the dataset itself, the data region control or change the actual query that drives the dataset?
SSRS will support any scenario, so then I ask, is there a reason beyond the obvious of why this should be done at one level over another?
It makes sense to me that modifying the query itself and asking the RDBMS to handle the filtering would be most efficient but maybe I am missing something with respect to how the SSRS Data Processing Extension may handle this scenario?
You are correct. The way to go is to pass the parameters through to the database engine.
Reporting Services should only be ideally used to render content. The less data that you need to pass back to the client web browser, the faster the report will render.
You may find my answer to a similar post regarding using mulit-value parameters to be of use.
Passing multiple values for a single parameter in Reporting Services
Hope this helps but please feel free to pose any further questions you may have.
Cheers,
John
Using table-valued UDF is a good approach, but there is still one issue - in case if this function is called in many places of query, and even inside inner select, there can be performance problem. You can resolve this issue using table variable (or temp table eather):
DECLARE #Param (Value INT)
INSERT INTO #Param (Value)
SELECT Param FROM dbo.fn_MVParam(#sParameterString,',')
...
where someColumn IN(SELECT Value FROM #Param)
so function will be called only once.
Othe thing, if you don't use stored procedure, but embedded SQL query instead, you can just put MVP into query:
...
where someColumn IN(#Param)
...
Use the RDBMS to do the main filtering
SSRS provides filtering for the purposes on data driven display and/or dynamic display. Especially useful for sub reports etc