SQL Report Service Subscription with dynamic report paramater's values - sql-server

Background
I have a report with 3 parameters: AccountId, FromDate and ToDate. The report is invoice layout. Customer want to view all members which means we have 300 members, system will generates 300 reports in pdf or excel format and send to customer.
Question
How to set member id for this in subscription? I cannot do one by one and create 300 subscriptions in manually :|
If you're not clear, please comment below and I will correct it asap.
Updated:
The data-driven subscription which Manoj deal with is required SQL Report has Enterprise or Developer editon.
If I don't have, do you have any solution for work around?

If you are stuck with the Standard edition of SQL Server, you'll need to create a SSIS package to generate the reports. I used the article below to set up a package that now creates and emails our invoices, order confirmations, and shipping acknowledgments. I like this article over other ones I found because it's so easy to add more reports to it as you go along without having to create a new package each time.
If you plan to use this for more than one report, I would change the parameter to be a PK so that you know you're always going to pass in one integer regardless of which report you're calling. Another change I made was to create one table for the report generation piece, and one for the email piece. In my case, I only want to send one email that may have multiple attachments, so that was the best way to do it. In your stored proc that builds this data, make sure you have some checks for if the email is valid.
update TABLE
set SendStatus = 'Invalid Email Address'
where email NOT LIKE '%_#__%.__%' --valid email format
or patindex ('%[ &'',":;!=\/()<>]%', email) > 0 -- Invalid characters
or patindex ('[#.-_]%', email) > 0 -- Valid but cannot be starting character
or patindex ('%[#.-_]', email) > 0 -- Valid but cannot be ending character
or email not like '%#%.%' -- Must contain at least one # and one .
or email like '%..%' -- Cannot have two periods in a row
or email like '%#%#%' -- Cannot have two # anywhere
or email like '%.#%' or email like '%#.%' -- Cant have # and . next to each other
--0
When I set this up, I had a lot of issues getting the default credentials to work. The SSRS services would crash every time, but the rest of the SQL services would keep working. I ended up having our network guy make a new set with a static password just for this. If you do that, change the default credential line to this one.
rs.Credentials = new System.Net.NetworkCredential("acct", "p#ssword", "domain");
If you run into errors or issues, the SSRS logs and Google are your best friends. Or leave a comment below and I'll help.
Here's that article: http://technet.microsoft.com/en-us/library/ff793463(v=sql.105).aspx

use this link for reference:
http://beyondrelational.com/modules/2/blogs/101/posts/13460/ssrs-60-steps-to-implement-a-data-driven-subscription.aspx
create your SQL statement like this:
This will solve your problem.

Related

SSRS pass through Delivery Options

Just curious, I was attempting to create an SSRS subscription and in the data driven query have it set to if the view returns any rows above 0 then send out the emails based on the delivery options.
At first I came up with this;
if (select count(*) from view.table) > 0
print ('Data Available')
And it works! but it sends multiple emails based on the number of rows it returns back which isn't good. Also if no rows are returned, it doesn't send, which is also good but it flags as an error in SSRS. My requirements are that if any row is returned back, send out the report via email to the specified email addresses. If no rows are returned from that view, then don't send at all.
I came up with this next query but I am unsure of the syntax.
if (select count(*) from view.table) > 3
select email:To
I know this SQL above won't work but I can't figure out the syntax.
Essentially, I want to see if it's possible to use the delivery options from the page prior in SSRS of the Data Driven query Page. If not, would I have to create another view or table that has that email info? Is there a better way of going about this?
Thank you for your help!

Using variable DB Connections in different identical Kettle Jobs/Transformatioins

I've read thru many related topics here, but don't seem to find a solution. Here's my scenario:
I have multiple identical customer databases
I use ETL to fill special tables within these databases in order to use as a source for PowerBI reports
Instead of copying (and thus maintaining) the ETLs for each customer, I want to pass the DB connection to the Jobs/Transformations dynamically
My plan is to create a text file of DB connections for each Customer:
cust1, HOST_NAME, DATABASE_NAME, USER_NAME, PASSWORD
cust2, HOST_NAME, DATABASE_NAME, USER_NAME, PASSWORD
and so on...
The Host will stay the same always.
The jobs will be started monthly using Pentaho kitchen in a linux box.
So when I run a Job for a specific customer, I want to tell the job to use the DB connection for that specific customer, i.e. Cust2. from the Connection file.
Any help is much appreciated.
Cheers & Thanks,
Heiko
Use parameters!
When you define a connection, you see a small S sign in a blue diamond on the right of the Database Name input box. It means that, instead of spelling the name of the database, you can put in a parameter.
The first time you do it, it's a bit challenging. So follow the procedure step by step, even if you are tempted to go straight to launch a ./kitchen.sh that reads a file containing a row per customer.
1) Parametrize your transformation.
Right-click anywhere, select Properties then Parameters, fill the table:
Row1 : Col1 (Parameter) = HOST_NAME, Col2 (Default value) = the host name for Cust1
Row2 : Col1 = DATABASE_NAME, Col2 = the database name for Cust1
Row3 : Col1 = PORT_NUMBER, Col2 = the database name for Cust1
Row4 : Col1 = USER_NAME, Col2 = the database name for Cust1
Row5 : Col1 = PASSWORD, Col2 = the database name for Cust1
Then go to the Database connection definition (On the left panel, View tab) and in the Setting panel:
Host name: ${HOST_NAME} -- The variable name with a "${" before and a a "$" after
Database name: ${DATABASE_NAME} -- Do not type the name, press Crtl+SPACE
Port Number: ${PORT_NUMBER}
Database name: ${USER_NAME}
Database name: ${PASSWORD}
Test the connection. If valid try a test run.
2. Check the parameters.
When you press the run button, Spoon prompts for some Run option (If you checked the "Don't show me anymore" in the past, use the drop-down just near by the Run menu).
Change the values of the parameters for those of Cust2. And check it runs for the other customer.
Change it on the Value column and the Default value column. You'll understand the difference in a short while, for the moment check it works with both.
3. Check it in command line.
Use pan from the command line.
The syntax should look like :
./pan.sh -file=your_transfo.ktr -param=HOST_NAME:cust3_host -param=DATABASE_NAME:cust3_db....
At this point, you have a small bit of trials and errors, because the syntax between = and : varies sightly with the OS and the PDI version. But you should get by with 4-6 trials.
4. Make a job
Do to the parallel computing paradigm of PDI, you cannot use the Set variable step in a single transformation. You need to make a job with two transformation : the first reads the csv file and define the variables with the Set variable step. The second is the transformation you just developed and tested.
Don't expect to make it run on the first trial. Some versions of the PDI are buggy and requires, for example to clean the default value of the parameters in the transformation. You are helped with the Write to log step which will write a field in the log of the calling job. Of course you will need to first put the parameters/variables in a field with the Get variable step.
In particular, do not start with the full customer list! Set the system up with 2-3 customers before.
Write the full list of customer in your csv, and run.
Make a SELECT COUNT(customer) on your final load. This is important, because you will probably want to load as many customer as possible, so to continue the process even in case of failure. This is the default behavior (on my best memory), so you won't probably notice a failure in the log if there is a large number of customer.
5. Install the job
In principle, it is just a ./kitchen.sh.
However, if you want to automate the load, you will have a hard time for checking that nothing went wrong. So open the transformation an use the System date (fixed) of the Get System Info step and write the result with the customer data. Alternatively you can get this date in the main job and pass it along the other variables.
If you have concerns about creating a new column in the database, store the list of customers loaded by day, in another table, in a file or send it to you by mail. From my experience, it's the only practical way to be able to answer to a user that claims that their biggest customer was not loaded tree weeks ago.
I run a similar scenario daily in my work. What we do is we use Batch files with named parameters for each client, this way we have the same package KJB/KTR's that run for a different client based on these parameters entirely.
What you want to do is set variables on a master job, that are used throughout the entire execution.
As for your question directly, in the connection creation tab you can use those variables in Host and DBname. Personally, we have the same user/pw set on every client DB so we don't have to change those or pass user/pw as variables for each connection, we only send the host name and database with the Named Parameters. We also have a fixed scheduled run that executes the routine for every database, for this we use a "Execute for each input row" type JOB.

SQL Server query- how to generate email of the contents before deleting an entity

I am trying to write a query in SQL Server that will automatically generate an email to the email provided in the table as I delete that row. Is this possible?
What I'm trying to achieve is: in a row number, there is a column that states notes where approvers can add their notes. Sometimes I have to restart this specific row of number. So I wrote a query to put the overall status to draft (regardless of what status it is in now). In doing so, every other column becomes null. what I'm trying to achieve is that as I set this row number to draft, an automatic email is generated that will show the comments note section to the email provided within that row.
Is this possible?
Thank you in advance.

Get audit history records of any entity record as per CRM view

I want to display all audit history data as per MS CRM format.
I have imported all records from AuditBase table from CRM to another Database server table.
I want this table records using SQL query in Dynamics CRM format (as per above image).
I have done so far
select
AB.CreatedOn as [Created On],SUB.FullName [Changed By],
Value as Event,ab.AttributeMask [Changed Field],
AB.changeData [Old Value],'' [New Value] from Auditbase AB
inner join StringMap SM on SM.AttributeValue=AB.Action and SM.AttributeName='action'
inner join SystemUserBase SUB on SUB.SystemUserId=AB.UserId
--inner join MetadataSchema.Attribute ar on ab.AttributeMask = ar.ColumnNumber
--INNER JOIN MetadataSchema.Entity en ON ar.EntityId = en.EntityId and en.ObjectTypeCode=AB.ObjectTypeCode
--inner join Contact C on C.ContactId=AB.ObjectId
where objectid='00000000-0000-0000-000-000000000000'
Order by AB.CreatedOn desc
My problem is AttributeMask is a comma separated value that i need to compare with MetadataSchema.Attribute table's columnnumber field. And how to get New value from that entity.
I have already checked this link : Sql query to get data from audit history for opportunity entity, but its not giving me the [New Value].
NOTE : I can not use "RetrieveRecordChangeHistoryResponse", because i need to show these data in external webpage from sql table(Not CRM database).
Well, basically Dynamics CRM does not create this Audit View (the way you see it in CRM) using SQL query, so if you succeed in doing it, Microsoft will probably buy it from you as it would be much faster than the way it's currently handled :)
But really - the way it works currently, SQL is used only for obtaining all relevant Audit view records (without any matching with attributes metadata or whatever) and then, all the parsing and matching with metadata is done in .NET application. The logic is quite complex and there are so many different cases to handle, that I believe that recreating this in SQL would require not just some simple "select" query, but in fact some really complex procedure (and still that might be not enough, because not everything in CRM is kept in database, some things are simply compiled into the libraries of application) and weeks or maybe even months for one person to accomplish (of course that's my opinion, maybe some T-SQL guru will prove me wrong).
So, I would do it differently - use RetrieveRecordChangeHistoryRequest (which was already mentioned in some answers) to get all the Audit Details (already parsed and ready to use) using some kind of .NET application (probably running periodically, or maybe triggered by a plugin in CRM etc.) and put them in some Database in user-friendly format. You can then consume this database with whatever external application you want.
Also I don't understand your comment:
I can not use "RetrieveRecordChangeHistoryResponse", because i need to
show these data in external webpage from sql table(Not CRM database)
What kind of application cannot call external service (you can create a custom service, don't have to use CRM service) to get some data, but can access external database? You should not read from the db directly, better approach would be to prepare a web service returning the audit you want (using CRM SDK under the hood) and calling this service by external application. Unless of course your external app is only capable of reading databases, not running any custom web services...
It is not possible to reconstruct a complete audit history from the AuditBase tables alone. For the current values you still need the tables that are being audited.
The queries you would need to construct are complex and writing them may be avoided in case the RetrieveRecordChangeHistoryRequest is a suitable option as well.
(See also How to get audit record details using FetchXML on SO.)
NOTE
This answer was submitted before the original question was extended stating that the RetrieveRecordChangeHistoryRequest cannot be used.
As I said in comments, Audit table will have old value & new value, but not current value. Current value will be pushed as new value when next update happens.
In your OP query, ab.AttributeMask will return comma "," separated values and AB.changeData will return tilde "~" separated values. Read more
I assume you are fine with "~" separated values as Old Value column, want to show current values of fields in New Value column. This is not going to work when multiple fields are enabled for audit. You have to split the Attribute mask field value into CRM fields from AttributeView using ColumnNumber & get the required result.
I would recommend the below reference blog to start with, once you get the expected result, you can pull the current field value using extra query either in SQL or using C# in front end. But you should concatenate again with "~" for values to maintain the format.
https://marcuscrast.wordpress.com/2012/01/14/dynamics-crm-2011-audit-report-in-ssrs/
Update:
From the above blog, you can tweak the SP query with your fields, then convert the last select statement to 'select into' to create a new table for your storage.
Modify the Stored procedure to fetch the delta based on last run. Configure the sql job & schedule to run every day or so, to populate the table.
Then select & display the data as the way you want. I did the same in PowerBI under 3 days.
Pros/Cons: Obviously this requirement is for reporting purpose. Globally reporting requirements will be mirroring database by replication or other means and won't be interrupting Prod users & Async server by injecting plugins or any On demand Adhoc service calls. Moreover you have access to database & not CRM online. Better not to reinvent the wheel & take forward the available solution. This is my humble opinion & based on a Microsoft internal project implementation.

User Interface to analaze date of a conventional rdbms

Currently we create Jasper PDF Reports from a single simple database table for our customers. This has been achieved programmatically. It's static. If the user wishes to change the query, he/she creates a change request, which we cannot deliver before the end of the next sprint (SCRUM).
The tool/library should be straight forward (e.g. convention over configuration) and employable from within a JavaEE container. And, open source.
Is there a dynamic tool that allows or customers to create the simple queries/reports themselves without knowing SQL? Means, they should be able to see the table and then create a query from it, execute and print (we could use Jasper Reports for the last one).
E.g. Select only data from year 2014, aggregate them by customer group and select columns x,y and z.
All of these criterias and query structure may change though, thus not just the value like year 2014.
Questions:
1) Is there a tool that presents the data in some kind of SAP-cube or something similar where the user could select the structure and attributes?
2) Can that tool save template queries (queries that the user has invoked before)?
thanks
With BIRT you could use parameters in the report... for example have one report that shows the whole data set or data cube (or at least a bit of all of the fields). Then you could add JavaScript to the report (or do all of the presentation in JavaScript for that matter), that shows the parameters a user can select from. These parameter values can then either be sent to a new report or could update the existing report. Parameters can be put into database queries too.
If that was exposed in JavaScript on a web page you could save the parameter values to an array and store them in the browser or server.

Resources