SSRS - Using ReportServer AddEvent, not processing subscriptions always - sql-server

I'm trying to create a homegrown subscription system for our ReportServer for QA purposes. I'm creating subscriptions on the front end and setting them on a schedule to execute once in the past (1/1/2019 for example) so the only way they would be executed is manually.
I looked at the jobs and they are simply executing a procedure "ReportServer.dbo.AddEvent" with EventType and EventData as parameters. When I execute that procedure, it adds a record to the Event table and then the Notifications table for the RS to pick up and process the subscription. Most of the time it works but sometimes it just hangs. I'm finding this for data-driven subscriptions especially. They are stuck in pending/processing and never execute fully.
Has anyone else had issues with this before? I'm building out a homegrown system because we have 200+ reports that we want to execute on a nightly basis. They are usually pretty quick but I want to only execute a handful at a time and I'm taking care of this through a custom Queue table.

Figured it out within the log files... It was data-driven but each version had the same file name and it was trying to overwrite a file that was still open.

Related

Using database snapshot vs snapshot issolation level transaction

I maintain an an MVC application which incorporates some long running batch processes for sending newsletters, generating reports etc.
I had previously encountered a lot of issues with deadlocks, where one of these long running queries might be holding a lock on a row which then needs to be updated by another process.
The solution I originally came up with, was to have a scheduled task, which creates database snapshots, like so...
CREATE DATABASE MyDatabase_snapshot_[yyyyMMddHHmmss]... AS SNAPSHOT OF MyDatabase
My application then has some logic which will find the latest available snapshot, and use this for the readonly connection for the long-running processes, or anywhere else where a read-only connection was required.
The current setup is perfectly functional, and reliable. However being dependent on that scheduled task doesn't make me happy. I can imagine, at some stage in the future, if someone else is looking after this project, this could be an easy source of confusing issues. If the database was moved to another server, for example, and the snapshot creation scheduled task wasn't setup correctly.
I've since realised I could achieve a similar result by using snapshot transaction issolation, and avoid all the extra complexity of managing the creation and cleanup of the database snapshots.
However I'm now wondering whether there may be any performance drawbacks for doing this using transactions vs continuing to use the static snapshots.
Consider the following scenario.
The system periodically sends personalised job lists to approximately 20K subscribers. For each of these subscribers it does database lookups to create the matching jobs list.
What is has been doing, is looping through the full recipient list, and for each one...
Open a connection to the snapshot db
Run the query to find matching jobs
Close the snapshot db connection
If instead, it does the following...
Open the database connection to the normal database
(non-snapshot)
Create a snapshot issolated transaction
Run the query to find matching jobs
Close the transaction
Close the database connection
Does this actually translate to more work for the database server?
Specifically I'm wondering about what's involved at step #2.
Removing complexity from the application is a good thing, but not at the expense of performance. Particularly since this particular process is already quite server intensive, and takes quite a long time to run.

How to run a database procedure after Siebel EIM job

I would like to run a database procedure after a successful EIM job run to collect information and record historical changes.
How can I call this database procedure automatically without manual intervention.
BR
Hani
Depends on the solution approach you take. If all has to be handeled within Sieber, the a workflow that creates the job, monitors it result and then proceeds with the second step seems to me to be feasible. An alternative would be to use a CI system like Jenkins/Automic to handle the orchestration part.

Do SSRS report subscriptions that trigger at the same time run concurrently?

If you have multiple report subscriptions that are set to be triggered at the same time will all of them start at that time and run concurrently? That is, if you have too many reports scheduled for a particular time they run the risk of using up too much system resources and failing?
Do shared subscriptions function differently?
I'm interested in the answer to this for any version of SSRS as my organisation manages environments with SSRS 2005, 2008, 2008 R2, and 2012.
I've tried searching MSDN and google but I haven't been able to find anything definitive. I would imagine that separate schedules all trigger simultaneously and run at once (competing for system resources) because SSRS is designed to run reports on-demand concurrently. But I could imagine scenarios where a shared subscription could queue the reports to run sequentially.
Is there any way to globally limit the number of concurrent report runs that can take place? I know you can limit it on a per-user basis in the RSServerConfig file.
I've found a bit of interesting info that explains how SSRS processes scheduled subscriptions:
When you create a subscription several things are added to the RS
server:
A row is placed in the Subscriptions table…
A row is placed in the Schedule and ReportSchedule tables…
A SQL Server Agent job is created…
When the subscription runs several things happen:
The SQL Server Agent job fires and puts a row in the Event table…
The RS server service has a limited number of threads (2 per CPU) that poll the Event table…
When it finds an event, it puts a row in the Notifications table and starts processing…
This if from: Troubleshooting Subscriptions: Part II...
(If I'm reading this correctly...) The reporting service will concurrently run two times the number of cores in your server when processing subscriptions. (e.g. If you have a dual-core machine RS will run 4 report subscriptions at once.)
I'm going to leave this question open awhile long in case anyone else has more information...
There is a setting in RsReportServer.config called MaxQueueThreads. I often set this to 1 or 2 to avoid flooding the server and crudely save resources for interactive SSRS users. The trade off is that one or two heavy subscription reports can choke the queue and hold up other subscriptions.
It is available in all the versions you listed. Here's the doco:
http://msdn.microsoft.com/en-us/library/ms157273.aspx

Call a stored procedure after a configurable amount of hours after an insert

I am using SQL Server 2008 Express and I cannot buy the full version it is not in the budget.
What I need is after an insert is done to call a stored procedure to send a notification email after a certain amount of hours. This amount of hours is also stored in the database for configurable reasons.
I was thinking about using windows scheduler and running a stored procedure twice a day but if the user set it to run less than 12 hours that will lead to the user being notified two times or more. I also certainly don't think it is wise to run it every hour. So both these options don't seem like the best.
I was wondering is there was a way to time something like this or even schedule it to run at a certain time (using express remember)
Any Help would be greatly appreciated and thanks for reading!
Use conversation timers. In the insert you would start a conversation timer armed to fire after the number of hours desired. When the time passes the system will enqueue a message and you can use interval activation to run the procedure you want, including sending a message. The advantage of this implementation is that it relies only on SQL Express features. It is also reliable, you won't loose notifications if a process shuts down, like it would happen with a CLR or WAITFOR based solution.
See Asynchronous procedure execution for a similar idea, but w/o a timer.
Even if you end up doing a check every hours (or every 5 mins) for pending 'due' notifications, I would still use a Service Broker activation based mechanism to activate the 'check' task. I also recommend reading Using tables as Queues.
you can use a CLR project in visual studios to achieve this!
The trigger will listen for any insert done on table and in return call a webservice that sends email.
If you can use the SQL Server WAITFOR command it might be helpful.
Maybe a sql stored procedure that runs every hour, and checks to see if its configured to run on "this" hour, if so then send the email.
http://msdn.microsoft.com/en-us/library/ms187331.aspx

Sending a summary of SQL Server Agent job failures

SQL Server Agent allows you to create Notifications at the moment a Job succeeds or fails, but I'd like to create a regular notification that sends a summary of the events for those who are one step removed from server maintenance.
Is there a way to setup summary notifications that spell out which jobs failed over the last 24 hours?
There are several system stored procedures that you might be able to use or you could query directly against the system tables. If you have SSRS available then put together a report using those queries and you can have it on-demand or scheduled to be emailed to the necessary people at whatever time is right for you.
Specifically, check out sp_help_jobhistory and sp_help_jobhistory_full in msdb.
I'd be surprised if you couldn't use Google to find a demo of setting this up.

Resources