Sending a summary of SQL Server Agent job failures - sql-server

SQL Server Agent allows you to create Notifications at the moment a Job succeeds or fails, but I'd like to create a regular notification that sends a summary of the events for those who are one step removed from server maintenance.
Is there a way to setup summary notifications that spell out which jobs failed over the last 24 hours?

There are several system stored procedures that you might be able to use or you could query directly against the system tables. If you have SSRS available then put together a report using those queries and you can have it on-demand or scheduled to be emailed to the necessary people at whatever time is right for you.
Specifically, check out sp_help_jobhistory and sp_help_jobhistory_full in msdb.
I'd be surprised if you couldn't use Google to find a demo of setting this up.

Related

Triggering Hangfire Jobs using SQL

I have a Hangfire service running on one of my servers, I'm a DBA and sometimes I'm asked to trigger jobs using the Dashboard, but it takes me a lot of time to connect to the jobs' server due to some connectivity and security issues.
And to overcome that, I want to trigger those jobs by inserting in Hangfire's tables on the database, I can already query those tables to find which job executed when and whether they failed, succeeded or still enqueued, does anyone know an approach to do so?
I've included a sample of two tables which I think will be used to do this trick, their names are Hash and Set respectively:
Hangfire normally uses a gui like swagger in .net (http://localhost:5000/hangfire) , there should be a immediate trigger feature. If not a second option is changing the cron expression for every minute or maybe every 30 seconds.

"Fire and forget" T-SQL query in SSMS

I have an Azure SQL Database where I sometimes want to execute ad-hoc SQL statements, that may take a long time to complete. For example to create an index, delete records from a huge table or copy data between tables. Due to the amounts of data involved, these operations can take anywhere from 5 minutes to several hours.
I noticed that if a SQL statement is executed in SSMS, then the entire transaction will be automatically rolled back, in case SSMS loses its connection to the server, before the execution is complete. This is problematic for very long running queries, for example in case of local wifi connectivity issues, or if I simply want to shut down my computer to leave the office.
Is there any way to instruct SQL Server or SSMS to execute a SQL statement without requiring an open connection? We cannot use SQL Server Agent jobs, as this an Azure SQL DB, and we would like to avoid solutions based on other Azure services, if possible, as this is just for simple Ad-hoc needs.
We tried the "Discard results after execution" option in SSMS, but this still keeps an open connection until the statement finishes executing:
It is not an asynchronous solution I am looking for, as I don't really care about the execution result (I can always check if the query is still running using for example sys.dm_exec_requests). So in other words, a simple "fire and forget" mechanism for T-SQL queries.
While my initial requirements stated that we didn't want to use other Azure services, I have found that using Azure Data Factory seems to be the most cost-efficient and simple way to solve the problem. Other solutions proposed here, seems to suffer from either high cost (spinning up VMs), or timeout limitations (Azure Functions, Azure Automation Runbooks), non of which apply to ADF when used for this purpose.
The idea is:
Put the long-running SQL statement into a Stored Procedure
Create a Data Factory pipeline with a Stored Procedure activity to execute the SP on the database. Make sure to set the Timeout and Retry values of the activity to sensible values.
Trigger the pipeline
Since no data movement is taking place in Data Factory, this solution is very cheap, and I have had queries running for 10+ hours using this approach, which worked fine.
If you could put the ad-hoc query in a stored procedure you could then schedule to run on the server assuming you have the necessary privileges.
Note that this may not be a good idea, it but should work.
Unfortunately I don't think you will be able to complete the query the without an open connection in SSMS.
I can suggest the following approaches:
Pass the query into an azure function / AWS Lambda to execute on your behalf (perhaps, expose it as a service via rest) and have it store or send the results somewhere accessible.
Start up a VM in the cloud and run the query from the VM via RDP. Once you are ready you re-establish your RDP connection to the VM and you will be able to view the outcome of the query.
Use an Azure automation runbook to execute the query on a scheduled trigger.

Sending emails automatically using SQL Server job

I'm developing a .NET desktop application with SQL Server as the database backend. One of the requirements of the application is that if a record status, for example, remains inactive for 30 days, there will be a reminder email sent to the user associated to that record.
This could be done pretty easily within the application, as long as it is started and running. However, assume that for a certain period of time, nobody starts up the application, the reminder email won't be sent, because nothing / nodody triggers the action.
How about creating a job in SQL Server which can monitors the records and sends emails as needed? Has anyone ever done that?
Thanks a lot!
Given the requirements of your task, I suggest that you create a console program (w/ C# or VB.NET) that checks for inactive (30 days) row condition and then generates the appropriate email notification message. Then run this program every hour or so (depending on the urgency involved in detecting an inactive row condition) using a SQL Server Agent Job.
The following image shows how the SQL Server Agent Jobs are displayed in the Object Explorer for SQL Server 2008 R2.
This SO entry covers some aspects on creating a console program that runs at certain times. The SQL Server Job Agent has several scheduling options that should facilitate your needs.
You might be reluctant to create a console program for this, but you are apt to find that doing so gives you options that are simply not easily implemented with a pure SQL Server based approach. Plus, you may have future needs that require similar processing that this approach provides.

Do SSRS report subscriptions that trigger at the same time run concurrently?

If you have multiple report subscriptions that are set to be triggered at the same time will all of them start at that time and run concurrently? That is, if you have too many reports scheduled for a particular time they run the risk of using up too much system resources and failing?
Do shared subscriptions function differently?
I'm interested in the answer to this for any version of SSRS as my organisation manages environments with SSRS 2005, 2008, 2008 R2, and 2012.
I've tried searching MSDN and google but I haven't been able to find anything definitive. I would imagine that separate schedules all trigger simultaneously and run at once (competing for system resources) because SSRS is designed to run reports on-demand concurrently. But I could imagine scenarios where a shared subscription could queue the reports to run sequentially.
Is there any way to globally limit the number of concurrent report runs that can take place? I know you can limit it on a per-user basis in the RSServerConfig file.
I've found a bit of interesting info that explains how SSRS processes scheduled subscriptions:
When you create a subscription several things are added to the RS
server:
A row is placed in the Subscriptions table…
A row is placed in the Schedule and ReportSchedule tables…
A SQL Server Agent job is created…
When the subscription runs several things happen:
The SQL Server Agent job fires and puts a row in the Event table…
The RS server service has a limited number of threads (2 per CPU) that poll the Event table…
When it finds an event, it puts a row in the Notifications table and starts processing…
This if from: Troubleshooting Subscriptions: Part II...
(If I'm reading this correctly...) The reporting service will concurrently run two times the number of cores in your server when processing subscriptions. (e.g. If you have a dual-core machine RS will run 4 report subscriptions at once.)
I'm going to leave this question open awhile long in case anyone else has more information...
There is a setting in RsReportServer.config called MaxQueueThreads. I often set this to 1 or 2 to avoid flooding the server and crudely save resources for interactive SSRS users. The trade off is that one or two heavy subscription reports can choke the queue and hold up other subscriptions.
It is available in all the versions you listed. Here's the doco:
http://msdn.microsoft.com/en-us/library/ms157273.aspx

Overriding SQL Server Reporting Service Hourly Subscription

Users want to set up SSRS reports to be emailed to them. After a little googling i found this link that shows the subscription interface of Report Manager. This has almost every feature they need except, the hourly report subscription does not give them enough control. By default, they are able to set up hourly reports, and provide the desired start time, but at first glance, I don't see how they specify an end time. What I need is a way to say "Send me a report every hour between 5 and 10."
So I'm looking for one of two answers:
Is there really an end time that I'm just missing?
If not, how can I override the hourly subscription page and get and end time.
Thanks
You can write your own subscription service using the SQL Reporting Service Web Services.
You can schedule a subscription to run the job hourly and then write your own service that will pause the jobs at the time they do not want the reports.
More info here:
http://msdn.microsoft.com/en-us/library/ms154066(SQL.90).aspx
Alternatively, you could also try editing the SQL Server job parameters. When reporting services creates a "subscription" a SQL Server job is created. There are starting and ending time parameters in that interface. I haven't verified that this provides the functionality you are looking for though.
Also, if that doesn't work and you don't want to code your own subscription service as mentioned, you could try creating 5 different jobs that run daily spaced an hour apart. I know it's kind of a kludge but maybe the extra job maintenance is preferable to the extra time for coding your own service.

Resources