"Fire and forget" T-SQL query in SSMS - sql-server

I have an Azure SQL Database where I sometimes want to execute ad-hoc SQL statements, that may take a long time to complete. For example to create an index, delete records from a huge table or copy data between tables. Due to the amounts of data involved, these operations can take anywhere from 5 minutes to several hours.
I noticed that if a SQL statement is executed in SSMS, then the entire transaction will be automatically rolled back, in case SSMS loses its connection to the server, before the execution is complete. This is problematic for very long running queries, for example in case of local wifi connectivity issues, or if I simply want to shut down my computer to leave the office.
Is there any way to instruct SQL Server or SSMS to execute a SQL statement without requiring an open connection? We cannot use SQL Server Agent jobs, as this an Azure SQL DB, and we would like to avoid solutions based on other Azure services, if possible, as this is just for simple Ad-hoc needs.
We tried the "Discard results after execution" option in SSMS, but this still keeps an open connection until the statement finishes executing:
It is not an asynchronous solution I am looking for, as I don't really care about the execution result (I can always check if the query is still running using for example sys.dm_exec_requests). So in other words, a simple "fire and forget" mechanism for T-SQL queries.

While my initial requirements stated that we didn't want to use other Azure services, I have found that using Azure Data Factory seems to be the most cost-efficient and simple way to solve the problem. Other solutions proposed here, seems to suffer from either high cost (spinning up VMs), or timeout limitations (Azure Functions, Azure Automation Runbooks), non of which apply to ADF when used for this purpose.
The idea is:
Put the long-running SQL statement into a Stored Procedure
Create a Data Factory pipeline with a Stored Procedure activity to execute the SP on the database. Make sure to set the Timeout and Retry values of the activity to sensible values.
Trigger the pipeline
Since no data movement is taking place in Data Factory, this solution is very cheap, and I have had queries running for 10+ hours using this approach, which worked fine.

If you could put the ad-hoc query in a stored procedure you could then schedule to run on the server assuming you have the necessary privileges.
Note that this may not be a good idea, it but should work.

Unfortunately I don't think you will be able to complete the query the without an open connection in SSMS.
I can suggest the following approaches:
Pass the query into an azure function / AWS Lambda to execute on your behalf (perhaps, expose it as a service via rest) and have it store or send the results somewhere accessible.
Start up a VM in the cloud and run the query from the VM via RDP. Once you are ready you re-establish your RDP connection to the VM and you will be able to view the outcome of the query.
Use an Azure automation runbook to execute the query on a scheduled trigger.

Related

Triggering Hangfire Jobs using SQL

I have a Hangfire service running on one of my servers, I'm a DBA and sometimes I'm asked to trigger jobs using the Dashboard, but it takes me a lot of time to connect to the jobs' server due to some connectivity and security issues.
And to overcome that, I want to trigger those jobs by inserting in Hangfire's tables on the database, I can already query those tables to find which job executed when and whether they failed, succeeded or still enqueued, does anyone know an approach to do so?
I've included a sample of two tables which I think will be used to do this trick, their names are Hash and Set respectively:
Hangfire normally uses a gui like swagger in .net (http://localhost:5000/hangfire) , there should be a immediate trigger feature. If not a second option is changing the cron expression for every minute or maybe every 30 seconds.

How to deal with querying from multiple servers where some servers may be inactive

I have a stored procedure that loops through a list of servers and queries their master DBs. When one of these servers is down, the stored procedure querying times out. How can I skip the querying of any inactive server, or, how can I catch the server timeout and continue to do the queries on the remaining active servers? I do have a Server table with an IsActive column, but the value is not automatically changed when a server goes down. Currently, the list of servers to query is based on this IsActive column in the table. Another solution could be to find a way to automatically change the IsActive column whenever a server goes down but I wouldn't know how to go about that. Any thoughts?
EDIT: I'm doing this all in SQL Server 2008
Do not do this from inside the engine (linked servers).
Query from an outside process. Launch the queries in parallel, with a connection timeout set. Your 'query' will get all the information at once and you will only have to wait once for all servers that are down, roughly, the timeout you've set, instead of once for each server that is down. I would recommend against 'testing' for the connectivity because attempting to connect is the test. If you would, say, iterate over the servers and call sp_testlinkedserver for each you risk waiting more in the end, because the tests are still serialized and they take the same amount of time as attempting to connect (this is what the test does, it attempts to connect).
A much better solution would be to use a reliable transport and asynchronous messaging instead, eg. Service Broker. Since the programmign model is asynchronous but the messaging is reliable, it doesn't matter if a server is down, you will get the result you want later, when is finally back online.

Permissions required to access remote perfmon counters from SQL CLR

So I'm trying to learn SQLCLR and have chosen to write a table-valued function that essentially gathers some perfmon counters. I'm able to run it successfully and gather counters from the server that hosts the database instance. However, when I try to access counters on another server, I get "access denied". I'm able to get it to work if I add the account that runs the SQL Server to the "Performance Monitor Users" group on the remote server, but I want to know if I can have the function run as a different windows account? That is to say, can I create a Windows account specifically for this task and somehow have SQL Server run the function in that context?
No, you cannot have the SQLCLR function run as a specific user. You may hear about use of LogonUser API to impersonate an user in the SQLCLR function but that approach is fraud with problems, particularly because of the issue of password storage. The correct solution is exactly what you did, grant the SQL Server account the needed privileges by adding him to the required security group. BTW, in case your SQLCLR function impersonates the current Windows login you will need to set up constrained delegation.
That being said, using SQLCLR to connect to a remote machine for anything is not a smart thing to do. Stealing the precious SQL Server workers to have them wait on slow network access is going to grind your server to a halt under load. You can do this as a way to learn how to do it, but don't even think about deploying it in production. Have the counter collection be done by an external process and save the counter in the database. In fact, there is already a tool that does exactly that: logman.exe.
And finally: querying performance counters from the C# API is extremity inefficient. You will quickly discover that there is a much faster API, the PDH library. But PDH has no manage equivalent, so you'll be back at square one, namely use the tool that does leverage PDH out-of-the-box: logman.exe.

Application Hangs on SQL Server - restart required every time

We have an application which has a SQL Server 2000 Database attached to it. After every couple of days the application hangs, and we have to restart SQL Server service and then it works fine. SQL Server logs show nothing about the problem. Can anyone tell me how to identify this issue? Is it an application problem or a SQL Server problem?
Thanks.
Is it an application problem or a SQL Server problem?
Is it possible to connect to MS SQL Server using Query Analyzer or another instance of your application?
General tips:
Use Activity Monitor to find information about concurrent processes, locks and resource utilization.
Use Sql Server Profiler to trace server and database activity, to capture and save data to a table or file to analyze it later.
You can use Dynamic Management Views (\Database name\Views\System Views folder (in the Management Studio)) to get more detailed information about MS SQL Server internals.
If you have the problems with perfomance (not your case) - you can use Perfomance Monitor and Data Collector Sets to gather perfomance information
Hard to predict the issue, I will suggest you to check your application first.Check what all operations you are performing against data base, are you taking care of connection pooling, unused open connections can create issues.
Check if you can get any log from your application. Without any log information hardly we can suggest anything.
Read this
Application may be hanging due to Deadlock
check the SP runs at that time using Profiler
and check the table manipulation(use nolock),
check the buffer size and segregate the DB into two or three module.

Sending a summary of SQL Server Agent job failures

SQL Server Agent allows you to create Notifications at the moment a Job succeeds or fails, but I'd like to create a regular notification that sends a summary of the events for those who are one step removed from server maintenance.
Is there a way to setup summary notifications that spell out which jobs failed over the last 24 hours?
There are several system stored procedures that you might be able to use or you could query directly against the system tables. If you have SSRS available then put together a report using those queries and you can have it on-demand or scheduled to be emailed to the necessary people at whatever time is right for you.
Specifically, check out sp_help_jobhistory and sp_help_jobhistory_full in msdb.
I'd be surprised if you couldn't use Google to find a demo of setting this up.

Resources