How can I get Azure to notify me when a db Copy operation is complete? - sql-server

Our deployment process involves two db copy procedures, one where we copy the production db to our rc site for rc testing, and then one where we copy the production db to our staging deployment slot for rollback purposes. Both of these can take as long as ten minutes, even though our db is very small. Ah, well.
What I'd like to do is have a way to get notified when a db Copy operation is done. Ideally, I could link this to an SMS alert or email.
I know that Azure has a big Push Notification subsystem but I'm not sure if it can hook the completion of an arbitrary db copy, and if there's a lighterweight solution.

There are some information about copy database in this page, http://msdn.microsoft.com/en-us/library/azure/ff951631.aspx. If you you are using T-SQL you can check the copy process through the query likes SELECT name, state, state_desc FROM sys.databases WHERE name = 'DEST_DB'. So you can keep running this query and send SMS when it shows finished.

Related

snowflake session_id on query_history, can it be reinitialized to have new id per application execution

I want to use snowflake query_history's sessionId to find all the queries executed in one session. It works fine on the snowflake end when I have different worksheets which create different sessions. But from other tools (it looks to be using the same connection pool until it recreates the connection), it creates the same session id for multiple jobs on the snowflake side query_history. Is there a way to have sessionID created on every execution? I am using the control-m scheduling/job automation tool to execute multiple jobs which execute different snowflake stored procs. I want to see if I can get different sessionID for each execution of the procedure on the snowflake query_history table.
Thanks
Djay
You can change the "idle session timeout". See documentation here
You can set it to as low as 5, which means any queries that are at least 5 minutes apart will need to reauthenticate and will have a new session.
CREATE [OR REPLACE] SESSION POLICY DO_NOT_IDLE SESSION_IDLE_TIMEOUT_MINS = 0
Though I believe this will affect any applications that use your account, and will make your applications need to reauthenticate every time the session expires.
Another option, if you need a window smaller than 5 minutes, is to get the sessionid and explicitely run an ABORT_SESSION after your query has finished.
which would look something like this
SEELCT SYSTEM$ABORT_SESSION(CURRENT_SESSION())

Databasemail working for live environment but not test environment on the same SQL Server instance

One of our offices has an application that utilizes databasemail to send emails from the application to users listed in the application's user directory.
In their live environment, the emails are sent without issue. On their training environment, the emails are not sent. On the application side, the settings to send emails are the same, and the database on the training side is a copy of the live database from a recent restore.
I've tried checking the databasemail logs, but the only events are event_type of information, mostly "DatabaseMail process is started", usually followed 10-20 minutes later by a "DatabaseMail process is shutting down" message.
I'm at a loss for why messages for the live database are working while the training database isn't, even though both databases are on the same SQL Server instance and the applications both live on the same server.
Your live and test environments must have the same permissions to run Database Mail service. Check that your test user has DatabasemailUserRole for msdb database. As well as any other permissions which could be needed to run some part of your application, e.g. stored procedures in test database querying msdb, etc.

AWS DataPipeline insert status with SQLActivity

I am looking for a way to record the status of the pipeline in a DB table. Assuming this is a very common use case.
Is there any way where I can record
status and time of completion of the complete pipeline.
status and time of completion of selected individual activities.
the ID of individual runs/execution.
The only way I found was using SQLActivity that is dependent on an individual activity but even there I cannot access the status or timestamp of the parent/node.
I am using a jdbc connection to connect to a remote SQLServer. And the pipeline is for coping S3 files into the SQLServer DB.
Hmmm... I haven't tried this but I can hit you with some pointers to possibly achieve the desired results. However, you will have to do research & figure out actual implementation.
Option 1
Create a ShellCommandActivity, which has depends on set to last activity in your pipeline. Your shell will use aws-cli to list-runs details of the current run, you can use filters to achieve this.
Use Staging Data to move output of previous ShellActivity to SQLActivity to eventually insert into the destination SQLServer.
Option 2
Use AWS lambda to run aws-cli data-pipeline list-runs periodically, with filters, & update the destination table with latest activities. Resource

SQL Server Log File - Why it increases in size when the database is not in use?

We have an ERP application that uses a SQL server database. Although the application is not in use currently, its log file increases constantly and is currently at 28GB. I took a full backup couple of weeks back to truncate the log file when it was 85GB and now it is back to 28GB. I've setup the log file size to be 20GB with 100MB increment and unrestricted growth.
Why the log file size is increasing when the application is not in use, how can I see the transactions that are causing it to increase and how can I manage it better? Does setting up a server trace is useful in this case?
Check the output of sys.dm_exec_requests and sys.dm_exec_sessions to find out what queries are running and what/who's connected to your database.
If that doesn't help you, then you could certainly set up a server trace or an extended events session - if you really think no one is connecting and querying, you could verify that using something like this:
IF EXISTS (SELECT * FROM sys.server_event_sessions WHERE name = 'EE_Queries')
DROP EVENT SESSION EE_Queries ON SERVER;
GO
CREATE EVENT SESSION EE_Queries ON SERVER
ADD EVENT sqlserver.sql_statement_completed
(ACTION (sqlserver.sql_text,sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.nt_username, sqlserver.username)
WHERE sqlserver.database_id = /* fill in your DB_ID() here */)
ADD TARGET package0.asynchronous_file_target
(SET FILENAME = N'C:\logs\EE_Queries.xel', METADATAFILE = N'C:\logs\EE_Queries.xem')
WITH (max_dispatch_latency = 1 seconds);
GO
ALTER EVENT SESSION EE_Queries ON SERVER STATE = START;
which would log all queries going on against your database - just replace the comment portion with the DB_ID() of the database you want to track. That will start an extended events session that tracks all queries, including the text of the query, the app name, the host name, and the username (NT and/or SQL user name) of the user that's running the query. That should clue you in pretty quickly. Just be careful with that events session in a production environment on a busy database - it will likely degrade performance and may fill up the disk quickly if you have lots of queries. Output of that extended event locally:

Transmitting sessions id from SQL Server to web scripts

I have a bunch of stored procs doing the business logic job in a SQL Server instance within a web application. When something goes wrong all of them queue a message in a specific table (let's say 'warnings') keeping track of the error severity and issue description. I want the web application using the stored procs to be able to query the message table at the end of the run but getting a list of the proper message only, i.e. the message created during the specific session or that specific connection: so I am in doubt if
have the web application send to the db a guid to INSERT as column value in the message records (but somehow I have to keep track of it in several stored procs running at the page level, so I need something "global")
OR
if I can use some id related to the connection opened by the application - and this would be definitely more sane. Something like (this is pseudo code):
SELECT #sessionid = sessionid FROM sys.something where this = that
Have some hints?
Thanks a lot for your ideas
To retrieve your current SessionId in SQL Server you can simply execute the following:
select ##SPID
See:
http://msdn.microsoft.com/en-us/library/ms189535.aspx

Resources