My query is regarding SQL JOB AGENT (auto mail)I have created 2 queries in one procedure if from query one data is not coming then I don't want to send Email to given email list but it should send data from query two.
Related
I have a report running against a Data Driven subscription in SSRS. The subscription runs a report and produces PDFs - about 1000 of them. The process takes about 2 minutes to complete.
I have been kicking this off manually using the following SQL:
EXEC msdb.dbo.sp_start_job #job_name = '<job_name>'
This works, but what I would like to know is when the job has finished. According to what I have read so far, I should be able to run:
exec msdb.dbo.sp_help_job
This lists my job, but it always has a status of 4 (Idle), even while I can see that reports are being produced.
How can I tell when the job has completed and all my reports have been produced?
MSDB shouldn't contain informtaion on the reporting server. The reporting server is seperate from Sql Server Management Server and will only tell you if the job ran or not not what happened in the job. If you have access to the DB I don't know how you have it set up but I have a subscriptions table that I can check with email sent and when it was sent. IF you don't have that you can go onto the reportserver web site and check the subscription and check the status and it should have a date of when it was last sent.
The only way you can access the information in Sql Server Management Studio is by queryng the DB and its tables assuming it is setup correctly.
When I am executing a select statement with 4 'inner' joins and two 'WHERE' conditions it took 13-15 s in local SSMS (I have executed 5 times). But when I connect the same instance from another server's SSMS and execute the same query it took 5 s to execute first time and then it took 0 s! I am using the same user SA.
Is there any possible explanation for that?
Host instance is SQL 2008 and Remote instance has SQL 2008 R2.
If your query is returning data to display in your local SSMS then this data needs to be transferred from the server to your local SSMS. The time to transfer the data from the server to your local SSMS is included in the execution time. So, the execution time is a combination of executing the script and fetching the data in order to display it.
You might want to "Include Client Statistics" and then review the row "Bytes received from server" in the "Client Statistic" tab of the result window.
In order to verify my assumption you can alter your select only to execute without fetching the data.
Is there a way/method/tool to monitor or to know what application or service is inserting records into a table in ms sql?
If you installed it as part of your SQL Client Tools installation, you can use the SQL Server Profiler tool to perform a trace of the activity taking place on a specific instance of SQL server. This includes capturing the actual sql batches which are inserting the data into your database.
When you setup the trace, select the SQL:BatchStarting(under the TSQL events) and RPC:Starting (under the Store Procedures) events. For each event select the following fields to be included in the trace:
textdata - Will contain the actual query being executed. Look in here for your insert queries.
spid
starttime
application name - Will contain the name of the application on the client if the client is configured with an application name
ClientProcessID - Will contain the process ID of the client application calling SQL Server
DatabaseID
DatabaseName
HostName - Will contain the name of the computer on which the client is running
LoginName - Will contain the login of the user (either the SQL Server or WIndows login)
You can add a filter on either the DatabaseID or DatabaseName fields so the trace only returns events from the database you are interested in tracking down the inserts on.
Additionally, if have an idea about how the insert is being made (for instance a specific storee procedured being called to execute the insert) you can define a filter on the textdata field in the format of %stored_procedure_name% % symbols are wildcards and the text between them represents a porition of the query which is inserting the data.
If you install Microsoft SQL Server Management Studio, the "Activity Monitor" will apparently show you the process name of a given connection (and, e.g., what the last executed statement was).
I have a SQL Server job that sends out a notification every time it fails. I run this job every 5 minutes. If something the job needs has its state changed at 10:00 PM at night and the problem isn't discovered until the next morning, there would be over 100 e-mails sent with the same failure notification.
Is there anyway for you to throttle the number of e-mails sent? For example, I would like SQL Server to send out an e-mail notification on the hour if any of the 12 scheduled runs during the hour fails, but no more than one e-mail should be sent (even if there are multiple failures).
You could create a simple table with the job name and a bit column indicating whether an email has been sent successfully, then check that column as part of the alert.
DECLARE Notifications TABLE
(
ID int IDENTITY (1, 1) PRIMARY KEY NOT NULL,
JobName varchar(50),
MailSent bit DEFAULT 0
)
In SSMS select Database Mail under Management, right-click and Configure Database Mail. After that's complete send a test mail using the new profile to confirm that works.
Next, open the properties of the SQL Server Agent. On the Alert System page enable the mail profile. Restart the SQL Server Agent for the change to take effect.
Under the SQL Server agent create Operators to send messages to.
Finally, in your job select Notifications and set the Operator to send to when the job completes.
We've two SQL server database. The source server has data populated from an external system and a destination database on a remote server (used by a web app). There's an SSIS package which maps column from source tables to destination (column names differ) and populates data to maintain the sync.
Now, to ensure that both the database are in sync for which we've an SP which shows record count and for some parent-child relationships it shows child count for each parent record (i.e. Brandwise Item count). Someone has to logon to both the servers, execute the SP and get the data manually. Then compare the results to ensure that both the db are in sync.
Now, to automate this process, we've done the following-
Add the destination server as a "Linked Server"
Use "EXEC msdb.dbo.sp_send_dbmail" along with "#attach_query_result_as_file =1"
Create an SSIS job which will execute the email SP for both the servers
So, this is how we get two emails which has query results attached to
it. And then comparing the text files completes the db sync check.
I believe this can be made better - now that we're able to access the destination server as a linked server. Its my first time so I'd request some experienced guys to share their approach, probably something beyond a join query with the linked server.
Since you have access to server as Linked server you can directly run query and compare data.
Please check this
You can modify SSIS jobs to send mails based on this query result.
I'm using the following query which is a simple version and gives me differences of both the sides -
(Select s.Title, s.Description from ERPMasterBrand as s EXCEPT
Select d.Title, d.Description from MasterBrand as d)
UNION
(Select s.Title, s.Description from MasterBrand as s EXCEPT
Select d.Title, d.Description from ERPMasterBrand as d)
Any better suggestions? I've tested and it gives desired results - hope I'm not being misguided :-) by my own solution.