Aggregate data from msdb.dbo.sysjobhistory - sql-server

each time i run agent job it puts some data to the log. I can access those data via T-SQL like this: SELECT * FROM msdb.dbo.sysjobhistory WHERE step_id = 0. It will show me summary information about job.
Problem is that if there was a step with failed but "on failure action" was "go to next step" run_status will show success.
I've try to aggregate data from step_id <> 0 but i don't know how to distinguish each run of a job.
Can You help me with that problem? Best result would be additional column with distinct list of statuses that appeared in a job showed by first query.
What i want to achieve is a daily raport of a jobs that ran (some of them multiple time during the day) last day.

Try joining your query to:
SELECT * FROM [dbo].[sysjobsteps]
This includes the field last_run_outcome.
MSDN link of complete table:
https://msdn.microsoft.com/en-us/library/ms187387.aspx

Related

Microsoft SQL Server Agent Job: Get Schedule that triggered the Job

I have SQL Server Agent Job on my System that copies data into at table for later evaluation purposes. The Job runs on two types of schedules every Friday every week and last day of the month. The target data records should also contain a column indicating the schedule that originally triggered the job. But I found no way so far to receive this data as parameter or so. I'm using a Microsoft SQL Server 2017.
I did a web search but maybe searched for the wrong keywords. I also thought about comparing current time to expected runtime per schedule but that seemed to be not a fault tolerant option to me.
I like to fill a column "schedule" with values like "End of week", "End of month"
sys tables are your friend here. Documentation
sysjobs has your job information.
sysjobschedules links your job to its schedule.
sysschedules has your schedule info.
SELECT j.*
, s.*
FROM sysjobs j
JOIN sysjobschedules js ON j.id = js.job_id
JOIN sysschedules s ON js.schedule_id = s.schedule_id
WHERE j.name = 'your job name here'
After long search and analyzing I finally found a solution that at least fit my needs:
The undocumented and unsupport stored procedures provides the schedule that triggered a job ind Column Request Source ID:
EXEC master.dbo.xp_sqlagent_enum_jobs 1, garbage
see also: https://am2.co/2016/02/xp_sqlagent_enum_jobs_alt/

How to audit who ran query on abcd database im Snowflake?

Have abcd database. one user has executed query on abcd database. Another user has executed another query on abcd database. Like this whenever different user has executed different user has executed query on abcd database. Need to capture user execution time and who has executed query..etc
This information is gathered from https://docs.snowflake.com/en/user-guide/access-history.html which is basically the access history details. Also, the same detail can be checked from Snowflake - Query History - Filters where you can use username or any other parameter to look up the details.
You can use Query_history view available in Snowflake.ACCOUNT_USAGE schema to get the complete information of the query within the last 365 days including user_name, execution time, etc
Please note Latency for the view may be up to 45 minutes.
https://docs.snowflake.com/en/sql-reference/account-usage/query_history.html#query-history-view
Also, you can use QUERY_HISTORY function available in information_schema to retrieve query information within the last 7 days and with no latency.
Please review the below documentation for more information.
https://docs.snowflake.com/en/sql-reference/functions/query_history.html#query-history-query-history-by
https://docs.snowflake.com/en/sql-reference/account-usage.html#differences-between-account-usage-and-information-schema
You can use below query(you can add the columns as you need):
select distinct QH.query_id, QH.USER_NAME,qh.database_name,Qh.start_time, qh.EXECUTION_TIME from
"SNOWFLAKE"."ACCOUNT_USAGE"."QUERY_HISTORY" QH
where
-- QH.query_id='' --If you know the query id,use it here
-- QH.user_name='USERNAME' -- You can filter by user id
QH.database_name='DBNAME' --you can filter by databasename
and qh.start_time > '2022-06-29 12:45:36.291'-- you can filter by date
;
If you want to track the IP address and application from where query was run, you can use below query as well:
select distinct QH.query_id,LH.client_ip, QH.USER_NAME,s.client_application_id,qh.database_name,Qh.start_time, qh.EXECUTION_TIME from snowflake.account_usage.login_history LH
inner join "SNOWFLAKE"."ACCOUNT_USAGE"."QUERY_HISTORY" QH
on QH.USER_NAME=LH.user_name
inner join "SNOWFLAKE"."ACCOUNT_USAGE"."SESSIONS" S on S.session_id=QH.session_id
and s.LOGIN_EVENT_ID=lh.EVENT_ID
where
-- QH.query_id='' --If you know the query id,use it here
-- QH.user_name='USERNAME' --If you know the user id,use it here
QH.database_name='DBNAME' --If you know the DB id,use it here
and qh.start_time > '2022-06-29 12:45:36.291'-- filter by date as required
;

How can I find out if someone modified a row in SQL server in a specific date?

I am just wondering, can I find out if somebody wrote a query and updated a row against specific table in some date?
I tried this :
SELECT id, name
FROM sys.sysobjects
WHERE NAME = ''
SELECT TOP 1 *
FROM ::fn_dblog(NULL,NULL)
WHERE [Lock Information] LIKE '%TheOutoput%'
It does not show me ?
Any suggestions.
No, row level history/change stamps is not built into SQL Server. You need to add that in the table design. If you want an automatic update date column it would typically be set by a trigger on the table.
There is however a way if you really need to find out what happened in a forensics scenario. But that is only available if you have the right backup plans. What you can do then is to use the DB transaction log to find when the modification was done. Note that this is not anything an application can or should do runtime.

How to get exactly the same list of "Parameters used" shown in the All Executions Overview report by using T-SQL from SSISDB database?

Is it possible to get exactly the same parameters as shown in the All execution Overview report (see the printscreen below)?
I was trying to use the table [internal].[execution_parameter_values] from SSISDB and filter it via execution_id, nevertheless it returns much more parameters than in the report. I have also tried to filter it with table attribute "value_set", "object_type", etc. but still it did not return the same list as in the report.
Reference:
https://learn.microsoft.com/en-us/sql/integration-services/system-views/views-integration-services-catalog?view=sql-server-2017
execution_parameter_value:
Displays the actual parameter values that
are used by Integration Services packages during an instance of
execution.
Whenever the package is executed, records are inserted into that table. You need to determine the execution_id that you want to filter on.
You can get that from [catalog].[executions] in the SSIS DB. Filter based on your project or package and when it was executed.
Or you will also see that in the execution overview report as "Operation ID":
You can then filter based on that value:
SELECT * FROM [internal].[execution_parameter_values]
WHERE [execution_id] = 16529
Overview report in the SSIS catalog shows only TOP 25 used parameters sorted by parameter_name ASC.
Also, it is needed to filter out the records with parameter_name without "." character.
So the result T-SQL script would be:
SELECT TOP 25
[parameter_name]
,[parameter_value]
,[parameter_data_type]
FROM [SSISDB].[internal].[execution_parameter_values]
WHERE execution_id = #execution_id AND parameter_name not like '%.%'
ORDER BY parameter_name

How to monitor an SQL Server Agent Job that runs after being triggered

I'm running SQL Server 2008 and I have 3 agent jobs set up to run one after the other because the success of the second depends on the first, etc.
Every other job I have is independent and I monitor using a script that incorporates MSDB..sysjobhistory run_status field, among others.
I want to know if there is a way to specifically find all jobs that never started for a specific day. I know that I can think of the problem the other way and say job 2 couldn't possibly run if job 1 failed; however, I'm looking for a more general purpose solution so in case I need to create other jobs that are linked similarly I won't have to hard code more logic into my nightly report.
Any suggestions are welcome!
The MSDB..sysjobservers table holds a single record for every agent job in your server, and includes the field last_run_date which makes it easy to tell which jobs haven't run in the last 24 hours. My query looks something like this
SELECT A.name AS [Name]
,##servername AS [Server]
,'Yes' AS [Critical]
FROM MSDB.dbo.sysjobs A
INNER JOIN MSDB.dbo.sysjobservers B ON A.job_id = B.job_id
WHERE A.name LIKE '2%SPRING%'
AND DATEDIFF(DAY, CONVERT(DATETIME, CONVERT(CHAR(8),B.last_run_date)), GETDATE( )) > 1)

Resources