SQL server alternative for Oracle's all_scheduler_jobs - sql-server

I am replicating application developed in Oracle to SQL server.
I need equivalents for Oracle sources:
all_scheduler_jobs
all_scheduler_running_jobs
all_scheduler_job_run_details
I am building it from these, but maybe some has already done it:
msdb.dbo.sysjobactivity
msdb.dbo.sysjobs
msdb.dbo.sysjobhistory
I am mostly interested in timing, scheduling and status information.

You can look at sp_help_job and the underlying things they ultimately call (sys.xp_sqlagent_enum_jobs and dbo.sp_get_composite_job_info):
EXEC msdb.dbo.sp_help_job;
EXEC sys.xp_sqlagent_enum_jobs 1, 'sa';
EXEC msdb.dbo.sp_get_composite_job_info
#job_id = 0x08549D549D494D4AA5802BE0A202EA52;
Those are relevant calls to get the information for all jobs (or a single job), but you can also view the source code for the first and last stored procedure, which will help you reverse engineer the queries you'll need to simulate a view.
Relevant thread, perhaps.

Related

Procedure -'executed as'

I create the following proc
ALTER PROCEDURE secret
WITH EXECUTE AS ...
AS
begin
end
i want to find out (by query) which user account the Database Engine uses to validate permissions on this object?
SELECT user_name();
More info here: http://msdn.microsoft.com/en-us/library/ms188014.aspx
The short answer is that it may not be available if you weren't specifically capturing that information. And you'd capture that information with a DDL trigger typically. If this is something that you think you're going to be interested in going forward, set this up now so you can answer this question for next time.
Another option that you have (assuming that the information hasn't rolled off) is to look in the default trace. According to this article, one of the things it captures is "Audit Schema Object GDR events" where "GDR" is "grant/deny/revoke". In other words, exactly what you're looking for. If it happened a while ago, though, you're going to be out of luck.
However, you can check in the sys.database_permissions system view. Do something like:
select user_name(grantor_principal_name)
from sys.database_permissions
where major_id = object_id('yourProcedure')
and grantee_principal_id = user_id('yourUser')
Most of the time, however, this is going to return 'dbo' which is entirely accurate if it was run by, for instance, someone on your DBA team. Which is super likely.

SQL Server 2008 : find out which stored procedures writes to certain table

I am trying to hunt down a certain stored procedure which writes to certain table (it needs to be changed) however going through every single stored procedure is not a route I really want to take. So I was hoping there might be a way to find out which stored procedures INSERT or UPDATE certain table.
I have tried using this method (pinal_daves_blog), but it is not giving me any results.
NOTICE: The stored procedure might not be in the same DB!
Is there another way or can I somehow check what procedure/function has made the last insert or update to table.
One brute-force method would be to download an add-in from RedGate called SQL Search (free), then do a stored procedure search for the table name. I'm not affiliated at all with RedGate or anything, this is just a method that I have used to find similar things and has served me well.
http://www.red-gate.com/products/sql-development/sql-search/
If you go this route, you just type in the table name, change the 'object types' ddl selection to 'Procedures' and select 'All databases' in the DB ddl.
Hope this helps! I know it isn't the most technical solution, but it should work.
There is no built-in way to tell what function, procedure, or executed batch has made the last change to a table. There just isn't. Some databases have this as part of their transaction logging but SQL Server isn't one of them.
I have wondered in the past whether transactional replication might provide that information, if you already have that set up, but I don't know whether that's true.
If you know the change has to be taking place in a stored procedure (as opposed to someone using SSMS or executing lines of SQL via ADO.NET), then #koppinjo's suggestion is a good one, as is this one from Pinal Dave's blog:
USE AdventureWorks
GO
--Searching for Empoloyee table
SELECT Name
FROM sys.procedures
WHERE OBJECT_DEFINITION(OBJECT_ID) LIKE '%Employee%'
There are also dependency functions, though they can be outdated or incomplete:
select * from sys.dm_sql_referencing_entities( 'dbo.Employee', 'object' )
You could run a trace in Profiler. The procedure would have to write to the table while the trace is running for you to catch it.

Good way to call multiple SQL Server Agent jobs sequentially from one main job?

I've got several SQL Server Agent jobs that should run sequentially. To keep a nice overview of the jobs that should execute I have created a main job that calls the other jobs with a call to EXEC msdb.dbo.sp_start_job N'TEST1'. The sp_start_job finishes instantly (Job Step 1), but then I want my main job to wait until job TEST1 has finished before calling the next job.
So I have written this small script that starts executing right after the job is called (Job Step 2), and forces the main job to wait until the sub job has finished:
WHILE 1 = 1
BEGIN
WAITFOR DELAY '00:05:00.000';
SELECT *
INTO #jobs
FROM OPENROWSET('SQLNCLI', 'Server=TESTSERVER;Trusted_Connection=yes;',
'EXEC msdb.dbo.sp_help_job #job_name = N''TEST1'',
#execution_status = 0, #job_aspect = N''JOB''');
IF NOT (EXISTS (SELECT top 1 * FROM #jobs))
BEGIN
BREAK
END;
DROP TABLE #jobs;
END;
This works well enough. But I got the feeling smarter and/or safer (WHILE 1 = 1?) solutions should be possible.
I'm curious about the following things, hope you can provide me with some insights:
What are the problems with this approach?
Can you suggest a better way to do this?
(I posted this question at dba.stackexchange.com as well, to profit from the less-programming-more-dba'ing point of view too.)
If you choose to poll a table, then you'd need to look at msdb.dbo.sysjobhistory and wait until the run_status is not 4. Still gonna be icky though.
Perhaps a different approach would be for the last step of the jobs, fail or success, to make an entry back on the "master" job server that the process has completed and then you simply look locally. Might also make tracking down "what the heck happened" easier by consolidating starts and stops at a centralized job server.
A third and much more robust approach would be to use something like Service Broker to handle communicating and signaling between processes. That'll require much more setup but it'd be the most mechanism for communicating between processes.
No problem with the approach. I was doing somewhat like your requirement only and i used sysjobhistory table from msdb to see the run status because of some other reasons.
Coming back to your question, Please refer msdb.dbo.sp_start_job stored procedure using the same approach and its been used by one default Microsoft BizTalk job 'MessageBox_Message_ManageRefCountLog_BizTalkMsgBoxDb' to call another dependent default biztalk job 'MessageBox_Message_Cleanup_BizTalkMsgBoxDb'. Even there is one stored procedure in BizTalk messagebox to check the status of job. Please refer 'int_IsAgentJobRunning' in BizTalk messagebox.

Sql Server 2005 SSIS/Agent - Query status of a job

Is there a way to query the current status (executing, idle, etc) and the last result (successfull, failed, etc), and the last run time for a specific job name? The end result I am looking for is being able to display this information in an internal web application for various SSIS packages.
You should be able to find this information inMSDB - there are tables sysjobs, sysjobhistory and sysjobsteps which give the information that you are looking for
exec msdb.dbo.sp_help_job #job_name = 'TheJobName'
gives the information I want. So then I can just use a SqlDataReader to get the information. Note that this stored procedure returns multiple result sets.
The micrsoft documentation on this store procedure is
http://msdn.microsoft.com/en-us/library/ms186722(SQL.90).aspx
Another solution I have used is to update a reference table with the current status. It's quick and easy and usually very easy to retrieve the values you need.
For example, as soon as a package kicks off, insert a record with date and time, package name, etc.

Oracle: is there a tool to trace queries, like Profiler for sql server? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
i work with sql server, but i must migrate to an application with Oracle DB.
for trace my application queries, in Sql Server i use wonderful Profiler tool. is there something of equivalent for Oracle?
I found an easy solution
Step1. connect to DB with an admin user using PLSQL or sqldeveloper or any other query interface
Step2. run the script bellow; in the S.SQL_TEXT column, you will see the executed queries
SELECT
S.LAST_ACTIVE_TIME,
S.MODULE,
S.SQL_FULLTEXT,
S.SQL_PROFILE,
S.EXECUTIONS,
S.LAST_LOAD_TIME,
S.PARSING_USER_ID,
S.SERVICE
FROM
SYS.V_$SQL S,
SYS.ALL_USERS U
WHERE
S.PARSING_USER_ID=U.USER_ID
AND UPPER(U.USERNAME) IN ('oracle user name here')
ORDER BY TO_DATE(S.LAST_LOAD_TIME, 'YYYY-MM-DD/HH24:MI:SS') desc;
The only issue with this is that I can't find a way to show the input parameters values(for function calls), but at least we can see what is ran in Oracle and the order of it without using a specific tool.
You can use The Oracle Enterprise Manager to monitor the active sessions, with the query that is being executed, its execution plan, locks, some statistics and even a progress bar for the longer tasks.
See: http://download.oracle.com/docs/cd/B10501_01/em.920/a96674/db_admin.htm#1013955
Go to Instance -> sessions and watch the SQL Tab of each session.
There are other ways. Enterprise manager just puts with pretty colors what is already available in specials views like those documented here:
http://www.oracle.com/pls/db92/db92.catalog_views?remark=homepage
And, of course you can also use Explain PLAN FOR, TRACE tool and tons of other ways of instrumentalization. There are some reports in the enterprise manager for the top most expensive SQL Queries. You can also search recent queries kept on the cache.
alter system set timed_statistics=true
--or
alter session set timed_statistics=true --if want to trace your own session
-- must be big enough:
select value from v$parameter p
where name='max_dump_file_size'
-- Find out sid and serial# of session you interested in:
select sid, serial# from v$session
where ...your_search_params...
--you can begin tracing with 10046 event, the fourth parameter sets the trace level(12 is the biggest):
begin
sys.dbms_system.set_ev(sid, serial#, 10046, 12, '');
end;
--turn off tracing with setting zero level:
begin
sys.dbms_system.set_ev(sid, serial#, 10046, 0, '');
end;
/*possible levels:
0 - turned off
1 - minimal level. Much like set sql_trace=true
4 - bind variables values are added to trace file
8 - waits are added
12 - both bind variable values and wait events are added
*/
--same if you want to trace your own session with bigger level:
alter session set events '10046 trace name context forever, level 12';
--turn off:
alter session set events '10046 trace name context off';
--file with raw trace information will be located:
select value from v$parameter p
where name='user_dump_dest'
--name of the file(*.trc) will contain spid:
select p.spid from v$session s, v$process p
where s.paddr=p.addr
and ...your_search_params...
--also you can set the name by yourself:
alter session set tracefile_identifier='UniqueString';
--finally, use TKPROF to make trace file more readable:
C:\ORACLE\admin\databaseSID\udump>
C:\ORACLE\admin\databaseSID\udump>tkprof my_trace_file.trc output=my_file.prf
TKPROF: Release 9.2.0.1.0 - Production on Wed Sep 22 18:05:00 2004
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
C:\ORACLE\admin\databaseSID\udump>
--to view state of trace file use:
set serveroutput on size 30000;
declare
ALevel binary_integer;
begin
SYS.DBMS_SYSTEM.Read_Ev(10046, ALevel);
if ALevel = 0 then
DBMS_OUTPUT.Put_Line('sql_trace is off');
else
DBMS_OUTPUT.Put_Line('sql_trace is on');
end if;
end;
/
Just kind of translated http://www.sql.ru/faq/faq_topic.aspx?fid=389 Original is fuller, but anyway this is better than what others posted IMHO
GI Oracle Profiler v1.2
It's a Tools for Oracle to capture queries executed similar to the SQL Server Profiler.
Indispensable tool for the maintenance of applications that use this database server.
you can download it from the official site iacosoft.com
Try PL/SQL Developer it has a nice user friendly GUI interface to the profiler. It's pretty nice give the trial a try. I swear by this tool when working on Oracle databases.
http://www.allroundautomations.com/plsqldev.html?gclid=CM6pz8e04p0CFQjyDAodNXqPDw
Seeing as I've just voted a recent question as a duplicate and pointed in this direction . . .
A couple more - in SQL*Plus - SET AUTOTRACE ON - will give explain plan and statistics for each statement executed.
TOAD also allows for client side profiling.
The disadvantage of both of these is that they only tell you the execution plan for the statement, but not how the optimiser arrived at that plan - for that you will need lower level server side tracing.
Another important one to understand is Statspack snapshots - they are a good way for looking at the performance of the database as a whole. Explain plan, etc, are good at finding individual SQL statements that are bottlenecks. Statspack is good at identifying the fact your problem is that a simple statement with a good execution plan is being called 1 million times in a minute.
The Catch is Capture all SQL run between two points in time. Like the way SQL Server also does.
There are situations where it is useful to capture the SQL that a particular user is running in the database. Usually you would simply enable session tracing for that user, but there are two potential problems with that approach.
The first is that many web based applications maintain a pool of persistent database connections which are shared amongst multiple users.
The second is that some applications connect, run some SQL and disconnect very quickly, making it tricky to enable session tracing at all (you could of course use a logon trigger to enable session tracing in this case).
A quick and dirty solution to the problem is to capture all SQL statements that are run between two points in time.
The following procedure will create two tables, each containing a snapshot of the database at a particular point. The tables will then be queried to produce a list of all SQL run during that period.
If possible, you should do this on a quiet development system - otherwise you risk getting way too much data back.
Take the first snapshot
Run the following sql to create the first snapshot:
create table sql_exec_before as
select executions,hash_value
from v$sqlarea
/
Get the user to perform their task within the application.
Take the second snapshot.
create table sql_exec_after as
select executions, hash_value
from v$sqlarea
/
Check the results
Now that you have captured the SQL it is time to query the results.
This first query will list all query hashes that have been executed:
select aft.hash_value
from sql_exec_after aft
left outer join sql_exec_before bef
on aft.hash_value = bef.hash_value
where aft.executions > bef.executions
or bef.executions is null;
/
This one will display the hash and the SQL itself:
set pages 999 lines 100
break on hash_value
select hash_value, sql_text
from v$sqltext
where hash_value in (
select aft.hash_value
from sql_exec_after aft
left outer join sql_exec_before bef
on aft.hash_value = bef.hash_value
where aft.executions > bef.executions
or bef.executions is null;
)
order by
hash_value, piece
/
5.
Tidy up Don't forget to remove the snapshot tables once you've finished:
drop table sql_exec_before
/
drop table sql_exec_after
/
Oracle, along with other databases, analyzes a given query to create an execution plan. This plan is the most efficient way of retrieving the data.
Oracle provides the 'explain plan' statement which analyzes the query but doesn't run it, instead populating a special table that you can query (the plan table).
The syntax (simple version, there are other options such as to mark the rows in the plan table with a special ID, or use a different plan table) is:
explain plan for <sql query>
The analysis of that data is left for another question, or your further research.
There is a commercial tool FlexTracer which can be used to trace Oracle SQL queries
This is an Oracle doc explaining how to trace SQL queries, including a couple of tools (SQL Trace and tkprof)
link
Apparently there is no small simple cheap utility that would help performing this task. There is however 101 way to do it in a complicated and inconvenient manner.
Following article describes several. There are probably dozens more...
http://www.petefinnigan.com/ramblings/how_to_set_trace.htm

Resources