When I am executing a select statement with 4 'inner' joins and two 'WHERE' conditions it took 13-15 s in local SSMS (I have executed 5 times). But when I connect the same instance from another server's SSMS and execute the same query it took 5 s to execute first time and then it took 0 s! I am using the same user SA.
Is there any possible explanation for that?
Host instance is SQL 2008 and Remote instance has SQL 2008 R2.
If your query is returning data to display in your local SSMS then this data needs to be transferred from the server to your local SSMS. The time to transfer the data from the server to your local SSMS is included in the execution time. So, the execution time is a combination of executing the script and fetching the data in order to display it.
You might want to "Include Client Statistics" and then review the row "Bytes received from server" in the "Client Statistic" tab of the result window.
In order to verify my assumption you can alter your select only to execute without fetching the data.
Related
I have been given an Access Database that I have to try to decipher what it is doing.
As a start I see that there is a Pass Through query with a command like:
Exec RefreshGLTableLatestEntries
#sourceDB = 'DB_NAME' ,
#tablePrefix = 'TableName$' ,
#logFile = 'C:\logDB.txt'
When I run it I will get something like:
Result
Success... 108 rows inserted with a total amount of $0.000000
What I don't understand is where are the rows being copied from or copied to.
In the MSSQL database I don't see a table, query, standard procedure or function called 'TableName$'. There are quite a few tables & queries called 'TableName$SomethingElse'. Is there a way to see more details on where is the data coming from?
Similarly, how can I see where are the rows being inserted to? I can not find any file named 'logDB.txt' in my hard disk to see the log. I would suspect that it might not say much more that '...108 rows insterted...'
I'm using:
Access 2016 from Office 365, Version 1609
MS SQL Server Management Studio v17.1
Any ideas on how to get more information on how to get more information on what the Pass Through do?
A Pass-Through query in Access is equivalent to running its SQL code in SQL Server Management Studio.
(In the database that is designated by the connection string of the Pass-Through query.)
The SQL is sent as-is to MSSQL and run there.
RefreshGLTableLatestEntries is the stored procedure that is executed here. You need to locate and analyze it in SQL Server.
I have a report running against a Data Driven subscription in SSRS. The subscription runs a report and produces PDFs - about 1000 of them. The process takes about 2 minutes to complete.
I have been kicking this off manually using the following SQL:
EXEC msdb.dbo.sp_start_job #job_name = '<job_name>'
This works, but what I would like to know is when the job has finished. According to what I have read so far, I should be able to run:
exec msdb.dbo.sp_help_job
This lists my job, but it always has a status of 4 (Idle), even while I can see that reports are being produced.
How can I tell when the job has completed and all my reports have been produced?
MSDB shouldn't contain informtaion on the reporting server. The reporting server is seperate from Sql Server Management Server and will only tell you if the job ran or not not what happened in the job. If you have access to the DB I don't know how you have it set up but I have a subscriptions table that I can check with email sent and when it was sent. IF you don't have that you can go onto the reportserver web site and check the subscription and check the status and it should have a date of when it was last sent.
The only way you can access the information in Sql Server Management Studio is by queryng the DB and its tables assuming it is setup correctly.
I have a stored procedure that I've tested over and over again in SQL Server Management Studio and it works fine, returning results in roughly 3 seconds; however, when I add the Stored Procedure as the Query Type for a new DataSet in Report Builder and attempt to, either run the report or execute the SP through the built-in Query Designer the execution call times-out. I haven't even used the dataset yet in any area within the report (Tablix's or Charts).
I've made sure that the Data Sources credentials are setup properly and even tested the connection to the DB and received a successful connection statement.
I have the Dataset Time out property set to 0 which should mean no timeout. Clearly, the timeout I'm receiving is being handled by the SQL Server rather than the Report Builder in this particular case.
What would make the Stored Procedure execute properly and efficiently (speed-wise) when executed from the server, but time-out when executed from Report Builder?
I'm running 2008 R2.
Please help! Thanks in advance.
Most time-out errors occur during query processing. If you are encountering time-out errors, try increasing the query time-out value. Make sure to adjust the report execution time-out value so that it is larger than the query time-out. The time period should be sufficient to complete both query and report processing.
Thanks
Venky
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I need to manually run a job on more than 150 sql server instances (sql server 2000, remote) from a sql server 2005 instance (the local server). The job is the same on all these instances. The job just calls a stored procedure without parameter, which is also the same across all the instances. These jobs are on a schedule. But now they want me to manually run the job for all the instance or for specified instances upon request.
What is the best practice for this? I have tried openrowset to call the remote stored procedure. But each run of the job takes couple of minutes, so if I use a loop to run all these jobs, it will run one by one and that's a long time. Ideally, it should be able to run the stored procedure on each instance without waiting for it to finish. More ideally, it should be able to run the job on each instance without waiting for it to finish, so it can leave a record in the job history on each instance.
And the stored procedure is from a third party so it can't be altered.
update:
since the 'people' ask this to be initialised from a SSRS report, use SSRS to call some T-SQL/proc on the server would be most appropriate. The problem I got now is when calling msdb.dbo.sp_start_job on remote SQL Server 2000 instances from local server using OPENQUERY or OPENROWSET I got Cannot following error.
process the object "exec msdb.dbo.sp_start_job #job_name = 'xxx' ". The OLE DB provider "SQLNCLI" for linked server "xxx" indicates that either the object has no columns or the current user does not have permissions on that object.
I guess this may because the sp_start_job doesn't return anything because I can use OPENQUERY/OPENROWSET to call other remote proc without problem. So any workaround?
update:
I have found it actually pretty simple in t-sql.
EXEC [linkedServerName].msdb.dbo.sp_start_job #job_name = 'test2'
So I don't need to use OPENROWSET/OPENQUERY atually since all the remote SQL Server 2000 instances are already added as remote servers.
Is this actually a SQL Agent job that calls the procedure? If so you remotely call sp_start_job which will kick off the job asynchronously. I assume you're connecting as a sysadmin to do this.
If you had SQL Server 2008 client tools you could use "Local Server Groups". If you register all 150 SQL instances you can connect to them all at once by right-clicking the entire group and selecting "New Query". Down in the status bar you'll see it connect to 150/150 servers. Then whatever command you run it will run it on all 150 at the same time.
The servers don't have to be 2008, only your client tools. I routinely use this feature on SQL 2000 servers.
I'm running SQL Server 2005 and using IIS for ASP scripts.
I have a problem with sql that is when I run a sql query (exp : http://[host name] with localhost as [host name]) at first time (like when I start my windows) or when the session times out due to being idle for too long, this error happens :
Login Timeout Expired
but after that when I refresh the page, everything will be OK and it works like it should be.
It looks like it takes a long time to run the sproc the "first" time, from then on it only takes less than a sec to execute the sproc.
SQL Server will generate a execution plan for the stored procedure when it runs the first time, so it will cause a long duration. Then next times SQL can reuse the exection plan so the duration becomes shorter. However a duration of 46secs is abnormal, you can try to create a clustered index on the table, in order to speed up query.