Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I know that I can't get a specific answer to my question, but I would like to know if I can find the tools to get to my answer.
Ok we have a Sql Server 2008 database that for the last 4 days has had moments where for 5-20 minutes becomes unresponsive for specific queries.
e.g. The following queries run in different query windows simultaneously have the following results
SELECT * FROM Assignment --hangs indefinitely
SELECT * FROM Invoice -- works fine
Many of the tables have non-clustered indexes to help speed up SELECTs
Here's what I know:
1) The same query will either hang indefinitely or run normally.
2) In Activity Monitor in the processes tab there are normally around 80-100 processes running
I think that what's happening is
1) A user updates a table
2) This causes one or more indexes to get updated
3) Another user issues a select while the index is updating
Is there a way I can figure out why at a specific moment in time SQL Server is being unresponsive for a specific query?
The sp_who system stored procedure will tell you if a connection is blocked because of another query.
When a query (such as your examples) hangs, the most common cause (in my experience) is that another query is blocking it. This might be apparent.
To diagnose this, I simply uses Activity Monitor in Sql server Management Studio.
Find the node that is being blocked and through that, you can find what is causing the block. Then at least you isolate the problem. The next step is to prevent the blocking... That will certainly depend.
For me, it's been a long running BEGIN TRAN contents.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
How to achieve 100 000 users concurrent connectivity to sql server database?
I have a scenario where 100 000 users will login to the website which is using sql server as backend to perform insert on the same table. Is it practical? How to achieve it? How should one design the database?
SQL Server can handle very large number of users inserting rows. What will happen is users will connect, their identity will be checked if they can connect and if they have the correct privileges and then the insert will happen and SQL Server will then drop the connection. There is no reason for a user to maintain a connection once the insert has happened and SQL Server has returned a status code indicating success or failure.
I have had success with very large number of users doing this to a single audit table and I have had to have the audit table be a heap without any keys. This way SQL Server would just add consecutive rows to consecutive pages and this happened fast enough for the inserts to successfully happen. There is no guarantee that the inserts will occur in this fashion (but in practice they did) and it handled very high volumes of data. You of course have to test to see if your install can handle the volume you anticipate. It does not matter what order the data is stored as long as it is successfully saved.
I have never seen an install that can handle 100,000 active sessions. The number of locks would probably overwhelm any conceivable set of hardware. You may also want to do a select ##MAX_CONNECTIONS from the intended machine that will indicate, as the number returned will indicate the maximum number of simultaneous connections the current instance can (theoretically) handle. On both SQL Server 2008 and 2012 enterprise the number returned is 32,767.
I have a program and:
Create a table named table1. Now [database1].table1 is empty.
Start my program. My program calls Ado Connection object's BeginTrans method.
My program does time consuming number of inserts on [database1].table1 via AdoQuery.
My program calls Ado Connection object's CommitTrans method.
If I start the program, then issue select * from table1 query on Management Studio, the query does not return until my program is finished.
I want to see an empty resultset without waiting when issue select query on Management Studio.
Which transaction should I use and how can I configure it with ADO programmatically ?
Edits:
As far as I googled, I may select one of the Repeatable Read, Snapshot or Serializable isolation levels and/or optimistic lock. I make some test and answer my question depending on test results.
Edit after Doug_Ivison 's comment:
In my situation, another closed source software uses the tables instead of Management Studio. And we are trying some kind of replication (Sorry for missing detail, I was trying to keep question shorter).
Thank you for reading my post.
Regards
Ömür Ölmez.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am running some update scripts in a server. Normally that will run in 20 - 30 mins. But today its taking very long time , more than 2 hrs. Will there be any block , can anyone help me how to identify the process running in tat server, and why it is running slow.
sp_who will show you the processes running on your system, the status of each process, and the spid for any blocking processes (along with other information).
You can also use sp_who USERNAME to show the processes from a specific user.
If you wan't more detailed information about what is going on in your server while you are running your query, you can use sp_sysmon to give you detailed statistics about the various actions (reads, writes, cache hits, network reads, disk access, etc.)
sp_sysmon syntax
Performance and Tuning using sp_sysmon
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I accidently executed TRUNCATE command on a wrong table and all my data is gone. We do have a backup but its 2 weeks old and doesn’t have all the latest data.
Any ideas how can we rollback this command and get the data back if possible? Are there any third party tools that can do this?
Simple answer is – you can’t rollback a transaction if it’s already committed but you can do something else to get the data back (or at least some parts of it).
When you execute truncate statement your data is still in the MDF file but it’s not visible because SQL Server is now treating this as free space (truncate is basically telling SQL Server to deallocate data pages).
Only way to get the data back is to somehow read deallocated data pages and convert them into readable data.
Important: you must act fast because free space will be overwritten with new data if not already. If you can stop your SQL Server instance and make a copy of MDF and LDF files that would buy you more time.
Try using ApexSQL Recover. From what I know it’s the only available tool that can do this kind of restore. If you’re really good with SQL you can try modifying and executing very long script like this .
Unless it was Transaction Wrapped, I am afraid you are out of luck.
Here is a similar post, but this also has more explanations.
http://www.sql-server-performance.com/forum/threads/how-to-rollback-truncate-operation.16968/
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm seeking for a solution to monitor (with graphs) the status of my MSSQL instances during a periode of time (day or night) and get or make reports for that monitoring.
Does any one have or know any known simple solutions to do so please ?
Thank you for your help.
Depends on what you want to monitor. There are vendor products that will do it. Idera has Diagnostic Manager and RedGate has SQL Monitor to name just a few. Or if you don't want to spend any money SQL Server has several monitoring options including alerts, policy based management and data collection. These won't be as pretty and may not cover as much not to mention you will be writing some code on your own, but they are free (with SQL Server of course, depending on edition).
Related, though you'll want to edit the scripts for getting information is Ola Hallengren's maintenance solutions http://ola.hallengren.com/. There are useful scripts for maintenance, but depending on what information you're trying to obtain, you can build on these scripts to get information. From there, you can use the results of these queries in SSRS to populate graphs, or even use a tool like Excel.
Again, Ola Hallengren provides maintenance solutions, so these will need to be adjusted for monitoring (depending on what you're monitoring), but they are totally free and you can edit the scripts, as well as see where the information comes from so that you can determine what you need to monitor.