I know that in .Net provide parallel programming but I don't know if is it possible to run query parallel in SQLServer. If is it possible, please give me for example query parallel or web link to show technology.
if is it possible to run query parallel in SQL Server. If is it possible,
What you mean with parallel?
Multiple queries at the same time? How you thin kSQL Server handles multiple users. Open separate connections, run queries on them.
One query? Let SQL Server parallelize it - as it does automatically. As is written in the documentation.
This may help you or not, but opening two instances of SSMS works too.
This has been covered a number of times. Try this.
Parallel execution of multiple SQL Select queries
If you're using >NET 4.5 then using the new async methods would be a cleaner approach.
Using SqlDataReader’s new async methods in .Net 4.5
Remember that doing this will make your client more responsive at the cost of placing more load on your SQL server. Rather than every client sending a single SQL command at a time they will be sending several so to the SQL server it will appear as though there at many more clients accessing it.
The load on the client will be minimal as it will be using more threads but most of the time these will simply be waiting for results to return from SQL.
Short answer: yes. Here's a technical link: http://technet.microsoft.com/en-us/library/ms178065(v=sql.105).aspx
Parallelism in SQL Server is baked in to the technology; whenever a query exceeds a certain cost, the optimizer will generate a parallel execution plan.
Related
I'm having issues with my SSRS reports running slow. Using SQL Profiler, I found out that the queries are running one at a time. I did research and found the suggestion to make sure "Use single transaction when processing the queries" was not clicked in my Data Source. This was already set to off. I am now testing if not only the Data sets won't run in parallel, but the Data Sources also won't run in parallel.
Using SQL Profiler, I'm finding that my single .Net Client Process logs into the first Data Source, sets up properties..
SELECT
DATABASEPROPERTYEX(DB_NAME(), 'Collation'),
COLLATIONPROPERTY(CONVERT(char, DATABASEPROPERTYEX(DB_NAME(), 'collation')),'LCID')
and then runs my SQL statement. After completion, the same ClientProcessID moves onto the next Data Source and does that one.
Has anyone run into this problem before? Are there other issues at play?
Thanks
Are you running/testing these on the reporting server, or from your development machine? Because, the dataset queries will not run in parallel in BIDS, but they should on the server. (Posted in comments by R. Richards)
I'm looking at source code on DotNet framework that are using MS-SQL object, ExecuteNonQuery and ExecuteNonQueryAsync.
My understanding is the use of async and without async only affect application (Window Service) when it come to blocking and non-blocking call.
I'm getting a high CPU usage when a delete command was issued, in SQL Server Profiliing it showed CPU of 178 for a simple delete command. The database index looks ok, when applying to SQL Query.
So, I wanna know is how much of SQL-Server performance improvement will be made if I switch the source code from foo.ExecuteNonQuery() to await foo.ExecuteNonQueryAsync()? Does it make a difference in SQL Server or not?
Your knowledge and input will be greatly appreciated.
how much [...] performance improvement will be made if I switch [...] from foo.ExecuteNonQuery() to await foo.ExecuteNonQueryAsync()
None whatsoever. Your local client thread that is executing the query will not block with ExecuteNonQueryAsync, that's all. SQL Server will execute the query just the same.
What you should be doing is analyze why your query is taking so long by executing it in SQL Server Management Studio, including the Actual Execution Plan, and analyzing why it is that it's taking so long.
Perhaps your server is very busy, that's possible too. Or there is blocking going on due to concurrent transactions. A good tool to analyze this is Adam Machanic's tool sp_whoisactive: How to Log Activity Using sp_whoisactive in a Loop
I've noticed that every single time I run a query in MS-Access the entire interface becomes unresponsive for the duration of the query run. This very much appears to me to be blocking behavior.
Is there any set of APIs (ODBC, OLE, ADO, ADOX, etc) that allows non-blocking access to an MS-Access database (JET)? I'd even settle for using SQL server.
If it is possible, does it require a specific version of JET to be used?
My guess is that the answer is no, but I thought I'd ask anyway.
Access is single-threaded so your observation is correct.
However, several instances of Access can access the same (backend) database so that may be an option you for - for example for a reporting or exporting "engine" that creates a long series of reports or exports.
I want to use single database connection from multiple threads to read (or to execute only select statements) in MS SQL Server in simultaneously. Is it possible to execute all these select statements in simultaneously from different threads.
I m using MS SQL Server from C++ in Linux environment. I need to create Database connection pools for reading and writing separately. So i want to know is there a possibility of sharing a same connection between threads to read only.
The select statements may return multiple rows (more than one row or result set). Will this be a problem?
Yes there will be a problem. Only one command can be executed at a time.
But you'll be fine using multiple connections, connection pooling works great for SQL server.
Don't use the same connection across threads. Only one command can be executed per connection. Create a connection for each thread. I'd suggest making a helper class to make this easier for you.
I want to use single database connection from multiple threads to read (or to execute only select
statements) in MS SQL Server in simultaneously
You should start reading the documentation and not start having funny ideas that whas you want matters here.
Yes, you CAN do that (MARS is the topic- read it up) but one connection can alwayso nly have one transaction context so it isa good approach to ahvem ultiple selects in one transaction (insert, couple of upserts etc.) but bad as a generic approach for progrmaming database connections.
Follow the recipes which mean iopen a conenction when you need it, close it when done and dont be afraid to run mtultiple connections.
If you worry about multi process in SQL Server, must think of two option.
Parallelism settings
AlwaysOn setup
The first one helps you by executing a single query with more than one cpu logical core; while the second one helps you to have a load balance, when you have concurrent connections.
In Maybe Normalizing Isn't Normal Jeff Atwood says, "You're automatically measuring all the queries that flow through your software, right?" I'm not but I'd like to.
Some features of the application in question:
ASP.NET
a data access layer which depends on the MS Enterprise Library Data Access Application Block
MS SQL Server
In addition to Brad's mention of SQL Profiler, if you want to do this in code, then all your database calls need to funnelled through a common library. You insert the timing code there, and voila, you know how long every query in your system takes.
A single point of entry to the database is a fairly standard feature of any ORM or database layer -- or at least it has been in any project I've worked on so far!
SQL Profiler is the tool I use to monitor traffic flowing to my SQL Server. It allows you to gather detailed data about your SQL Server. SQL Profiler has been distributed with SQL Server since at least SQL Server 2000 (but probably before that also).
Highly recommended.
Take a look at this chapter Jeff Atwood and I wrote about performance optimizations for websites. We cover a lot of stuff, but there's a lot of stuff about database tracing and optimization:
Speed Up Your Site: 8 ASP.NET Performance Tips
The Dropthings project on CodePlex has a class for timing blocks of code.
The class is named TimedLog. It implements IDisposable. You wrap the block of code you wish to time in a using statement.
If you use rails it automatically logs all the SQL queries, and the time they took to execute, in your development log file.
I find this very useful because if you do see one that's taking a while, it's one step to just copy and paste it straight off the screen/logfile, and put 'explain' in front of it in mysql.
You don't have to go digging through your code and reconstruct what's happening.
Needless to say this doesn't happen in production as it'd run you out of disk space in about an hour.
If you define a factory that creates SqlCommands for you and always call it when you need a new command, you can return a RealProxy to an SqlCommand.
This proxy can then measure how long ExecuteReader / ExecuteScalar etc. take using a StopWatch and log it somewhere. The advantage to using this kind of method over Sql Server Profiler is that you can get full stack traces for each executed piece of SQL.