continue with next iteration if foreach loop takes too long - sql-server

I have a for each loop that pulls queries out of a table with queries.
The loop then executes each query and writes the results to a table.
This works like a charm. I even implemented error-handling for when a query could not be executed (if it for example does have syntax-errors).
But sometimes executing a query will take forever, for example a query with a cross join without proper predicament.
Now I would like to be able to set a max duration on the execution of a query. So that a query will be stopped after an x amount of minutes (if it did not finish by then).
The loop should then continue with the next query.
In other words, an iteration should never take longer than X minutes, after that it should continue with the next iteration.
Any ideas, suggestions?

You CANT set a timeout for the query on the server.
Either you create a client app where you can set a timeout for the sql command
SqlCommand command = new SqlCommand(queryString, connection);
// Setting command timeout to 1 second
command.CommandTimeout = 1;
try {
command.ExecuteNonQuery();
}
Or try this External Tool to monitor query and kill those process when time run out. As mentioned on dba.stackexchange

Related

For each container not failing when it is returning 0 rows

I have a for each container in my SSIS package.Inside that i have a script task in it.For each loop is iterating an object variable containing result from an Execute SQL task.
Everything is fine but the problem is for-each loop is not failing even when no rows are returned by previous sql task.
I want the package should fail when zero rows are returned.What should id do?
I generally don't like to have packages fail just because the conditions for running aren't met. "Fail" can mean a phone call in the middle of the night. If this is something that needs to be addressed immediately, that's fine, but if it's not, I'd suggest a graceful exit instead.
A first thought would be to add another Execute SQL task in front of the one you have. In that task, execute the query you're using, but just return a row count, then pass that count to a variable. Then have two precedent constraints coming away from the new Execute SQL task.
Connect your first constraint to your existing Execute SQL task, but add a condition that your row count variable has to be >0.
Create another constraint with the condition that row count ==0. Connect that to, for instance, a Send Mail task that will generate an email saying there were no rows to process, and let the package execution end there.
To have your package fail when there are no rows in the object variable, add an OnPostExecute Event Handler to the Execute SQL Task before the Foreach Loop. In this Event Handler, create a Script Task that checks for existing rows in the object variable, then raises an error if none are found. A DataTable object can be used as outlined below for this, and of course you can include whatever information in the error message that you need. On the Script Task, the FailPackageOnFailure property will need to be set to true.
Although raising the error will already fail the package this prevents subsequent tasks from being executed. The following outlines this further and a reference to the System.Data.OleDb namespace is necessary for this example.
OleDbDataAdapter od = new OleDbDataAdapter();
System.Data.DataTable dt = new System.Data.DataTable();
od.Fill(dt, Dts.Variables["User::YourObjectVariable"].Value);
if (dt.Rows.Count > 0)
{
//If no processing is necessary, nothing needs to be done here.
}
else
{
Dts.Events.FireError(0, "Error", "No rows were available.", String.Empty, 0);
}

Multi-Threaded VB.net: Update then return row id?

I'm trying to code a multi-threaded program using VB.NET. It's a simple program: go to a database, get a row, process some back-end, then update the row and set it as "processed = true". Because there is so much data, I'm planning to do a multi-threaded program for it.
SELECT... FOR UPDATE doesn't seem to work in a transaction for some odd reason, and so I've decided to pre-emptively mark the row as "being read = TRUE", then process it from there.
Is it possible to update the row, then retrieve the row ID from the same SQL statement?
I've tried using these SQL statements together:
Dim sqlUpdateStatement As String = "SET #uid := 0;UPDATE process_data SET reading = TRUE, idprocess_data = (SELECT #uid := idcrawl_data) WHERE reading IS NOT TRUE AND processed IS NOT TRUE LIMIT 1;SELECT #uid;"
but it tells me that there was a fatal error encountered during command execution.
Any ideas?
EDIT
After some testing, I've come to the conclusion that you can't use MySQL variables when performing updates in VB.Net. Is this true? And if so, is there a workaround?
I eventually took the time to debug the SELECT FOR UPDATE portion of my code to get it to work on a Transaction basis. Thanks everyone for their time!

SQL Server on Azure: "Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding."

I'm running a script locally in Powershell that loops bcp, copying hundreds of excel files to a database on Azure (I simply entered the network information as arguments to bcp). On each execution of the loop, I also make a query that updates a column of the table. At the beginning, it works fine. But about a minute in, it slows to a halt and finally produces this error.
47680 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 18250 Average : (2612.60 rows per sec.)
Exception calling "ExecuteReader" with "0" argument(s): "Execution Timeout
Expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
I'm not sure what's going on here. Here's the query it's timing out on:
$query = "UPDATE $tableName SET jobID = $dir_id where jobID is NULL;"
$sqlCmd3 = $connection.CreateCommand()
$sqlCmd3.Connection = $connection
$sqlCmd3.CommandText = $query
$sqlCmd3.ExecuteReader() *>$null
Have you tried to use "ExecuteNonQuery" instead of your "ExecuteReader"?
Execute reader opens a stream to read from which you should close by calling "EndExecuteReader ". Also, you use update statement which is NonQuery statement - just use ExecuteNonQuery instead.
The comment from Jakob is the correct answer. It is not enough to set timeout on your sql server connection as the sql command it self also has a timeout. Increase both if you have timeout issues.

Triple Inner join with over 10.000 rows and asp-calculations stalls application

My ASP Classic application are fetching over 10.000 rows via a triple inner join.
After this, it is going through each line, calculating some stuff and putting data in a multidimensional array with 17 coloumns.
All this are being fetched through a jQuery AJAX when pushing the search button.
Logically this takes a while, 5+ minutes actually.
It all works fine. BUT... While doing this, the whole system freezes, not only for the user doing the calculations, but also for all the other users using the system in other ways.
How can I optimize this?!
A small relevant code snippet:
dim userArray()
sql = "select first.*, second.fullname, second.info, third.inputs from first inner join second on first.userid = second.id inner join third on first.info = third.id where convert(varchar, first.period, 112) between '20020115' and '20120115' order by second.fullname, first.userid"
set rs = conn.execute(sql)
if rs.eof then
response.write("Nothing were found in the period")
else
counter = 0
redim userArray(17, 1000)
do until rs.eof
response.flush
counter = counter + 1
' A LOT of calculations and putting in array...
rs.movenext
loop
for i = 1 to counter
' Doing all the response.writes to the user...
next
end if
Lets analyse this whilst also bearing mind the SQL has an ORDER BY Clause:-
do until rs.eof
response.flush
counter = counter + 1
' A LOT of calculations and putting in array...
rs.movenext
loop
Note the Response.Flush, first thing I would do is get rid of that. You will probably need to increase the ASP Response Buffering Limit (in IIS manager). Flush is sending the generated content so far to the client, it waits for the client to acknowledge receipt of all the packets sent before it completes. That is where I'm gonna guess 90% of the 5+ minutes is being spent.
Now "A LOT calculations". VBScript is not know for its peformance. This code may well take some time. In some cases some calculations can be done much better by SQL than in script so that is one option. Another would be to build some COM compiled component to do complex work (although some accounting needs to made for marshalling which can wipe out benefits). However it may be unavoidable that you need to do these calcs in VBScript.
Now rs.movenext. This loop means you hold the connection and rowset open for pretty much all the time the processing is required. That is while the servers is sending bytes over the network to the client and while VBScript is crunching numbers. A much better approach would be suck up all the rowset quickly and disconnect from the DB, then crunch numbers and finally dump the buffer to the client.
Consider using a disconnected recordset (you specify a client side static cursor) or even the simple GetRows method of the recordset object that dumps the whole rowset into a 2-dimensional array. This will mean that you maintain locks on the various tables for the smallest time possible.
I see you already use response.flush() to flush data to the browser intermittedly during the process, but if you're using AJAX, the call must first complete before your AJAX callback function is called, so I think response.flush is not going to be of any use there.
You might try calling the AJAX url directly, and put in a response.write() in the loop to see what happens (and what the speed is)
To get even more information, you could add a timer before the query, after the query and inside the loop and response.write the time that has passed since starting the script. This will give you a very good idea where the delay is happening: http://www.codefixer.com/codesnippets/vbscript_timer_function.asp
You say the machine freezes, is that the client PC with the browser, or the server where IIS runs? If a bucketload of data is being sent I have seen browsers hang and not update until it's done.
Try to add WITH NOLOCK after each table name in your query to select without locking the database for writing. This might give you some data that was overwritten during the execution of your query, but that's usually not a problem.
Also, indexes. Try changing the clustered index on the fields you use in your WHERE statement, or add some regular indexes if you need your clustered index. Optimizing your indexes will speed things up considerably if you haven't done so.
HTH,
Erik
I'd refactor that code like this:
sql = "select first.*, second.fullname, second.info, third.inputs from first inner join second on first.userid = second.id inner join third on first.info = third.id where convert(varchar, first.period, 112) between '20020115' and '20120115' order by second.fullname, first.userid"
Set rs = conn.Execute(sql)
If NOT rs.EOF Then
aRecords = rs.GetRows() ' retrieve your records and assign them to an array '
End If
rs.Close ' record set is now released, so it shouldn't lock up your database anymore
If IsArray(aRecords) Then ' just a small sanity check '
iCount = UBound(aRecords, 2)
For iCounter = 0 To iCount
' Your calculations and your Response.Writes '
Next
Erase aRecords
Else
' no result found '
End If
Update
You can assign the records to variables in your For loop, e.g.
id = aRecords(0, iCounter)
Then you only need to refer to id when ever you need it. You're right though in that if what you're selecting is dynamic (i.e. column's positions can shift), then this approach can produce problems for when you're trying to assign records to variables further along the line. To assign fullname from your second table, for instance, you'd have to know how many columns are being retrieved from first.*.

Timeout not being honoured in connection string

I have a long running SQL statement that I want to run, and no matter what I put in the "timeout=" clause of my connection string, it always seems to end after 30 seconds.
I'm just using SqlHelper.ExecuteNonQuery() to execute it, and letting it take care of opening connections, etc.
Is there something else that could be overriding my timeout, or causing sql server to ignore it? I have run profiler over the query, and the trace doesn't look any different when I run it in management studio, versus in my code.
Management studio completes the query in roughly a minute, but even with a timeout set to 300, or 30000, my code still times out after 30 seconds.
What are you using to set the timeout in your connection string? From memory that's "ConnectionTimeout" and only affects the time it takes to actually connect to the server.
Each individual command has a separate "CommandTimeout" which would be what you're looking for. Not sure how SqlHelper implements that though.
In addition to timeout in connection string, try using the timeout property of the SQL command. Below is a C# sample, using the SqlCommand class. Its equivalent should be applicable to what you are using.
SqlCommand command = new SqlCommand(sqlQuery, _Database.Connection);
command.CommandTimeout = 0;
int rows = command.ExecuteNonQuery();

Resources