When I try to create a #temp table, I get this:
A severe error occurred on the current command. The results, if any, should be discarded.
Any ideas how to solve this error? Thank you!
In one of my update queries, I had the same problem. I realized that the problem was from memory leakage.
Restarting MSSQL service will flush tempDb resource and free a large amount of memory. This will solve the problem.
SQL SERVER – Msg 0, Level 11 – A Severe Error Occurred on the Current Command. The results, if Any, Should be Discarded
CHECKLIST for this error
First and foremost, check database consistency. This can be done by running below command in SQL Server
DBCC CHECKDB('database_name');
If you have nailed down to a table, then check table consistency. We can execute below command
DBCC CHECKTABLE('table_name');
Check the LOG folder which contains ERRORLOG and look for any file named ‘SQLDump*’ at the same time when the error was reported. If you find any, you can either contact Microsoft or use the self-service by getting dump analyzed using diagnostic preview.
If you are getting this while using extended stored procedure, then you need to run, debug by running the exact piece of code one by one. Here is one such error which had none of 3 causes.
In case you want to see the error yourself, feel free to use below code
create table #ErrorLog (column1 char(8000))
go
insert into #ErrorLog
exec xp_readerrorlog 'SQL Error'
drop table #ErrorLog
Reference: Pinal Dave (https://blog.sqlauthority.com)
Related
I have a bulk insert inside a try - catch block:
BEGIN TRY
BULK INSERT dbo.EQUIP_STATUS_CODE
FROM 'filepath\filename.csv'
WITH ( MAXERRORS = 1, FIELDTERMINATOR = ',')
END TRY
BEGIN CATCH
EXECUTE dbo.ERROR_LOG_CSV;
END CATCH
I would like to be able to capture the following error when it occurs:
Bulk load data conversion error (truncation)
But it seems that I can't, even though the level is 16 which falls within the try-catch range. I was wondering if there is a way to capture this error when it occurs.
Before I specified the MAXERRORS to 1 I got this error:
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Since the former error is much more descriptive to the problem, that is the one I'd like to record.
Though my competence is more Oracle than SQL Server, anyway I'll try to help somehow with this issue. I discovered that your situation is already in the bugtracker of SQL Server (bug id: 592960) with status "Won't fix" since 2010. You can see the corresponding discussion on connect.microsoft.com yourself (on the present moment host is unreachable so I used google cache).
Alexander has given you the answer but you have to read bug log very carefully and consider what might be going on. SQL Server (bug id: 592960)
You are trying to bulk insert directly from a data file to a data table?
From the article, there is a mismatch in data types or truncation. The SQL engine has a bug that does not report this as an error.
Quote from first person reporting the bug - "Inspite of the severity level being 16 I don't see the error being caught by TRY / CATCH construct. The code doesn't break and proceeds smoothly as if no error has occurred."
Have you investigated what fields that may contain bad data?
Here are some suggestions.
1 - COMMA DELIMITED FILES ARE PROBLEMATIC - I always hate comma delimited format since commas can be in the data stream. Try using a character like tilde ~ as the delimiter which occurs less often. Could the problem be that a text field has a comma , in it? Thus adding a field to the data stream?
2 - USE STAGING TABLE - It is sometimes better to import the data from the file into a staging table that is defined with columns as varchar (x). This allows the data to get into a table.
Then write a stored procedure to validate the data in the columns before transferring to the production table. Mark any bad rows as suspect.
Insert the data from the staging table to production leaving behind any bad rows.
Send an email for someone to look at the bad data. If this is a re-occurring data file transfer, you will want to fix it at the source.
3 - REWRITE PROCESS WITH A ETL TOOL - Skip writing this stuff in the Engine. SQL Server Integration Services (SSIS) is a great Extract Translate Load (ETL) tool.
There are options in the connection that you can state that text is quoted "", eliminates the above extra comma issue. You can send rows that fail to import into the production table to a hospital table for review.
In summary, there is a bug in the engine.
However, I would definitely consider changing to a tilde formatted file and/or use a staging table. Better yet, if you have the time, rewrite the process with a SSIS package!
Sincerely
J
PS: I am giving Alexander points since he did find the bug on SQL connect. However, I think the format of the file is the root cause.
This will probably catch this error because it catches error Msg 4860:
Q:
TRY doesn't CATCH error in BULK INSERT
BEGIN TRY
DECLARE #cmd varchar(1000)
SET #cmd = 'BULK INSERT [dbo].[tblABC]
FROM ''C:\temp.txt''
WITH (DATAFILETYPE = ''widechar'',FIELDTERMINATOR = '';'',ROWTERMINATOR = ''\n'')'
EXECUTE (#cmd)
END TRY
BEGIN CATCH
select error_message()
END CATCH
I've got following error in dbcc checkdb output from our customer for one table (more of very similar lines):
Msg 8964, Level 16, State 1, Line 1
Table error: Object ID 212503619, index ID 1, partition ID 72057594046251008, alloc unit ID
72057594048675840 (type LOB data). The off-row data node at page (1:705), slot 0, text ID 328867287793664 is not referenced.
CHECKDB found 0 allocation errors and 49 consistency errors in table 'X' (object ID 2126630619).
This error was created when running upgrade of our software (if he restores DB from backup and run the upgrade again, the same issue reappears).
My question is - how can I possibly create this kind of error from my app? I always thought that this kind of error must be caused by some environmental (HDD) problem, but I've seen the same issue on the same table on another environment. I tried the same steps as him, but without success.
Thanks!
You're right, this is probably a severe bug in SQL Server. It is not possible to cause corruption using documented and supported T-SQL. To cause corruption you need
hardware problems
OS-level file system problems (filter drivers, ...)
Undocumented commands like DBCC WRITEPAGE
A severe bug
Can you single-step through the upgrade script? If not, try tracing it with SQL Profiler. Find the statement that first makes corruption appear.
Here is a simpler, less noisy command:
DBCC CHECKDB([AdventureWorks2012]) WITH NO_INFOMSGS, ALL_ERRORMSGS
How can I effectively troubleshoot this error?
The query processor ran out of stack space during query optimization.
Please simplify the query.
Msg 8621, Level 17, State 2
I've tried to attaching profiling, but I'm not sure I have the right messages selected. I do see the error in there. The Estimated Execution Plan gives this error as well.
The sproc I am calling is just doing a really simple UPDATE on one table. There is one UPDATE trigger, but I disabled it, yet it still is giving me this error. I even took the same UPDATE statement out and manually supplied the values. It doesn't return as fast, and still gives me the error.
Edit:
OK, my generated script is setting the PK. So if I set the PK and another column, I get this error. Any suggestions along those lines?
There's a microsoft KB article about this.
Basically it's a bug and you need to update. I'm assuming you are running SQL Server 2005 sp2?
There are a great number of FK's that were being referenced by this PK. I changed our code not to update that PK any further.
This error frequently appears when the number of foreign keys relating to a table exceeds the Microsoft recommended maximum of 253.
You can disable the constraints temporarily by the following line of code:
EXEC sp_MSforeachtable "ALTER TABLE ? NOCHECK CONSTRAINT all"
YOUR DELETE/UPDATE COMMAND
and after the executing your command, enable it again as the following:
EXEC sp_MSforeachtable "ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all"
Hope that it helps.
This isn't always a bug! Sounds like Daniel was able to come to the conclusion that the query wasn't as simple as he originally thought.
This article seems to answer a similar question as the one Daniel had. I just ran into the same error for a different (legitimate) reason as well. Dynamic SQL being run on a database with data no one anticipated resulted in a single select statement with hundreds of tables.
I'm troubleshooting a nasty stored procedure and noticed that after running it, and I have closed my session, lots of temp tables are still left in tempdb. They have names like the following:
#000E262B
#002334C4
#004E1D4D
#00583EEE
#00783A7F
#00832777
#00CD403A
#00E24ED3
#00F75D6C
If I run this code:
if object_id('tempdb..#000E262B') is null
print 'Does NOT exist!'
I get:
Does NOT exist!
If I do:
use tempdb
go
drop TABLE #000E262B
I get an error:
Msg 3701, Level 11, State 5, Line 1
Cannot drop the table '#000E262B', because it does not exist or you do not have permission.
I am connected to SQL Server as sysadmin. Using SP3 64-bit. I currently have over 1100 of these tables in tempdb, and I can't get rid of them. There are no other users on the database server.
Stopping and starting SQL Server is not an option in my case.
Thanks!
http://www.sqlservercentral.com/Forums/Topic456599-149-1.aspx
If temp tables or table variables are frequently used then, instead of dropping them, SQL just 'truncates' them, leaving the definition. It saves the effort of recreating the table next time it's needed.
Tables created with the # prefix are only available to the current connection. Therefore any new connection you create will not be able to see them and therefore not be able to drop them.
How can you tell that they still exist? What command are you running to find this out?
Is the connection that created them closed properly? If not then this may be why they still exist.
Been working with SQL Server since it was Sybase (early 90s for the greenies) and I'm a bit stumped on this one.
In Oracle and DB2, you can pass a SQL batch or script to a stored procedure to test if it can be parsed, then execute conditional logic based on the result, like this pseudocode example:
if (TrySQLParse(LoadSQLFile(filename)) == 1
{ execute logic if parse succeeds }
else
{ execute logic if parse fails }
I'm looking for a system proc or similar function in SQL Server 2008 -- not SHOWPLAN or the like -- to parse a large set of scripts from within a TSQL procedure, then conditionally control exception handling and script execution based on the results. But, I can't seem to find a similar straightforward gizmo in TSQL.
Any ideas?
The general hacky way to do this in any technology that does a full parse/compile before execution is to prepend the code in question with something that causes execution to stop. For example, to check if a vbscript passes syntax checking without actually running it, I prepend:
Wscript.exit(1)
This way I see a syntax error if there are any, or if there are none then the first action is to exit the script and ignore the rest of the code.
I think the analog in the sql world is to raise a high severity error. If you use severity 20+ it kills the connection, so if there are multiple batches in the script they are all skipped. I can't confirm that there is 100.00000% no way some kind of sql injection could make it past this prepended error, but I can't see any way that there could be. An example is to stick this at the front of the code block in question:
raiserror ('syntax checking, disregard error', 20, 1) with log
So this errors out from syntax error:
raiserror ('syntax checking, disregard error', 20, 1) with log
create table t1()
go
create table t2()
go
While this errors out from the runtime error (and t1/t2 are not created)
raiserror ('syntax checking, disregard error', 20, 1) with log
create table t1(i int)
go
create table t2( i int)
go
And to round out your options, you could reference the assembly C:\Program Files\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\Microsoft.SqlServer.SqlParser.dll in a clr utility (outside of the db) and do like:
SqlScript script = Parser.Parse(#"create proc sp1 as select 'abc' as abc1");
You could call an exec(), passing in the script as a string and wrap it in a Try/Catch
There isn't a mechanism in SQL Server to do this. You might be able to do it with a CLR component and SMO, but it seems like a lot of work for questionable gain.
How about wrapping the script in a try/catch block, and executing the "if fails" code in the catch block?
Potentially very dangerous. Google up "SQL injection" and see for yourslef.