SSMS Bulk inserts = Error + Which line is it? - sql-server

I'm trying to insert a lot of data with SQL Server Management Studio. This is what I do:
I open my file containing a lot of SQL inserts: data.sql
I execute it (F5)
I get a lot of these:
(1 row(s) affected)
and some of these:
Msg 8152, Level 16, State 13, Line 26
String or binary data would be truncated.
The statement has been terminated.
Question: How to get the error line number ? Line 26 doesn't seems to be the correct error line number...

This is something that has annoyed SQL Server developers for years. Finally, with SQL Server 2017 CU12 w/ trace flag 460 they give you a better error message, like:
Msg 2628, Level 16, State 6, Procedure ProcedureName, Line Linenumber
String or binary data would be truncated in table ‘%.*ls’, column
‘%.*ls’. Truncated value: ‘%.*ls
A method to get around this now is to add a print statement after each insert. Then, when you see your rows affected print out, you could see what ever you print.
...
insert into table1
select...
print 'table1 insert complete'
insert into table2
select...
print 'table2 insert complete'
This isn't going to tell you what column, but would narrow it down to the correct insert. You can also add SET NOCOUNT ON at the top of your script if you don't want the rows affected message printed out.
Another addition, if you really are using BULK INSERT and weren't just using the term generally, you can specify an ERRORFILE. This will log the row(s) which caused the error(s) in your BULK INSERT command. It's important to know that by default, BULK INSERT will complete if there are 10 errors or less. You can override this by specifying the MAXERRORS in your BULK INSERT command.

Related

How to fix a stored procedure that can recover deleted data

I have lately stumbled upon a blog post that talks about a stored procedure called Recover_Deleted_Data_Proc.sql that can apparently recover your deleted data from the .log file.
There is nothing new under the sun, we are going to use fn_dblog.
STEPS TO REPRODUCE
We are first going to create the table:
--Create Table
CREATE TABLE [Test_Table]
(
[Col_image] image,
[Col_text] text,
[Col_uniqueidentifier] uniqueidentifier,
[Col_tinyint] tinyint,
[Col_smallint] smallint,
[Col_int] int,
[Col_smalldatetime] smalldatetime,
[Col_real] real,
[Col_money] money,
[Col_datetime] datetime,
[Col_float] float,
[Col_Int_sql_variant] sql_variant,
[Col_numeric_sql_variant] sql_variant,
[Col_varchar_sql_variant] sql_variant,
[Col_uniqueidentifier_sql_variant] sql_variant,
[Col_Date_sql_variant] sql_variant,
[Col_varbinary_sql_variant] sql_variant,
[Col_ntext] ntext,
[Col_bit] bit,
[Col_decimal] decimal(18,4),
[Col_numeric] numeric(18,4),
[Col_smallmoney] smallmoney,
[Col_bigint] bigint,
[Col_varbinary] varbinary(Max),
[Col_varchar] varchar(Max),
[Col_binary] binary(8),
[Col_char] char,
[Col_timestamp] timestamp,
[Col_nvarchar] nvarchar(Max),
[Col_nchar] nchar,
[Col_xml] xml,
[Col_sysname] sysname
)
And we then insert data into it:
--Insert data into it
INSERT INTO [Test_Table]
([Col_image]
,[Col_text]
,[Col_uniqueidentifier]
,[Col_tinyint]
,[Col_smallint]
,[Col_int]
,[Col_smalldatetime]
,[Col_real]
,[Col_money]
,[Col_datetime]
,[Col_float]
,[Col_Int_sql_variant]
,[Col_numeric_sql_variant]
,[Col_varchar_sql_variant]
,[Col_uniqueidentifier_sql_variant]
,[Col_Date_sql_variant]
,[Col_varbinary_sql_variant]
,[Col_ntext]
,[Col_bit]
,[Col_decimal]
,[Col_numeric]
,[Col_smallmoney]
,[Col_bigint]
,[Col_varbinary]
,[Col_varchar]
,[Col_binary]
,[Col_char]
,[Col_nvarchar]
,[Col_nchar]
,[Col_xml]
,[Col_sysname])
VALUES
(CONVERT(IMAGE,REPLICATE('A',4000))
,REPLICATE('B',8000)
,NEWID()
,10
,20
,3000
,GETDATE()
,4000
,5000
,getdate()+15
,66666.6666
,777777
,88888.8888
,REPLICATE('C',8000)
,newid()
,getdate()+30
,CONVERT(VARBINARY(8000),REPLICATE('D',8000))
,REPLICATE('E',4000)
,1
,99999.9999
,10101.1111
,1100
,123456
,CONVERT(VARBINARY(MAX),REPLICATE('F',8000))
,REPLICATE('G',8000)
,0x4646464
,'H'
,REPLICATE('I',4000)
,'J'
,CONVERT(XML,REPLICATE('K',4000))
,REPLICATE('L',100)
)
GO
We are now going to verify if the data are there:
--Verify the data
SELECT * FROM Test_Table
At this point we need to create the stored procedure. I couldn't paste it here because it's too long but you can download it from the same blog post there is a link to a Box file.
If the query gives you troubles like this:
Msg 50000, Level 16, State 1, Procedure Recover_Deleted_Data_Proc, Line 22 [Batch Start Line 700] The compatibility level should be equal to or greater SQL SERVER 2005 (90)
Msg 50000, Level 16, State 1, Procedure Recover_Deleted_Data_Proc, Line 22 [Batch Start Line 705] The compatibility level should be equal to or greater SQL SERVER 2005 (90)
Is because you have to comment out from line 701 to line 708.
Cool, let's now delete the data from that table:
--Delete the data
DELETE FROM Test_Table
And confirm that the data were deleted:
--Verify the data
SELECT * FROM Test_Table
And here is the last step: we need to try to recover the data using the freshly installed stored procedure.
The author instruct us to use one of these two commands (don't forget to change 'test' with the name of your database):
--Recover the deleted data without date range
EXEC Recover_Deleted_Data_Proc 'test', 'dbo.Test_Table'
or
--Recover the deleted data it with date range
EXEC Recover_Deleted_Data_Proc 'test', 'dbo.Test_Table', '2012-06-01', '2012-06-30'
But the problem is that both returns this error:
(8 rows affected)
(2 rows affected)
(64 rows affected)
(2 rows affected)
(1 row affected)
(1 row affected)
(1 row affected)
(1 row affected)
(1 row affected)
(1 row affected)
Msg 245, Level 16, State 1, Procedure Recover_Deleted_Data_Proc, Line 485 [Batch Start Line 112]
Conversion failed when converting the varchar value '0x41-->01 ; 0001' to data type int.
If I right click on the stored procedure and I click "Modify", I don't see anything particularly fishy at Line 485.
Any idea why this stored procedure is not working?
What is the conversion mentioned?
The code is 10 years old and was written with the assumption that a [PAGE ID] would only ever be expressed as a pair of integers, e.g. 0001:00000138 - however, as you have learned, sometimes that is expressed differently, like 0x41-->01 ; 0001:00000138.
You can fix that problem by adding this inside the cursor:
IF #ConsolidatedPageID LIKE '0x%-->%;%'
BEGIN
SET #ConsolidatedPageID = LTRIM(SUBSTRING(#ConsolidatedPageID,
CHARINDEX(';', #ConsolidatedPageID) + 1, 8000));
END
But then your next problem is when you saved the procedure from the box file it probably changed '†' to some wacky ? character. When I fixed that (using N'†' of course, since Unicode characters should always have N), I still got these error messages:
Msg 537, Level 16, State 3, Procedure Recover_Deleted_Data_Proc, Line 525
Invalid length parameter passed to the LEFT or SUBSTRING function.
Msg 9420, Level 16, State 1, Procedure Recover_Deleted_Data_Proc, Line 651
XML parsing: line 1, character 2, illegal xml character
After 15 minutes of trying to reverse engineer this spaghetti, I gave up. If you need to recover data you deleted, restore a backup. If you don't have a backup, well, that's why we take backups. The fragile scripts people try to create to compensate for not taking backups are exactly why log recovery vendors charge the big bucks.
As an aside, the compatibility level error message is a red herring, totally misleading as the logic is currently written, and completely irrelevant to the problem. But it can be solved if, right before this:
IF ISNULL(#Compatibility_Level,0)<=80
BEGIN
RAISERROR('The compatibility level should ... blah blah',16,1)
RETURN
END
You add this:
IF DB_ID(#Database_Name) IS NULL
BEGIN
RAISERROR(N'Database %s does not exist.',11,1,#Database_name);
RETURN;
END
Or simply not calling those two example calls at the end of the script, since they depend on you having a database called test, which clearly you do not.

How to Reduce the Memory Usage in BULK INSERT?

I am using Bulk Insert command to insert 4,910,496 records, as below:
BULK INSERT [MyTable] FROM 'F:\mydata.dat' WITH (DATAFILETYPE = 'widechar', FORMATFILE = 'F:\myformat.XML', MAXERRORS = 2147483647, ROWS_PER_BATCH = 4910496, TABLOCK);
However, from time to time, the insertion will fail and I will get the following error:
Error: 802, Severity: 17, State: 20. (Params:). The error is printed in terse mode because there was error during formatting. Tracing, ETW, notifications etc are skipped.
I check online and see https://blog.sqlauthority.com/2018/08/16/sql-server-error-17300-the-error-is-printed-in-terse-mode-because-there-was-error-during-formatting/ , which said this error occurs because SQL Server is out of memory.
So, my question is, how to prevent SQL Server from out of memory in bulk insert operation?

SQL Server 2005 Alter table column not being recognized

This is on a replicated table in SQL Server 2005
I used the following command:
ALter table dbo.apds alter column docket nvarchar(12) null
and it executed with no errors, everything looks clean.
Column spec shows it now has 12 (was previously set to 6) on both tables, publisher
and subscriber.
But when I try to put more than 6 characters in that column, I get the error:
Msg 8152 lefel 16, state 13, procedure trgapdsupdate, line 5
String or binary data would be truncated.
I can still only write 6 characters of data to that column even though it shows 12 as the
column specification..
Any ideas?
Thank you in advance..
You say
I get the error: msg 8152 lefel 16, state 13, procedure trgapdsupdate,
line 5 string or binary data would be truncated. The statement has
been terminated
So what is trgapdsupdate?
From the name it looks like an update trigger on the table apds?
Does that need to be updated to deal with the new column value? For example writing to an audit table whose definition needs to be updated.

How to find what caused errors reported in a SQL Server profiler trace?

I was running a trace on a Sql Server 2005 using the profiler and need to find out what is causing the reported errors.
I used the "blank" template, and selected all columns of the following events:
Exception
Exchange Spill Event
Execution Warnings
Hash Warnings
Missing Column Statistics
Missing Join Predicate
I noticed a number of these errors in the "TextData" column:
Error: 156, Severity: 16, State: 0
Error: 208, Severity: 16, State: 0
I looked up the errors (Incorrect syntax, Invalid object name), but how can I tell what stored procedure or query is causing them?
Don't worry about the 208 errors. 208 is "Object not found". Profiler picks up these due to what's called 'deferred name resolution'.
Take the following procedure.
CREATE PROCEDURE Demo AS
CREATE TABLE #Temp (ID int)
INSERT INTO #Temp VALUES (1)
SELECT ID FROM #Temp
GO
That proc will run fine without any errors however, if you have a profiler trace running, you'll see one or two instances of error 208. It's because the table #Temp doesn't exist when the proc starts, which is when the code is parsed and bound. The process of binding to the underlying objects fails.
Once the create table runs, the other statements get recompiled and bound to the correct table and run without error.
The only place you'll see that deferred resolution error is in profiler.
in sql 2005 you can't.
you'll have to run the profiler trace of SQL:StmtStarting, SQL:StmtCompleted, User Error Message and Exception events with text, transactionId, EventSequence and otehr columns you need to get a picture of what's going on.

Is SQL Server Bulk Insert Transactional?

If I run the following query in SQL Server 2000 Query Analyzer:
BULK INSERT OurTable
FROM 'c:\OurTable.txt'
WITH (CODEPAGE = 'RAW', DATAFILETYPE = 'char', FIELDTERMINATOR = '\t', ROWS_PER_BATCH = 10000, TABLOCK)
On a text file that conforms to OurTable's schema for 40 lines, but then changes format for the last 20 lines (lets say the last 20 lines have fewer fields), I receive an error. However, the first 40 lines are committed to the table. Is there something about the way I'm calling Bulk Insert that makes it not be transactional, or do I need to do something explicit to force it to rollback on failure?
BULK INSERT acts as a series of individual INSERT statements and thus, if the job fails, it doesn't roll back all of the committed inserts.
It can, however, be placed within a transaction so you could do something like this:
BEGIN TRANSACTION
BEGIN TRY
BULK INSERT OurTable
FROM 'c:\OurTable.txt'
WITH (CODEPAGE = 'RAW', DATAFILETYPE = 'char', FIELDTERMINATOR = '\t',
ROWS_PER_BATCH = 10000, TABLOCK)
COMMIT TRANSACTION
END TRY
BEGIN CATCH
ROLLBACK TRANSACTION
END CATCH
You can rollback the inserts . To do that we need to understand two things first
BatchSize
: No of rows to be inserted per transaction . The Default is entire
Data File. So a data file is in transaction
Say you have a text file which has 10 rows and row 8 and Row 7 has some invalid details . When you Bulk Insert the file without specifying or with specifying batch Size , 8 out of 10 get inserted into the table. The Invalid Row's i.e. 8th and 7th gets failed and doesn't get inserted.
This Happens because the Default MAXERRORS count is 10 per transaction.
As Per MSDN :
MAXERRORS :
Specifies the maximum number of syntax errors allowed in the data
before the bulk-import operation is canceled. Each row that cannot be
imported by the bulk-import operation is ignored and counted as one
error. If max_errors is not specified, the default is 10.
So Inorder to fail all the 10 rows even if one is invalid we need to set MAXERRORS=1 and BatchSize=1 Here the number of BatchSize also matters.
If you specify BatchSize and the invalid row is inside the particular batch , it will rollback the particular batch only ,not the entire data set.
So be careful while choosing this option
Hope this solves the issue.
As stated in the BATCHSIZE definition for BULK INSERT in MSDN Library (http://msdn.microsoft.com/en-us/library/ms188365(v=sql.105).aspx) :
"If this fails, SQL Server commits or rolls back the transaction for every batch..."
In conclusion is not necessary to add transactionality to Bulk Insert.
Try to put it inside user-defined transaction and see what happens. Actually it should roll-back as you described it.

Resources