SQL server bulk insert rowterminator failed - sql-server

I have an csv like this :
"F","003","abc""X","1","50A1","name","Z5AA1A005C","70008","","A1ZZZZ17","","","","","","""X","2","50A1","name","Z5AA1A005C","70007","","A1ZZZZ17","","","","","","""X","3","50A1","name","Z5AA1A005C","70000","","A1ZZZZ17","","","","","",""
I need to bulk insert to tabel A
from the 2nd row
BULK INSERT A FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR='","',
ROWTERMINATOR='0x0a',
FIRSTROW = 2,
DATAFILETYPE = 'widenative'
)
the problem is when I insert, failed insert
it show error :
Msg 4866, Level 16, State 8, Line 15 The bulk load failed. The column
is too long in the data file for row 1, column 15. Verify that the
field terminator and row terminator are specified correctly. Msg 7301,
Level 16, State 2, Line 15 Cannot obtain the required interface
("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server
"(null)".
I have tried rowterminator : '0x0a','\n','\r\n','char(10)' but nothing works

Although it will only be inserting data from row2 row 1 still needs to be in the correct format as I'm pretty sure SQLServer performs a 'pre-validation' cycle against the Schema to ensure you have half a chance of the data getting to the database. Row 1 fails this 'pre-validation' because you have not provided all the columns as per the table schema.

Try to Open the file in Notepad then check it for line structure and save it again.

Related

Bulk Insert failing in SQL Server "Column is too long"

I am trying to run the following command in SQL Server and its not working:
bulk insert dbo.Sales
from 'C:\Users\sram1\Downloads\SalesFile20161103\SALECOPY.TXT'
with
(
FIRSTROW = 1,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '0x0a'
)
Here is the error message that is printed:
Msg 4866, Level 16, State 1, Line 131
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 131
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 131
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I looked throughout StackOverflow, and saw that I should change the rowterminator, and I have tried both '0x0a' and '\r\n'. My data is tab separated, but it appears that in some cases, the tab is 2 spaces, other times it is more, other times it is less. Is this perhaps the root of the problem? If so, how do I fix it?
My data is tab separated, but it appears that in some cases, the tab is 2 spaces
No a tab character can't be two spaces. tab separated doesn't mean "data lines up when displayed on the screen". It means there's an ASCII tab character between each column value.
If this is a one-time thing, you might import your data into Excel, and export it as tab-delimited. If it's a regular thing, you'll want to learn how to examine the file to look for nonprinting characters, change line endings, and fix up delimiters.
HTH.

SQL server 2014 error with openrowset bulk and large text field

I have a csv file (200.000 rows) with about 20 columns. One of the columns can contain a large amount of text data.
I try to read the csv with openrowset bulk and I use a format file. The large text column has varbinary (max) in the format file.
This is the command I use:
SELECT *
FROM OPENROWSET(BULK 'F:\Source.csv', FORMATFILE='F:\Source.fmt',
ERRORFILE = 'F:\Bulk_log.txt', FIRSTROW = 2) as t1
On one of the rows I get the following error:
Msg 4866, Level 16, State 1, Line 4
The bulk load failed. The column is too long in the data file for row 119426, column 4. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 4
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 4
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
When I look at the row causing the problem, the large text field has 69.339 bytes of data.
With the errorfile option then it shows the problem starts at 66.367 bytes.
Is there a limitation with openrowset and bulk on the maximum bytes it can read from large fields?
The weird thing is; when I copy the row causing the problem to a separate csv file containing the single problem row, everything works fine.....
I also tried to read the data with SSIS, and there everything works fine as well.
But I would like to stick to 'basic' SQL server (2014) and not move my load method to SSIS.
How can I solve this problem?

Bulk insert breaks when adding DATAFILETYPE='widenative'

I have this little sql script to import a semicolon separated file into a specific table of my database:
BULK
INSERT foo_bar
FROM 'C:\Users\JohnDoe\projects\foo\ftp-data-importator\bar.txt'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 2,
MAXERRORS = 100000,
ERRORFILE = 'c:\temp\foobar_bulk_log.txt'
)
GO
And it's working like a charm.
The only problem is that some special unicode characters like ó or é are not being inserted respecting the encoding of the file.
So I added the next line between the WITH keyword parentheses:
DATAFILETYPE = 'widenative'
And instead of respecting the encoding is breaking the whole execution and giving me the next error:
Msg 4866, Level 16, State 5, Line 5 The bulk load failed. The column
is too long in the data file for row 1, column 1. Verify that the
field terminator and row terminator are specified correctly. Msg 7301,
Level 16, State 2, Line 5 Cannot obtain the required interface
("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server
"(null)".
Where is the problem?
Instead of DataFileType try using CODEPAGE=1252.
Try specifying widechar instead of widenative Your original statement is using character mode, not native BCP format. Also, ensure the source file is Unicode (not UTF-8).

Opening/Importing 147 GIG CSV

I have a CSV file with 350 million records and 350 Columns.. I have tried to import it to a Local MSSQL Database but I keep getting the following error:
Msg 4866, Level 16, State 8, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator
are specified correctly.
Msg 7301, Level 16, State 2, Line 1
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
This is the code I was using to import the data
BULK INSERT TABLE_NAME_HERE
FROM 'C:\PATH_TO_FILE\FILE_NAME.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Can anyone please help me and point me in the right direction to work with these type of files?

Import text file to SQL Server using Bulk Insert

This is my sql
BULK INSERT dbo.Account FROM 'G:\Import\Account3.txt'
WITH
(
FIELDTERMINATOR = '" | "'
)
GO
When I run the sql i got this error
Msg 4866, Level 16, State 1, Line 1
The bulk load failed.
The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Please help me. I already tried many ways but still get the same error.
From your example SQL, it seems you are missing a ROWTERMINATOR statement, specifying how rows are to be differentiated from one another.
Your query would then become something like
BULK INSERT dbo.Account FROM 'G:\Import\Account3.txt'
WITH
(
FIELDTERMINATOR = '" | "',
ROWTERMINATOR = '\r\n'
)
GO
Try this
BULK
INSERT dbo.Account
FROM 'G:\Import\Account3.txt'
WITH
(
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n'
)
GO

Resources