I am trying to run the following command in SQL Server and its not working:
bulk insert dbo.Sales
from 'C:\Users\sram1\Downloads\SalesFile20161103\SALECOPY.TXT'
with
(
FIRSTROW = 1,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '0x0a'
)
Here is the error message that is printed:
Msg 4866, Level 16, State 1, Line 131
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 131
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 131
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I looked throughout StackOverflow, and saw that I should change the rowterminator, and I have tried both '0x0a' and '\r\n'. My data is tab separated, but it appears that in some cases, the tab is 2 spaces, other times it is more, other times it is less. Is this perhaps the root of the problem? If so, how do I fix it?
My data is tab separated, but it appears that in some cases, the tab is 2 spaces
No a tab character can't be two spaces. tab separated doesn't mean "data lines up when displayed on the screen". It means there's an ASCII tab character between each column value.
If this is a one-time thing, you might import your data into Excel, and export it as tab-delimited. If it's a regular thing, you'll want to learn how to examine the file to look for nonprinting characters, change line endings, and fix up delimiters.
HTH.
Related
I have a csv file (200.000 rows) with about 20 columns. One of the columns can contain a large amount of text data.
I try to read the csv with openrowset bulk and I use a format file. The large text column has varbinary (max) in the format file.
This is the command I use:
SELECT *
FROM OPENROWSET(BULK 'F:\Source.csv', FORMATFILE='F:\Source.fmt',
ERRORFILE = 'F:\Bulk_log.txt', FIRSTROW = 2) as t1
On one of the rows I get the following error:
Msg 4866, Level 16, State 1, Line 4
The bulk load failed. The column is too long in the data file for row 119426, column 4. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 4
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 4
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
When I look at the row causing the problem, the large text field has 69.339 bytes of data.
With the errorfile option then it shows the problem starts at 66.367 bytes.
Is there a limitation with openrowset and bulk on the maximum bytes it can read from large fields?
The weird thing is; when I copy the row causing the problem to a separate csv file containing the single problem row, everything works fine.....
I also tried to read the data with SSIS, and there everything works fine as well.
But I would like to stick to 'basic' SQL server (2014) and not move my load method to SSIS.
How can I solve this problem?
I have this little sql script to import a semicolon separated file into a specific table of my database:
BULK
INSERT foo_bar
FROM 'C:\Users\JohnDoe\projects\foo\ftp-data-importator\bar.txt'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 2,
MAXERRORS = 100000,
ERRORFILE = 'c:\temp\foobar_bulk_log.txt'
)
GO
And it's working like a charm.
The only problem is that some special unicode characters like ó or é are not being inserted respecting the encoding of the file.
So I added the next line between the WITH keyword parentheses:
DATAFILETYPE = 'widenative'
And instead of respecting the encoding is breaking the whole execution and giving me the next error:
Msg 4866, Level 16, State 5, Line 5 The bulk load failed. The column
is too long in the data file for row 1, column 1. Verify that the
field terminator and row terminator are specified correctly. Msg 7301,
Level 16, State 2, Line 5 Cannot obtain the required interface
("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server
"(null)".
Where is the problem?
Instead of DataFileType try using CODEPAGE=1252.
Try specifying widechar instead of widenative Your original statement is using character mode, not native BCP format. Also, ensure the source file is Unicode (not UTF-8).
I have an csv like this :
"F","003","abc""X","1","50A1","name","Z5AA1A005C","70008","","A1ZZZZ17","","","","","","""X","2","50A1","name","Z5AA1A005C","70007","","A1ZZZZ17","","","","","","""X","3","50A1","name","Z5AA1A005C","70000","","A1ZZZZ17","","","","","",""
I need to bulk insert to tabel A
from the 2nd row
BULK INSERT A FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR='","',
ROWTERMINATOR='0x0a',
FIRSTROW = 2,
DATAFILETYPE = 'widenative'
)
the problem is when I insert, failed insert
it show error :
Msg 4866, Level 16, State 8, Line 15 The bulk load failed. The column
is too long in the data file for row 1, column 15. Verify that the
field terminator and row terminator are specified correctly. Msg 7301,
Level 16, State 2, Line 15 Cannot obtain the required interface
("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server
"(null)".
I have tried rowterminator : '0x0a','\n','\r\n','char(10)' but nothing works
Although it will only be inserting data from row2 row 1 still needs to be in the correct format as I'm pretty sure SQLServer performs a 'pre-validation' cycle against the Schema to ensure you have half a chance of the data getting to the database. Row 1 fails this 'pre-validation' because you have not provided all the columns as per the table schema.
Try to Open the file in Notepad then check it for line structure and save it again.
I have a CSV file with 350 million records and 350 Columns.. I have tried to import it to a Local MSSQL Database but I keep getting the following error:
Msg 4866, Level 16, State 8, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator
are specified correctly.
Msg 7301, Level 16, State 2, Line 1
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
This is the code I was using to import the data
BULK INSERT TABLE_NAME_HERE
FROM 'C:\PATH_TO_FILE\FILE_NAME.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Can anyone please help me and point me in the right direction to work with these type of files?
I am trying to do a bulk upload in one table in our sql database. This query was running good before, when we had the database on different server, but now on the new server I am getting an error.
Here is all I have:
sql bulk import query:
BULK
INSERT NewProducts
FROM 'c:\newproducts.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
And the errors I am getting are:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Thanks for any help in advance.
For anybody else who comes across this question looking for an answer, this error also happens when the number of columns in your CSV file don't match the columns of the table you're doing the bulk insert into.
I've encountered this before and there are a few things to look for:
Make sure that your csv file doesn't have any blank rows at the top.
Make sure that there are no additional blank rows at the end of the file.
Make sure that the ROWTERMINATOR is actually \n and not \r\n
If you do all three of these and are still getting the error let me know.
In my case the file I was trying to access was in a directory that the SQL Sever process did not have access to. I moved my flat files to a directory SQL had access to and this error was resolved.