Bulk Upload: "unexpected end of file" on new server - sql-server

I am trying to do a bulk upload in one table in our sql database. This query was running good before, when we had the database on different server, but now on the new server I am getting an error.
Here is all I have:
sql bulk import query:
BULK
INSERT NewProducts
FROM 'c:\newproducts.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
And the errors I am getting are:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Thanks for any help in advance.

For anybody else who comes across this question looking for an answer, this error also happens when the number of columns in your CSV file don't match the columns of the table you're doing the bulk insert into.

I've encountered this before and there are a few things to look for:
Make sure that your csv file doesn't have any blank rows at the top.
Make sure that there are no additional blank rows at the end of the file.
Make sure that the ROWTERMINATOR is actually \n and not \r\n
If you do all three of these and are still getting the error let me know.

In my case the file I was trying to access was in a directory that the SQL Sever process did not have access to. I moved my flat files to a directory SQL had access to and this error was resolved.

Related

SQL Server bulk insert fails, but file imports easily with import wizard

I have an R script that combines years of FFIEC Bank Call Report schedules into flat files--one for each schedule--then writes each schedule to a tab-delimited, non-quoted flat file suitable for bulk inserting into SQL Server. Then I run this bulk insert command:
bulk insert CI from 'e:\CI.txt' with (firstrow = 2, rowterminator = '0x0a', fieldterminator = '\t')
The bulk insert will run for a while then quit, with this error message:
Msg 7301, Level 16, State 2, Line 4
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
I've searched here for answers and the most common problem seems to be the rowterminator argument. I know that the files I've created have a line feed without a carriage return, so '0x0a' is the correct argument (but I tried '\n' and it didn't work).
Interestingly, I tried setting the fieldterminator to gibberish just to see what happened and I got the expected error message:
The bulk load failed. The column is too long in the data file for row 1, column 1."
So that tells me that SQL Server has access to the file and is indeed starting to insert it.
Also, I did a manual import (right click on database, tasks->Import Data) and SQL Server swallowed up the file without a hitch. That tells the layout of the table is fine, and so is the file?
Is it possible there's something at the end of the file that's confusing the bulk insert? I looked in a hex editor and it ends with data followed by 0A (the hex code for a line feed).
I'm stumped and open to any possibilities!

Bulk Insert return error Msg 7399 and Msg 7330

My problem is when I run the bulk insert it return the following error. Any idea on the error? Some help is appreciated.
Actually regarding this issues my testing environment able to execute normally but my production server return this error.
Error message:
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Code:
BULK INSERT Table_ZZ
FROM 'e:\Folder\sometextfile.txt'
WITH
(ROWTERMINATOR = '')
Check that the CSV fields are correct, the error could be due to number of columns in the DB not matching the CSV pattern (newly created columns or whatever).
In addition to this, i suggest you to consider the newline character as the row terminator, using its ASCII code:
(ROWTERMINATOR = '0x0A')
Hope this helps.
Updating SQL Patches solve my problem.
Before the SQL server are not yet being patches, one day decide to do patches, and suddenly solve my problem.

Bulk Insert failing in SQL Server "Column is too long"

I am trying to run the following command in SQL Server and its not working:
bulk insert dbo.Sales
from 'C:\Users\sram1\Downloads\SalesFile20161103\SALECOPY.TXT'
with
(
FIRSTROW = 1,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '0x0a'
)
Here is the error message that is printed:
Msg 4866, Level 16, State 1, Line 131
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 131
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 131
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I looked throughout StackOverflow, and saw that I should change the rowterminator, and I have tried both '0x0a' and '\r\n'. My data is tab separated, but it appears that in some cases, the tab is 2 spaces, other times it is more, other times it is less. Is this perhaps the root of the problem? If so, how do I fix it?
My data is tab separated, but it appears that in some cases, the tab is 2 spaces
No a tab character can't be two spaces. tab separated doesn't mean "data lines up when displayed on the screen". It means there's an ASCII tab character between each column value.
If this is a one-time thing, you might import your data into Excel, and export it as tab-delimited. If it's a regular thing, you'll want to learn how to examine the file to look for nonprinting characters, change line endings, and fix up delimiters.
HTH.

SQL server 2014 error with openrowset bulk and large text field

I have a csv file (200.000 rows) with about 20 columns. One of the columns can contain a large amount of text data.
I try to read the csv with openrowset bulk and I use a format file. The large text column has varbinary (max) in the format file.
This is the command I use:
SELECT *
FROM OPENROWSET(BULK 'F:\Source.csv', FORMATFILE='F:\Source.fmt',
ERRORFILE = 'F:\Bulk_log.txt', FIRSTROW = 2) as t1
On one of the rows I get the following error:
Msg 4866, Level 16, State 1, Line 4
The bulk load failed. The column is too long in the data file for row 119426, column 4. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 4
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 4
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
When I look at the row causing the problem, the large text field has 69.339 bytes of data.
With the errorfile option then it shows the problem starts at 66.367 bytes.
Is there a limitation with openrowset and bulk on the maximum bytes it can read from large fields?
The weird thing is; when I copy the row causing the problem to a separate csv file containing the single problem row, everything works fine.....
I also tried to read the data with SSIS, and there everything works fine as well.
But I would like to stick to 'basic' SQL server (2014) and not move my load method to SSIS.
How can I solve this problem?

SQL server bulk insert rowterminator failed

I have an csv like this :
"F","003","abc""X","1","50A1","name","Z5AA1A005C","70008","","A1ZZZZ17","","","","","","""X","2","50A1","name","Z5AA1A005C","70007","","A1ZZZZ17","","","","","","""X","3","50A1","name","Z5AA1A005C","70000","","A1ZZZZ17","","","","","",""
I need to bulk insert to tabel A
from the 2nd row
BULK INSERT A FROM 'c:\csvtest.csv'
WITH
(
FIELDTERMINATOR='","',
ROWTERMINATOR='0x0a',
FIRSTROW = 2,
DATAFILETYPE = 'widenative'
)
the problem is when I insert, failed insert
it show error :
Msg 4866, Level 16, State 8, Line 15 The bulk load failed. The column
is too long in the data file for row 1, column 15. Verify that the
field terminator and row terminator are specified correctly. Msg 7301,
Level 16, State 2, Line 15 Cannot obtain the required interface
("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server
"(null)".
I have tried rowterminator : '0x0a','\n','\r\n','char(10)' but nothing works
Although it will only be inserting data from row2 row 1 still needs to be in the correct format as I'm pretty sure SQLServer performs a 'pre-validation' cycle against the Schema to ensure you have half a chance of the data getting to the database. Row 1 fails this 'pre-validation' because you have not provided all the columns as per the table schema.
Try to Open the file in Notepad then check it for line structure and save it again.

Resources