I'm importing a CSV compiled using cygwin shell commands into MS SQL 2014 using:
BULK INSERT import
from 'D:\tail.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\r', FIRSTROW = 1)
GO
I have confirmed that each row contains a \r\n. If I leave a CR/LF on the last row the bulk import fails with Msg 4832:
Bulk load: An unexpected end of file was encountered in the data file.
If I end the file at the end of the last data row then the bulk import succeeds.
For very large CSVs a kludgy way around this problem is to find the number of rows and use the LASTROW setting for the BULK INSERT. Is there a more graceful way to let MS SQL know that, yes, the last row is terminated and not to complain about it or fail on it?
Use the following :
BULK INSERT import
from 'D:\tail.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0a', FIRSTROW = 1)
GO
Related
I'm new to SQL Server, so forgive me for being a bit of a noob here.
The code shown here returns the following error:
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
Code:
BULK INSERT testingtable
FROM 'D:\TimeLords\data\db-test-file.csv'
WITH
(FORMAT = 'CSV',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK)
I've tried using:
ROWTERMINATOR = '0x0a'
and
ROWTERMINATOR = '\r\n'
This is the CSV file: https://gyazo.com/0392b660c97e3cac27f2337993190c69
This is my SQL table: https://gyazo.com/fbbaf6204df9bb574d8887864cc95ea0
And this is the complete SQL query: https://gyazo.com/ffe020437f07524ce44420bedeebf0d4
I've scouted StackOverflow and can't find any solution which works. Any ideas would be appreciated.
Thanks
There's another potential culprit. I've been running BULK INSERTs into my SQL Server 2017 Express, and my syntax used FORMAT = 'CSV' and a ROWTERMINATOR of '\n' -- and it had been working fine for months.
I added a new column to the other system where I was routinely exporting data as a CSV, and when I went to do another BULK INSERT, it was failing because I had an extra column in my CSV that didn't line up with the columns in my SQL table. DOH! I just needed to add that same new column in my SQL db and all was well again. A stupid error on my part, but maybe it will help someone else.
Change FORMAT = 'CSV' to DATAFILETYPE = 'char'
or just remove the FORMAT = 'CSV' line as your file may not be RFC 4180 compliant.
BULK INSERT testingtable
FROM 'D:\TimeLords\data\db-test-file.csv'
WITH
(FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK)
this has worked for me with this error.
Old post, but hey, every bit of knowledge helps. You can also run into this issue if you use a CSV other encoding or types, e.g. if you save CSV for Macintosh or UTF-8 (as you can in Excel), this is not compliant with FORMAT = 'CSV'. You can try other options like row terminator = '\r' while removing FORMAT = 'CSV' that did it for me for non-windows-based CSV files.
For me the error was an extra space on the end of the first row.
I am running SQL 2008, bulk insert command, while inserting the data, I am trying to remove (") double quotes from the CSV file, which works partially, but doesnt work for all the records, please check my code and the screenshot of the result.
Bulk Insert tblUsersXTemp
from 'C:\FR0250Members161212_030818.csv'
WITH (FIELDTERMINATOR = '","',
ROWTERMINATOR = '"\n"',
--FormatFile =''
ERRORFILE = 'C:\bulk_insert_BadData.txt')
After you do the bulk insert, you could replace the double quotes.
UPDATE tblUsersXTemp
SET usxMembershipID = REPLACE(usxMembershipID, CHAR(34), '')
You need a format file I believe, that's what I think is going on.
If you use the following Bulk Insert command to import the data without using a format file, then you will land up with a quotation mark prefix to the first column value and a quotation mark suffix for the last column values and a quotation mark prefix for the first column values.
Reference
Example from reference:
BULK INSERT tblPeople
FROM ‘bcp.txt’
WITH (
DATAFILETYPE=‘char’,
FIELDTERMINATOR=‘","’,
ROWTERMINATOR = ‘\n’,
FORMATFILE = ‘bcp.fmt’);
You could also potentially have dirty data that uses quotes for more than just delimiters.
I'm trying to create a query to export a sheet from excel to SQL Server, I came up with this query yet I'm getting the error Invalid object name Sheet1$
How can I select from the sheet: "Sheet1"?
s = "INSERT INTO TestTable SELECT * FROM [Sheet1$] "
cn.Execute s
In your case i guess sql server doesnt have access to sheet1 file.
Check here how to make file accessible or what could be the problem for sql to locate your file.
There are 2 ways that i know of how you could achieve this.
1>>
BULK INSERT TestTable
FROM 'C:\CSVData\sheet1.xls'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.txt',
TABLOCK
)
But make sure on your system sql server has access to folder from where you want to take the excel file and you have bulk import rights
Check out more info
here
2>> Also you could use sql import wizard like this.
I am trying to import a .csv file into SQL SERVER using BULK Import. I am facing weird problem when I am importing it initially it is not considering any line feed or any ROW Terminator which causes all the data to be imported into the last column. But When I am opening the .csv file in Excel and closing it using save as .csv all goes good and I am able to import data using BULK import.
My Code is as follow
BULK INSERT TS4_UCM_0 FROM 'C:\Test\stage1_Baseline1.csv'
WITH (
FIELDTERMINATOR =',',
FIRSTROW = 2,
ROWTERMINATOR ='\n')
How can i load data records from excel file to mssql database?
Provided your data types are consistent between the CSV columns and your database columns then a bulk insert would work.
BULK INSERT tablename
FROM 'C:\Temp\filename.csv'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
csv files are in ascii format and have some problem with unicode characters such as 'ی' – ramezani.saleh 26 mins ago
For this problem i must export my excel file to Unicode text file and then i must use
BULK INSERT tablename FROM 'C:\Temp\filename.txt' WITH ( FIRSTROW
= 2, MAXERRORS = 0, FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n' )
I thinks it works and the problem of csv files with unicode charcters will solve (such as 'ی')
Use SQirreL SQL Client (java based) + Excel JDBC driver (for example http://sourceforge.net/projects/xlsql/) and copy the data with a SQL script
You can export an excel file to a csv
file and import it via SQL