I'm new to SQL Server, so forgive me for being a bit of a noob here.
The code shown here returns the following error:
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)".
Code:
BULK INSERT testingtable
FROM 'D:\TimeLords\data\db-test-file.csv'
WITH
(FORMAT = 'CSV',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK)
I've tried using:
ROWTERMINATOR = '0x0a'
and
ROWTERMINATOR = '\r\n'
This is the CSV file: https://gyazo.com/0392b660c97e3cac27f2337993190c69
This is my SQL table: https://gyazo.com/fbbaf6204df9bb574d8887864cc95ea0
And this is the complete SQL query: https://gyazo.com/ffe020437f07524ce44420bedeebf0d4
I've scouted StackOverflow and can't find any solution which works. Any ideas would be appreciated.
Thanks
There's another potential culprit. I've been running BULK INSERTs into my SQL Server 2017 Express, and my syntax used FORMAT = 'CSV' and a ROWTERMINATOR of '\n' -- and it had been working fine for months.
I added a new column to the other system where I was routinely exporting data as a CSV, and when I went to do another BULK INSERT, it was failing because I had an extra column in my CSV that didn't line up with the columns in my SQL table. DOH! I just needed to add that same new column in my SQL db and all was well again. A stupid error on my part, but maybe it will help someone else.
Change FORMAT = 'CSV' to DATAFILETYPE = 'char'
or just remove the FORMAT = 'CSV' line as your file may not be RFC 4180 compliant.
BULK INSERT testingtable
FROM 'D:\TimeLords\data\db-test-file.csv'
WITH
(FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK)
this has worked for me with this error.
Old post, but hey, every bit of knowledge helps. You can also run into this issue if you use a CSV other encoding or types, e.g. if you save CSV for Macintosh or UTF-8 (as you can in Excel), this is not compliant with FORMAT = 'CSV'. You can try other options like row terminator = '\r' while removing FORMAT = 'CSV' that did it for me for non-windows-based CSV files.
For me the error was an extra space on the end of the first row.
Related
I'm trying to import data from a .csv file into a SQL Server table.
Using the code below, I can read from the file:
BULK INSERT #TempTable
FROM '\\Data\TestData\ImportList.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR ='\n', FIRSTROW = 2, Lastrow = 3)
GO
(I added LastRow = 3 so I was just getting a subset of the data rather than dealing with all 2000 rows)
But I am getting multiple columns into a single column:
If I use the Import/Export wizard in SSMS, with the below settings, I see the expected results in the preview:
Can anyone give me some pointers as to how I need to update my query to perform correctly.
Here is a sample of what the CSV data looks like:
TIA.
You probably need to specify " as Text qualifier.
Your fields seem to be quoted and most likely contain comma's, which are currrently splitting your fields.
Or, if it works fine using <none> as Text qualifier, try to use FIELDQUOTE = '' or FIELDQUOTE = '\b' in your query. FIELDQUOTE defaults to '"'.
It's hard to tell what's really wrong without looking at some raw csv data that includes those quotes (as seen in your first screenshot).
I am doing a bulk operation on sql server 2008 and I get the OLE DB "BULK" error. I know that it is because my table does not have a column that the file has or vice versa.
Despite knowing what is happening this do not help me a lot. How can I get the exact line of the bulk that is failing? Or any other hint to try to solve the error quickly.
Thank you.
Try this :
BULK INSERT [table_name]
FROM 'C:\...\...\[filename].csv' -- -- This is server path not local
WITH (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
ERRORFILE = 'C:\...\...\[logfilename].log'
);
I have a Persian CSV file and I need to read that with SQL bulk into the SQL server:
I wrote this bulk:
BULK INSERT TEMP
FROM 'D:\t1.csv'
WITH(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
CODEPAGE = '1256'
);
but that can not read UTF-8 encoding and read ی character as ? character.
How can I write that?
1. go to the BULK INSERT documentation on MSDN
2. find the section on the CODEPAGE
3. see the note that says:
SQL Server does not support code page 65001 (UTF-8 encoding).
4. Research further and find the Use Unicode Character Format to Import or Export Data (SQL Server) and see if that helps
This problem is still there in SQL server 2017, see here and here.
If your import is just an occasional exercise, i.e. if it's OK to import not using a script at all, what worked for me is simply importing the csv using Tasks -> Import -> Flat file.
I'm adding this here because this page is high up when you Google 'SQL Server does not support code page 65001'. Hope it helps some.
In addition to the now deprecated or obsolete earlier answers by others I want to point out that a of today in May 2022, with Release Version 15.0.2080.9 (SQL Server 2019), this works flawlessly for UTF-8.
Create a UTF-8 encoded file (I use with BOM)
then
BULK INSERT #tempTable1
FROM 'C:\....\file.csv' WITH (
CODEPAGE = '65001',
FIRSTROW = 2, --skip the first line
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n')
GO
works flawlessly for me, with many french and other characters.
I went through the documenation #marc_s linked to, and found the usage of DATAFILETYPE = widechar.
I then went ahead and tried it with my UTF-8 csv file, but it didn't work, giving me the error:
[...] the data file does not have a Unicode signature
I then re-saved my csv file with Notepad's Unicode format, retried the import, and voila, success.
Make sure all commas and line-breaks are escaped (see here how to save a valid csv).
My full script (I'm using SQL Server 2017):
BULK INSERT [my_table]
FROM 'C:\path\to\file.csv'
WITH
(
FORMAT = 'CSV',
FIRSTROW = 2, -- if you have a title row, the first data row is 2nd
FIELDTERMINATOR = ',',
KEEPIDENTITY, -- remove it if you don't want identity to be kept
ROWTERMINATOR = '\n',
DATAFILETYPE = 'widechar',
ERRORFILE = 'C:\path\to\file_err.txt',
KEEPNULLS,
TABLOCK
)
Notes:
Make sure your date fields are in valid sql format.
Regarding KEEPNULS, read this question (e.g, if you have NULLs in your file, replace them with an empty string).
I'm importing a CSV compiled using cygwin shell commands into MS SQL 2014 using:
BULK INSERT import
from 'D:\tail.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\r', FIRSTROW = 1)
GO
I have confirmed that each row contains a \r\n. If I leave a CR/LF on the last row the bulk import fails with Msg 4832:
Bulk load: An unexpected end of file was encountered in the data file.
If I end the file at the end of the last data row then the bulk import succeeds.
For very large CSVs a kludgy way around this problem is to find the number of rows and use the LASTROW setting for the BULK INSERT. Is there a more graceful way to let MS SQL know that, yes, the last row is terminated and not to complain about it or fail on it?
Use the following :
BULK INSERT import
from 'D:\tail.csv'
WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0a', FIRSTROW = 1)
GO
How can i load data records from excel file to mssql database?
Provided your data types are consistent between the CSV columns and your database columns then a bulk insert would work.
BULK INSERT tablename
FROM 'C:\Temp\filename.csv'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
csv files are in ascii format and have some problem with unicode characters such as 'ی' – ramezani.saleh 26 mins ago
For this problem i must export my excel file to Unicode text file and then i must use
BULK INSERT tablename FROM 'C:\Temp\filename.txt' WITH ( FIRSTROW
= 2, MAXERRORS = 0, FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n' )
I thinks it works and the problem of csv files with unicode charcters will solve (such as 'ی')
Use SQirreL SQL Client (java based) + Excel JDBC driver (for example http://sourceforge.net/projects/xlsql/) and copy the data with a SQL script
You can export an excel file to a csv
file and import it via SQL