BULK INSERT from CSV does not read last row - sql-server

I have the following T-SQL statement:
BULK INSERT #TempTable
FROM "C:\csvfile.csv"
WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR ='\n', FIRSTROW = 2, KEEPIDENTITY )
I am test running it on a 3 row csv file, of which the first row contains the headers. So there are 2 data rows.
However, it only reads line 2 and never line 3.
Any idea why?

A line break was needed after the last row. Ugh.

Related

Insert plus-minus Unicode character into SQL Server DB (TSQL)

I have a CSV file in UTF8 encoding and I would like to import data into SQL Server DB table.
In some cells I have stored values like:
±40%;
16.5±10%;
All columns load perfectly but only columns with ± character show in DB this:
All columns where I would like to store this character I use nvarchar(50) with collation Latin1_General_100_CS_AS_WS_SC_UTF8
Is there any wait how this character store into DB ?
Thank you
EDIT
I use for load CSV file this:
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
TABLOCK
);
I also try to change SSMS Options:
‘Tools‘ -> ‘Options‘ -> ‘Environment‘ -> ‘Fonts and Colors‘ -> Select ‘Grid Results’
Set Font to Arial but without positive results
I have over 20 million records in many files, which I want to import
Have you tried adding a CODEPAGE=65001 (UTF-8) to the WITH clause of the BULK INSERT statement?
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
CODEPAGE = 65001,
TABLOCK
);

Bulk insert doesn't insert any rows from my txt file

My Bulk Insert doesn't insert any rows from my txt file
here is the code i use for bulk
BULK INSERT mytablename
FROM '\\myIP\myfolder\myfile.txt' WITH (
FIELDTERMINATOR = ';'
,ROWTERMINATOR = '\n'
)
I got "0 rows affected" after executing this code
My file is a txt file with 1 row for each record and each colum is separated by ' ; '
There is no header with column name inside my file.
thanks a lot
THe encoding of my file was UTF-8 BOM ! And i tried with UTF-8 and it worked !

Insert data from CSV to SQL server, insert problem

I have one problem with bulk insert. I have about 7 million records in the csv file, when I make the following query, just half of the total records are entered in the table.
The Query I use:
BULK INSERT *Table_Name* FROM 'file path'
WITH
(
FIRSTROW = 1, -- csv dont have header
FIELDTERMINATOR = '0x0a',
ROWTERMINATOR = '\n',
ERRORFILE = '*error file path*',
TABLOCK
)
Table Creation query:
CREATE TABLE *Table_Name*
(
ID int NOT NULL IDENTITY(1,1) PRIMARY KEY,
*Column_Name* varchar(50)
);
Where I make a mistake?
How to load all records from a csv file (all 7 million)?

SQL - read CSV file with unexpected EOF error

I need to read data from .csv file which contains many records, but in last row
there is string "the_end" and after that there is no LF sign.
Here is my csv file:
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202
the_end
Below my sql script which reads data into temporary table:
create table #Data
(
id int,
first_name varchar(50),
last_name varchar(50),
birthdate smalldatetime
)
bulk insert #Data
from 'C:\Users\Michał\Desktop\csvtest.csv'
with
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
select * from #Data
drop table #Data
Error is:
Msg 4832, Level 16, State 1, Line 13
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 13
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 13
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I want to read all data into my table without this last line label "the_end", that's obvious. How can i manage that? I cannot modify this file, because it's given from outside and I cannot do that.
Use the lastrow option
bulk insert #Data
from 'C:\Users\Michał\Desktop\csvtest.csv'
with
(
FIELDTERMINATOR = ','
, ROWTERMINATOR = '\n'
, lastrow = 4
)
dynamic sql example using a variable for lastline:
create table #Data (
id int
, first_name varchar(50)
, last_name varchar(50)
, birthdate smalldatetime
);
declare #lastline int = 4;
declare #sql nvarchar(max)= '
bulk insert #Data
from ''C:\Users\Michał\Desktop\csvtest.csv''
with (
FIELDTERMINATOR = '',''
, ROWTERMINATOR = ''\n''
, lastrow = '+convert(nvarchar(11),#lastline)+'
);';
exec sp_executesql #sql;
select * from #Data
drop table #Data
If the lastrow option doesn't work for you, manually delete the last row and try to re-run your BulkLoad process. If everything works fine, crate a VB.NET executable to open the file, strip out the last row, and re-save the file. can schedule the delete-last-row process and the BulkLoad process using the Windows Task Scheduler.
https://www.sevenforums.com/tutorials/11949-elevated-program-shortcut-without-uac-prompt-create.html

Bulk insert of multiple files in SQL Server

I have several chunks of a table in .dat files
I want to import all of these chuncks to a single sql server table.
To do it with one I do
BULK INSERT dbo.Tab
FROM 'C:\Data\1.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
but how to append to the table the rest of the .dat files?
You fire multiple BULK INSERT commands.
BULK INSERT dbo.Tab
FROM 'C:\Data\1.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
BULK INSERT dbo.Tab
FROM 'C:\Data\2.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
...
Alternatively (and probably better for performance), use some other program to merge the files together first.

Resources