Bulk insert doesn't insert any rows from my txt file - sql-server

My Bulk Insert doesn't insert any rows from my txt file
here is the code i use for bulk
BULK INSERT mytablename
FROM '\\myIP\myfolder\myfile.txt' WITH (
FIELDTERMINATOR = ';'
,ROWTERMINATOR = '\n'
)
I got "0 rows affected" after executing this code
My file is a txt file with 1 row for each record and each colum is separated by ' ; '
There is no header with column name inside my file.
thanks a lot

THe encoding of my file was UTF-8 BOM ! And i tried with UTF-8 and it worked !

Related

Insert plus-minus Unicode character into SQL Server DB (TSQL)

I have a CSV file in UTF8 encoding and I would like to import data into SQL Server DB table.
In some cells I have stored values like:
±40%;
16.5±10%;
All columns load perfectly but only columns with ± character show in DB this:
All columns where I would like to store this character I use nvarchar(50) with collation Latin1_General_100_CS_AS_WS_SC_UTF8
Is there any wait how this character store into DB ?
Thank you
EDIT
I use for load CSV file this:
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
TABLOCK
);
I also try to change SSMS Options:
‘Tools‘ -> ‘Options‘ -> ‘Environment‘ -> ‘Fonts and Colors‘ -> Select ‘Grid Results’
Set Font to Arial but without positive results
I have over 20 million records in many files, which I want to import
Have you tried adding a CODEPAGE=65001 (UTF-8) to the WITH clause of the BULK INSERT statement?
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
CODEPAGE = 65001,
TABLOCK
);

Insert data from CSV to SQL server, insert problem

I have one problem with bulk insert. I have about 7 million records in the csv file, when I make the following query, just half of the total records are entered in the table.
The Query I use:
BULK INSERT *Table_Name* FROM 'file path'
WITH
(
FIRSTROW = 1, -- csv dont have header
FIELDTERMINATOR = '0x0a',
ROWTERMINATOR = '\n',
ERRORFILE = '*error file path*',
TABLOCK
)
Table Creation query:
CREATE TABLE *Table_Name*
(
ID int NOT NULL IDENTITY(1,1) PRIMARY KEY,
*Column_Name* varchar(50)
);
Where I make a mistake?
How to load all records from a csv file (all 7 million)?

I want to bulk import from CSV file in sql but \n for new line is not working in SQL

I want to bulk import from CSV file in sql but \n for new line is not working in SQL as row terminator. it does not read any record from csv file if i use \n but when i use
ROWTERMINATOR = '0x0A'
it mix up the all records.
this is code what i am using in my sp.
Create Table #temp
(
Field1 nvarchar(max) null,
Field2 nvarchar(max) null,
Field3 nvarchar(max) null
)
BULK INSERT #temp
FROM 'c:\file.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --not working
--ROWTERMINATOR = '\r', --not working
--ROWTERMINATOR = char(10), ---not working
--ROWTERMINATOR = char(13), ---not working
TABLOCK
)
INSERT INTO table_name
(
tbl_field1,tbl_field2,tbl_field3
)
SELECT
field1,
field2,
field3
FROM #temp
Thanks in Advance
I did it with the help of #DVO. Thank you #dvo for answer. It is working fine as per your instructions. i used notepad++ and see the hidden characters and handle them accordingly.

Bulk insert .txt file in SQL

I'm trying to import a .txt file into Advanced Query Tool (the SQL client I use). So far, I have:
CREATE TABLE #tb_test
(
id INTEGER,
name varchar(10),
dob date,
city char(20),
state char(20),
zip integer
);
insert into #tb_test
values
(1,'TEST','2015-01-01','TEST','TEST',11111)
;
bulk insert #tb_test
from 'h:\tbdata.txt'
with
(
fieldterminator = '\t',
rowterminator = '\n'
);
I receive an error message saying there's a syntax error on line 1. Am I missing a database from which #tb_test comes (like db.#tb_test)?
Here's a line from the tbdata.txt file:
2,'TEST2','2012-01-01','TEST','TEST',21111
I was curious with this question and I found the following solution:
Your data is comma separated but you are trying to split by TAB
two options: change the file data to be TAB separated or change the fieldterminator = '\t' to fieldterminator = ','
The DATE format has issues when loading directly from a file, my best solution is to change the temp field dob to type VARCHAR(20) and then, when passing to the final display/data storage convert to DATE.
Here is the corrected code:
CREATE TABLE #tb_test
(
id INTEGER,
name varchar(10),
dob varchar(20),
city char(20),
state char(20),
zip integer
);
insert into #tb_test
values
(1,'TEST','2015-01-01','TEST','TEST',11111)
;
bulk insert #tb_test
from 'h:\tbdata.txt'
with
(
fieldterminator = ',',
rowterminator = '\n'
);

BULK INSERT from CSV does not read last row

I have the following T-SQL statement:
BULK INSERT #TempTable
FROM "C:\csvfile.csv"
WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR ='\n', FIRSTROW = 2, KEEPIDENTITY )
I am test running it on a 3 row csv file, of which the first row contains the headers. So there are 2 data rows.
However, it only reads line 2 and never line 3.
Any idea why?
A line break was needed after the last row. Ugh.

Resources