I have several chunks of a table in .dat files
I want to import all of these chuncks to a single sql server table.
To do it with one I do
BULK INSERT dbo.Tab
FROM 'C:\Data\1.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
but how to append to the table the rest of the .dat files?
You fire multiple BULK INSERT commands.
BULK INSERT dbo.Tab
FROM 'C:\Data\1.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
BULK INSERT dbo.Tab
FROM 'C:\Data\2.dat'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
);
...
Alternatively (and probably better for performance), use some other program to merge the files together first.
Related
I am trying to bulk insert a data file into a SQL table. The data files structure is:
Col1FSCol2FSCol3
Val1FSVal2FSVal3
FS is a control character (File Separator), corresponding to Byte 1C.
However, the following bulk insert code in SQL does not work:
bulk insert schema.table
from 'filepath'
with (
datafiletype = 'char'
, codepage = 'ACP'
, firstrow = 2
, fieldterminator = '0x1C'
, rowterminator = '\n'
, tablock
)
We just solved the problem: If fieldterminator is specified in Hex-Notation, rowterminator also has to be specified in Hex-Notation.
Therefore, the solution is:
bulk insert schema.table
from 'filepath'
with (
datafiletype = 'char'
, codepage = 'ACP'
, firstrow = 2
, fieldterminator = '0x1C'
, rowterminator = '0x0A'
, tablock
)
I have a CSV file in UTF8 encoding and I would like to import data into SQL Server DB table.
In some cells I have stored values like:
±40%;
16.5±10%;
All columns load perfectly but only columns with ± character show in DB this:
All columns where I would like to store this character I use nvarchar(50) with collation Latin1_General_100_CS_AS_WS_SC_UTF8
Is there any wait how this character store into DB ?
Thank you
EDIT
I use for load CSV file this:
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
TABLOCK
);
I also try to change SSMS Options:
‘Tools‘ -> ‘Options‘ -> ‘Environment‘ -> ‘Fonts and Colors‘ -> Select ‘Grid Results’
Set Font to Arial but without positive results
I have over 20 million records in many files, which I want to import
Have you tried adding a CODEPAGE=65001 (UTF-8) to the WITH clause of the BULK INSERT statement?
BULK INSERT [dbo].[x]
FROM 'c:\Users\x\Downloads\x.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
ERRORFILE = 'c:\Users\x\Downloads\xx.csv',
CODEPAGE = 65001,
TABLOCK
);
I have one problem with bulk insert. I have about 7 million records in the csv file, when I make the following query, just half of the total records are entered in the table.
The Query I use:
BULK INSERT *Table_Name* FROM 'file path'
WITH
(
FIRSTROW = 1, -- csv dont have header
FIELDTERMINATOR = '0x0a',
ROWTERMINATOR = '\n',
ERRORFILE = '*error file path*',
TABLOCK
)
Table Creation query:
CREATE TABLE *Table_Name*
(
ID int NOT NULL IDENTITY(1,1) PRIMARY KEY,
*Column_Name* varchar(50)
);
Where I make a mistake?
How to load all records from a csv file (all 7 million)?
I want to bulk import from CSV file in sql but \n for new line is not working in SQL as row terminator. it does not read any record from csv file if i use \n but when i use
ROWTERMINATOR = '0x0A'
it mix up the all records.
this is code what i am using in my sp.
Create Table #temp
(
Field1 nvarchar(max) null,
Field2 nvarchar(max) null,
Field3 nvarchar(max) null
)
BULK INSERT #temp
FROM 'c:\file.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --not working
--ROWTERMINATOR = '\r', --not working
--ROWTERMINATOR = char(10), ---not working
--ROWTERMINATOR = char(13), ---not working
TABLOCK
)
INSERT INTO table_name
(
tbl_field1,tbl_field2,tbl_field3
)
SELECT
field1,
field2,
field3
FROM #temp
Thanks in Advance
I did it with the help of #DVO. Thank you #dvo for answer. It is working fine as per your instructions. i used notepad++ and see the hidden characters and handle them accordingly.
I have the following T-SQL statement:
BULK INSERT #TempTable
FROM "C:\csvfile.csv"
WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR ='\n', FIRSTROW = 2, KEEPIDENTITY )
I am test running it on a 3 row csv file, of which the first row contains the headers. So there are 2 data rows.
However, it only reads line 2 and never line 3.
Any idea why?
A line break was needed after the last row. Ugh.