Import text file data into SQL Server database - sql-server

I have a text file with below format
I want to import it in SQL Server database by splitting it into several columns:
Terminal, NetSales, NetAjustment, CancelsCnt, CancelAmount,
CashesCnt, CashesAmount, ClaimsCnt, ClaimsAmount, SalesCommission,
CashCommission, NetDue
I have tried to insert the text file in SQL Server using SSIS but its inserting everything in one column instead of split it, then used SQL scripting to split it into several columns but its not working
I'm having some difficulties to split the column from the text file
Any ideas or help about how I can capture those columns data into a proper format?

I would suggest to use SSIS Bulk Insert Task.
Bulk Insert Task in SSIS
It has identical functionality as a T-SQL BULK INSERT statement.
It allows to specify where real first row starts via its FIRSTROW parameter.
Here is a conceptual example.
SQL
CREATE TABLE dbo.tbl (
Terminal VARCHAR(20),
NetSales VARCHAR(30),
NetAjustment VARCHAR(100),
CancelsCnt INT
...
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\inputFile.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = '\t' -- for a TAB
, ROWTERMINATOR = '\n'
, FIRSTROW = 8
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;

Related

BULK INSERT type mismatch when table created from same .CSV

I receive update information for items on a daily basis via a CSV file that includes date/time information in the format YYY-MM-DDThh:mm:ss
I used the Management Studio task "Import Flat File..." to create a table dbo.fullItemList and import the contents of the initial file. It identified the date/time columns as type datetime2(7) and imported the data correctly. I then copied this table to create a blank table dbo.dailyItemUpdate.
I want to create a script that imports the CSV file to dbo.dailyItemUpdate, uses a MERGE function to update dbo.fullItemList, then wipes dbo.dailyItemUpdate ready for the next day.
The bit I can't get to work is the import. As the table already exists I'm using the following
BULK INSERT dbo.dailyItemUpdate
FROM 'pathToFile\ReceivedFile.csv'
WITH
(
DATAFILETYPE = 'char',
FIELDQUOTE = '"',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK
)
But I get a "type mismatch..." error on the date/time columns. How come the BULK INSERT fails, even though the data type was picked up by the "Import Flat File" function?

How to bulk insert with derived columns in SQL Server?

Newbie to SQL Server!
I try to perform bulk insert into SQL Server. I have the following csv file named input.csv:
NO,Name,age,Reference,dateTime,Category
1,Stack#mail,23,Kiop,2017-03-02T12:23:00,D
2,OverEnd#Yahoo,22,CSK,2017-030-03T12:23:00,I
In that CSV file, I have to move that into SQL Server using BulkInsert into below table schema:
create table BulkInsertTemp
(
no int,
name nvarchar(50),
age int,
Ref nvarchar(30),
currentDatetime datetime,
Category nvarchar(40)
)
Now I need to store in SQL like:
no Name age Ref currentDatetime category
--------------------------------------------------------
1 Stack 23 Kiop 2017-03-02 12:23:00 D
2 OverEnd 22 CSK 2017-03-03 12:23:00 I
I just tried below query for another one table to move into SQL Server.
create table bulkInsert(no varchar(50),name varchar(50));
BULK INSERT bulkInsert
FROM 'C:\MyInput\BulkInsert\BulkInsertData.txt'
WITH
(FIRSTROW = 1,
ROWTERMINATOR = '\n',
FIELDTERMINATOR = ',',
ROWS_PER_BATCH = 10000)
My query worked if there is no need for modifying data.
But in input.csv I have to change column values such as if name is "Stack#mail" to be store as "Stack" in SQL
I am a new one to bulk insert option so I don't know how to derive columns from existing ones.
Anyone, please guide me to solve my requirement?
I would recommend building an SSIS package to do this. If you don't know how or don't have time you could run the SQL Server Import and Export Wizard which will actually create an SSIS package for you behind the scenes.
Hope that gets you going in the right direction.
Noel

How to import from text file to sql server table having millions of records

I have a text file of coauthor data set containing author id's and number of co-authored papers separated by spaces. I want to import this data in sql server table whereas number of records in this text file is in millions i.e. of size 73MB of text file.
Please help out the way by which I can import this file to sql server table.
Thanks
BULK
INSERT yourtable
FROM 'location with filename'
WITH
(
FIELDTERMINATOR = ' ',
ROWTERMINATOR = '\n'
);
find more from here
http://www.codeproject.com/Tips/775961/Import-CSV-or-txt-File-Into-SQL-Server-Using-Bulk
another GUI approach can be useful.

Bulk insert with text qualifier in SQL Server

I am trying to bulk insert few records in a table test from a CSV file ,
CREATE TABLE Level2_import
(wkt varchar(max),
area VARCHAR(40),
)
BULK
INSERT level2_import
FROM 'D:\test.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
The bulk insert code should rid of the first row and insert the data into the table . it gets rid of first row alright but gets confused in the delimiter section . The first column is wkt and the column value is double quoted and has comma within the value .
So I guess I question is if there is a way to tell the BULK INSERT that the double quoted part is one column regardless of the comma within it ?
the CSV file looks like this ,
"MULTIPOLYGON (((60851.286135090661 510590.66974495345,60696.086128673756 510580.56976811233,60614.7860844061 510579.36978015327,60551.486015895614)))", 123123.22
You need to use a 'format file' to implement a text qualifier for bulk insert. Essentially, you will need to teach the bulk insert that there's potentially different delimiters in each field.
Create a text file called "level_2.fmt" and save it.
11.0
2
1 SQLCHAR 0 8000 "\"," 1 wkt SQL_Latin1_General_CP1_CI_AS
2 SQLCHAR 0 40 "\r\n" 2 area SQL_Latin1_General_CP1_CI_AS
The first line, "11.0" refers to your version of SQL. The second line shows that your table, [level2_import], has two columns. Each line after that will describe a column, and obeys the following format:
[Source Column Number][DataType][Min Size][Max Size][Delimiter pattern][Destination Column Number][Destination Column Name][Case sensitivity of database]
Once you've created that file, you can read in your data with the following bulk insert statement:
BULK INSERT level2_import
FROM 'D:\test.csv'
WITH
(
FIRSTROW = 2,
FORMATFILE='D:\level_2.fmt'
);
Refer to this blog for a detailed explanation of the format file.
SQL Server 2017 finally added support for text qualifiers and the CSV format defined in RFC 4180. It should be enough to write :
BULK INSERT level2_import
FROM 'D:\test.csv'
WITH ( FORMAT = 'CSV', ROWTERMINATOR = '\n', FIRSTROW = 2 )
Try removing .fmt to the file and use .txt instead, that worked for me
I have this issue working with LDAP data the dn contains commas, as do other fields that contain dns. Try changing your field terminator to another, unused character, like a pipe | or Semicolon ;. Do this in the data and the file definition.
so the code should be:
CREATE TABLE Level2_import
(wkt varchar(max),
area VARCHAR(40),
)
BULK
INSERT level2_import
FROM 'D:\test.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n'
)
and your CSV:
"MULTIPOLYGON (((60851.286135090661 510590.66974495345,60696.086128673756 510580.56976811233,60614.7860844061 510579.36978015327,60551.486015895614)))"; 123123.22

Easiest way to import CSV into SQl Server 2005

I have several files about 5k each of CSV data I need to import into SQL Server 2005.
This used to be simple with DTS. I tried to use SSIS previously and it seemed to be about 10x as much effort and I eventually gave up.
What would be the simplest way to import the csv data into sql server? Ideally, the tool or method would create the table as well, since there are about 150 fields in it, this would simplify things.
Sometimes with this data, there will be 1 or 2 rows that may need to be manually modified because they are not importing correctly.
try this:
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
here is a summary of the code from the link:
Create Table:
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
import data:
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ','
,ROWTERMINATOR = '\n'
--,FIRSTROW = 2
--,MAXERRORS = 0
)
GO
use the content of the table:
SELECT *
FROM CSVTest
GO

Resources