Date import from CSV to Microsoft SQL Server - sql-server

I'm trying to import csv file containing data to my SQL server using SQL server import wizard in Microsoft SQL server management studio.
CSV file date looks like this:
02-04-2021
03-04-2021
05-04-2021
06-04-2021
07-04-2021
08-04-2021
09-04-2021
10-04-2021
12-04-2021
13-04-2021
14-04-2021
15-04-2021
16-04-2021
17-04-2021
19-04-2021
20-04-2021
21-04-2021
22-04-2021
23-04-2021
24-04-2021
26-04-2021
27-04-2021
28-04-2021
29-04-2021
30-04-2021
In the wizard I chose source DataType as date, and destination table has datatype date (i tried also datetime, datetime2, all the same trouble)
Imported completed successfully, but the result is:
2021-01-04
2021-02-04
2021-03-04
2021-05-04
2021-06-04
2021-07-04
2021-08-04
2021-09-04
2021-10-04
2021-12-04
2021-04-13
2021-04-14
2021-04-15
2021-04-16
2021-04-17
2021-04-19
2021-04-20
2021-04-21
2021-04-22
2021-04-23
2021-04-24
2021-04-26
2021-04-27
2021-04-28
2021-04-29
2021-04-30
As you can see, days and months mixed, date is wrong. Some rows are correct, but in some rows there is a month instead of day.
What can I do?

Here is how to do via T-SQL in SSMS.
As #Lamu pointed out, input file date format is ambiguous. So the following line specifies it exactly:
SET DATEFORMAT DMY;
SQL
USE tempdb;
GO
DROP TABLE IF EXISTS dbo.tbl;
CREATE TABLE dbo.tbl
(
inputDate DATE
);
-- to let know format of the date in the *.csv file
SET DATEFORMAT DMY;
BULK INSERT dbo.tbl
FROM 'e:\temp\Faenno.csv'
WITH
(
DATAFILETYPE = 'char', --widechar',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
CODEPAGE = '65001'
);
SELECT * FROM dbo.tbl;

Related

How to create a backup table from a table with current date & time in sql server

How to create a backup table from a table with current date & time in sql server.
The spaces in date & time should be replaced by underscore.
For Backup I am doing this :-
select * into BKP_TABLE_STUDENT
from TABLE_STUDENT
And for fetching DateTime I am using this :-
select convert(varchar, getdate(), 0)
Above gives this format -> Mar 2 2022 4:02PM
So, I need to combine the above datetime and the table name.
E.g. Table name will be BKP_TABLE_STUDENT_Mar_2_2022_4_02PM
You can conactenate your table name with a formatted string.
I would prefer a 24 hour clock than AM / PM but it's your call
SELECT CONCAT('BKP_TABLE_STUDENT_',FORMAT(getdate(),'MMM_dd_yyyy_hh_mm_tt')) as backup_name
returns
BKP_TABLE_STUDENT_Mar_02_2022_11_24_AM

Import text file data into SQL Server database

I have a text file with below format
I want to import it in SQL Server database by splitting it into several columns:
Terminal, NetSales, NetAjustment, CancelsCnt, CancelAmount,
CashesCnt, CashesAmount, ClaimsCnt, ClaimsAmount, SalesCommission,
CashCommission, NetDue
I have tried to insert the text file in SQL Server using SSIS but its inserting everything in one column instead of split it, then used SQL scripting to split it into several columns but its not working
I'm having some difficulties to split the column from the text file
Any ideas or help about how I can capture those columns data into a proper format?
I would suggest to use SSIS Bulk Insert Task.
Bulk Insert Task in SSIS
It has identical functionality as a T-SQL BULK INSERT statement.
It allows to specify where real first row starts via its FIRSTROW parameter.
Here is a conceptual example.
SQL
CREATE TABLE dbo.tbl (
Terminal VARCHAR(20),
NetSales VARCHAR(30),
NetAjustment VARCHAR(100),
CancelsCnt INT
...
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\inputFile.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = '\t' -- for a TAB
, ROWTERMINATOR = '\n'
, FIRSTROW = 8
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;

How to bulk insert with derived columns in SQL Server?

Newbie to SQL Server!
I try to perform bulk insert into SQL Server. I have the following csv file named input.csv:
NO,Name,age,Reference,dateTime,Category
1,Stack#mail,23,Kiop,2017-03-02T12:23:00,D
2,OverEnd#Yahoo,22,CSK,2017-030-03T12:23:00,I
In that CSV file, I have to move that into SQL Server using BulkInsert into below table schema:
create table BulkInsertTemp
(
no int,
name nvarchar(50),
age int,
Ref nvarchar(30),
currentDatetime datetime,
Category nvarchar(40)
)
Now I need to store in SQL like:
no Name age Ref currentDatetime category
--------------------------------------------------------
1 Stack 23 Kiop 2017-03-02 12:23:00 D
2 OverEnd 22 CSK 2017-03-03 12:23:00 I
I just tried below query for another one table to move into SQL Server.
create table bulkInsert(no varchar(50),name varchar(50));
BULK INSERT bulkInsert
FROM 'C:\MyInput\BulkInsert\BulkInsertData.txt'
WITH
(FIRSTROW = 1,
ROWTERMINATOR = '\n',
FIELDTERMINATOR = ',',
ROWS_PER_BATCH = 10000)
My query worked if there is no need for modifying data.
But in input.csv I have to change column values such as if name is "Stack#mail" to be store as "Stack" in SQL
I am a new one to bulk insert option so I don't know how to derive columns from existing ones.
Anyone, please guide me to solve my requirement?
I would recommend building an SSIS package to do this. If you don't know how or don't have time you could run the SQL Server Import and Export Wizard which will actually create an SSIS package for you behind the scenes.
Hope that gets you going in the right direction.
Noel

Error of importing data from txt file to IBM netezza SQL database due to date format

I need to upload data from a text file to the Netezza table. It is not working because date format is not same between the Netezza table and Text file. As per Netezza table date format which should be date only but in text file it is datetime.
Is there any way, so that I can convert the datatime into date only while uploading.
Below is the file format with one row of data -
AS_OF_DATE|ID
10/01/2015 00:00:00|40
Below Netezza query I am using to upload
INSERT INTO
LND_FINANCE_CUSTOMER
(AS_OF_DATE,ID)
SELECT
*
FROM EXTERNAL 'D:123.txt'
USING (
QUOTEDVALUE DOUBLE
DELIMITER '|'
MAXERRORS 4
DATESTYLE MDY
DATEDELIM '/'
MAXROWS 0
Y2BASE 2000
ENCODING internal
REMOTESOURCE ODBC
ESCAPECHAR '\');
0 Rows are inserting because of date format. If in the text file I manually change the datetime format into date only. then it is working fine.
If you force the file specification to timestamp the load should work fine. It will cast back to a date on the insert into LND_FINANCE_CUSTOMER. See Code Below:
INSERT INTO LND_FINANCE_CUSTOMER (AS_OF_DATE,ID)
SELECT * FROM EXTERNAL 'D:\123.txt'
(AS_OF_DATE timestamp
,ID integer)
USING ( QUOTEDVALUE DOUBLE DELIMITER '|' MAXERRORS 4 DATESTYLE MDY DATEDELIM '/' MAXROWS 0 Y2BASE 2000 ENCODING internal REMOTESOURCE ODBC ESCAPECHAR '\');

bulk insert a date in YYYYMM format to date field in MS SQL table

I have a large text file (more than 300 million records). There is a field containing date in YYYYMM format. Target field is of date type and I'm using MS SQL 2008 R2 server. Due to huge amount of data I'd prefer to use bulk insert.
Here's what I've already done:
bulk insert Tabela_5
from 'c:\users\...\table5.csv'
with
(
rowterminator = '\n',
fieldterminator = ',',
tablock
)
select * from Tabela_5
201206 in file turns out to be 2020-12-06 on server, whereas I'd like it to be 2012-06-01 (I don't care about the day).
Is there a way to bulk insert date in such format to a field of date type?
kind regards
maciej pitucha
Run SET DATEFORMAT ymd before the bulk insert statement
Note that yyyy-mm-dd and yyyy-dd-mm are not safe in SQL Server generally: which causes this. Always use yyyymmdd. For more, see best way to convert and validate a date string (the comments especially highlight misunderstandings here)
There isn't a way to do this on Bulk Insert. You have 2 options.
Insert the date as into a varchar field then run a query to convert
the date into a DateTime field
Use SSIS
You may not care about the day but SQL does.
YYYYMM is not a valid SQL date.
A date must have day component.
In your example it parsed it down as YYMMDD.
You could insert into a VarChar as Jaimal proposed then append a 01 and convert.
I would read in the data in .NET add the 01 and use DateTime.ParseExact and insert row by row asynch. You can catch any parse that is not successful.
Or you might be able to do a global replace in the csv "," to "01,". It is worth a try.

Resources