SQL DATETIME Insert from Excel? - sql-server

So im having a rather strange problem, I have a Column (lets say Column A) in excel that has data that looks like this:
4/11/2015 10:14
I have a bunch of other columns, but anyways in my SQL Insert statement within excel, the data (when copying out) looks like this:
42105.4561921296
The ="INSERT INTO TABLE VALUES ('"&A1&"', Etc....)" is in the data format of "general" and the Date column is in the format of "Custom" where there is a M/DD/YYYY MM/HH type within.
The SQL Column is of the data type DATETIME, so it of course doesn't accept the weird number it gets.
Any ideas? changing the format of the "SQL INSERT" column doesn't change the results.

You are right - Excel formats only changes the way the numbers are displayed, not the underlying value of the cell. In this case, the value of the cell is an excel date-time value, which is the # of days since 1/1/1900, with decimals for the time.
I'd recommend using Excel's TEXT function to convert Excel's numeric date-time value to a text string that can be inserted into SQL:
Instead of:
INSERT INTO TABLE VALUES ('"&A1&"', Etc....)"
Try:
INSERT INTO TABLE VALUES ('"&TEXT(A1,"YYYY-MM-DD HH:MM")&"', Etc...)"

The best format to insert a date time into a datetime column is to present the date and time as YYYY-MM-DD Minute:Second or 2015-04-15 12:52
so to insert a datetime from Excel you can use this set of Excel functions:
(where A1 contains the datetime to be saved)
=YEAR(A1)&"-"&RIGHT("00"&MONTH(A1),2)&"-"&RIGHT("00"&DAY(A1),2)&" "&RIGHT("00"&HOUR(A1),2)&":"&RIGHT("00"&MINUTE(A1),2)&":"&RIGHT("00"&SECOND(A1),2)

Related

When using SSIS to import a CSV file, how can you replace missing dates with the current UTC datetime?

I am creating an SSIS package to import a CSV file into a SQL Server table.
The CSV file has a date column and the value will be missing from some of the rows.
The missing value is represented by two consecutive commas(i.e. val1,,va3).
When the value is missing, I want to insert the current date and time in UTC.
Within a Dervied Column Transformation I am using:
REPLACENULL(DateCreated, GetUtcDate())
This doesn't work; instead, the value 0001-01-01 00:00:00.0000000 is being inserted.
Details:
The SQL Server table is expecting the date format: Datetime2(7)
The flat file connection manager uses database timestamp with precision [DT_DBTIMESTAMP2]
How can I replace the missing values with the current UTC datetime?
The column might not be NULL so the REPLACENULL isn't triggering
The default flat file source behavior is to replace empty column values with an empty string or empty date, double check on your flatfile source and ensure that your are importing empty columns as NULLs.
The REPLACENULL should then register the column as being NULL and replace it with GetUtcDate()

How to write where condition for column having decimal datatype in SQL

I have a date column which is of type "PIC S9(7) COMP-3" in IBM DB2/AS400.
When I put these values in SQL Server, my data type of column made changed as "decimal(7,0)".This is just to making DB2 datatype similar to SQL Server.
Now, I would like to find if there is any "space" or "numeric" in this particular date column.
The date column is like this:
DATE
-------
4040404
(a space)
404040
2020202
(a space)
202020
In where condition like this gives error:
"Error converting data type varchar to numeric."
Select ID, DATE
from Table1
Where DATE = ''
How to resolve this?
Because a space is occurring in the column, data type should default to varchar.
Explicitly convert the column (to avoid data type errors) to varchar and then check for floating spaces or numbers using wildcards. E.g.
where Convert(varchar, Column_Date) like '%[ ]%' or
where Convert(varchar, Column_Date) like '%[0-9]%' depending on what you want to filter out.

SSIS Convert m/dd/yyyy to yyyymmdd with inconsistencies

I'm loading many files into a SQL SERVER database. I have one flat file that has a Date Column coming in as string[DT_STR].
I have TWO "date fields" in my database. One is varchar, one is datetime.
Converting the datetime column is no issue, I just use Data Conversion/Derived Column if necessary. However, this varchar column is giving me trouble. Our database values for this column should be in yyyymmdd format. However, on this single file the format of the dates change.
Normally I'd do a SUBSTRING(...) expression here, but the difficulty is that the format of these dates change. some examples of values could be
08/16/2017
8/16/2017
08/6/2017
08/06/2017
10/6/2017
10/06/2017
This makes the challenge harder. I tried LEN([DATE]) == NUM_HERE ? do_THING : OTHER_CALC, but this approach failed because the length of 10/6/2017 is the same as 8/06/2017 which will give me the wrong result. Does anyone have a good workaround for this?
Perhaps a simple convert to date and then into the final format. If 2012+, use try_convert() to trap any bogus dates.
Example
Declare #YourTable Table ([SomeCol] varchar(50))
Insert Into #YourTable Values
('08/16/2017')
,('8/16/2017')
,('08/6/2017')
,('08/06/2017')
,('10/6/2017')
,('10/06/2017')
Select *
,Formatted = convert(varchar(8),convert(Date,SomeCol),112)
from #YourTable
Returns
SomeCol Formatted
08/16/2017 20170816
8/16/2017 20170816
08/6/2017 20170806
08/06/2017 20170806
10/6/2017 20171006
10/06/2017 20171006
Convert the varchar data to datetime and convert that to a formatted string
SELECT CONVERT(varchar,(CONVERT(datetime, '8/6/2017')),112)

SQL Server - data lost when converted datetime to varchar

I had to restore a table that contained a datetime column.
I used bulk insert to insert the data from a CSV file. However the import couldn't insert the datetime values because SQL server saw it as a different format.
I ended up modifying the table, removing the datetime data type and replacing it with a varchar.
The issue is the data got converted from this format: 7/15/2015 3:41:57 PM to something like this: 47:47.0
Is there a way I can convert these values back or is the data lost?
As #ChrisSteele mentioned, your data is hosed. It likely got this way by Excel's cool feature to convert datetime strings to integers. Try re-saving the original file in notepad, or changing the format of the column from Date/Datetime to Text if you're using Excel.

bulk insert a date in YYYYMM format to date field in MS SQL table

I have a large text file (more than 300 million records). There is a field containing date in YYYYMM format. Target field is of date type and I'm using MS SQL 2008 R2 server. Due to huge amount of data I'd prefer to use bulk insert.
Here's what I've already done:
bulk insert Tabela_5
from 'c:\users\...\table5.csv'
with
(
rowterminator = '\n',
fieldterminator = ',',
tablock
)
select * from Tabela_5
201206 in file turns out to be 2020-12-06 on server, whereas I'd like it to be 2012-06-01 (I don't care about the day).
Is there a way to bulk insert date in such format to a field of date type?
kind regards
maciej pitucha
Run SET DATEFORMAT ymd before the bulk insert statement
Note that yyyy-mm-dd and yyyy-dd-mm are not safe in SQL Server generally: which causes this. Always use yyyymmdd. For more, see best way to convert and validate a date string (the comments especially highlight misunderstandings here)
There isn't a way to do this on Bulk Insert. You have 2 options.
Insert the date as into a varchar field then run a query to convert
the date into a DateTime field
Use SSIS
You may not care about the day but SQL does.
YYYYMM is not a valid SQL date.
A date must have day component.
In your example it parsed it down as YYMMDD.
You could insert into a VarChar as Jaimal proposed then append a 01 and convert.
I would read in the data in .NET add the 01 and use DateTime.ParseExact and insert row by row asynch. You can catch any parse that is not successful.
Or you might be able to do a global replace in the csv "," to "01,". It is worth a try.

Resources