I am trying to combine a SmallDateTime field and a Time value (result of a scalar-valued function) into a DateTime and I keep getting the following error:
Conversion failed when converting date and/or time from character
string.
Here are the variables used throughout:
DECLARE #STARTDATETIME AS DATETIME
DECLARE #ENDDATETIME AS DATETIME
SELECT #STARTDATETIME = '8/29/2016 12:00:00'
SELECT #ENDDATETIME = '8/30/2016 12:00:00'
Column definitions:
FT_START_DATE SmallDateTime
FT_END_DATE SmallDateTime
FT_START_TIME Int
FT_END_TIME Int
The date fields do not contain timestamps. The time fields are basically 24 hour time without the colon dividers. (Example: 142350 = 14:23:50)
Here's the function that is called in my queries:
USE [PWIN171]
GO
/****** Object: UserDefinedFunction [dbo].[dbo.IPC_Convert_Time] Script Date: 9/13/2016 4:50:49 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER FUNCTION [dbo].[dbo.IPC_Convert_Time]
(
#time int
)
RETURNS time
AS
BEGIN
DECLARE #Result time
SELECT #Result = CONVERT(time
, STUFF(
STUFF(
RIGHT('000000' + CONVERT(varchar(6), #time), 6)
, 5, 0, ':')
, 3, 0, ':')
)
RETURN #Result
END
Example 1 - Fails:
This is what I'm after in general.
SELECT * FROM FT WITH (NOLOCK)
WHERE
CAST(FT_END_DATE AS DATETIME) + DBO.[dbo.IPC_Convert_Time](FT_END_TIME) BETWEEN #STARTDATETIME AND #ENDDATETIME;
Example 2 - Works:
This one runs, but it won't get records from 8/29 because the end dates will be before 12:00:00 on 8/29.
SELECT * FROM FT WITH (NOLOCK)
WHERE
FT_END_DATE BETWEEN #STARTDATETIME AND #ENDDATETIME
AND CAST(FT_END_DATE AS DATETIME) + DBO.[dbo.IPC_Convert_Time](FT_END_TIME) BETWEEN #STARTDATETIME AND #ENDDATETIME;
I suppose I could do one where I split apart my paramters and check that the end time is between the time portion of the parameters as well, but that seems to be a step in the wrong direction. The error seems to only appear when there is no other usage of FT_START_DATE or FT_END_DATE in the where clause.
The time converting function works fine in every scenario I have created. I have even tried Example 2 with parameters that would give it 100% overlap with the data covered by Example 1 in case there was bad data causing the error, but it runs fine.
I also don't know exactly where the error is occurring, because it only references the line the select statement begins on, and not the actual location in the code.
Why does it behave like this?
UPDATE:
TIMEFROMPARTS is not available because this is on SQL Server 2008
If I understand this correctly, this can be done much simpler:
Try this:
DECLARE #d DATE=GETDATE();
DECLARE #t TIME=GETDATE();
SELECT #d;
SELECT #t;
SELECT CAST(#d AS datetime)+CAST(#t AS datetime);
A pure date and a pure time can simply be added to combine them...
UPDATE Read your question again...
Try this
SELECT FT_END_DATE
,FT_END_TIME
,CAST(FT_END_DATE AS DATETIME) + DBO.[dbo.IPC_Convert_Time](FT_END_TIME) AS CombinedTime
,*
FROM FT
to see if your attempt is doing the right thing.
If yes, it might help to create a CTE and do the filter on the named column.
Sometimes the engine does not work the order you would expect this.
As CTEs are fully inlined it is quite possible, that this will not help...
SQL Server is well knwon for bringing up such errors, because a type check happens before a conversion took place...
It might be an idea to use the given SELECT with INTO #tbl to push the result set into a new table and do your logic from there...
Related
I'm getting an error when I try to run a simple aggregating query.
SELECT MAX(CAST(someDate as datetime)) AS MAX_DT FROM #SomeTable WHERE ISDATE(someDate) = 1
ERROR: Conversion failed when converting date and/or time from character string.
Non-date entries should be removed by WHERE clause, but that doesn't seem to be happening. I can work around with an explicit CASE statement inside the MAX(), but I don't want to hack up the query if I can avoid it. If I use a lower COMPATIBILITY_LEVEL, it works fine. If I have fewer than 2^17 rows, it works fine.
-- SQLServer 15.0.4043.16
USE AdventureWorks;
GO
ALTER DATABASE AdventureWorks SET COMPATIBILITY_LEVEL = 150;
GO
-- delete temp table if exists
DROP TABLE IF EXISTS #SomeTable;
GO
-- create temp table
CREATE TABLE #SomeTable (
someDate varchar(20) DEFAULT GETDATE()
);
-- load data, need at least 2^17 rows with at least 1 bad date value
INSERT #SomeTable DEFAULT VALUES;
DECLARE #i int = 0;
WHILE #i < 17
BEGIN
INSERT INTO #SomeTable (someDate) SELECT someDate FROM #SomeTable
SET #i = #i + 1;
END
GO
-- create invalid date row
WITH cteUpdate AS (SELECT TOP 1 * FROM #SomeTable)
UPDATE cteUpdate SET someDate='NOT_A_DATE'
-- error query
SELECT MAX(CAST(someDate as datetime)) AS MAX_DT
FROM #SomeTable
WHERE ISDATE(someDate) = 1
--ERROR: Conversion failed when converting date and/or time from character string.
-- delete temp table if exists
DROP TABLE IF EXISTS #SomeTable;
GO
I would recommend try_cast() rather than isdate():
SELECT MAX(TRY_CAST(someDate as datetime)) AS MAX_DT
FROM #SomeTable
This is a much more reliable approach: instead of relying on some heuristic to guess whether the value is convertible to a datetime (as isdate() does), try_cast actually attempts to convert, and returns null if that fails - which aggregate function max() happily ignores.
try_cast() (and sister functions try_convert()) is a very handy functions, that many other databases are missing.
I actually just encountered the same issue (except I did cast to float). It seems that the SQL Server 2019 Optimizer sometimes (yes, it's not reliable) decides to execute the calculations in the SELECT part before it applies the WHERE.
If you set compatibility level to a lower version this also results in a different optimizer being used (it always uses the optimizer of the compatibility level). Older query optimizers seem to always execute the WHERE part first.
Seems lowering the compatibility level is already the best solution unless you want to replace all CAST with TRY_CAST (which would also mean you won't spot actual errors as easily, such as a faulty WHERE that causes your calculation to then return NULL instead of the correct value).
We have a job with couple of steps and almost all of the steps use getdate(), but instead we want to get the date from a specific table and column. The table includes only two columns status as ready (doesn't change) and statusdate (dynamic). The plan is to create a stored procedure and replace the getdate() with that stored procedure.
How do I write the stored procedure? How do I declare a variable?
CREATE PROCEDURE SP_DATE
#StatusDate DATETIME
AS
BEGIN
SELECT StatusDate
FROM [DB_Name].[dbo].[Table_Name]
WHERE status = ready
END
Thank you!
Your jobs use getdate() function therefore in order to replace it with custom programmatic object you should use function as well and not a stored procedure. With a function like this
CREATE FUNCTION dbo.StatusDate ()
RETURNS DATETIME
AS
BEGIN
RETURN (SELECT
StatusDate
FROM Table_Name
WHERE status = 'ready')
END
you can replace getdate directly
SELECT
id
FROM Your_Job_List yjl
WHERE yjl.aDate < dbo.StatusDate()--getdate()
yet there are some questions to the design. One biggest single task of RDBMS is joining tables and perhaps a query similar to next one might be better
SELECT
id
FROM Your_Job_List yjl
,Table_Name tn
WHERE yjl.aDate < tn.StatusDate
AND tn.status = 'ready'
CREATE PROCEDURE spRunNextDate
AS
BEGIN
--SET NOCOUNT ON
declare #runDate datetime
select #runDate = MIN(StatusDate)
from [DB_Name].[dbo].[Table_Name]
where [status] = 'ready'
IF (#runDate IS NULL)
BEGIN
PRINT 'NO Dates in Ready State.'
RETURN 0
END
PRINT 'Will Run Date of ' + CAST(#runDate as varchar(20))
-- Notice the MIN or MAX usage above to get only one date per "run"
END
GO
There are huge holes and questions raised in my presumptuous sp above, but it might get you to thinking about why your question implies that there is no parameter. You are going to need a way to mark the day "done" or else it will be run over and over.
I have a simple stored procedure where part of it I want to set 2 variables, 1 for the current time and the other for the current date. I need them in hhmm format for the time and yyyyMMdd format for the date.
Here is the code I have so far:
BEGIN
DECLARE #d AS DATETIME
DECLARE #t as TIME
SET #d = GETDATE()
SET #t = SYSDATETIME()
But everything I've tried to use to change the format of those 2 variables does not help me out. The only examples I've found online is for formatting values in regular queries.
Can anyone point me in the right direction as far as what I should do to get these values? Thanks in advance.
If 2012+ you can use Format()
Example
DECLARE #d as varchar(8) = format(GetDate(),'yyyyMMdd')
DECLARE #t as varchar(4) = format(SYSDATETIME(),'HHmm') -- use hh for 12 hour time
Select #d,#t
Returns
(No column name) (No column name)
20180511 1738
For 2005
DECLARE #d as varchar(8) = convert(varchar(10),GetDate(),112)
DECLARE #t as varchar(8) = replace(left(convert(varchar(25),SYSDATETIME(),108),5),':','')
I have a SQL table with one varchar data type column, having dates in different formats.
Example Data:
2017-06-30 09:40:05.130,
2017-08-21 12:52:23.063000000,
26/4/2016 00:00:00,
5/4/2016 00:00:00
I have a requirement to load data from this column(varchar data type) to another SQL table column with datetime data type.
I tried the following query
Select
convert(datetime,convert(char(19),'2017-08-21 12:52:23.063000000'))
Select
convert(datetime,convert(char(19),'26/4/2016 00:00:00'),103)
Can anybody help me a common single query which converts different format dates ?
It's an ugly task, but this is the best I can think of:
select convert(datetime2, '9/20/2017')
select convert(datetime2, '2017-06-30 09:40:05.130')
select convert(datetime2, '2017-08-21 12:52:23.063000000')
select convert(datetime, '5/4/2016 00:00:00')
However, keep in mind, as #Eli pointed out in the comments, your second example might miss precession. Also, it won't handle '26/4/2016 00:00:00'.
The above examples will return as follows (in same order):
2017-09-20 00:00:00.0000000
2017-06-30 09:40:05.1300000
2017-08-21 12:52:23.0630000
2016-05-04 00:00:00.000
You need to use error handling to ignore failed conversions, so you also need a cursor to handle each row of data. That way you manipulate values that can be converted and handle any way you see fit values that cannot be converted, without having your loop interrupted by conversion error. A small example:
DECLARE #tbl TABLE (dat NVARCHAR(64))
INSERT INTO #tbl (dat) VALUES ('2017-06-30 09:40:05.130')
INSERT INTO #tbl (dat) VALUES ('26/4/2016 00:00:00')
INSERT INTO #tbl (dat) VALUES ('2017-08-21 12:52:23.063000000')
INSERT INTO #tbl (dat) VALUES ('5/4/2016 00:00:00')
DECLARE #cdat NVARCHAR(64)
DECLARE c CURSOR LOCAL FAST_FORWARD READ_ONLY FOR
SELECT dat FROM #tbl
OPEN c
FETCH NEXT FROM c INTO #cdat
WHILE ##fetch_status = 0 BEGIN
BEGIN TRY
/*convert and do the job*/
SELECT CONVERT(DATETIME2,#cdat)--into normal table
END TRY
BEGIN CATCH
/*handle failed conversions*/
SELECT #cdat --into failstable?
PRINT 'failed conversion, data:' + #cdat
END CATCH
FETCH NEXT FROM c INTO #cdat
END
CLOSE c
DEALLOCATE c
If you are using SQL server 2012 or above, you can use TRY_PARSE function.
This is not a SQL Native function rather it is a .NET runtime dependent function. It also facilitates to parse data culture wise which not there in TRY_CONVERT and TRY_CAST. It may create performance overhead but it tries its best to parse the data to specified data type.
declare #t table (dates nvarchar(50));
insert into #t values
('2017-06-30 09:40:05.130'),
('2017-08-21 12:52:23.063000000'),
('26/4/2016 00:00:00'),
('5/4/2016 00:00:00') ;
select
try_parse(dates as datetime) parsed_dates ,
dates
from #t;
--result
parsed_dates dates
30.06.2017 09:40:05 2017-06-30 09:40:05.130
21.08.2017 12:52:23 2017-08-21 12:52:23.063000000
NULL 26/4/2016 00:00:00
04.05.2016 00:00:00 5/4/2016 00:00:00
see demo
I tried following query ,it worked for me for different formats in my case
declare #col varchar(100)='2017-08-21 12:52:23.063000000'
Select
case
when isdate(convert(char(19),#col))=1 then convert(datetime,convert(char(19),#col))
else convert(datetime,convert(char(19),#col),103)
end Col
I tried with all above date, it worked.
I have created two functions to work with ISO 8601 dates:
CREATE FUNCTION IPUTILS_STR_TO_ISODATE (
#isostr VARCHAR(30))
RETURNS DATETIME
AS
BEGIN
RETURN CONVERT(DATETIME, #isostr, 126);
END;
GO
CREATE FUNCTION IPUTILS_ISODATE_TO_STR (
#date VARCHAR(30))
RETURNS VARCHAR(30)
AS
BEGIN
DECLARE #result VARCHAR(30);
SET #result = CONVERT(VARCHAR(30), #date, 126);
RETURN #result;
END;
GO
I don't get them working correct for some reason. If I do:
select dbo.IPUTILS_ISODATE_TO_STR(dbo.IPUTILS_STR_TO_ISODATE('1965-04-28T12:47:43'));
I get:
apr 28 1965 12:47PM
instead of:
1965-04-28T12:47:43
if I do:
select convert(VARCHAR(30), dbo.IPUTILS_STR_TO_ISODATE('1965-04-28T12:47:43'), 126);
I get:
1965-04-28T12:47:43
Is this a bug or am I doing something wrong?
Why are you not testing these functions individually first and then in combination? If you do test them individually you will likely see the problem ;-). Check the datatype of the #date input parameter on the IPUTILS_ISODATE_TO_STR function: it is VARCHAR(30) instead of DATETIME.
Having the incorrect datatype for the input parameter means that an implicit conversion from a real DATETIME into VARCHAR, but without a specified "style", is happening as the value comes into the function. This is the same as doing CONVERT(VARCHAR(30), #date). And the result of this implicit conversion (i.e. the value stored in #date) is being sent to the SET #result = CONVERT(VARCHAR(30), #date, 126); line.
Also, I would suggest not doing this in the first place (i.e. creating either of these functions) if they are going to be used in SELECT statements or WHERE clauses. Using the CONVERT() function in those places is repetitive, but also much faster. T-SQL scalar UDFs and Multiline TVFs do not perform well and you can slow down your queries by using them. In this particular case there is no real computation / formula being done so you aren't really gaining much outside of not needing to remember the "style" number. Also, T-SQL functions invalidate the query from getting a parallel execution plan. But if these are just being used in simple SET statements to manipulate a variable that is being used in a query, then that should be fine.