How to compare Time value in table with Current Time in SQL? - sql-server

I have a table named TimeList:
| Slot |
==============
| 10:00 |
| 11:00 |
| 12:00 |
| 13:00 | and so on
That saves the Times in Varchar(5)
The desired result should be showing the rows with time that is more than the current time, for example if the current time is 11:12 A.M. the result should return:
| Slot |
==============
| 12:00 |
| 13:00 |
I tried to Convert the two values into time and comparing them with:
SELECT *
FROM TimeList
WHERE Convert(Time, Slot) > Convert(Time, GETDATE())
But it didn't work saying that Time is not a recognizable format in SQL
Is there anyway I could compare the two time slots?

Depends on the version of SQL Server you're running, I think. There is a CAST(.. as time) in 2012 or later, but I think that's a fairly new development. So... to compare the current date/time with the Timelist where the times are converted to "time, if it were today," something like this should work :
SELECT *
FROM TimeList
WHERE Convert(Datetime, FORMAT (GETDATE(), 'd') + ' ' + Slot) > GETDATE()
Conversely, if you want to compare the times to the current time, as text:
SELECT *
FROM TimeList
WHERE Slot > FORMAT(GETDATE(), N'hh\:mm')

Try This.....
SELECT *
FROM TimeList
WHERE Slot > CONVERT(time,GETDATE())

Thank you very much for all the answers, fortunately I found the answer to my question inspired by your answers.
The solution is:
SELECT *
FROM TimeList
WHERE Slot > CONVERT(varchar(5),GETDATE(), 108)
Where it seems that 108 is the format for time saved as char/varchar in which Slot was categorized as too

Related

Summing over a numeric column with moving window of varying size in Snowflake

I have a sample dataset given as follows;
time | time_diff | amount
time1 | time1-time2 | 1000
time2 | time2-time3 | 2000
time3 | time3-time4 | 3000
time4 | time4-time5 | 4500
time5 | NULL | 1000
Quick explanation; first column gives time of transaction, second column gives difference with next row to get transaction interval(in hours), and third column gives money made in a particular transaction. We have sorted the data in ascending order using time column.
Some values are given as;
time | time_diff | amount
time1 | 2. | 1000
time2 | 3. | 2000
time3 | 1. | 3000
time4 | 19. | 4500
time5 | NULL | 1000
The goal is to find the total transaction for a given time, which occurred within 24 hours of that transaction. For example, the output for time1 shd be; 1000+2000+3000=6000. Because if we add the value at time4, the total time interval becomes 25, hence we omit the value of 4500 from the sum.
Example output:
time | amount
time1 | 6000
time2 | 9500
time3 | 7500
time4 | 4500
time5 | 1000
The concept of Mong window sum should work, in my knowledge, but here the width of the window is variable. Thats the challenge I am facing.Can I kindly get some help here?
You could ignore the time_diff column and use a theta self-join based on a timestamp range, like this:
WITH srctab AS ( SELECT TO_TIMESTAMP_NTZ('2020-04-15 00:00:00') AS "time", 1000::INT AS "amount"
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 00:02:00'), 2000::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 00:05:00'), 3000::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 00:06:00'), 4500::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-16 00:01:00'), 1000::INT
)
SELECT t1."time", SUM(t2."amount") AS tot
FROM srctab t1
JOIN srctab t2 ON t2."time" BETWEEN t1."time" AND TIMESTAMPADD(HOUR, +24, t1."time")
GROUP BY t1."time"
ORDER BY t1."time";
Minor detail: if your second column gives the time difference with the next row then I'd say the first value should be 10500 (not 6000) because it's only your 5th transaction of 1000 which is more than 24 hours ahead... I'm guessing your actual timestamps are at 0, 2, 5, 6 and 25 hours?
Another option might be to use the sliding WINDOW function by tweaking your transactional data to include each hour.
It's perhaps an overkill but might be a useful technique.
Firstly generate a placeholder for each hour using the timestamps. I utilised time_slice to map each timestamp into nice hour blocks and generator with dateadd to back fill each hour putting a zero in where no transactions took place.
So now I can use the sliding window function knowing that I can safely choose the 23 preceding hours.
Copy|Paste|Run
WITH SRCTAB AS (
SELECT TO_TIMESTAMP_NTZ('2020-04-15 00:00:00') AS TRANS_TS, 1000::INT AS AMOUNT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 02:00:00'), 2000::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 05:00:00'), 3000::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-15 06:00:00'), 4500::INT
UNION ALL SELECT TO_TIMESTAMP_NTZ('2020-04-16 01:00:00'), 1000::INT
)
SELECT
TRANS_TIME_HOUR
,SUM(AMOUNT) OVER ( ORDER BY TRANS_TIME_HOUR ROWS BETWEEN 23 PRECEDING AND 0 PRECEDING ) OVERKILL FROM (
SELECT
TRANS_TIME_HOUR,
SUM(AMOUNT) AMOUNT
FROM
(
SELECT
DATEADD(HOUR, NUMBER, TRANS_TS) TRANS_TIME_HOUR,
DECODE( DATEADD(HOUR, NUMBER, TRANS_TS), TIME_SLICE(TRANS_TS, 1, 'HOUR', 'START'), AMOUNT,0) AMOUNT
FROM
SRCTAB,
(SELECT SEQ4() NUMBER FROM TABLE(GENERATOR(ROWCOUNT => 24)) ) G
)
GROUP BY
TRANS_TIME_HOUR
)

SAS PROC SQL where between 4 mondays ago and last monday

I am trying to sum up the past 4 weeks forecast vs sales data with each week starting on a monday.
For reference today is 8/6 so i want to start collecting their weekly sales and forecast for 7/5 going up until previous week monday, 7/26. I eventually plan to sum this 4wk forecast into one row using a do until last. statement grouping by Store and SKU to where i can use a put statement to make a new column to signal if they sold more than they forecasted or less.
For instance lets pretend they had the below data
|Mon_DT |STORE |SKU |wk_FCST|wk_Sales|
|05July21:00:00:00 | 5 | abc | 10 | 12 |
|12July21:00:00:00 | 5 | abc | 10 | 16 |
|19July21:00:00:00 | 5 | abc | 10 | 7 |
|26July21:00:00:00 | 5 | abc | 10 | 12 |
with the do until last. the forecast will read as 40 and sales as 47 and ill say if sales < forecast then LOWforecast = 'Y';
However, I am having trouble just getting the between statement to work to pull only the last 4 weeks (Starting on monday).
DATA Getweeks;
StartOversell = intnx('week.2',today(),-4);
endoversell = intnx('week.2',today(),-1);
format StartOversell yymmdd10.;
format endoversell yymmdd10.;
Run;
Proc sql;
connect to odbc (dsn='****' uid='****' pwd='***');
create table work.Forecast1 as select distinct * from connection to odbc
(select MON_DT as DATE, Store_Number as Store, PROD_PKG_ID as SKU, FCST AS WK_FCST, SALES AS WK_sales, DIFF_QTY
From FCST
where Mon_DT >= 'StartOversell'd and Mon_DT <= 'endoversell'd );
disconnect from odbc;
quit;
I tried to use a macro variable as well but no luck.
Use macro variables since the code uses explicit pass through and DB is expecting a database compliant date literal. All your SQL must be DB compliant in explicit pass through - not SAS SQL.
If you used MS SQL and it needs dates as "MM/DD/YY" for literals. So I will use the mmddyyS10. format which creates a macro variable that will look like that. You can convert the values using a put() function.
It's also a good idea to include the quotes in the macro variable in this case because Oracle needs single quotes, not double - not sure about MS SQL. The quote() function can be used to add quotes without issue.
DATA Getweeks;
StartOversell = intnx('week.2',today(),-4);
call symputx('startOversell', quote(put(startOverSell, MMDDYYS10.), "'"));
...
Run;
%put &startOversell;
Proc sql;
connect to odbc (dsn='****' uid='****' pwd='***');
create table work.Forecast1 as select distinct * from connection to odbc
(select MON_DT as DATE, Store_Number as Store, PROD_PKG_ID as SKU, FCST AS WK_FCST, SALES AS WK_sales, DIFF_QTY
From FCST
where Mon_DT >= &startOverSell and Mon_DT <= &endOverSell );
disconnect from odbc;
quit;
Edit: you may want to consider what happens if you run it on a Monday and check if your dates align as expected.

Get HH:MM:SS from a datetime in MSSQL

I have the following issue:
I have a datetime field, which contains entries like this: "1970-01-01 22:09:26.000"
I would like to extract only the 22:09:26 (hh:mm:ss) part, but I am unable to convert it into 24h format, I used FORMAT and CONVERT, but received the the am/pm culture (for the CONVERT I tried to use 13 culture value).
What is the simplest way to construct the formula to give back the above mentioned format?
Thank you!
1st way
You can select the format you wish from https://www.mssqltips.com/sqlservertip/1145/date-and-time-conversions-using-sql-server/
select replace(convert(nvarchar(20), CAST('1970-01-01 22:09:26.000' AS datetime), 114),'-',':')
2nd way
It is not a conversion,but if your entries are all the same format then you can use the below:
select right('1970-01-01 22:09:26.000',12)
Updated if you have null dates as well:
1.
select case when value is not null
then replace(convert(nvarchar(20), CAST(value AS datetime), 114),'-',':')
else null
end
select case when value is not null then right(value,12)
else null end
To get just the time portion of a datetime you just need to cast or convert it into the appropriate data type. If you really want to be formatting your data right in the query, this is very possible with format and I am not sure what issues you were facing there:
declare #t table(d datetime);
insert into #t values(dateadd(minute,-90,getdate())),(dateadd(minute,-60,getdate())),(dateadd(minute,-30,getdate())),(dateadd(minute,90,getdate()));
select d
,cast(d as time) as TimeValue
,format(d,'HH:mm:ss') as FormattedTimeValue
from #t;
Output
+-------------------------+------------------+--------------------+
| d | TimeValue | FormattedTimeValue |
+-------------------------+------------------+--------------------+
| 2020-08-10 11:51:15.560 | 11:51:15.5600000 | 11:51:15 |
| 2020-08-10 12:21:15.560 | 12:21:15.5600000 | 12:21:15 |
| 2020-08-10 12:51:15.560 | 12:51:15.5600000 | 12:51:15 |
| 2020-08-10 14:51:15.560 | 14:51:15.5600000 | 14:51:15 |
+-------------------------+------------------+--------------------+
By using format code as 108 we can get datetime in 'HH:mm:ss' format.
DECLARE #now DATETIME = GETDATE()
SELECT CONVERT(NVARCHAR(20), #now, 108)

Convert integer value to DateTime in SQL Server 2012

I have a table called "EventLog" which has the column called nDateTime of type int.
This is the table "EventLog" with some values:
-----------------
| nDateTime |
-----------------
| 978307200 |
-----------------
| 978307219 |
-----------------
| 978513562 |
-----------------
| 978516233 |
-----------------
| 978544196 |
-----------------
| 1450379547 |
-----------------
| 1472299563 |
-----------------
| 1472299581 |
-----------------
| 1472300635 |
-----------------
| 1472300644 |
-----------------
| 1472300673 |
-----------------
I need to get the DateTime value, and I tried the following statements, but I receive these errors:
Test #1:
SELECT CONVERT(DATETIME, CONVERT(CHAR(8), nDateTime), 103) AS 'Formatted date'
FROM EventLog
The error says:
Conversion failed when converting date and/or time from character string.
Test #2: modified from here:
SELECT CONVERT(DATETIME, nDateTime, 103) AS 'Formatted date'
FROM EventLog
And Test #3 goes:
SELECT CAST(nDateTime AS datetime) AS 'Formatted date'
FROM EventLog
The duplicate question doesn't answer my question because (both, test #2 and test #3) generates this error:
Arithmetic overflow error converting expression to data type datetime.
I admit that I never saw such value as a Date, and for that, I'm kind of confused in how to proceed.
My question is: How can get the valid DateTime value from the sample data?
Almost every time you see a date/time represented as an integer, that number represents the passage of time since a known epoch. This is the basis of Unix time which is, put simply, the number of seconds which have elapsed since 1st January 1970 00:00:00
Using this, we can check with some values you have provided
declare #dt DATETIME = '1970-01-01' -- epoch start
print dateadd(second,978307200,#dt ) -- Jan 1 2001 12:00AM
print dateadd(second,1472300673,#dt ) -- Aug 27 2016 12:24PM
Seems possible, but who knows?!
You can check every date in your table simply using
declare #dt DATETIME = '1970-01-01' -- epoch start
SELECT
nDateTime AS OriginalData,
DATEADD(second, nDateTime,#dt) AS ActualDateTime
FROM EventLog
Just for giggles, I took a stab at having the base date of 1970-01-01, but without KNOWING the base, it is just a guess
Declare #Log table (DateInt int)
Insert Into #Log values
(978307200),
(978307219),
(978513562),
(978516233),
(978544196),
(1450379547),
(1472299563),
(1472299581),
(1472300635),
(1472300644),
(1472300673)
Select DateInt,Converted= DateAdd(SECOND,DateInt,'1970-01-01') From #Log
Returns
DateInt Converted
978307200 2001-01-01 00:00:00.000
978307219 2001-01-01 00:00:19.000
978513562 2001-01-03 09:19:22.000
978516233 2001-01-03 10:03:53.000
978544196 2001-01-03 17:49:56.000
1450379547 2015-12-17 19:12:27.000
1472299563 2016-08-27 12:06:03.000
1472299581 2016-08-27 12:06:21.000
1472300635 2016-08-27 12:23:55.000
1472300644 2016-08-27 12:24:04.000
1472300673 2016-08-27 12:24:33.000
The "2038" Problem with Unix Timestamps
There's a serious issue with writing code to convert UNIX Timestamps that are based on seconds... DATEADD can only handle INTs and that brings us to the "2038/Y2K38/Friday the 13th" problem (the day of the week when then "wraparound" to the most negative number an INT can have happens after the date below).
That means that the largest positive value it can handle is 2147483647. If we use DATEADD to add that number of seconds to the UNIX Epoch of the first instant of the year 1970, we end up with a DATETIME that clearly explains what they mean by the "2038" issue.
SELECT DATEADD(ss,2147483647,'1970');
The Standard Fix for the "2038" Problem
The standard way to get around that is to first store the UNIX Timestamp as a BIGINT and do two date adds... one for seconds and one for days.
There are 84600 seconds in a day. If we do Integer Division and ...
Use the Quotient to derive the number of days to add to
'1970'...
And use the Remainder to derive the number of
seconds to add to '1970'...
... we'll get the correct date not only for the MAX INT value...
DECLARE #SomeUnixTS BIGINT = 2147483647
,#SecsPerDay BIGINT = 86400
;
SELECT DATEADD(ss,#SomeUnixTS%#SecsPerDay,DATEADD(dd,#SomeUnixTS/#SecsPerDay,'1970'))
;
... but also for the last possible date in seconds for SQL Server. If we calculate the UNIX Timestamp (in seconds) for the last possible second that's available in SQL Server...
SELECT DATEDIFF_BIG(ss,'1970','9999-12-31 23:59:59');
... it still works with lots of room to spare and no "2038" problem.
DECLARE #SomeUnixTS BIGINT = 253402300799
,#SecsPerDay BIGINT = 86400
;
SELECT DATEADD(ss,#SomeUnixTS%#SecsPerDay,DATEADD(dd,#SomeUnixTS/#SecsPerDay,'1970'))
;
UNIX Timestamps Based on Milliseconds
Working with UNIX Timestamps that are based on Milliseconds are only slightly different but must be handled the same way...
DECLARE #SomeUnixTS BIGINT = DATEDIFF_BIG(ms,'1970','9999-12-31 23:59:59.999')
,#msUnixEpoch DATETIME2(3) = '1970'
,#msPerDay BIGINT = 86400000
;
SELECT SomeUnixTS = #SomeUnixTS
,msUnixEpoch = #msUnixEpoch
,Converted = DATEADD(ms,#SomeUnixTS%#msPerDay,DATEADD(dd,#SomeUnixTS/#msPerDay,#msUnixEpoch))
;
As a bit of a sidebar, you have to wonder what Microsoft was or was not thinking when they created DATEDIFF_BIG() but didn't create a DATEADD_BIG(). Amazing even more is the they have SQL Server that will work in a UNIX environment and still no CONVERT(ts) functionality.
Here's whats new in 2022 in the area I'm talking about...
https://learn.microsoft.com/en-us/sql/sql-server/what-s-new-in-sql-server-2022?view=sql-server-ver16#language
And, last but not least, do not convert UNIX Timestamps that are based on milliseconds directly to DATETIME because the rounding in DATETIME can take you to the next day, week, month, and even year. You must do a "units place" detection for "9" and "-1" and make the appropriate substitution of "7" and "-3" respectively.
Your input is > 8 digits hence it is throwing arithmentic overflow error.. If it is 8 digits you will get converted data:
For Example:
DECLARE #ndatetime int = 978307200
SELECT CONVERT(datetime, convert(varchar(10), #ndatetime, 112))
-- this throws arithmetic overflow error
DECLARE #ndatetime int = 97830720 -- with 8 digits only
SELECT CONVERT(datetime, convert(varchar(10), #ndatetime, 112))
This returns converted date
You can try try_convert which will return null if it is wrong date
DECLARE #ndatetime int = 978307200
SELECT TRY_CONVERT(datetime, convert(varchar(10), #ndatetime, 112))

Show averages of a dataset, different date/time ranges

DB: MS SQL Server 11.0.3156.
I have a table where I record periodic data values. The key columns are:
fldObjectGUID (varchar), fldDataTimestamp (datetime), fldConfigItem (varchar), fldConfigItemValue (numeric)
I want to retrieve data for a different time frame (day, week, month). But to keep the number of returned data pints to a manageable number (e.g. less < 350), Therefore, I'd like to get averages.
For example:
Day - Return all Data (already got this!)
Week - Return the data in hourly average values (e.g. there would be 24 * 1 Hour Averages, * 7 days)
Month - Return the data in 3-hourly average values (e.g. 8 * Average
over 3 hours, * 30)
Yearly - Return the data in daily average values (e.g. 1 * Average
over 24 hours, * 365)
A small example of the data set is shown here:
+--------------------------------------------------------------------------------+
+ fldObjectGUID | fldRecordUpdatedTimestamp | fldConfigItem | fldConfigItemValue |
+ 40010000 | 2015-06-16 18:20:48.000 | ICMPResponseTime | 4.00 |
+ 40010000 | 2015-06-16 19:22:00.000 | ICMPResponseTime | 15.00 |
+ 40010000 | 2015-06-16 20:22:14.000 | ICMPResponseTime | 4.00 |
+ 40010000 | 2015-06-17 17:35:19.000 | ICMPResponseTime | 6.00 |
+ 40010000 | 2015-06-17 18:36:26.000 | ICMPResponseTime | 4.00 |
+ 40010000 | 2015-06-28 02:18:31.000 | ICMPResponseTime | 19.00 |
+ 40010000 | 2015-06-28 03:18:54.000 | ICMPResponseTime | 9.00 |
+ 40010000 | 2015-06-02 17:25:16.000 | ICMPResponseTime | 3.00 |
+------------------------------------------------------------------------------------+
Data is added for an object (fldObjectGUID) at different rates. This could be one row every 5 minutes or 1 one row every hour. There can be gaps in the data (hours or even days). I want to graph the fldConfigItemValue data for each object over different time frames; Day (last 24 hours), Week, Month and Year. The periods of the returned data don't need to be exact. So, a month could just be the last 30 days, or just 1 calendar month back from today's date.
The SQL only needs to return data for a single fldObjectGUID and fldConfigItem combination - I'll then amend the SQL at run-time to get the data for the required object/configitem.
There may be gaps in the data, so no data points within a given period. So, the return value can be zero.
I'm retrieving data using Classic ASP, creating the SQL statement and parsing the results. I could achieve the result programatically in my ASP code. So for the 'Week' required set, I could make repeated calls to the DB, using the AVERAGE function, and a WHERE clause to retrieve a subset of records (NOW to NOW - 1 hour). Store the value, then repeat using a WHERE clause for (NOW - 1 hour to NOW - 2 Hours). And just step back in time until I've got all the values for a week. The 'Month' and 'Yearly' routines would be the same, just different timeframes in the WHERE clauses.
However, even to me, this seems a clumsy way of doing it and just one SQL routine (or a different SQL routine for Week, Month and Year) must be quicker and / or more elegant.
At the moment, I have some SQL (from StackOverflow?) that I thought might work and I have my code build up the SQL for the 'Month' view like this (I've hard-coded the fldObjectGUID and fldConfigItem in the example, to make the example clearer):
SELECT top 30 convert(date, l.fldDataTimestamp) as 'fldDataTimestamp_result', l.fldConfigItemValue, l.fldConfigItemValue
FROM tblObjectHealthCheckData_Historic l
INNER JOIN (
SELECT MIN(fldDataTimestamp) first_timestamp
FROM tblObjectHealthCheckData_Historic
where fldObjectGUID = '10050400' and fldConfigItem = 'AvailableRAM'
group by Convert(Date, fldDataTimestamp)
) sub_l ON (sub_l.first_timestamp = l.fldDataTimestamp)
where fldObjectGUID = '10050400' and l.fldConfigItem = 'AvailableRAM'
order by fldDataTimestamp desc
But this gets just the first data point for each day (as you can guess, whilst I do understand SQL and programming, they are a hobby, not something I do for a living) and so I'm struggling to fix this code.
I'm assuming that people agree, it's more efficient doing this in code that making many separate SQL calls - but can anyone help?
I would try with DATEPART function, this way you can get different parts of the fldRecordUpdatedTimestamp and then AVG field fldConfigItemValue.
This goes down to a single Hour of your timestamp (could be minutes, check MSDN for DATEPART in T-SQL), so if you wish to get daily averages per week then you need to include:
day_fldRecordUpdatedTimestamp
week_fldRecordUpdatedTimestamp
this will Average for each day inside each week.
Below example shows average per month - mind the year, if you have more than a years worth of data make sure you include year_fldRecordUpdatedTimestamp etc.
WITH PartsTable As
(
SELECT
fldObjectGUID
, fldRecordUpdatedTimestamp
, fldConfigItem
, fldConfigItemValue
, DATEPART(HOUR, fldRecordUpdatedTimestamp) As hour_fldRecordUpdatedTimestamp
, DATEPART(DAY, fldRecordUpdatedTimestamp) As day_fldRecordUpdatedTimestamp
, DATEPART(WEEK, fldRecordUpdatedTimestamp) As week_fldRecordUpdatedTimestamp
, DATEPART(MONTH, fldRecordUpdatedTimestamp) As month_fldRecordUpdatedTimestamp
, DATEPART(YEAR, fldRecordUpdatedTimestamp) As year_fldRecordUpdatedTimestamp
FROM
YourLogTable
--WHERE
-- Perhaps set a limit here to not get a huge set in the first step.
)
SELECT
COUNT(1) As setcount /* Shows how many rows are in each AVG calculation. */
, fldObjectGUID
, fldConfigItem
, month_fldRecordUpdatedTimestamp /* Change this column for specific span you're intrested in. */
, AVG(fldConfigItemValue) As avg_fldConfigItemValue
FROM
PartsTable
GROUP BY
fldObjectGUID
, fldConfigItem
, month_fldRecordUpdatedTimestamp /* Change this column for specific span you're intrested in. */
;
One final note: make sure you include month_, week_ etc. column in both SELECT and GROUP BY.

Resources