LogParser get results from last hour - logparser

I have cpu monitors constantly workning on my computer.
I would like log parser to parse results only from the last hour. does anyone know how to do that?
SELECT TO_STRING(QUANTIZE(TO_TIMESTAMP(Field1, 'mm/dd/yyyy HH:mm:ss.ll'),10 ), 'hh:mm:ss') AS Time, AVG(TO_REAL(Field2)) AS Cpu
INTO .\output\cpu.csv
FROM .\logs\*.csv
WHERE Time >= SUB( TO_LOCALTIME(SYSTEM_TIMESTAMP()), TIMESTAMP('0000-01-02', 'yyyy-MM-dd') )
the last line results with an error. does anyone know how can i do that? thanks!

Your WHERE clause is wrong; instead of pulling 1 hour, you're pulling 1 day.
The following worked on IIS logs:
SELECT TO_STRING(QUANTIZE(TO_TIMESTAMP(date, time),10 ), 'hh:mm:ss') AS Time
INTO asdf.csv
FROM W3SVC5\*ex*.log
WHERE TO_LOCALTIME(TO_TIMESTAMP(date, time)) >= SUB( TO_LOCALTIME(SYSTEM_TIMESTAMP()), TIMESTAMP('0000-01-01 01:00', 'yyyy-MM-dd HH:mm') )
For your particular case you should be safe just using this bit of the WHERE clause:
SUB( TO_LOCALTIME(SYSTEM_TIMESTAMP()), TIMESTAMP('0000-01-01 01:00', 'yyyy-MM-dd HH:mm') )

Related

SQL Server: use function after certain value in table

I am trying to find the time difference between two certain points in stored conversations. These points can differ in each conversation which makes it difficult for me. I need the time difference between the Agent's message and the first EndUser response after it.
In the example in CaseNr 1234 below I need the time difference between MessageNrs 3&4, 5&6 and 7&8.
In CaseNr 2345 I need the time difference between MessageNrs 3&4, 5&6, 7&8 and 10&11.
In CaseNr 4567 I need the time difference between 2&3 and 4&5.
As is shown, the order Agent & EndUser can differ in each conversation as well as the positions these types are in.
Is there a way to calculate the time difference the way I have described it in SQL server?
I think this code should help you.
with t(MessageNr,CaseNr,Type, AgentTime, EndUserTime) as
(
select
t1.MessageNr,
t1.CaseNr,
t1.Type,
t1.EntryTime,
(select top 1 t2.EntryTime
from [Your_Table] as t2
where t1.CaseNr = t2.CaseNr
and t2.[Type] = 'EndUser'
and t1.EntryTime < t2.EntryTime
order by t2.EntryTime) as userTime
from [Your_Table] as t1
where t1.[Type] = 'Agent'
)
select t.*, DATEDIFF(second, AgentTime, EndUserTime)
from t;
It appears the logic you require is the time difference between an Agent row and the immediately following EndUser row.
You can do this with LEAD, which will be more performant than the use of self-joins.
SELECT *,
DATEDIFF(second, t.EntryTime, t.NextTime) TimeDifference
FROM (
SELECT *,
LEAD(CASE WHEN t.[Type] = 'EndUser' THEN t.EntryTime END) NextTime
FROM myTable t
) t
WHERE t.[Type] = 'Agent'
AND t.NextTime IS NOT NULL

AVG of a calculated field

I want to calculate the AVG member life time. Therefor I need to make a calculation between the column : member_since and Gettime/Currentsystemtime what syntax do I need for this function? I want the outcome of this is Years, afterwards I can to calculate the AVG of the outcome.
SELECT yelping_since
FROM [Star model incremental]
WHERE (date >= '2011-12-31 00:00:00.000') AND (date <= '2013-01-01 00:00:00.000') AND (city = N'Toronto')
THis is the view on which the calculation needs to be done
The calculation needs te be done between Yelping_since and current time to get the total member time
I think this is what you are looking for, hope it helps:
SELECT
AVG(MemberLifeTime) AS AvgMemberLifeTime
FROM
(
SELECT
DATEDIFF(year, Yelping_since, GETDATE()) AS MemberLifeTime
FROM
[vYourView]
) test

Error Convert String to DateTime

SELECT CONVERT(datetime,'17/05/2015 22:15:00',103)
output:
2015-05-17 22:15:00.000
I want include 2 column is Date+Time
Example: Colunm Date and Time
**Date** **Time**
17/05/2015 22:15:00
but Error Query
SELECT CONVERT(datetime,[Date]+' '+[Time],103) FROM LPTables
Conversion failed when converting date and/or time from character string.
Just add the time portion to the date portion:
SELECT DATEADD(ms, DATEDIFF(ms, '00:00:00', [Time]), CONVERT(DATETIME, [Date]))
FROM LPTables
This will give you accuracy to the millisecond.
Just enclose your columns with ().
SELECT CONVERT(datetime,([Date]+' '+[Time]),103) FROM LPTables
WHERE ISNULL([Date],'')!='' AND ISNULL([Time],'')!=''
Sample :

SQL Datediff in seconds with decimal places

I am trying to extract the difference between two SQL DateTime values in seconds, with decimal places for some performance monitoring.
I have a table, "Pagelog" which has a "created" and "end" datetime. In the past I have been able to do the following:
SELECT DATEDIFF(ms, pagelog_created, pagelog_end)/1000.00 as pl_duration FROM pagelog
However I have started getting the following error:
Msg 535, Level 16, State 0, Line 1
The datediff function resulted in an overflow. The number of dateparts separating two date/time instances is too large. Try to use datediff with a less precise datepart.
I have seen numerous responses to this error stating that I should use a less precise unit of measurement. But this hardly helps when I need to distinguish between 2.1 seconds and 2.9 seconds, because DATEDIFF(s,..,..) will return INT results and lose the accuracy I need.
I originally thought that this had been caused by a few values in my table having a huge range but running this:
SELECT DATEDIFF(s, pagelog_created, pagelog_end) FROM pagelog
ORDER BY DATEDIFF(s, pagelog_created, pagelog_end) DESC
Returns a max value of 30837, which is 8.5 hours or 30,837,000 milliseconds, well within the range of a SQL INT as far as I know?
Any help would be much appreciated, as far as I can tell I have two options:
Somehow fix the problem with the data, finding the culprit values
Find a different way of calculating the difference between the values
Thanks!
The StackOverflow magic seems to have worked, despite spending hours on this problem last week, I re-read my question and have now solved this. I thought I'd update with the answer to help anyone else who has this problem.
The problem here was not that there was a large range, there was a negative range. Which obviously results in a negative overflow. It would have been helpful if the SQL Server error was a little more descriptive but it's not technically wrong.
So in my case, this was returning values:
SELECT * FROM pagelog
WHERE pagelog_created > pagelog_end
Either remove the values, or omit them from the initial result set!
Thanks to Ivan G and Andriy M for your responses too
You can try to avoid overflow like this:
DECLARE #dt1 DATETIME = '2013-01-01 00:00:00.000'
DECLARE #dt2 DATETIME = '2013-06-01 23:59:59.997'
SELECT DATEDIFF(DAY, CAST(#dt1 AS DATE), CAST(#dt2 AS DATE)) * 24 * 60 * 60
SELECT DATEDIFF(ms, CAST(#dt1 AS TIME), CAST(#dt2 AS TIME))/1000.0
SELECT DATEDIFF(DAY, CAST(#dt1 AS DATE), CAST(#dt2 AS DATE)) * 24 * 60 * 60
+ DATEDIFF(ms, CAST(#dt1 AS TIME), CAST(#dt2 AS TIME))/1000.0
First it gets number of seconds in whole days from the DATE portion of the DATETIME and then it adds number of seconds from the TIME portion, after that, it just adds them.
There won't be error because DATEDIFF for minimum and maximum time in TIME data type cannot produce overflow.
You could of course do something like this:
SELECT
DATEDIFF(ms, DATEADD(s, x.sec, pagelog_created), pagelog_end) * 0.001
+ x.sec AS pl_duration
FROM pagelog
CROSS APPLY (
SELECT DATEDIFF(s, pagelog_created, pagelog_end)
) x (sec)
;
As you can see, first, the difference in seconds between pagelog_created and pagelog_end is taken, then the seconds are added back to pagelog_created and the difference in milliseconds between that value and pagelog_end is calculated and added to the seconds.
However, since, as per your investigation, the table doesn't seem to have rows that could cause the overflow, I'd also double check whether that particular fragment was the source of the error.
with cte as(
select rownum = row_number() over(partition by T.TR_ID order by T.[date]),
T.* from [dbo].[TR_Events] T
)
select cte.[date],nex.[date],convert(varchar(10),datediff(s, cte.[date], nex.[date])/3600)+':'+
convert(varchar(10),datediff(s, cte.[date], nex.[date])%3600/60)+':'+
convert(varchar(10),(datediff(s,cte.[date], nex.[date])%60))
from cte
left join cte prev on prev.rownum = cte.rownum - 1
left join cte nex on nex.rownum = cte.rownum + 1

Float as DateTime

SQL Server 2008
I almost have, I think, what I'm looking to do. I'm just trying to fine tune the result. I have a table that stores timestamps of all transactions that occur on the system. I'm writing a query to give an average transaction time. This is what I have so far:
With TransTime AS (
select endtime-starttime AS Totaltime
from transactiontime
where starttime > '2010-05-12' and endtime < '2010-05-13')
Select CAST(AVG(CAST(TotalTime As Float))As Datetime)
from TransTime
I'm getting the following result:
1900-01-01 00:00:00.007
I can't figure out how to strip the date off and just display the time, 00:00:00:007. Any help would be appreciated. Thank you.
You want to cast as a TIME.
With TransTime AS (
select endtime-starttime AS Totaltime
from transactiontime
where starttime > '2010-05-12' and endtime < '2010-05-13')
Select CAST(AVG(CAST(TotalTime As Float))As TIME(7))
from TransTime
It's that first subtraction that's your problem, and why are you casting the result to DATETIME (or even TIME)?
With TransTime AS
(
-- get time in milliseconds
select DATEDIFF(ms, starttime, endtime) AS Totaltime
from transactiontime
where starttime > '2010-05-12' and endtime < '2010-05-13'
)
Select AVG(CAST(TotalTime As Float)) AS average_time_in_msec
FROM TransTime
You probably want to use the DATEPART function to do this. Check out the documentation here.

Resources