Using the native function of:
SELECT CURRENT_DATE()
Will return the date of yesterday. This is causing a lot of downstream issues as it is a native function called in many tasks and procedures. I can confirm that this didn't happen as of last week. I can also confirm that our snowflake parameter is set to Pacific Time. So our timezone should not be causing this.
As I cannot change this function and I don't see an obvious session parameter that would cause it. Where can I start looking?
Related
I've got a stored procedure that's timing out despite the fact that I've set both the server's execution-timeout and the connection's execution-timeout to zero, which should make it unlimited.
It times out at exactly 10 minutes, which is the default timeout, so it would seem to be still getting that from somewhere.
Any ideas?
Note that this stored procedure used to run for hours w/o timing out, but recently I've made some changes to it, such as using a cursor for the iteration, using a temporary table and using some explicit transactions -- maybe that has something to do with the problem.
Fixed it! It seems there’s a third (and possibly even a fourth) place where the timeout can be set – under Options -> Query Execution (the possible fourth is in the Query window’s context menu Query Options).
Let's say I have a table with a date column; can I attach some sort of "watcher" that can take action if the date gets smaller than getdate()? Note that the date is larger than getdate() at the time of insertion.
Are there any tools that I might be unaware of in SQL Server 2008/2012?
Or would the best option be to poll the data from another application?
Edit: Note that there is no insertion/update taking place.
You could set up a SQL Job which runs periodically and executes a stored procedure which can then handle the logic around past dates.
https://msdn.microsoft.com/en-gb/library/ms187910.aspx
For example a SQL Job could be set up to run once daily to find out user's birthdays and send out an automated email.
In your case a job could be set up every minute (if required) which detects past dates and does something with those records. I would suggest adding some kind of flag to each record so that it isn't actioned the next time the job runs.
Alternatively if you have a lot of servers and databases, you could centralise your job scheduling using a third-party tool such as ActiveBatch.
I am storing timestamps in database as expressed in UTC, e.g. '2015-03-27 08:32:46.024 +00:00'
And I am using the current servers timezone to convert the timestamps to local time.
select
SWITCHOFFSET (CAST(CAST('2015-03-27 08:32:00.000 +00:00' AS datetime2(3)) as DateTimeOffset(3)), DATEPART(TZ, SYSDATETIMEOFFSET ()) )
This works fine as long as I look to wintertime timestamps in wintertime.
For last Fridays time wintertime timestamp get expressed in summer time: '2015-03-27 10:32:00.000 +02:00'.
I would like to see 2015-03-27 09:32:00.000 +01:00.
Anyone that has a solution without storing winter- and summertime in the database?
As I mentioned in the question's comments, this is usually best done in application code, rather than in the database. However, there are some scenarios where having it in the database can be quite useful, or necessary. For example, you may need to group by a date in a specific time zone, or you may need to convert many items in bulk, or generate data for a reporting system.
For these, you can use my SQL Server Time Zone Support project.
Follow the installation instructions, then you can do this:
SELECT Tzdb.UtcToLocal('2015-03-27 08:32:46.024', 'Europe/Paris')
I am guessing at your time zone. You can choose any from the list here.
My website is a bit huge and looping through all the nodes using Umbraco APIs to get release and expire dates is timing out.. so I wrote the following query to find the expire and release dates of nodes..
SELECT D.nodeId, D.releaseDate, D.expireDate
FROM dbo.cmsDocument D
WHERE D.newest = 1 AND
(
D.releaseDate IS NOT NULL
OR
D.expireDate IS NOT NULL
)
can any one please confirm whether its right or wrong... If its wrong, what is the proper way to get these values..
Thanks
Anz
I don't know anything about Umbraco, but I do know that not all dbms will use an index in evaluating expressions like your_column_name IS NOT NULL.
If your target dbms doesn't treat IS NOT NULL as a sargable expression, then it won't use an index. Instead, it will do a full table scan, which can take a long time on a big table. And if it takes a long time, I suppose a timeout is still possible.
It's also possible that there isn't an index on releaseDate or expireDate. On a big table, that will slow you down a lot, too.
If you want to know which documents expired before today, maybe so you can delete them, I'd (perhaps naively) expect this standard SQL statement to work. (But I'd also expect some Umbraco housekeeping module to make this unnecessary.)
select nodeId
from cmsDocument
where expiredate < CURRENT_DATE
Question:
Does passing DateTime.Now as a parameter to a proc prevent SQL Server from caching the query plan? If so, then is the web app missing out on huge performance gains?
Possible Solution:
I thought DateTime.Today.AddDays(1) would be a possible solution. It would pass the same end-date to the sql proc (per day). And the user would still get the latest data. Please speak to this as well.
Given Example:
Let's say we have a stored procedure. It reports data back to a user on a webpage. The user can set a date range. If the user sets today's date as the "end date," which includes today's data, the web app passes DateTime.Now to the sql proc.
Let's say that one user runs a report--5/1/2010 to now--over and over several times. On the webpage, the user sees 5/1/2010 to 5/4/2010. But the web app passes DateTime.Now to the sql proc as the end date. So, the end date in the proc will always be different, although the user is querying a similar date range.
Assume the number of records in the table and number of users are large. So any performance gains matter. Hence the importance of the question.
Example proc and execution (if that helps to understand):
CREATE PROCEDURE GetFooData
#StartDate datetime
#EndDate datetime
AS
SELECT *
FROM Foo
WHERE LogDate >= #StartDate
AND LogDate < #EndDate
Here's a sample execution using DateTime.Now:
EXEC GetFooData '2010-05-01', '2010-05-04 15:41:27' -- passed in DateTime.Now
Here's a sample execution using DateTime.Today.AddDays(1)
EXEC GetFooData '2010-05-01', '2010-05-05' -- passed in DateTime.Today.AddDays(1)
The same data is returned for both procs, since the current time is: 2010-05-04 15:41:27.
The query plan will be cached regardless of parameter values. Parameters basically guarantee that a consistent, reusable query exists, since they are type-safe as far as SQL server is concerned.
What you want is not query plan, but result caching. And this will be affected by the behavior you describe.
Since you seem to handle whole days only, you can try passing in dates, not datetimes, to minimize different parameter values. Also try caching query results in the application instead of doing a database roundtrip every time.
Because you invoke a stored procedure, not directly a query, then your only query that changes is the actual batch you send to SQL, the EXEC GetFooData '2010-05-01', '2010-05-05' vs. GetFooData '2010-05-01', '2010-05-04 15:41:27'. This is a trivial batch, that will generate a trivial plan. While is true that, from a strict technical point of view, you are loosing some performance, it will be all but unmeasurable. The details why this happes are explained in this response: Dynamically created SQL vs Parameters in SQL Server
The good news is that by a minor change in your SqlClient invocation code, you'll benefit from even that minor performance improvement mentioned there. Change your SqlCommand code to be an explicit stored procedure invocation:
SqlCommand cmd = new SqlCommand("GetFooData", connection);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("#StartDate", dateFrom);
cmd.Parameters.AddWithValue("#EndDate", DateTime.Now);
As a side note, storing localized times in the database is not a very good idea, due to the clients being on different time zones than the server and due to the complications of daylight savings change night. A much better solution is to always store UTC time and simply format it to user's local time in the application.
In your case, you are probably fine if the second parameter is just drifting upward in real time.
However, it is possible to become a victim of parameter sniffing where the first execution (which produces the cached execution plan) is called with parameters which produce a plan which is not typically good for the other parameters normally used (or the data profile changes drastically). The later invocations might use a plan which is sometimes so poor that it won't even complete properly.
If your data profile changes drastically by different choices of parameters, and the execution plan becomes poor for certain choices of parameters, you can mask the parameters into local variables - this will effectively prevent parameter sniffing in SQL Server 2005. There is also the WITH RECOMPILE (either in the SP or in the EXEC - but for heavily called SPs, this is not a viable option) In SQL Server 2008, I would almost always use the OPTIMIZE FOR UNKNOWN which will avoid producing a plan based on parameter sniffing.