Bit of a strange issue, not sure I will be able to explain it properly, but I will try.
I have created a Table with SQL CREATE TABLE and I set up a watermark column called _ingested_timestamp.
CREATE TABLE IF NOT EXISTS MY_TABLE
(
id INT NOT NULL,
...
...
_ingested_timestamp TIMESTAMP(3),
WATERMARK FOR _ingested_timestamp AS _ingested_timestamp - INTERVAL '10' SECOND,
PRIMARY KEY(id) NOT ENFORCED
)
WITH(...);
I then have a SELECT using data from the above table, something like this:
SELECT
item.id,
...
...
item._ingested_timestamp as event_time,
FROM MY_TABLE as item JOIN ....;
However, this fails with an exception around Conversion to relational algebra failed to preserve datatypes.
Specifically, there seems to be a mismatch with due to the fact that event_time is expected to be ROWTIME but it is not (or the other way around):
validated type:
RecordType(INTEGER NOT NULL id, ..., TIMESTAMP(3) *ROWTIME* event_time) NOT NULL
converted type:
RecordType(INTEGER NOT NULL id, ..., TIMESTAMP(3) event_time) NOT NULL
As you can see, the only difference in the exception message seems to be about the type of event_time.
Can anyone help me out here?
What exactly is the issue here, and how can I use a rowtime field in a SQL without facing this issue?
Thanks
Related
I have problem with converting date from varchar to date format:
Msg 241, Level 16, State 1, Line 15
Conversion failed when converting date and/or time from character string.
I'm trying to convert/cast it like SELECT convert(DATE, '25.02.2019');. Can't change string order bacause the data are from existing table.
I know that the solution is easy but I'm still missing something and didn't get it yet :(
If you are unable to fix the underlying problem (that the table uses the wrong data type), you need to apply the correct DATETIME Style, which for dd.MM.yyyy is 104:
SELECT CONVERT(DATE, '25.02.2019', 104);
If at all possible though you should correct the original table. You should never store dates using VARCHAR, there is not one good reason to do so, and lots of good reasons not to. It will save you a lot of headaches if you change your datatype to DATE and then you won't have to worry about conversion errors. The longer you leave it the worse it will get. If you can't change the table, have a word with your DBA, and tell them to change the table. If you don't have a DBA, find someone who can.
Some good articles on this below:
Bad habits to kick : choosing the wrong data type
Bad habits to kick : mis-handling date / range queries
ADDENDUM
If you are unable to change the actual column because it is used by other processes, you can still sanitise the column by using a check contraint, and optionally include a computed column so you always have access to a real date, and not a varchar:
e.g.
IF OBJECT_ID(N'tempdb..#DateTest', 'U') IS NOT NULL
DROP TABLE #DateTest;
CREATE TABLE #DateTest
(
StringDate CHAR(10) NOT NULL,
RealDate AS CONVERT(DATE, StringDate, 104),
CONSTRAINT CHK_DateTest__RealDate CHECK (TRY_CONVERT(DATE, StringDate, 104) IS NOT NULL)
);
This will allow you to continue to add/edit varchar dates:
-- insert valid date and check output
INSERT #DateTest (StringDate) VALUES ('25.02.2019');
SELECT RealDate
FROM #DateTest;
The check constraint will prevent you from adding any dates that are not dates:
--Try to insert invalid date
INSERT #DateTest (StringDate) VALUES ('29.02.2019');
This will throw an error:
The INSERT statement conflicted with the CHECK constraint "CHK_DateTest__RealDate". The conflict occurred in database "tempdb", table "dbo.#DateTest___________________________________________________________________________________________________________000000000704", column 'StringDate'.
You can even index the column:
CREATE NONCLUSTERED INDEX IX_DateTEst ON #DateTest (RealDate);
With the index on you can take advantage of the benefits storing dates properly gives.
Your string is in the wrong format.
Should be:
SELECT convert(DATE, '2019-05-02')
Edit: If you can't get the date in that format, put 104 as a third argument.
Here's a list of the optional format arguments. https://www.w3schools.com/SQL/func_sqlserver_convert.asp
I have a table that is filled using a stored procedure. This stored procedure uses a view that calls attributes from another databases.
To illustrate, it is something like:
ALTER PROCEDURE theSp
AS BEGIN
INSERT INTO dbo.theTable (attr1, att2, amount, attr4)
SELECT attr1, attr2, amount, attr4
FROM theView
END
The view is defined this way:
select attr1, attr2, amount, attr4
from db1.theTable
where date >='anyDate'
and the values are correctly inserted, but if the view is used this way:
select attr1, attr2, amount, attr4
from db2.theTable
where date >='anyDate'
this message is shown:
Checking identity information: current identity value '1252'.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
Msg 515, Level 16, State 2, Procedure theSp, Line 16
Cannot insert the value NULL into column 'amount', table 'db2.dbo.theTable'; column does not allow nulls. INSERT fails.
Note: the 'amount' attribute for db1 and db2 tables allows null but I never insert null, instead, I insert 0.
So I filtered to check whether the amount attribute is null and I did not get results, meaning there are not nulls value in the amount attribute.
Does anyone know a possible solution?
By looking at the error message that you have posted
Cannot insert the value NULL into column 'amount', table 'db2.dbo.theTable'; column does not allow nulls. INSERT fails. The statement has been terminated.
there is a Null value for amount column in `db2.dbo.theTable'. You can use ISNULL() to get rid of this issue.
If you want to see the NULL values, you gotta use query like
select * from db2.dbo.theTable where amount IS NULL
Have a look at is Null vs =Null to see why you weren't seeing those null records in your previous query.
Good to have a logger to validate the condition when it's going wrong, like printing the values before inserting
I have a SELECT that retrieves ROWS comparing a DATETIME field to the highest available value of another TABLE.
The Two Tables have the following structure
DeletedRecords
- Id (Guid)
- RecordId (Guid)
- TableName (varchar)
- DeletionDate (datetime)
And Another table which keep track of synchronizations using the following structure
SynchronizationLog
- Id (Guid)
- SynchronizationDate (datetime)
In order to get all the RECORDS that have been deleted since the last synchronization, I run the following SELECT:
SELECT
[Id],[RecordId],[TableName],[DeletionDate]
FROM
[DeletedRecords]
WHERE
[TableName] = '[dbo].[Person]'
AND [DeletionDate] >
(SELECT TOP 1 [SynchronizationDate]
FROM [dbo].[SynchronizationLog]
ORDER BY [SynchronizationDate] DESC)
The problem occurs if I do not have synchronizations available yet, the T-SQL SELECT does not return any row while it should returns all the rows cause there are no synchronization records available.
Is there a T-SQL function like COALESCE that I can use with DateTime?
Your subquery should look like something like this:
SELECT COALESCE(MAX([SynchronizationDate]), '0001-01-01')
FROM [dbo].[SynchronizationLog]
It says: Get the last date, but if there is no record (or all values are NULL), then use the '0001-01-01' date as start date.
NOTE '0001-01-01' is for DATETIME2, if you are using the old DATETIME data type, it should be '1753-01-01'.
Also please note (from https://msdn.microsoft.com/en-us/library/ms187819(v=sql.100).aspx)
Use the time, date, datetime2 and datetimeoffset data types for new work. These types align with the SQL Standard. They are more portable. time, datetime2 and datetimeoffset provide more seconds precision. datetimeoffset provides time zone support for globally deployed applications.
EDIT
An alternative solution is to use NOT EXISTS (you have to test it if its performance is better or not):
SELECT
[Id],[RecordId],[TableName],[DeletionDate]
FROM
[DeletedRecords] DR
WHERE
[TableName] = '[dbo].[Person]'
AND NOT EXISTS (
SELECT 1
FROM [dbo].[SynchronizationLog] SL
WHERE DR.[DeletionDate] <= SL.[SynchronizationDate]
)
I'd like to have two columns in a database, one for tracking whether or not the user has submitted something, and another for the timestamp of that submission.
How can I structure the table definition so that the state of these two columns is never inconsistent?
Basically, I'd like the boolean field to be driven by whether or not a SubmittedDate column is null. Here's a snippet of the table definition:
CREATE TABLE SomeSchema.SomeTable
(
...
SubmittedDate datetime NULL,
Submitted bit NOT NULL DEFAULT(0), -- Drive off of SubmittedDate?
...
)
What's the best way to accomplish this?
Thanks!
Use only one column - the DATETIME one. It serves double duty - the column being null means it wasn't submitted, but if the value exists - you also know when.
Use a computed column:
CREATE TABLE SomeSchema.SomeTable
(
...
SubmittedDate datetime NULL,
Submitted as cast(case when SubmittedDate is null then 0 else 1 end as bit)
)
I am running into an error I am having trouble figuring out.
I have 2 tables and I'm trying to copy data from one to the other (simplified view):
MyTable
-------
ID varchar(11) do not allow nulls
Field01 numeric(6,0) allow nulls
MyTable_Temp
------------
ID varchar(11) do not allow nulls
Field01 numeric(6,0) allow nulls
My query looks like this:
DELETE FROM dbo.MyTable
INSERT INTO dbo.MyTable([ID],[Field01])
SELECT ID, Field01 FROM [dbo].MyTable_Temp WITH (NOLOCK)
However when I run my query it throws this error:
Msg 242, Level 16, State 3, Procedure TRG_MyTable, Line 6
The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.
If I comment out the Field01 part of the query it runs fine. How can a numeric field throw a datetime error?
It looks to me like you've got some kind of trigger on the destination table that's firing (TRG_MyTable is a giveaway) It's probably doing something like inserting a timestamped record into an audit table somewhere and is getting confused.