querying SQL Server from Apache Drill and datetimeoffset - sql-server

I am trying to query SQL Server from Apache Drill but I get some issues when the SQL Server tables contain datetimeoffset (SQL Server type) columns.
Any SELECT query from Drill to SQL Server on this kind of table lead to the response:
Error:
VALIDATION ERROR: Unknown SQL type -155
I am certain it comes from the datetimeoffset column in the table, since I tested the same kind of queries on tables with no datetimeoffset columns and obtained satisfying results.
I went through the documentation of Drill (e.g. this page https://drill.apache.org/docs/supported-data-types/) and tried to cast the datetimeoffset column to SQL supported types but nothing worked and Apache Drill kept answering me the same error.
Do you have any idea how to get through this please?

I can confirming that it is in fact a problem with DATETIMEOFFSET https://msdn.microsoft.com/en-us/library/bb677267.aspx not being well handled by the JDBC library.
Only resolution that I currently know of is to convert the data on the way out like so:
,CAST(my_datetimeoffset_col AS DATETIME2) as datetime_local
,CAST(SWITCHOFFSET(my_datetimeoffset_col, 0) AS DATETIME2) as datetime_utc
,DATEPART(TZ, my_datetimeoffset_col) as datetime_tz
You could choose to only output UTC or local time but in my experience DATETIMEOFFSET was chosen because both were going to be needed.

Related

SQL Server date datatype becomes nvarchar

After so many years, I have become part of a project that use SQL server. This time it's 2017. I found a very weird behavior when I create a table.
create table test (sampledate date)
If I run above and check the data type of column sampledate, it shows nvarchar instead of date. This causes an error in my application.
Btw I'm using DBVisualizer to check the data type, I believe this is not because of this tool.
Have a look at this section of the Microsoft documentation. It explains that with down-level clients, the backward compatibility of the date data type is ensured by being converted to String/Varchar. So it may really comes from the DBVisualizer usage.
For your app, check the version of the client you use.

AWS SCT : What Redshift data type is equivalent to SQL Server's datetime?

We're trying to migrate our DWH from current SQL Server 2016 to AWS Redshift by AWS SCT (Schema Conversion Tool).
Our SQL Server tables have the columns of 'datetime'(YYYY-MM-DD hh:mm:ss.000).
They are normally converted to Redshift timestamp columns by SCT, at schema level.
But data copy to Redshift was failed by SCT Data Extract Agent.
(Extracted data was successfully uploding to S3)
I suppose that's due to datetime type difference, even though I believe Redshift timestamp allows until 6 digits in second scale.
If you have any workaround for this, kindly let me know how to convert them without any issues.
Sincerly,
Sachiko
The equivalent column is most likely timestamp without the timezone, and it'll be stored in UTC. The format of the dumped values may cause problems if it's not one of the formats that RedShift/Postgres expects, though, too.

Using datetime2 with Informatica

I have the requirement for dates that predate 1753, the minimum for datetime on SQL Server. DB-side, the solution is clear: change to datetime2 format. But it seems Informatica still treats the column as datetime. I suppose datetime2 is not supported. Is there any workaround that could enable me to insert pr-1753 dates in a datetime2 column?
Informatica version is 9.1.0, SQL Server 2008
If upgrading to a newer version of Informatica won't help or isn't an option, then you've got limited options. One option is to CONVERT the datetime2 to a suitable VARCHAR representation from the SQL Server side and then feed that to Informatica and let the implict casting of the value back to a suitable DATETIME do the hard work for you.
You'll need to make sure you are very careful with this though - if you end up migrating text dates such as 02/03/2015 02:33 etc - then there's a chance that if regional settings differ between databases or servers, you could import that as either 2nd of March or 3rd of Feb. It's best to use culture-invariant date formats, such as yyyy-mm-dd - that way it's always unambiguous.
-Steve

INSERT to Oracle table from SQL Server database link

I have created a database link to SQL Server 2012 database in Oracle 11gR2 using the Oracle Gateway for SQL Server. The connection is working fine and I am able to run queries and stored procedure from SQL Server.
The problem arises when I try to retrieve an XML column from SQL Server. Oracle documents clearly states that if database is in UTF character set (AL32UTF8), XML is supported, but in LONG datatype format.
I am able to query the XML column by
SET LONG 5000;
select "XMLColumn" FROM "xmltable"#sqlserver;
but while trying to insert this into an oracle table with Long datatype it gives the following error.
SQL Error: ORA-00997: illegal use of LONG datatype
Is there any workaround for this problem.
I even tried to convert the incoming XML to CLOB as suggested by Sanders, which perfectly makes sense. But somehow that too throws back the same error. In below query Name is obviously the XML column from SQL Server.
CREATE TABLE TEMPCLOB
AS
SELECT TO_LOB("Name") AS "Name" FROM "xmldata"#sqlserver;
LONG is deprecated datatype in Oracle. You can spend time to this problem, but I'd advice you to convert it immediately into CLOB or XMLTYPE and have no problem at all.

SQL Server JBDC DATE gets wrong metadata result type

I have a strange problem:
When i create a column in SQL Server 2008, JDBC Driver 2.0, sqljdbc4.jar with Java 1.6
create table ctypes (dbms_date DATE NOT NULL, dbms_date_null DATE)
The data type in the database is correctly date
But when I access the table with JDBC select
The metadata says that the type is nvarchar, still the getDate() function is working.
The problem is that I am programming a framework that generically copies data and must rely on the data type.
I am not a big SQL Server specialist, so maybe it some configuration in the SQL Server server that is responsible, I left as much to the default values as possible.
The problem is fixed by downloading Driver 3.0

Resources