Teradata to Snowflake conversion - snowflake-cloud-data-platform

Working on Teradata ddl conversion where i could see below code :
Would like to know how to convert from Teradata to Snowflake of below Latin to Unicode DDL code :
OTRANSLATE(TRANSLATE(acc_name USING LATIN_TO_UNICODE WITH ERROR),'?;*<>"$+()&##''-/\!','') AS deal_name

Related

oracle timestamp 6 convert to sql server datetime2 error

I am using SSIS to convert some oracle data to sql server. I found for Oracle date timestamp6 like this
26-DEC-82 12.00.00.000000000 AM
will cause the conversion in SSIS to fail
Error: Year, Month, and Day parameters describe an un-representable
I think it's because ssis don't know whether it's 2082 or 1982, so don't know how to convert.How can I convert the oracle dates to something with yyyy for the year part?
update: tried to_char function mentioned by Hadi. I can see the year now is 2682 (most of them). I added a pic showing with to_char and original column for plate_date and sold_date columns. As you can see most of the years are 26xx, two exceptions are 18xx. Can someone explain?
In the Oracle Source use an SQL Command to convert TimeStamp to nvarchar using TO_CHAR() function and use the universal data format yyyy-MM-dd HH:mm:ss:
TO_CHAR(SOURCECOLUMN, 'YYYY-MM-DD HH24:MI:SS')
And in SSIS data flow add a derived column with the following expression:
(DT_DATE)[SOURCECOLUMN]
Or add a data conversion transformation and convert the column to date data type.
In SQL Server, the datatype "timestamp" is nothing to do with dates or times.
Microsoft have renamed their old "timestamp" datatype to "rowversion" because it is just an 8-byte number that is used to record a sequence of “row changed” events.
On the other hand Oracle's "timestamp" really is about time because Oracle's "timestamp" extends their "date" datatype. more here.
Unfortunately, SQL Server still recognises "timestamp" as a valid datatype name.
So, I suspect that your error message may have something to do with the timestamp-timestamp homonym.

Insert to CLOB through ODP.NET

I am using ODP.NET and Oracle database. I have to save to CLOB field data more than 4000 length. When I am doing this through simple SQL statement and ExecuteNonQuery exception occurs - PLS-00172: string literal too long.
The question is how to save this lond data?
I cannot use/create procedures - no way to get permission. I can only use ODP.NET.

SSIS: Handling 1/0 Fields in Data Flow Task

I am building a Data Flow Task in an SSIS package that pulls data in from an OLE DB Source (MS Access table), converts data types through a Data Conversion Transformation, then routes that data to an OLE DB Destination (SQL Server table).
I have a number of BIT columns for flag variables in the destination table and am having trouble with truncation when converting these 1/0 columns to (DT_BYTES,1). Converting from DT_WSTR and DT_I4 to (DT_BYTES,1) results in the same truncation, and I have verified that it is happening at that step through the Data Viewer.
It appears that I need to create a derived column similar to what is described in the answers to the question linked below, but instead of converting to DT_BOOL, I need to convert to (DT_BYTES,1), as casting from DT_BOOL to DT_BYTES is apparently illegal?
SSIS Converting a char to a boolean/bit
I have made several attempts at creating a derived column with variations of the logic below, but haven’t had any luck. I am guessing that I need to use Hex literals in the “1 : 0” portion of the expression, but I haven’t been able to find valid syntax for that:
(DT_BYTES,1)([Variable_Name] == (DT_I4)1 ? 1 : 0)
Am I approaching this incorrectly? I can’t be the first person to need to insert BIT data into a SQL Server table, and the process above just seems unnecessarily complex to me.

SSIS OLE DB Command Date Parameter Format

I am using an OLE DB Command task to execute an INSERT statement. The INSERT statement accepts a number of parameters represented by ?. SSIS maps SQL Server DATETIME columns to parameters of type DT_DBTIMESTAMP which I think is fine.
The INSERT query fails as the DT_DBTIMESTAMP is passed to SQL Server as a string in format 'yyyy-MM-dd hh:mm:ss' but the database is expecting 'dd/MM/yyyy hh:nn:ss'. The error is due to SQL Server treating the 'day' and 'month' parts the wrong way around.
I've seen responses to questions around formatting dates in SSIS using derived columns etc. but I already have the DT_DBTIMESTAMP value (it has no format!) and the problem occurs when SSIS sets the parameter value as a string, and I can't see how to control the format so it outputs in 'dd/MM/yyyy hh:mm:ss'.
I've tried setting LocaleID and language but still seems to make no difference. An interesting observation is that this error does not occur when running through Visual Studio, only from a SQL Agent job.
Any help greatly appreciated.
What errors do you get, when running this?
The only way to solve this, I see in converting DT_DBTIMESTAMP to DATETIME between the reading of the source file and the writing in SQL table.

SSIS Package: convert between unicode and non-unicode string data types

I am connecting to an Oracle DB and the connection works, but I get the following error for some of the columns:
Description: Column "RESOURCE_NAME" cannot convert between unicode
and non-unicode string data types.
Value for RESOURCE_NAME:
For Oracle: VARCHAR2(200 BYTE)
For SQL Server: VARCHAR(200 BYTE)
I can connect to the Oracle DB via Oracle SQL Developer without any issues. Also, I have the SSIS package setting Run64BitRuntime = False.
The Oracle data type VARCHAR2 appears to be equivalent to NVARCHAR in SQL Server, or DT_WSTR in SSIS. Reference
You will have to convert using the Data Conversion Transformation, or CAST or CONVERT functions in SQL Server.
If the package works in one machine and doesn't in other; Try setting the NLS_LANG to right language, territory and character set and test the package.
[Command Prompt]> set NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1
The easiest way around this to to open the SSIS package in notepad (the dtsx file) and do a global find and replace of all instances of validateExternalMetadata="True" with validateExternalMetadata="False".
note: we encountered this issue when connecting to an Oracle 11g database on Linux through SSIS.
on oledb source ->advanced editor options->input/output columns->output columns->select RESOURCE_NAME column and change Data type as DT_WSTR and length also u can change as required
You can use SQL command in SSIS and use CONVERT or CAST. If SSIS still gives you an error its because of the metadata. Here is how you can fix it.
Open the Advanced Editor.
Under the Input and Output properties, Expand Source Output.
Expand Output columns
Select the column which is causing the issue.
Go to Data Type Properties and change the DataType to your desired type DT_STR, DT_Text etc.
You can just double-click on the "Data Conversion" block in the Data Flow and for every item change it to: "Unicode String [DT_WSTR]"
Works
If everything failed from above. Create a table variable and insert the data into it. Then select all records as source. use SET NOCOUNT ON in the script.
I encountered a very similar problem even using SQL Server rather than Oracle. In my case, I was using a Flat File as a data source, so I just went in to the Flat File Connection Manager and manually changed the column type to be a Unicode string:
I don't know if this would fix your problem or not, but it helped me - hopefully someone else will be helped too. (I was inspired to try that by this previous answer to this question BTW, just to give credit where credit's due).

Resources