Type Conversion changes between Local PC and Server environment - sql-server

I am taking some simple data from an SQL table, making a small transformation and converting it to Unicode. Then I output it into an Oracle CHAR(1 byte) field on an Oracle server.
This works without any error on my local PC. I then deploy to the server and it says that
"Column "A" cannot convert between unicode and non-unicode string data types".
After trying several things I threw my hands up in the air and just took out the data conversion to unicode and now it is broken and wont run on my PC.
BUT - it now works on the server and is all happy. I've searched and found that others have had this problem, but none seem to find the cause and just work around it it other ways.
Why can I not have my PC and my Server work the same? All tables and data connection are the SAME for both. No change other than execution location.

I got the same issue in my SSIS package and found no solution.
It was a simple data, not containing any unicode character, and it doesn't throw any exception if converting it using an SQL query or a .net code ... But it throws an exception in SSIS when using Data Conversion Transformation
Workarounds
I made a simple workaround to achieve this (you can use it if this error occurs again)
I replaced the Data Conversion Component with a Script Component
I marked the columns i want to convert as input
For each column i want to convert i created an output column of type DT_WSTR
In the Script for each column i used the following code: (assuming the input is inColumn and the output is outColumn)
If Not Row.inColumn_IsNull AndAlso _
Not String.IsNullOrEmpty(Row.inColumn) Then
Row.outColumn = Row.inColumn
Else
Row.outColumn_IsNull = True
End If
OR
If the source is an OLEDB Source you can use a casting function in the OLEDB source command
ex:
SELECT CAST([COLUMN] as NVARCHAR(255)) AS [COLUMN]
FROM ....

Related

Inserting string data into XML column: 'nil=\"true\"'

SQL Server 2017
I'm trying to debug and re-engineer an SSIS package which gets string data from a web service, then inserts it into an SQL table (into a column of type XML).
The input string data looks something like this:
<InvoiceInfo><acceptedDate nil=\"true\" /><.... more stuff...
There is no overall problem with the formation of the data - all terminated correctly and so on. But SQL Server hates this: nil=\"true\". When I run the SQL generated by the SSIS package in SSMS, I get this error:
XML parsing: line 1, character 32, A string literal was expected
There's probably a good reason for this. I don't know much about XML, but AFAIK this "\" escaping doesn't really belong there.
If I do some pre-processing on the input string, replacing "nil=\"true\"" with a zero-length string, it works.
The most bizarre thing is that, when this same SQL is run in an ExecuteSQL task in the SSIS package, it works. I'm completely mystified. It's not a problem with implicit type conversion; if I add an explicit CONVERT and run it in SSMS -
INSERT INTO TheTable(XMLcolumn) VALUES
(CONVERT(XML,'<InvoiceInfo><acceptedDate nil=\"true\" /><.... more stuff...'))
it still fails in the same way.
Two questions:
How can this SQL work from SSIS but fail in SSMS?
What are these "nil=\"true\"" values doing in the XML?

Using SSIS and T-SQL to convert date to dd.mm.yyyy

When I use T-SQL to convert a datetime into dd.mm.yyyy for an csv output using SSIS, the file is produced with a dd-mm-yyyy hh:mm:ss which is not what i need.
I am using:
convert(varchar,dbo.[RE-TENANCY].[TNCY-START],104)
which appears correct in SSMS.
Which is the best way to handle the conversion to be output from SSIS?
Not as simple as i thought it would be.
It works for me.
Using your query as a framework for driving the package
SELECT
CONVERT(char(10),CURRENT_TIMESTAMP,104) AS DayMonthYearDate
I explicitly declared a length for our dd.mm.yyyy value and since it's always going to be 10 characters, let's use a data type that reflects that.
Run the query, you can see it correctly produces 13.02.2019
In SSIS, I added an OLE DB Source to the data flow and pasted in my query
I wired up a flat file destination and ran the package. As expected, the string that was generated by the query entered the data flow and landed in the output file as expected.
If you're experiencing otherwise, the first place I'd check is double clicking the line between your source and the next component and choose Metadata. Look at what is reported for the tenancy start column. If it doesn't indicate dt_str/dt_wstr then SSIS thinks the data type is date variant and is applying locale specific rules to the format. You might also need to check how the column is defined in the flat file connection manager.
The most precise control on output format of the date can be achieved by T-SQL FORMAT(). It is available since SQL Server 2012.
It is slightly slower than CONVERT() but gives desired flexibility
An example:
SELECT TOP 4
name,
FORMAT(create_date, 'dd.MM.yyyy') AS create_date
FROM sys.databases;
name create_date
--------------------
master 08.04.2003
tempdb 12.02.2019
model 08.04.2003
msdb 30.04.2016
p.s. take into account that FORMAT() produces NVARCHAR output, which is different from your initial conversation logic, therefore extra cast to VARCHAR(10)) perhaps will be necessary to apply

SSIS: Handling 1/0 Fields in Data Flow Task

I am building a Data Flow Task in an SSIS package that pulls data in from an OLE DB Source (MS Access table), converts data types through a Data Conversion Transformation, then routes that data to an OLE DB Destination (SQL Server table).
I have a number of BIT columns for flag variables in the destination table and am having trouble with truncation when converting these 1/0 columns to (DT_BYTES,1). Converting from DT_WSTR and DT_I4 to (DT_BYTES,1) results in the same truncation, and I have verified that it is happening at that step through the Data Viewer.
It appears that I need to create a derived column similar to what is described in the answers to the question linked below, but instead of converting to DT_BOOL, I need to convert to (DT_BYTES,1), as casting from DT_BOOL to DT_BYTES is apparently illegal?
SSIS Converting a char to a boolean/bit
I have made several attempts at creating a derived column with variations of the logic below, but haven’t had any luck. I am guessing that I need to use Hex literals in the “1 : 0” portion of the expression, but I haven’t been able to find valid syntax for that:
(DT_BYTES,1)([Variable_Name] == (DT_I4)1 ? 1 : 0)
Am I approaching this incorrectly? I can’t be the first person to need to insert BIT data into a SQL Server table, and the process above just seems unnecessarily complex to me.

SSIS OLE DB Command Date Parameter Format

I am using an OLE DB Command task to execute an INSERT statement. The INSERT statement accepts a number of parameters represented by ?. SSIS maps SQL Server DATETIME columns to parameters of type DT_DBTIMESTAMP which I think is fine.
The INSERT query fails as the DT_DBTIMESTAMP is passed to SQL Server as a string in format 'yyyy-MM-dd hh:mm:ss' but the database is expecting 'dd/MM/yyyy hh:nn:ss'. The error is due to SQL Server treating the 'day' and 'month' parts the wrong way around.
I've seen responses to questions around formatting dates in SSIS using derived columns etc. but I already have the DT_DBTIMESTAMP value (it has no format!) and the problem occurs when SSIS sets the parameter value as a string, and I can't see how to control the format so it outputs in 'dd/MM/yyyy hh:mm:ss'.
I've tried setting LocaleID and language but still seems to make no difference. An interesting observation is that this error does not occur when running through Visual Studio, only from a SQL Agent job.
Any help greatly appreciated.
What errors do you get, when running this?
The only way to solve this, I see in converting DT_DBTIMESTAMP to DATETIME between the reading of the source file and the writing in SQL table.

SSIS Package: convert between unicode and non-unicode string data types

I am connecting to an Oracle DB and the connection works, but I get the following error for some of the columns:
Description: Column "RESOURCE_NAME" cannot convert between unicode
and non-unicode string data types.
Value for RESOURCE_NAME:
For Oracle: VARCHAR2(200 BYTE)
For SQL Server: VARCHAR(200 BYTE)
I can connect to the Oracle DB via Oracle SQL Developer without any issues. Also, I have the SSIS package setting Run64BitRuntime = False.
The Oracle data type VARCHAR2 appears to be equivalent to NVARCHAR in SQL Server, or DT_WSTR in SSIS. Reference
You will have to convert using the Data Conversion Transformation, or CAST or CONVERT functions in SQL Server.
If the package works in one machine and doesn't in other; Try setting the NLS_LANG to right language, territory and character set and test the package.
[Command Prompt]> set NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1
The easiest way around this to to open the SSIS package in notepad (the dtsx file) and do a global find and replace of all instances of validateExternalMetadata="True" with validateExternalMetadata="False".
note: we encountered this issue when connecting to an Oracle 11g database on Linux through SSIS.
on oledb source ->advanced editor options->input/output columns->output columns->select RESOURCE_NAME column and change Data type as DT_WSTR and length also u can change as required
You can use SQL command in SSIS and use CONVERT or CAST. If SSIS still gives you an error its because of the metadata. Here is how you can fix it.
Open the Advanced Editor.
Under the Input and Output properties, Expand Source Output.
Expand Output columns
Select the column which is causing the issue.
Go to Data Type Properties and change the DataType to your desired type DT_STR, DT_Text etc.
You can just double-click on the "Data Conversion" block in the Data Flow and for every item change it to: "Unicode String [DT_WSTR]"
Works
If everything failed from above. Create a table variable and insert the data into it. Then select all records as source. use SET NOCOUNT ON in the script.
I encountered a very similar problem even using SQL Server rather than Oracle. In my case, I was using a Flat File as a data source, so I just went in to the Flat File Connection Manager and manually changed the column type to be a Unicode string:
I don't know if this would fix your problem or not, but it helped me - hopefully someone else will be helped too. (I was inspired to try that by this previous answer to this question BTW, just to give credit where credit's due).

Resources