Not able to insert the rows which contain '#' special Character from CSV file to SQl database,using SSIS packages - sql-server

When i am trying to import the CSV file into SQL tables using SSIS packages i was not able to insert the rows which contain '#' as special character.
It is throwing the following error:
'[DCNV - Unicode to NonUnicode [14]] Error: The "DCNV - Unicode to
NonUnicode.Outputs[Data Conversion Output].Columns[PrimaryAddr1]"
failed because truncation occurred, and the truncation row disposition
on "DCNV - Unicode to NonUnicode.Outputs[Data Conversion
Output].Columns[PrimaryAddr1]" specifies failure on truncation. A
truncation error occurred on the specified object of the specified
component.'
Please help me with this issue. How can i allow '#' special character to insert into Database tables. The column related to this issues are Customer Address(DT_WSTR)

Check out the source as well as the destination data types to make sure you are specifying them correctly - Jet has probably guessed wrong - it only looks at the first few rows to guess what datatype your data is.
Also notice there error is actually complaining about Unicode to NonUnicode - make sure the source/destination lengths are correct and depict your data.

Related

Column "" cannot convert between unicode and non-unicode string data types

I am trying to import the data from the flat file into the Azure SQL database table and I have a merge to merge with another source too. But when I map the fields from the flat file to the Azure SQL database I keep getting the error like
Column "Location" cannot convert between unicode and non-unicode string data types
Upon looking at some forums I tried to change the data type of the field to Unicode string[DT_WSTR] and even I tried to have string [DT_STR]
The Destination Azure SQL database below is the Location field
Can anyone please suggest what I am missing here? Any help is greatly appreciated
Changing the columns data types from the component's advanced editor will not solve the problem. If the values imported contain some Unicode characters, you cannot convert them to non-Unicode strings, and you will receive the following exception. Before providing some solution, I highly recommend reading this article to learn more on data type conversion in SSIS:
SSIS Data types: Change from the Advanced Editor vs. Data Conversion Transformations
Getting back to your issue, there are several solutions you could try:
Changing the destination column data type (if possible)
Using the Data conversion transformation component, implement an error handling logic where the values throwing exceptions are redirected to a staging table or manipulated before re-importing them to the destination table. You can refer to the following article: An overview of Error Handling in SSIS packages
From the flat file connection manager, got to the "Advanced Tab", and change the column data type to DT_STR.

SSIS: Handling 1/0 Fields in Data Flow Task

I am building a Data Flow Task in an SSIS package that pulls data in from an OLE DB Source (MS Access table), converts data types through a Data Conversion Transformation, then routes that data to an OLE DB Destination (SQL Server table).
I have a number of BIT columns for flag variables in the destination table and am having trouble with truncation when converting these 1/0 columns to (DT_BYTES,1). Converting from DT_WSTR and DT_I4 to (DT_BYTES,1) results in the same truncation, and I have verified that it is happening at that step through the Data Viewer.
It appears that I need to create a derived column similar to what is described in the answers to the question linked below, but instead of converting to DT_BOOL, I need to convert to (DT_BYTES,1), as casting from DT_BOOL to DT_BYTES is apparently illegal?
SSIS Converting a char to a boolean/bit
I have made several attempts at creating a derived column with variations of the logic below, but haven’t had any luck. I am guessing that I need to use Hex literals in the “1 : 0” portion of the expression, but I haven’t been able to find valid syntax for that:
(DT_BYTES,1)([Variable_Name] == (DT_I4)1 ? 1 : 0)
Am I approaching this incorrectly? I can’t be the first person to need to insert BIT data into a SQL Server table, and the process above just seems unnecessarily complex to me.

SSIS (ASCII needed): "Code page is 1252 and is required to be 20127"

I have a requirement to export a database to a tab-delimited file in the ASCII format. I am using derived columns to convert any Unicode strings to non-Unicode strings. For example, a former Unicode text stream is now casted as this:
(DT_TEXT,20127)incomingMessage
But SSIS is still looking for ANSI. I am still seeing an error at the Flat File Destination:
The code page on input column <column_name> is 1252 and is required to be 20127.
This happens for any column in the table, not just Unicode ones.
This is what I have been doing to ensure ASCII is used:
In the Flat File Connection Manager, used Code page "20127 (US-ASCII)"
Used a Derived Column to cast data types
In the OLE DB source, set the default code page to 20127
Any thoughts?
How about using the Data Conversion task? Connect the Flat File task to the Data Conversion and then change the metadata on the fly to suit your needs. You should be able to delete the derived column task if you change the metadata to handle the unicode issues in the data conversion task. Then you can process the records accordingly into the OLE DB Source without issues.

How to prevent SSIS from truncating the last field of the last data row in a flat file?

I have an SSIS package thats unzips and loads a text file. It has been working great from the debugger, and from the various servers its been uploaded to on its way to our production environment.
My problem right now is this: A file was being loaded, everything was going great, but all of the sudden, on the very last data row (according to the error message) the last field was truncated. I assumed the file we receive was probably messed up, cracked it open, and everything is good there....
Its a | delimited file, no text qualifier, and {CR}{LF} as the row delimiter. Since the field with the truncation error is the last field in the row (and in this case the last field of the entire file), its delimiter is {CR}{LF} as opposed to |.
The file looks pristine and I've even loaded it into Excel with no issue and no complaints. I have run this file through my local machine running the package via the deugger in VS 2008, and it ran perfectly. Has anybody had any issues with behavior like this at all? I can't test it much in the environment that its crashing in, because it is our production environment and these are peak hours.... so any advice is GREATLY appreciated.
Error message:
Description: Data conversion failed. The data conversion for column "ACD_Flag" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". End Error Error: 2013-02-01 01:32:06.32 Code: 0xC020902A Source: Load ACD file into Table HDS Flat File 1 [9] Description: The "output column "ACD_Flag" (1040)" failed because truncation occurred, and the truncation row disposition on "output column "ACD_Flag" (1040)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component. End Error Error: 2013-02-01 01:32:06.32 Code: 0xC0202092 Source: Load ACD file into Table [9] Description: An error occurred while processing file "MY FLAT FILE" on data row 737541.
737541 is the last row in the file.
Update: originally I had the row delimiter {CR}, but I have updated that to {CR}{LF} to attempt to fix this issue... although to no avail.
Update:
I am able to recreate the error message that you have added to your question. The error happens when you have more column delimiters in the line than what you have defined in the flat file connection manager.
Here is a simple example to illustrate it. I created a simple file as shown below.
I created a package and configured the flat file connection manager with below shown settings.
I configured the package with a data flow task to read the file and populate the data to a database table. When I executed the package, it failed.
Clicked the Execution Results tab on the BIDS. It displays the same message that you have posted in your question.
[Flat File Source [44]] Error: Data conversion failed. The data conversion for column "Column 1" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
[Flat File Source [44]] Error: The "output column "Column 1" (128)" failed because truncation occurred, and the truncation row disposition on "output column "Column 1" (128)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
[Flat File Source [44]] Error: An error occurred while processing file "C:\temp\FlatFile.txt" on data row 2.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Flat File Source" (44) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Hope it helps to identify your problem.
Previous answer:
I think that the value in the last field on the last row of your file probably exceeded the value of OutputColumnWidth property of the last column on the Flat File Connection Manager.
Right-click the Flat File Connection Manager on your SSIS package. Click Advanced tab page on the Flat File Connection Manager Editor. Click on the last column and check the value on the OutputColumnWidth property.
Now, verify the length of data on the last field of the last row in the file that is causing your package to fail.
If that is cause of the problem, here are two possible options to fix this:
Increase the OutputColumnWidth property on the last column to an appropriate length that meets your requirements.
If you do not care about truncation warnings, you can change the truncation error output on the last column of the Flat File Source Editor. Double-click the Flat File Source Editor, click Error Output. Change the Truncation column value to either Ignore failure or Redirect row. I prefer Redirect row because it gives the ability to track data issues in the incoming file by redirecting the invalid to a separate table and take necessary actions to fix the data.
Hope that gives you an idea to resolve your problem.
So I've come up with an answer. The other answers are extremely well thought out and good, but I solved this using a slightly different technique.
I had all but eliminated the actual possibility of truncation because once I looked into the data in the flat file, it just didn't make sense... truncation could definitely NOT be occuring. So I decided to focus the second half of the error message: or one or more characters had no match in the target code page
After some intense Googleing I found a few sites like this one: http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/6d4eb033-2c45-47e4-9e29-f20214122dd3/
Basically the idea is that if you know truncation isn't happening, you have characters without a code page match, so a switch from 1252 ANSI Latin I to 65001 UTF-8 should make a difference.
Since this has been moved to production, and the production environment is the only environment having this issue I wanted to make 100% sure I had the correct fix in, so I made one more change. I had no text qualifier, but SSIS still keeps the default Text_Qualified property for each column in the Flat File Connection Manager to TRUE. I set ALL of them to false (not just the column in question). So now the package doesn't see it needs a qualifier, then go to the qualifier and see <none> and then not look for a qualifier... it just flat out doesn't use a qualifier period.
Between these two changes the package finally ran successfully. Since both changes were done in the same release, and I've only received this error in production and I can't afford to switch different things back and forth for experimental purposes, I can't speak to which change finally did it, but I can tell you those were the only two changes I made.
One thing to note: the production machine running this package is: 10.50.1617 and my machine I am developing on (and most of the machines I am testing on) are: 10.50.4000. I've raised this as a possible issue with our Ops DBA and hopefully we'll get everything consistent.
Hopefully this will help anybody else who has a similar issue. If anybody would like additional information or details (I feel as if I've covered everything) please just comment here and let me know. I will gladly update this to make it more helpfull for anybody coming along in the future.
It only happens on the one server? And you aren't using a test qualifier? We have had this happen before. This is what fixed it.
Go to that server and open the xml file. Search forTextQualifier and see if it says:
<DTS:Property DTS:Name="TextQualifier" xml:space="preserve"><none></DTS:Property>
If it doesn't make it say that.
I had the exact same error. My source text file contained unicode characters and I solved it by saving the text file using unicode encoding (instead of the default utf-8 encoding) and checking the Unicode checkbox in the Data Source dialog.
Just follow these simple steps.
1. Right-click the OLE DB source or destination object, and then click Show Advanced
Editor….
2. On the Advanced Editor screen, click the Component Properties page.
3. Set AlwaysUseDefaultCodePage to True.
4.Click OK.
5.Clicking OK saves the settings for use with the current OLE DB source or destination object within the SSIS package.
I know this is a whole year later, but when I opened the flat file connection manager, for the text qualifier it had "_x003C_none_x003E_". I replace the "_x003C_none_x003E_" hex code garbage and put arrows like it should be, "<" none ">" (the editor is removing the arrows), and it stopped dropping the last row of the file.
Below steps may help you in solving your problem.
Go to show advance editor by right clicking on Source.
2.Click on Component properties.
And set AlwaysUseDefaultCodePage to TRUE.
And save he changes.

What happens to ignored rows when Ignore failure is selected in SSIS?

There are 3 components in the data flow: an OLE DB Source, Data Conversion transformation and a Flat File Destination. If I selected the Ignore failure option in the Data conversion transformation and some rows get ignored in that data conversion level then would those ignored rows move towards target? Or where can I get those ignored rows? Are those going to available in the log file?
What will happen to erroneous rows when I select Fail component as option?
The standard behavior when you select Ignore failure seems to be to ignore the offending values, not the rows that contain them. I just did a quick test (SQL Server 2008 R2) and values are imported with NULL instead of the offending values.
You can find a very helpful page with more information on error handling in SSIS Data Flow tasks here. An overview from that page of the relevant error handling options and what they mean:
Fail Component:
The Data Flow task fails when an error or a truncation occurs. Failure is the default option for an error and a truncation.
Ignore Failure:
The error or the truncation is ignored and the data row is directed to the output of the transformation or source.
Redirect Row:
The error or the truncation data row is directed to the error output of the source, transformation, or destination.
As per SQL 2014, if you set Truncate Row Disposition to "RD_IgnoreFailure" the column data will be truncated to the specified length and inserted into destination. For example, if the column size is 50 and the data length is 70, the first 50 characters will be inserted to the destination.
why did you tag sql server 2000 and 2008? Its different on each version, even from 2005 to 2008 it changes. On 2008 you can drag the failed rows to another oledbdestination for example and insert them on a failed-log-table.

Resources