Potential Loss of Data reading from CSV with decimal - sql-server

I have read a large number of questions and answers on this and I still can't get it to work.
I have a csv like the following:
Field1;Field2;Field3
CCC;DDD;0.03464
EEE;FFF;0.08432
...
When I attach a Flat File Source, in SSIS, it gives me the following:
[Sample CSV [2]] Error: Data conversion failed. The data conversion
for column "Field3" returned status value 2 and status text "The value
could not be converted because of a potential loss of data.".
I have already changed the output to DT_DECIMAL, with 5 as the scale value, in the advance properties but I still get the same error.
Any clue on this?
It seems like a simple solution that I am somehow overlooking.
Thanks!

There are many values that cannot be converted to DT_DECIMAL, you can detect the values that cause this error by utilizing of the Flat File Error Output which redirect the rows that are causing errors when loading data.
Helpful Links
ERROR HANDLING IN SSIS WITH AN EXAMPLE STEP BY STEP
SSIS error when loading data from flat files

Related

Text Was Truncated or One or More Characters Has No Match in the Target Code Page

For the life of me, I cannot seem to get past the following error:
Error: 0xC020901C at Import Data - APA, APA Workbook [2]: There was an error with APA Workbook.Outputs[Excel Source Output].Columns[Just] on APA Workbook.Outputs[Excel Source Output]. The column status returned was: "Text was truncated or one or more characters had no match in the target code page.".
Error: 0xC020902A at Import Data - APA, APA Workbook [2]: The "APAC Workbook.Outputs[Excel Source Output].Columns[Just]" failed because truncation occurred, and the truncation row disposition on "APA Workbook.Outputs[Excel Source Output].Columns[Just]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component."
I have an SSIS package that is trying to load data from an Excel file into a SQL Server table. I understand SSIS takes a "Snapshot" of the data and uses this to build the column sizes. My database column for this column is: nvarchar(512).
So some things I have done to try and rectify this are as follows:
Added "IMEX=1" to the extended properties of the Excel Connection string
Created an Excel file with 10 rows and each row has 512 characters in this "Just" column so that SSIS will recognize the size
Went into the Advanced Editor for the Source, then "Input and Output Properties". Then went to the Just column and changed DataType to "Unicode String [DT_WSTR] and changed the Length to 512
After I did the above, I ran the code and the 10 rows of data were imported with no issue. But when I run it against the real Excel file, the error appears again.
I have found that if I add a column to find the character length of the column, then sort by that, putting the largest first, the code works. But if the file is as sent by the user, it errors out
I would appreciate any help on how to solve this, as all of my Google searches state the above would work, but unfortunately it is not.

Importing CSV in SSIS: truncation error

I am importing a CSV file in SSIS having many string columns. I have set column width more than the maximum length, but still I am getting below errors
[Input CSV File [114]] Error: Data conversion failed. The data
conversion for column "Functionality" returned status value 4 and
status text "Text was truncated or one or more characters had no match
in the target code page.".
[Input CSV File [114]] Error: The "Input CSV File.Outputs[Flat File
Source Output].Columns[Functionality]" failed because truncation
occurred, and the truncation row disposition on "Input CSV
File.Outputs[Flat File Source Output].Columns[Functionality]"
specifies failure on truncation. A truncation error occurred on the
specified object of the specified component.
[Input CSV File [114]] Error: An error occurred while processing file
"D:\Prateek\SSIS_UB_PWS\January.csv" on data row 236.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The
PrimeOutput method on Input CSV File returned error code 0xC0202092.
The component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
As a work around, I have set values to 500 or 1000 and now it is allowing me to continue, but the actual length is in double digit.
Kindly suggest what could be the possible error.
Check , what is the value of the column 'Functionality' at row number 236. And then , verify what is allowed. In the advanced editor of the source, you can increase the length(if there are not any special characters) if you are loading data into a table
Truncation warning appears when your source column length is bigger than destination column length, so it'll truncate the source value to fit into its destination. Could you share with us the length of the column length, and the length of destination column?
I got the error, sorry for my bad understanding of issue.
Actually one of the column is having multiple commas (,) in data and that was getting into other columns (Text delimiter was set to ). Hence I was getting bigger values than expected in other columns as well.
Thanks for you help!

SSIS Package error- SSIS Error Code DTS_E_PROCESSINPUTFAILED

SSIS job has failed and posting the below error
[Product Sales [749]] Error: An exception has occurred during data insertion, the message returned from the provider is: The given value of type String from the data source cannot be converted to type float of the specified target column.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Product Sales" (749) failed with error code 0xC020844B while processing input "ADO NET Destination Input" (752). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Can some one please advise if you have come across this kind of error
Thank you
Your error message is explaining the issue to you: "The given value of type String from the data source cannot be converted to type float of the specified target column."
Open the component that is failing and review the metadata. You have a float column somewhere and you are passing this column a string that can't be converted to a float, such as empty space or an alphanumeric value.
If you want to ensure these values are floats, you can add a script component above the one that is failing and write some code to ensure the value is properly sanitized:
string input = "1.1"; //Replace with your input buffer value
float result;
float.TryParse(input, out result); //Result = 0.0 if value was not parsed
Please add a data conversion task between source and destination to change data type from string to float , it will resolve your issue .
If still you are facing the issue then let me know the exact issue what are the source and which ssis task you are using.
Use ole db source and destination instead of odc and try to reduce the column name length and no paranthesis in column names and use table and fast load this should solve. I had the same problem where loading from analysis services cube through dax query into SQL Table of my local machine

SSIS Blank Int Flat File Fail on Load

I have an SSIS package I am using to load a Fixed Width flat file. I have put in all the column lengths and have two packages against similar files working correctly. The third however keeps throwing the following error:
[Source 1 [16860]] Error: Data conversion failed. The data conversion for column "Line Number"
returned status value 2 and status text "The value could not be converted because of a
potential loss of data.".
[Source 1 [16860]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.
The "output column "Line Number" (16957)" failed because error code 0xC0209084
occurred, and the error row disposition on "output column "Line Number" (16957)"
specifies failure on error. An error occurred on the specified object of the specified
component. There may be error messages posted before this with more information about
the failure.
After doing some testing this happens for any column I have the is using the DT_I4 Data Type and has a blank in the column. I was going to try using a derived column, but this seems to fail for some of the columns even if I change them to a string data type to handle the blank as a NULL and then do a conversion to an INT later in the data flow.
In the source and the destination task I have the Retain NULL values checkbox ticked however this hasn't changed anything.
Any suggestions on handling this error where INT seems to fail at converting a blank to a NULL?
DT_I4 maps to a four byte signed integer in SSIS.
You were on the right track with your derived column. You just need to add the right expression.
You can try this expression:
ISNULL([Line Number]) ? "0":[Line Number]
This link may also be of use - see the postcode column in the example
http://www.bidn.com/blogs/DonnyJohns/ssas/1919/handling-null-or-implied-null-values-in-an-ssis-derived-column
I ended up using the approach from this blog post:
http://www.proactivespeaks.com/2012/04/02/ssis-transform-all-string-columns-in-a-data-flow-stream/
to handle all of the null and blank columns via a script task and data conversion.

Why when I try to output a query result to CSV in SSIS instead of decimal numbers I get ones and zeros?

I have created a very simple Data Flow in SSIS that is run inside a loop.
IMAGE 1 http://img407.imageshack.us/img407/1553/step1f.jpg
I have a simple OLE DB Source control which is connecting to a SQL Server and running quite a complex query to split daily data by 30 minute intervals as shown below.
IMAGE 2 http://img168.imageshack.us/img168/857/step2vs.jpg
I then have a Flat File Destination control which is taking the output from the OLE DB Source control and saving it as a comma-delimited CSV file. As you can see above the numbers are decimal numbers to two decimal places but in the CSV file below it is showing as ones and zeros.
IMAGE 3 http://img341.imageshack.us/img341/5494/step3w.jpg
What can I do to get the CSV output to match the query input? I have tried converting the numbers to varchar in the query but I got the same result. I also tried changing the column types in the Connection Manager too but got the same result.
I have managed to resolve this issue by changing all the DataType properties for each column of data I was importing. I had to change them to 'double-precision float [DT_R8]' and then it saved the CSV with the proper decimal values.
Very annoying, I hope that helps someone.
IMAGE 4 http://img687.imageshack.us/img687/3749/step4dp.jpg

Resources