I have the Contoso Dataset of Microsoft. My goal is from this dataset which is a flat file, to fill tables on my database.
More specific the flat file contains three columns like these:
CurrencyLabel
CurrencyName
CurrencyDescription
My goal is to remove duplicates and insert the unique results in my table that have these columns
1. CurrencyKey
2. CurrencyLabel
3. CurrencyName
4. CurrencyDescription
The CurrencyKey column is the primary key of the table and it should be filled automatically with new keys of the results on SSIS.
At SSIS, i have made a control flow which contains the data flow.
In the data flow I have the flat file source, then aggregate with group by
with all columnns (CurrencyLabel, CurrencyName,CurrencyDescription ) and then I Have set the ole db destination to my table.
The goal is to fill all my four columns and the target is to increase the CurrencyKey from the aggregate.
The columns of the flat files and the first two rows:
OnlineSalesKey|SalesQuantity|SalesAmount|TotalCost|UnitCost|UnitPrice|CustomerLabel|Title|FirstName|MiddleName|LastName|BirthDate|MaritalStatus|Gender|EmailAddress|YearlyIncome|TotalChildren|NumberChildrenAtHome|Education|Occupation|HouseOwnerFlag|NumberCarsOwned|Phone|CurrencyLabel|CurrencyName|CurrencyDescription|IsWorkDay|IsHoliday|HolidayName|EuropeSeason|NorthAmericaSeason|AsiaSeason|ProductLabel|ProductName|ProductDescription|Manufacturer|BrandName|ColorName|Size|Weight|ProductCategoryLabel|ProductCategoryName|ProductCategoryDescription|StoreManager|StoreType|StoreName|StoreDescription|Status|OpenDate|CloseDate|StorePhone|StoreFax|Customer_address|store_address
19609697|1|313.52|180.22|180.22|391.9|18157||Isabella||Hall|1953-03-06|S|F|isabella80#adventure-works.com|90000|3|2|Partial College|Professional|1|1|1 (11) 500 555-0181|001|USD|US Dollar|WorkDay|0|None|Holiday|Spring/Back to Business|Holiday|0402095|Fabrikam SLR Camera M150 Orange|Digital camera ñ SLR, 5.2 in x 2.8 in x 3.7 in, 19.2 oz|Fabrikam, Inc.|Fabrikam|Orange|3.5 x 5 x 3.3 |5.25|04|Cameras and camcorders |Cameras and camcorders |246|Online|Contoso Europe Online Store|Contoso Europe Online Store|On|2004-09-03 00:00:00||731-555-0117|731-555-0117|Bundesallee 4422|Downtown Berlin, Germany
19609733|1|166.4|95.65|95.65|208|14866||Hailey|R|Peterson|1970-10-21|S|F|hailey14#adventure-works.com|40000|2|2|Partial College|Clerical|1|2|1 (11) 500 555-0123|001|USD|US Dollar|WorkDay|0|None|Holiday|Spring/Back to Business|Holiday|0504005|The Phone Company Smart phones 160x160 M26 Black|Smart phones 160x160, AC adapter, stylus, protective cover, installation CD-ROM, application manual|The Phone Company|The Phone Company|Black|4.5 x 3.1 x 0.6|4.5999999999999996|05|Cell phones|Cell phones|246|Online|Contoso Europe Online Store|Contoso Europe Online Store|On|2004-09-03 00:00:00||731-555-0117|731-555-0117|Pflugstr 24|Downtown Berlin, Germany
You can see that the fields are
001|USD|US Dollar
1. CurrencyLabel
2. CurrencyName
3. CurrencyDescription
My table
My data flow
Aggregate by group by
Mappings on OLE DB DESTINATION
Right now nothing seems to be written in my table and there is error by this this output:
SSIS package
Information: 0x4004300A at Fill Currency Table, SSIS.Pipeline:
Validation phase is beginning.
Information: 0x4004300A at Fill Currency Table, SSIS.Pipeline:
Validation phase is beginning.
Warning: 0x80049304 at Fill Currency Table, SSIS.Pipeline: Warning:
Could not open global shared memory to communicate with performance
DLL; data flow performance counters are not available. To resolve, run
this package as an administrator, or on the system's console.
Information: 0x40043006 at Fill Currency Table, SSIS.Pipeline: Prepare
for Execute phase is beginning.
Information: 0x40043007 at Fill Currency Table, SSIS.Pipeline: Pre-
Execute phase is beginning.
Information: 0x402090DC at Fill Currency Table, Flat File Source [30]:
The processing of file "C:\Users\george\Desktop\Data_201813_10M.txt"
has started.
Information: 0x4004300C at Fill Currency Table, SSIS.Pipeline: Execute
phase is beginning.
Information: 0x402090DE at Fill Currency Table, Flat File Source [30]:
The total number of data rows processed for file
"C:\Users\george\Desktop\Data_201813_10M.txt" is 10000001.
Error: 0xC0202009 at Fill Currency Table, OLE DB Destination [102]:
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error
code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native
Client 11.0" Hresult: 0x80004005 Description: "The statement has been
terminated.".
An OLE DB record is available. Source: "Microsoft SQL Server Native
Client 11.0" Hresult: 0x80004005 Description: "Cannot insert the
value NULL into column 'CurrencyKey', table
'DataWarehouse.dbo.DimCurrency'; column does not allow nulls. INSERT
fails.".
Error: 0xC0209029 at Fill Currency Table, OLE DB Destination [102]:
SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "OLE DB
Destination.Inputs[OLE DB Destination Input]" failed because error code
0xC020907B occurred, and the error row disposition on "OLE DB
Destination.Inputs[OLE DB Destination Input]" specifies failure on
error. An error occurred on the specified object of the specified
component. There may be error messages posted before this with more
information about the failure.
Error: 0xC0047022 at Fill Currency Table, SSIS.Pipeline: SSIS Error
Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component
"OLE DB Destination" (102) failed with error code 0xC0209029 while
processing input "OLE DB Destination Input" (115). The identified
component returned an error from the ProcessInput method. The error is
specific to the component, but the error is fatal and will cause the
Data Flow task to stop running. There may be error messages posted
before this with more information about the failure.
Information: 0x40043008 at Fill Currency Table, SSIS.Pipeline: Post
Execute phase is beginning.
Information: 0x402090DD at Fill Currency Table, Flat File Source [30]:
The processing of file "C:\Users\george\Desktop\Data_201813_10M.txt"
has ended.
Information: 0x4004300B at Fill Currency Table, SSIS.Pipeline: "OLE DB
Destination" wrote 1 rows.
Information: 0x40043009 at Fill Currency Table, SSIS.Pipeline: Cleanup
phase is beginning.
Task failed: Fill Currency Table
Warning: 0x80019002 at DataWarehouse2018: SSIS Warning Code
DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but
the number of errors raised (3) reached the maximum allowed (1);
resulting in failure. This occurs when the number of errors reaches the
number specified in MaximumErrorCount. Change the MaximumErrorCount or
fix the errors.
SSIS package
Any idea?
Thank you
Related
I'm trying to import a large csv file into Microsoft Sql Server Management Studio through the 'Import and Export" Wizard.
Data in Question, it's the "Parcels - Comma-Separated Values" csv file
When I try to Import it just as is these are the errors given
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "LEGAL_DESC" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)
Error 0xc020902a: Data Flow Task 1: The "Source - parcels_csv.Outputs[Flat File Source Output].Columns[LEGAL_DESC]" failed because truncation occurred, and the truncation row disposition on "Source - parcels_csv.Outputs[Flat File Source Output].Columns[LEGAL_DESC]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)
Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\tobyr\OneDrive\Desktop\RealEstate\Data\parcels.csv" on data row 13.
(SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - parcels_csv returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
I tried cleaning it up, replacing out the 'None' with just spaces (maybe SQL manger doesn't know 'none'= 'NULL'), using the suggested types, increasing 'headers rows to skip', changing the 'header rows delimiter' to comma. These are the results after just cleaning it up as described above (It's giving me all checkmarks during the review tab:
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "SITUS_ADDR_NBR_SUFFIX" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Source - denParcels4_csv.Outputs[Flat File Source Output].Columns[SITUS_ADDR_NBR_SUFFIX]" failed because error code 0xC0209084 occurred, and the error row disposition on "Source - denParcels4_csv.Outputs[Flat File Source Output].Columns[SITUS_ADDR_NBR_SUFFIX]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\tobyr\OneDrive\Desktop\RealEstate\Data\denParcels4.csv" on data row 2.
(SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - denParcels4_csv returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
I have also tried setting it to ignore all error but it's just creating an empty table. And creating and empty table and using the 'BULK INSERT' query, but nothing has worked.
You might find it easier to
remove constraints from the target table
set string columns sizes to a larger size
calibrate the import process to use text qualifier on strings (")
Probably not critical - but a quick review of record two in file https://www.denvergov.org/media/gis/DataCatalog/parcels/csv/parcels.csv has an empty value in the first column - SCHEDNUM - perhaps the target table has "not null" constraint
I believe the other errors are related to perhaps non use of text qualifiers- some records have strings wrapped in quotes
If you are tied to the target table maintaining constraints - then long process suggestion ahead - see below
THanks
In this circumstance I would port the file to a new [staging] table that does not have any constraints (e.g. not null). I would also use a text qualifier when importing the data into the fresh table. To avoid all reasonable doubt - set string columns on the [staging] table to a reasonable size - nvarchar (255)
I'd suggest using bcp for the quick fire a file into a temporary [staging] table as it does not require much configuration to lift and shift a text file straight to table - as long as there are the same number of columns in the table as there are delimited values in the file record
bcp also provides facility to progress even if there are errors - default allow 10 errors and skip.
Once all loaded into [staging] table - create an identical table structure as the target - just with no rows.
Then build a merge statement to sweep "good" records from staging table and insert into the target table and use the capability of merge statement to load failed records into a failures table (or perform a not exists)
Context
I am developing a simple SSIS package that contains a Data flow Task with:
1 OLE DB Source
2 Lookup Transformations
1 OLE DB Destination
1 OLE DB Destination for error output rows
I am using the FastLoad option in both OLE DB Destinations and i have configured the error output of the first one to redirect rows to the second destination.
Question
From many online articles, i read that using Fastload option will cause the entire batch to fail an not only erroneous rows are redirected.
Error Handling With OLE DB Destinations
Error output in OLE DB Destination. How to redirect a row?
How to handle Failed Rows in a Data Flow
Have your SSIS Fast Load (Bulk Load) and Row by Row Error Messages too
But when executed the package only 2 rows are redirected and all other rows are imported successfully. And i checked that the sum of the rows count in both destination is equal to the source row count which means that only erroneous rows are redirected.
Note that:
Max Commit size = 2147483647
Batch size is empty
Table lock and check constraints option are checked
I am using SQL Server 2014 with Visual Studio 2013
I didn't find any similar case online. Any explanation?
What #DanGuzman mentioned is true, since there are 2 phases of validation for the data when it comes to the OLE DB Destination:
Client-side validation
Server-side validation:
1. The Client-side validation:
When data from the pipeline to the OLE DB destination the pipeline columns (External columns) are mapped to the OLE DB Destination Input columns which must have a data type relevant to the server side columns data types (Database Engine). If an error occured While data is passed from External columns to the OLE DB destination inputs columns the error row can be redirected alone.
Example: Implicit conversion failure: When a DT_STR field is mapped to a DT_DATE and it contains an invalid date value
When we say that Fastload option load data in batches we are talking about the phases when data are sent from the OLE DB destination input columns to the destination itself (Database engine)
2. Server-side validation
This type of validation is done when inserting data to the destination such as Identity, primary key or foreign key violation ...
If an error occurs in this phase the whole batch is rejected and all rows are redirected to the error output.
Right, so I have a foreach loop with a data flow inside going through a group of files with all the exact same format. This changes a few things with a derived column which is all getting dumped into a SQL Server Database which will become my staging table.
The problem is that some files throw up an error even though the files are all formatted identically, the error is always around the date. It will go through 4 files with no problem and then on the 5th file it stops working.
What am I trying to do?
I want to get a whole load of files with the same format and put the data from within into an SQL database while changing some formats.
What have I tried to do?
I tried to reformat the date to be dd/mm/yyyy which is the format I want the date to have anyway.
The reformatting worked but then on the same file that had errors before it came up with a type cast error.
This is the error I get:
[OLE DB Destination [59]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Invalid character value for cast specification".
[OLE DB Destination [59]] Error: There was an error with OLE DB Destination.Inputs[OLE DB Destination Input].Columns[Amount] on OLE DB Destination.Inputs[OLE DB Destination Input]. The column status returned was: "The value could not be converted because of a potential loss of data.".
[OLE DB Destination [59]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "OLE DB Destination.Inputs[OLE DB Destination Input]" failed because error code 0xC0209077 occurred, and the error row disposition on "OLE DB Destination.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
*[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (59) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (72). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
*
It's happened before and I got around it by creating a new foreach loop to handle the new files with the new format (I could not see any change in the format but did it to continue).
This is the format for the Data Flow:
This is the image of the Derived Column
Any help would be greatly appreciated! if you need me to clarify anything just let me know.
Your columns appear to be in different orders. I replaced the tabs with pipes and got the following:
Working:
Staffordshire County Council|Staffordshire County Council Other|247 Cars Willenhall Ltd 15/06/2017|1126.97|Transport - Escorts|Transport - Escorts|opendatacommunities.org/id/county-council/staffordshire
Not Working:
Staffordshire County Council|Childrens Services SEND|247 Cars Willenhall Ltd|273.42|06/07/2017|Transport - Escorts|Transport - Escorts opendatacommunities.org/id/county-council/staffordshire
In the first one the date is on the other side of the amount, and appears to be included in the "247 Cars Willenhall Ltd" string.
If you enable the Data Viewer by right-clicking the arrow between the last two components and selecting the option, you'll get a clear view of how this is affecting your data flow (while running/debugging the package).
I have an error in my ssis pakage that I don't understand it:
Error: 0xC0202009 at InsertStudent, InsertStudent [303]: SSIS Error
Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code:
0x80040E57. An OLE DB record is available. Source: "Microsoft SQL
Server Native Client 11.0" Hresult: 0x80040E57 Description: "The
statement has been terminated.". An OLE DB record is available.
Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E57
Description: "String or binary data would be truncated.".
Error: 0xC0209029 at InsertStudent, InsertStudent [303]: SSIS Error
Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The
"InsertStudent.Inputs[OLE DB Command Input]" failed because error code
0xC020906E occurred, and the error row disposition on
"InsertStudent.Inputs[OLE DB Command Input]" specifies failure on
error. An error occurred on the specified object of the specified
component. There may be error messages posted before this with more
information about the failure.
Error: 0xC0047022 at InsertStudent, SSIS.Pipeline: SSIS Error Code
DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component
"InsertStudent" (303) failed with error code 0xC0209029 while
processing input "OLE DB Command Input" (308). The identified
component returned an error from the ProcessInput method. The error is
specific to the component, but the error is fatal and will cause the
Data Flow task to stop running. There may be error messages posted
before this with more information about the failure.
Error: 0xC0047022 at InsertStudent, SSIS.Pipeline: SSIS Error Code
DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Merge
Join" (406) failed with error code 0xC0047020 while processing input
"Merge Join Left Input" (411). The identified component returned an
error from the ProcessInput method. The error is specific to the
component, but the error is fatal and will cause the Data Flow task to
stop running. There may be error messages posted before this with
more information about the failure.
One of your is getting an input value that is larger (in size) than what the column size is, as defined in your database table.
It's unfortunate that the error message does not also give you the name of the offending column. I think you are stuck with troubleshooting this the hard way, by looking at the values column by column until Microsoft fixes the error message
See Microsoft Connect: Please fix the "String or binary data would be truncated" message to give the column name
...and all the commentary on
Why isn't “String or Binary data would be truncated” a more descriptive error?
Check the component connected to OLE DB Command transformation and the left input of MERGE JOIN transformation, the size of the input string from that component is larger than the column size and therefore data truncation error happened here.
The solution would be to fix the data length from that component or to change the target table structure (enlarge the column size).
In my case this error connected with different length of columns in your DB. For example if you create Source DB you need to be sure that each column have sufficient maximum length and you don't put largest values (too long values) into this column.
So unfortunately SSIS don't specify the problematic column, and you need to find it by yourself or enlarge maximum length of every column.
That's my first experience in SSIS and I'm just going nuts: NOTHING WORKS.
(Don't be afraid of big post: most is just errors output.)
I've got two MS SQL DBs with same fields and I have to transfer from first one, where everything is in nvarchar(32) aka DT_WSTR, into second one, where types are different.
Data and its examples:
"Timestamp" is "datetime2(7)" in destination, and source looks like ISO 8601: 2013-12-19T00:00:00.000
"Value" is "real" numbers with scientific notation, test examples are: 17e+10, 17.14, 17.14e+5, 1715E+4, 1714
And four columns with just different ints (bigint, bigint, tinyint, int).
Now for what I've tried (warning for lots of quotations):
Derived Column. I used casts like "(DT_DBTIMESTAMP2,7)time" and "(DT_R4)value". Perhaps I'm using wrong types, but I strongly doubt: I googled it like a lot and most articles (like this) tells that I'm right.
Error: 0xC0049064 at Import from ODS to DWH, Derived Column [2]: An error occurred while attempting to perform a type cast.
Error: 0xC0209029 at Import from ODS to DWH, Derived Column [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Derived Column" failed because error code 0xC0049064 occurred, and the error row disposition on "Derived Column.Outputs[Derived Column Output].Columns[timestamp]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Import from ODS to DWH, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Derived Column" (2) failed with error code 0xC0209029 while processing input "Derived Column Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at Import from ODS to DWH, OLE DB Source [62]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Import from ODS to DWH, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Explicitly changing types in source (so they match destination) and connecting directly to destination.
Error: 0xC020901C at Direct, OLE DB Source [32]: There was an error with OLE DB Source.Outputs[OLE DB Source Output].Columns[time] on OLE DB Source.Outputs[OLE DB Source Output]. The column status returned was: "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Direct, OLE DB Source [32]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "OLE DB Source.Outputs[OLE DB Source Output].Columns[time]" failed because error code 0xC0209072 occurred, and the error row disposition on "OLE DB Source.Outputs[OLE DB Source Output].Columns[time]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047038 at Direct, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Data Conversion. Same result: failure.
Error: 0xC02020C5 at Conversion, Data Conversion [2]: Data conversion failed while converting column "time" (74) to column "Copy of time" (11). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Conversion, Data Conversion [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of time]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of time]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Conversion, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion" (2) failed with error code 0xC0209029 while processing input "Data Conversion Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at Conversion, OLE DB Source [62]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Conversion, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
So now I just ran out of tools and ideas how to do this: most instructions just say something like "map this to that and it will work". I'm struggling with this so hard and for so long I even created a StackExchange account for that. Any help appreciated. Cheers.
This message
The value could not be converted because of a potential loss of data.
tells you pretty much everything you need to know. Your problem isn't with the types you're using, it's with the data in your source.
The most likely cause is at least one row of data in your source ODS has values which are in a format that cannot be recast into (for example) a DB_TIMESTAMP.
My recommended approach would be to cast all your source columns explicitly in your SQL source statement.
So instead of
select time, value from source
do
select (cast time as datetime) as time, (cast value as int) as value from source
In fact, I would run this query against your source table to make sure all the values can be correctly cast into the final fields. (My guess is not or you wouldn't get casting errors.)
Another thing you could do is change your task components from "Fail" on error to "Redirect error rows" and push the error rows into a file destination so you can see which rows are getting kicked out by the transformation component.
New Answer to an old question - but I just pulled all of my hair out trying to solve this error. Multiple error paths, data conversion, derived columns, etc. were not working. Two things solved this error:
I noticed that columns with Null Values were causing errors when dealing with Curreny (DT_CY) .
I finally solved this error by checking a little box "Retain null values from source as null values in the data flow" that was within the Flat File Source node. (why is this not selected by deafult!?)
Second, I was having a hard time converting dates to strings - an easy solution would be to do an update statement within a Execute SQL Task within the control flow.
For example: your database column is varchar or nvarchar. Your Source has data coming in a mm/dd/yyyy. No matter how many times you try wrestling with the above error, converting this into a string fails.
I solved this by loading the data (you can use a temp table if you'd like) and then converting this column using a SQL Query. I personally used:
update MY_TABLE_NAME
set [MY_DATE_COLUMN] = convert(varchar(8),CONVERT (datetime,[MY_DATE_COLUMN] ), 112)
where (FILTER_STATEMENT_TO_UPDATE_ONLY_WHAT_YOU_JUST_LOADED)
Hopefully this can assist others who stumble upon this post.
(DT_R8)(REPLACE(Knowledgeable,"\"","") == "" ? NULL(DT_R8) : (DT_R8)REPLACE(Knowledgeable,"\"",""))
Try the above code.
What's happening is that you are trying to convert a value that has double qoutes. That is why it's giving you an error. Replacing the double qoute with an empty string will solve the problem.