Getting error while loading data into table - snowflake-cloud-data-platform

I am getting this error while loading data into snowflake table from tsv format.
Unable to copy files into table.
End of record reached while expected to parse column "SALESFORCE_ACTIVITIES_REPORT" ["INSERT_DATE".19]' File '#SALESFORCE_ACTIVITIES_REPORT/ui1659508402073/Salesforce Report- Activities.tsv',line 5179, character 784 Row 5178, column "SALESFORCE_ACTIVITIES_REPORT" ["INSERT_DATE".19]
Not able to understand the error.
Please help me with it.enter image description here

Related

Text Was Truncated or One or More Characters Has No Match in the Target Code Page

For the life of me, I cannot seem to get past the following error:
Error: 0xC020901C at Import Data - APA, APA Workbook [2]: There was an error with APA Workbook.Outputs[Excel Source Output].Columns[Just] on APA Workbook.Outputs[Excel Source Output]. The column status returned was: "Text was truncated or one or more characters had no match in the target code page.".
Error: 0xC020902A at Import Data - APA, APA Workbook [2]: The "APAC Workbook.Outputs[Excel Source Output].Columns[Just]" failed because truncation occurred, and the truncation row disposition on "APA Workbook.Outputs[Excel Source Output].Columns[Just]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component."
I have an SSIS package that is trying to load data from an Excel file into a SQL Server table. I understand SSIS takes a "Snapshot" of the data and uses this to build the column sizes. My database column for this column is: nvarchar(512).
So some things I have done to try and rectify this are as follows:
Added "IMEX=1" to the extended properties of the Excel Connection string
Created an Excel file with 10 rows and each row has 512 characters in this "Just" column so that SSIS will recognize the size
Went into the Advanced Editor for the Source, then "Input and Output Properties". Then went to the Just column and changed DataType to "Unicode String [DT_WSTR] and changed the Length to 512
After I did the above, I ran the code and the 10 rows of data were imported with no issue. But when I run it against the real Excel file, the error appears again.
I have found that if I add a column to find the character length of the column, then sort by that, putting the largest first, the code works. But if the file is as sent by the user, it errors out
I would appreciate any help on how to solve this, as all of my Google searches state the above would work, but unfortunately it is not.

Potential Loss of Data reading from CSV with decimal

I have read a large number of questions and answers on this and I still can't get it to work.
I have a csv like the following:
Field1;Field2;Field3
CCC;DDD;0.03464
EEE;FFF;0.08432
...
When I attach a Flat File Source, in SSIS, it gives me the following:
[Sample CSV [2]] Error: Data conversion failed. The data conversion
for column "Field3" returned status value 2 and status text "The value
could not be converted because of a potential loss of data.".
I have already changed the output to DT_DECIMAL, with 5 as the scale value, in the advance properties but I still get the same error.
Any clue on this?
It seems like a simple solution that I am somehow overlooking.
Thanks!
There are many values that cannot be converted to DT_DECIMAL, you can detect the values that cause this error by utilizing of the Flat File Error Output which redirect the rows that are causing errors when loading data.
Helpful Links
ERROR HANDLING IN SSIS WITH AN EXAMPLE STEP BY STEP
SSIS error when loading data from flat files

Importing CSV in SSIS: truncation error

I am importing a CSV file in SSIS having many string columns. I have set column width more than the maximum length, but still I am getting below errors
[Input CSV File [114]] Error: Data conversion failed. The data
conversion for column "Functionality" returned status value 4 and
status text "Text was truncated or one or more characters had no match
in the target code page.".
[Input CSV File [114]] Error: The "Input CSV File.Outputs[Flat File
Source Output].Columns[Functionality]" failed because truncation
occurred, and the truncation row disposition on "Input CSV
File.Outputs[Flat File Source Output].Columns[Functionality]"
specifies failure on truncation. A truncation error occurred on the
specified object of the specified component.
[Input CSV File [114]] Error: An error occurred while processing file
"D:\Prateek\SSIS_UB_PWS\January.csv" on data row 236.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The
PrimeOutput method on Input CSV File returned error code 0xC0202092.
The component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
As a work around, I have set values to 500 or 1000 and now it is allowing me to continue, but the actual length is in double digit.
Kindly suggest what could be the possible error.
Check , what is the value of the column 'Functionality' at row number 236. And then , verify what is allowed. In the advanced editor of the source, you can increase the length(if there are not any special characters) if you are loading data into a table
Truncation warning appears when your source column length is bigger than destination column length, so it'll truncate the source value to fit into its destination. Could you share with us the length of the column length, and the length of destination column?
I got the error, sorry for my bad understanding of issue.
Actually one of the column is having multiple commas (,) in data and that was getting into other columns (Text delimiter was set to ). Hence I was getting bigger values than expected in other columns as well.
Thanks for you help!

SSIS Excel column name reading from file

I have a source file, has columns names with length as more than 100.
Column name are same upto 80 characters and remaining 20 characters will be different, please find below example.
ABC_XXXX_MNOAP : XYZABC_PAGELOADER_CLICKS_MANOPD_YXBDBAGD : VIEWS
ABC_XXXX_MNOAP : XYZABC_PAGELOADER_CLICKS_MANOPD_YXBDBAGD : CLICKS & THROGHS
While i am loading this file in to ssis, full column name is not loading and it is trimming.
Error thrown was
There is more than one data source column with same
Please help me how to get full metadata from excel

SSIS loading data error

I have a CSV file which contains this data:
EmployeeCode,EmployeeName,EmployeeSalary,Date
101,raju,1000,2/2/2003
102,krish,100,3/4/2005
103,rishbh,320,12/9/2007
104,rani,4690,12/8/2008
105,olesia,2000,17/4/2009
106,olga,2000,12/6/2010
107,mellisa,3330,12/4/2011
And I have table called employees:
EmployeeCode nvarchar(50)
EmployeeName nvarchar(50)
EmployeeSalary money
Date datetime
When I try to load this CSV file into my table using the SSIS package it gives me an error
1) [ADO NET Destination [2]] Error: An exception has occurred during
data insertion, the message returned from the provider is: The given
value of type String from the data source cannot be converted to type
date of the specified target column.
2) [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.
The ProcessInput method on component "ADO NET Destination" (2) failed
with error code 0xC020844B while processing input "ADO NET Destination
Input" (9). The identified component returned an error from the
ProcessInput method. The error is specific to the component, but the
error is fatal and will cause the Data Flow task to stop running.
There may be error messages posted before this with more information
about the failure.
However If I remove date column from the CSV and try to insert everything works well. What is wrong with date column, why is it not taking the values?
Use OLE DB Destination rather than ADO NET Destination
This issue is due to default datatype of flat file is picked "string [DT_STR] " which is equal to varchar in database , but you have made your destination table data type nvarchar is equal to Unicode string [DT_WSTR] .
Either using derived column Change data type string [DT_STR] to Unicode string [DT_WSTR]
or
Go to show advance editor (source ) >> Input and output Properties output columns >> DAta type Property >> change data type to Unicode string [DT_WSTR] for EmployeeCode and EmployeeName.
First of all you should use OLE DB destination instead of ADO.Net Destination if your table is huge and now come to issue ,in your table data type of date is as 'datetime' so either you have to change your input data to date time by using Right click on component and go to advance editor.
Otherwise you can data conversion between source and destination and convert date as "DB_Datetime".I hope you will not face any issue when you do accordingly.
I had the same issue and my answer is going to sound silly. I struggled with it for 2 days and ended up realizing that the date format was wrong in my CSV file.
for example:
105,olesia,2000,17/4/2009 -- this is invalid because it ends up taking 17 as a month and
its invalid
blew my mind when I realized it! Hope this helps anyone else going forward!

Resources