SSIS : Converting DT_NTEXT to image file - sql-server

I am running an SSIS file that read from DataSource to convert it into Image File
for that I am using "Export Column" and everything seems to be alright except that generated files cannot be opened (as they are not images)
The file name is ending with JPG but the end result a file with real size but cannot be opened
The source column in of type DT_NTEXT
I wonder what could be wrong that cause this

Related

SSIS Export OLEDB Source to a Flat File with UTF-8

I am trying to export an OLEDB source (from a stored procedure) to an UTF-8 flat file, but am getting the following error:
[Flat File Destination [2]]
Error: Data conversion failed. The data conversion for column "name" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
The name column is defined in the stored procedure as nvarchar(30).
In the advanced editor of the OLEDB source, I have the AlwaysUseDefaultCodePage set to true and the DefaultCodePage set to 65001.
In the advanced editor for the flat file for the External Columns and Input columns, the Data Type is Unicode string [DT-WSTR] with a length of 30.
The connection manager for the flat file has the Unicode checkbox un-checked and the Code page is: 65001 (UTF-8).
I am stumped right now and any help would be appreciated.
Thanks,
David
EDIT:
I added a redirect of errors and truncations to a flat file destination but nothing was sent to the file.
Also, when I have a data viewer on the OLE DB source it comes up with all the records.
The data viewer for the Destination also shows all the records. The length of the name in both viewers is 30 characters (from Excel).
I gave up on getting the data flow to work and coded a C# script task instead.
I changed the output of my data flow to produce a Unicode file by checking the Unicode check box in the flat file connection manager.
I then have the new C# script read the Unicode file one line at a time and output the line to another flat file using Encoding.UTF8 adding a new line character at the end of the line variable.
After the new file is created, I delete the input file and rename the new file to be the same path and name as the original input file. This is also done in the C# script.

Can't import characters due to incorrect code page

I have an SSIS job to import data from a flat file into an SQL Server table. I'm having an issue regarding the encoding of the source file and destination table.
The file is an UTF8 encoded CSV file with some standard accented latin characters (ãóé, etc). My destination table is defined as having the Latin1_General_CI_AS Collation, which means I can manually insert the following text with no problem: "JOÃO ANTÓNIO".
When I declare the Flat File source, it automatically determines the file as having the 65001 code page (UTF-8), and infers the string [DT_STR] data type for each column. However, the SSIS package automatically assumes the destination table as having the 1252 Code Page, giving me the following error:
Validation error. <STEPNAME>: <STEPNAME>: The code page 65002 specified on output column "<MYCOLUMN>" (180) is not valid. Select a different code page for output column "<MYCOLUMN>".
I understand why, since the database collation is defined as having that Code Page. However, if I try to set the Flat File datasource as having the Latin1 1252 encoding, the SSIS executes but it imports characters incorrectly:
JOÃO ANTÓNIO (Flat File)-> JOAO ANTÓNIO (Database).
I have already tried to configure the flat file source as being unicode compliant, but then when after I configure each column as having a unicode compliant data type, i can't update the destination step since SSIS infers data types directly from the database and doesn't allow me to change them.
Is there a way to keep the flat file source as being CP 1252, but also importing the correct characters? What am I missing here?
Thanks to Larnu's comment i've been able to get around this problem.
Since SSIS doesn't allow implicit data conversion, I needed to set up a data conversion step first (Derived Column Transformation). Since the source columns were already set up as DTSTR[65002], i had to configure new derived columns form an expression, converting from the source code page into the destination code page, with the following expression:
(DT_STR, 50, 1252)<SourceColumn>
Where a direct cast to DT_STR is being made, stating the column will have a maximum size of 50 characters and the data will be represented with the 1252 code page.

Non-obvious truncation error during flat file import in SSIS "on data row 387"

While trying to implement Hadi's the solution to my question about import to SSIS the file with max filename in the folder, I encountered the following error:
Data Flow Task, Flat File Source [28]: Data conversion failed. The data conversion for column ""AsofDateTime"" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.
Data Flow Task, Flat File Source [28]: The "Flat File Source.Outputs[Flat File Source Output].Columns["AsofDateTime"]" failed because truncation occurred, and the truncation row disposition on "Flat File Source.Outputs[Flat File Source Output].Columns["AsofDateTime"]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component`
An error occurred while processing file "\Share\ABC_DE_FGHIJKL_MNO_PQRST_U-1234567.csv" on data row 387.
I spent hours on trying to find out what is specific about "row 387", playing this and that, removing and changing the source data, but did not get a hint at all - still the same error. SSIS package worked OK with explicitly specified filename, and the script correctly picks up the file with max filename but these parts simply do not work together, resulting in above error.
Answer: While the LAST file should be imported, SSIS takes table headers from the FIRST file in the folder.
Newer file versions were changed as per discussions with client, some columns were removed.
Solved by cleaning up older .csv file versions from import folder.

How to Import Just The FileName into SQL via SSIS

I am trying to write a file name to a table in my database - at the moment all I am achieving is importing the whole file path.
I have a foreach loop, which on the Collection looks in a specific folder and for a specific file type (the retrieve file name is fully qualified)
This has a variable mapping to "ImportInvoiceFilePath"
Then within that is a Data Flow Task which includes the flat file source, a derived column which creates the file path in the database.
This works fine - but what I am trying really hard to do but can't work out is how do I get just the file name (no extension) to write to the database as well?
Literally worked it out. Set my forloop to nameonly then in my connection to my Source file under expressions put:
#[User::ProcessingInvoiceFilePath] + "\\"+#[User::ImportInvoiceFileName]+".saf"
Where saf is the file type

SSIS read flat file skip first row

First of all, I did spend quite some time on research, and I know there are many related questions, though I can't find the right answer on this question.
I'm creating a SSIS package, which does the following:
1. Download and store CSV file locally, using HTTP connection.
And 2. Read in CSV file and store on SQL Server.
Due to the structure of my flat file, the flat file connection keeps giving me errors, both in SSIS as in the SQL Import Wizard.
The structure of the file is:
"name of file"
"columnA","columnB"
"valueA1","valueB1"
"valueA2","valueB2"
Hence the row denominator is end of line {CR}{LF} and the column denominator is a comma{,}, with text qualifier ".
I want to import only the values, not the name of the file or the column names.
I played around with the settings and got the right preview with the following settings (see image below)
- Header rows to skip: 0
- Column names in the first data row: no
- 2 self-configured columns (string with columnWidth = 255)
- Data rows to skip: 2
When I run the SSIS Package or SQL Import Wizard I get the following error:
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The
PrimeOutput method on Flat File Source returned error code 0xC0202091.
The component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
I can't figure out what goes wrong and what I can do to make this import work.
If you want to skip the file name and the column names, you need to set Header Rows to skip to 2. You should also check whether the file actually uses line feeds (LF) instead of CR+LF. Checking the line breaks in a text editor isn't enough to detect the difference, as most editors display correctly files with both CR+LF or LF.
You can check the results of your settings by clicking on the "Preview" button in your flat file source. If the settings are correct, you'll see a grid with your data properly aligned. If not, you'll get an error, or the data will be wrong in some way, eg a very large number of columns, column names in the first data row etc

Resources