SSIS Flat File Source Text Qualifier being ignored - sql-server

I am using SSIS to insert data from flat file to database.
I have created Data Flow Task for that. I am using Flat File as Source and ADO NET Destination to insert data.
Below is how my setting looks like for Flat File Source.
Below is how my "Columns" tab look like
THIS WORKS FINE WHEN I RUN THAT USING BIDS AND DATA IS INSERTED PROPERLY INTO DATABASE. IT EVEN WORKS WITH DTEXEC.EXE WHEN RUN LOCALLY.
Now, The problem is with executing the package on the server using dtexec.exe. On the server, data is inserted properly but the text qualifier (double quotes) given is totally ignored while inserting data to database. THE SAME WORKS TOTALLY FINE WHEN RUN LOCALLY. I have attached image below for how its stored in database.
I have checked SQL SERVER version and SSIS version locally and on remote server and both are same.
What can be the problem? Can anyone help?

So I found a solution for this problem. Thanks to LukeBI answer here
Create a string variable called TextQualifier and assign the value " (double quotes)
Select the connection manager, and in the Properties window select 'Expressions'. See below.
Click ..., add the property 'TextQualifier' and assign the variable #[User::TextQualifier]. See below image
Now its working fine. It will even work on 64 bit OS now.

In the flat file source, click "Columns". Make sure that when you preview the data there are no quotes in the preview. Otherwise you may have to look back at your file and make sure that BOTH the text qualifier and delimiter are correct.
If this does not work, then please take a screenshot of the "Columns" screen as well and post it. A screenshot of the actual file layout would help as well. Hope this helps!

Within your flat file connection Manager, within the "Advanced" option, you should be given a view of different parameters for each field. You will see for each field it will have a Name, ColumnDelimiter, a bunch of faded out fields and the DataType and a choice if it is text qualified or not.
In there, you should specify that the column(s) which you wish to be determined as text qualified by setting the TextQualified option to true.

Related

Read file in SSIS Project into a variable

My SSIS projects tend to run queries that require changes as they move between environments, like the table schema might change or a value in the Where clause. I've always either put my SQL into a Project Parameter, which is hard to edit since formatting is lost, or just put it directly into the Execute SQL Task/Data Flow Source then manually edited it between migrations which is also not ideal.
I was wonder though if I added my SQL scripts to files within the project, can these be read back in? Example if I put a query like this:
select id, name from %schema%.tablename
I'd like to read this into a variable then it's easy to use an expression as I do with Project Parameters to replace %schema% with the appropriate value. Then the .sql files within the project can be edited with little effort or even tested through an Execute SQL Task that's disabled/removed before the project goes into the deployment flow. But I've been unable to find how to read in a file using a relative path within the project. Also I'm not even sure these get deployed to the SSIS Server.
Thanks for any insight.
I've added a text file query.sql to an SSIS (SQL 2017) Project in Visual Studio, bit I've found no way to pull the contents of query.sql into a variable.
Native tooling approach
For an Execute SQL Task, there's an option to source your query directly from a file.
Set your SQLSourceType to File Connection and then specify a file connection manager in the FileConnection section.
Do be aware that while this is handy, it's also ripe for someone escalating their permissions. If I had access to the file the SSIS package is looking for, I can add a drop database, create a new user and give them SA rights, etc - anything the account that runs the SSIS package can do, a nefarious person could exploit.
Roll your own approach
If you're adamant about reading the file yourself, add two Variables to your SSIS package and supply values like the following
User::QueryPath -> String -> C:\path\to\file.sql
User::QueryActual -> String -> SELECT 1;
Add a Script Task to the package. Specify as a ReadOnly variable User::QueryPath and specify as a ReadWrite variable User::QueryActual
Within the Main you'd need code like the following
string filePath = this.Dts.Variables["User::QueryPath"].Value.ToString();
this.Dts.Variables["User::QueryActual"].Value = System.IO.File.ReadAllText(filePath);
The meat of the matter is System.IO.File.ReadAllText. Note that this doesn't handle checking whether the file exists, you have permission to access, etc. It's just a barebones read of a file (and also open to the same injection challenges as the above method - just this way you own maintaining it versus the fine engineers at Microsoft)
You can build your query by using both Variable and Parameter.
For example:
Parameter A: dbo
Build your variable A (string type) as : "Select * FROM server.DB." + ParameterA + ".Table"
So if you need to change the schema, just change the parameter A will give you the corresponding query in variable A.

SSIS - ANSI flatfile always saved as UTF-8 (w/o BOM)

I am facing an issue with SSIS where a customer wants a (previously delivered file in UTF-8) to be delivered in ANSI-1252. No big deal i thought. change the file connection manager and done... unfortunately it wasn't that simple. Been stuck on this for a day and clueless on what to try next.
the package itself
IN - OLE DB source with a query. Source database fields are NVARCHAR.
Next i have created a Data conversion block where i convert the incoming DT_WSTR to DT_STR using 1252 codepage.
After that is a outbound file connection destination. The flat file connection is tab delimited using codepage 1252. I have mapped the converted columns to the columns used in this flat file. Below are some screenshots of the connection manager and destination block
Now when i create a new txt file from explorer it will be ANSI (as detected by Notepad++)
When the package runs the file becomes UTF-8 w/o BOM
I have tried experimenting with the checkbox for overwriting as suggested in SSIS - Flat file always ANSI never UTF-8 encoded
as building the project from scratch and experimenting with the data conversion.
Does anyone have a suggestion on what I am missing here? The strange thing is we have a different package with exact the same blocks build previously and it does output an ANSI file (checked the package from top to bottom). However we are getting mixed results on different machines. Some machines will give an ANSI file other the UTF-8 file.
Is this solved already? My idea is to delete the whole Data Flow Task and re-create it. I suppose the metadata is stuck and overwritten at each execution.
I believe you need not to change anything in your ssis package just check your editor setting (notepad++). Go to settings --> Preferences --> new document setting
You need to uncheck the 'Apply to opened ANSI files' checkbox.
Kindly check and let me know if it works for you.

SSIS Permission Issue Flat Files

I recently had to move my files to a new SSIS server. Everything seems to be working except when I try to execute a bulk insert it tells me
(Cannot bulk load because the file "E:\FlatFiles\SSG\apmast.txt" could
not be opened. Operating system error code 21(The device is not
ready.).".
It does this for all my flat files. I found an article saying you need to give the MSSQLSERVER user full control of the files, which I did but this does not seem to fix it. Any other ideas? Do I need to give other files the same permissions? I really don't want to just throw full control around if I don't have to. Thanks
I figured it out, turns out that a bulk insert tells the server to look locally for the text file. I was trying to get SSIS to do a bulk insert of flat files from one server into another sql server on the network. As soon as I put the flatfiles on the remote server it grabbed and used them. This seems like a very odd way for it to work. I would expect it to push the files to the sql server instead of asking the sql server to look for the files locally via a hard path.

Foreach Container to loop through Multiple Excel File to load

I have had Packages in the past where I was looping through multiple Text files in a folder and loading into sql server tables.
Now I am asked to create a package which will loop through Multiple Excel Files in a folder and load them into sql server table.
I went through the following steps to create this package assuming it shouldn't be much different from what I have in other packages where it loops through multiple Flat file.
Added an Execute Sql Task, Truncating my staging table, A simple Truncate table statement.
Added a Foreach Loop Container. Selected Foreach File Enumerator and created a variable called File_Path with data type string.
Added a Data Flow Task.
Added an Excel Data Source. and configured the Excel Connection manager By selecting any one 'Excel' File in the destination folder. (At this point is configured correctly as it is not showing any red cross or warring messages.)
Then I selected the Excel File Connection Manager and in Properties windows Under Expressions, Selected Connection String property and Used the User Variable #User::File_Path.
At this point the Excel Data source is showing a Red Cross as it needs further configuration.
I have tried a few things Like changing the Data Access Mode from Table name to Table Name or View Name Variable, And passing variable #User::File_Path but it gives me the following error.
Can someone please have a look and advice where I am going wrong and how I can fix this? Any Advice or a pointer in the right direction is much appreciated.
Thank you.
You shouldn't use an expression on the ConnectionString property, but on the ExcelFilePath property.

When I save SSIS package the changes are lost

I have a SQL Server 2005 SSIS package with an Execute SQL task. I edit the SQL statement, go through the OKs to make the change. I go back in to the Execute SQL task- and the changes have taken hold. Great.
I now go to save the package. I click save. I look back in the Execute SQL task, and the changes have been lost.
What is going on here and how can I stop it?
Click the Execute SQL Task
Click F4 to see the list of properties
Click the Expressions line and then the little [...] box to see the expressions
Change the expressions (or remove them if you don't want your name/sqlstatementsource/etc. to be set dynamically).
This is a nice feature for reusability and template based development of SSIS packages. We use it allt he time.
I had the same issue. The reason was that I used a config file with all the executables included, and sql code is just a property of one of the executable which is stored in the config file. When you change your code the config file does NOT get updated, but when you close/open your project the values are getting pulled from the config file which reverse your change to the initial state!
In order to fix this you need either to exclude your executable from the config file (what I did) or re-create the config file every time you change the package.
I hope this helps.
I found I was unable to change the "To" in the Send Mail Task. Or rather, I would press OK on the new value and then bring it up again and it would be the old value. What I did -- was to edit the dtsx file in Notepad by searching for the old value and changing it.

Resources