Update table column value with SSIS package Name - sql-server

I am creating .cvs file by SSIS.
Once .cvs file is created then I need to update one of the table column value with created SSIS package Name just to indicate that rows are exported to .cvs file.
msdn.dbo.sysssispackege this does not work for me , I need other options to achieve this.

I found solution here: Please go to the link
http://www.wiseowl.co.uk/blog/s359/files-foreach.htm

Related

Creating Netezza table from csv file with header

I need to create table in Netezza from csv file with header. Target table should have the columns as the headers in source file. Target table should have flexible structure changing based on source file headers. Is it possible?
I don't know what your expectations are for this? You will need datatypes as well as column names, and short of using NVARCHAR(100) for everything and crossing your fingers, I don't know how to address that from the csv file alone...
If you can get your source system to provide another csv file with metadata for COLNAME,DATATYPE(Precision) you can certainly change that to a valid 'create table' statement in netezza, and then something like this to get you the rest of the way:
CREATE EXTERNAL TABLE demo_ext SAMEAS emp USING (dataobject
('/tmp/demo.out') DELIMITER '|');
More info here:
https://www.ibm.com/support/knowledgecenter/en/SSULQD_7.2.1/com.ibm.nz.load.doc/c_load_create_external_tbl_expls.html
From the question, im assuming the requirement is you need to create a new table/alter every time based on ur source csv file. Is it?
#Moutusi Das
As mentioned by Lars G Olsen, the option will be having a separate file for table creation which contains column names & datatypes. You can use unix script to get the details of table creation file & create/altee table in netezza.
Then load the table using external table command.

Does SSIS consider wild card characters in CSV input?

I have a SQL table that stores filename and ssis package name. Whenever the file gets dropped to a directory, the corresponding ssis package gets triggered referring the mapping table.
If I store the file name as say, a*.csv in database and the corresponding ssis package as sample-ssis.dtsx, Will I be able to trigger the same package for any csv file starting with "a"? Can someone please help me with this.
Sure, you can read the file name into a variable and use a script task to loop through your mapping table and see if any of the filename-with-wildcard entries in the mapping table match the file name in the variable.

Use SSIS to import multiple .csv files that each have unique columns

I keep running into issues creating a SSIS project that does the following:
inspects folder for .csv files -> for each csv file -> insert into [db].[each .csv files' name]
each csv and corresponding table in the database have their own unique columns
i've tried the foreach loop found in many write ups but the issue comes down to the flat file connection. it seems to expect each csv file has the same columns as the file before it and errors out when not presented with this column names.
anyone aware of a work around for this?
Every flat file format would have to have it's own connection because the connection is what tells SSIS how to interpret the data set contained within the file. If it didn't exist it would be the same as telling SQL server you want data out of a database but not specifying a table or its columns.
I guess the thing you have to consider is how are you going to tell a data flow task what column in a source component is going to map to a destination component? Will it always be the same column name? Without a Connection Manager there is no way to map the columns unless you do it dynamically.
There are still a few ways you can do what you want and you just need to search around because I know there are answers on this subject.
You could create a Script Task and do the import in .Net
You could create a SQL Script Task and use BULK INSERT or OPENROWSET into a temporary stagging table and then use dynamic sql to map and import the final table.
Try to keep a mapping table with below columns
FileLocation
FileName
TableName
Add all the details in the table.
Create user variables for all the columns names & one for result set.
Read the data from table using Execute SQL task & keep it in single result set variable.
In For each loop container variable mappings map all the columns to user variables.
Create two Connection Managers one for Excel & other for csv file.
Pass CSV file connection string as #[User::FileLocation]+#[User::FileName]
Inside for each loop conatiner use bulk insert & assign the source & destination connections as well as table name as User::TableName parameter.
if you need any details please post i will try to help you if it is useful.
You could look into BiML Script, which dynamically creates and executes a package, based on available meta data.
I got 2 options for you here.
1) Scrip component, to dynamically create table structures in sql server.
2) With for each loop container, use EXECUTE SQL TASK with OPENROWSET clause.

Inserting data into the newly created tables from table variable SSIS and not one single table

I have been searching for about a week now and I was wondering if anyone may have a clue. I wrote a package to do the following:
loop through a parent folder and its subfolders for a csv with a particular naming structure (works)
Create a table for each .csv based on the enumeration of each file (works).
Import the data into sql server in their own tables with the file name that was created as the table name and not OLE DB Destination (which does not work). It works if it there is destination folder for everything, but when I use table variable that does not work.
What I did was add an Execute SQL task to the for each container to create a table with a variable for the file path that is mapped as an expression in the for each container in a create table query under property sqlstatementsource expression. The tables are created, but when I use the variable that was mapped for the for each loop as the table name or variable in OLE DB Destination I get an error asking for me to check if the table exists. The tables are created, but I cannot get the insertion of the data into their own tables. Even when I bypass the error of "Destination table has not been provided" and run the package. I set delayValidation as true and still nothing. SSIS from what I have seen so far does some cool things. However, I am stuck right now. What else am I doing wrong?
I forgot to mention that the data is going to sql server.
Thanks for everything.
You can't create an OLEDB Destination at design time with a variable for a table name. The OLEDB destination needs to know the table name, and the columns, so that it can pre-map the data flow to the table columns.
You have a couple of other options:
You can use BiML to dynamically create your dataflows and destinations.
You can use an ExecuteSQL Transformation as your dataflow destination, and write a dynamic SQL statement that inserts each row in the dataflow to the desired table.

SSIS Dynamic Mapping column

I'm little new to SSIS and I have a need to import some flat files into SQL tables in the same structure.
(Assume the table is already exist in the same structure and table name and flat file name is same)
I thought to create a generic package (sql 2014) to import all those file by looping through a folder.
I try to create a data flow task in a foreach loop container in the data flow task I dropped a flat file source and ADO.Net destination .
I have set the file source to a variable so that every time it loops through it get the new file. similarly for the ADO.net table name I set it to a the variable so that each time it select a different table according to the file name.
since both source column names and destination column names are same I assume it will map the columns automatically.
but with a simple map it didn't let me to run the package so added a column on the source and selected a table and mapped it.
when I run the package I assumed it will automatically re map everything.
but for the first file it ran but second file it failed complaining with map issues.
can some one let me know whether this is achievable by doing some dynamic mapping?? or using any other way.
any help would be much appreciated.
thanks
Ned

Resources