I know how to parameterize the ole db destination. We can do it by changing the data access mode to "table name or view name variable". But I want parametrization at DFT level like (we can parameterize the sql command for lookup task at data flow task level (as per attached screen shot)).
In the same way as lookup, can we parameterize the oledb table name at dft level?
Related
We have around 5000 tables in Oracle and the same 5000 tables exist in SQL server. Each table's columns vary frequently but at any point in time source and destination columns will always be the same. Creating 5000 Data flow tasks is a big pain. Further there's a need to map every time a table definition changes, such as when a column is added or removed.
Tried the SSMA (SQL Server Migration Assistance for Oracle ) but it is very slow for transferring huge amount of data then moved to SSIS
I have followed the below approach in SSIS:
I have created a staging table where it will have a table name, source
query (oracle), Target Query (SQL server) used that table in Execute
SQL task and stored the result set as the full result set
created for each loop container off that execute SQL task result set
and with the object and 3 variables table name, source query and
destination query
In the data flow task source I have chosen OLE DB source for oracle
connection and choose data access mode as an SQL command from a
variable (passed source query from loop mapping variable)
In the data flow task destination I have chosen OLE DB source for SQL
connection and choose data access mode as an SQL command from a
variable (passed Target query from loop mapping variable)
And looping it for all the 5000 tables..it is not working can you please guide us how I need to create it for 5000 tables dynamically from oracle to SQL server using SSIS. any sample code/help would be greatly appreciated. Thanks in advance
Using SSIS, when thinking about dynamic source or destination you have to take into consideration that the only case you can do that is when metadata is well defined at run-time. In your case:
Each table columns vary frequently but at any point of time source destination columns will always same.
You have to think about build packages programatically rather than looping over tables.
Yes, you can use loops in case you can classify tables into groups based on their metadata (columns names, data types ...). Then you can create a package for each group.
If you are familiar with C# you can dynamically import tables without the need of SSIS. You can refer to the following project to learn more about reading from oracle and import to SQL using C#:
Github - SchemaMapper
I will provide some links that you can refer to for more information about creating packages programatically and dynamic columns mapping:
How to manage SSIS script component output columns and its properties programmatically
How to Map Input and Output Columns dynamically in SSIS?
Implementing Foreach Looping Logic in SSIS
I have an excel file with some columns as shown below.
I am using it as Excel Source in SSIS package.
I have Unpivotted the columns 2012, 2013 and 2014 using Unpivot Transform whose results look like
How can I send the output of this Unpivot Transform out of its data flow and Excecute SQL Task on it or send it to another Data Flow in the same package for various transformations? I do not want to use the option of writing the output of Unpivot Transform into OLEDB Destination and then Read them Back.
Thanks in Advance.
If you need to execute an SQL query on the Excel data, you can do this inside the DataFlow Task using an OLEDB Command transformation (The OLE DB Command transformation runs an SQL statement for each row in a data flow.)
Or you can use a RecordSet Destination to store data in a In-Memory Table (Table variable) and use it outside the DataFlow Task
You can follow the following links to learn more:
SSIS OLE DB Command Transformation
Microsoft Docs - OLE DB Command Transformation
Use a Recordset Destination
Is there any way out there in SSIS, through which I can be able to get data first, and, then use that data inside my Script Task to generate a sqlite file.
You can use an OLE DB Connection and an OLE DB Source together to pull data out from SQL Server. Add an script task, configure it as a destination task,set the input columns an ad the code necessary to fill your sqlite tables.
I am using OLE DB data source component as a part of data flow task, but I would like to keep the sql query in an external file and not embedded in the task itself. Is there an easy way to accomplish this?
You could use a script task in your control flow to extract the sql query from the file and store it in a string variable. You could then use the variable as a source for you ole db data source component.
How can I make identical output from a transformation go to two separate places e.g., an OLE DB destination and a DataReader destination?
Background:
I have an existing package that reads data from a text file, does some transformations, and loads the data into a SQL Server table.
Now I'm trying to make the package be callable from a reporting services report (SSRS). I'm following the instructions here: http://msdn.microsoft.com/en-us/library/ms159215.aspx
It says to make my data go into a DataReader destination and then the report will have access to that. So I want the output of the final transformation to go to both the SQL table, and the DataReader destination.
Use the MULTICAST and send to a "DataReader destination" and "OLEDB destination" in your SSIS package.
When you create your datasets in SSRS, you should use the name of the Output object from your SSIS package. Your dataset in the report should then populate with fields, data, etc that from the SSIS package.
Perhaps the Multicast step?