SSIS variable from excel file - sql-server

I have a stored procedure on a server which generates a table in my database. Then in ssis I'm querying some columns from that table and then I'm appending some dummy columns filled with static values. When I query the database I'm doing it so by holding the query into a variable (sql command from variable), in that query I am using a select a, b, c from X where #[User::variable1] = '' and #[User::variable2]='' for all 4.
My question is: I need to be able to change the value of those variables (variable1 to 4) for 48 different scenarios (or might be more than that), so manually replacing them would be a pain since it will lead to over 130 combos. If there a way in which I could pass the values from an excel file at runtime to the package?
ex:
column1 column2 column3 column4
12.03.2015 def ghi jkl
12.04.2015 456 789 012
..
..
And I need to loop through all columns in the excel file and the results should be exported to files.
What I described above I already made except for the part in which I can get the values for those 4 variables from the excel file. I need help only with this part.
Any help would be great.
Thank you,
Cristian

Yes, this is possible.
Create a Connection to Excel
Create a Transit table to store the excel content (obviously the column names)
Create a "Data Flow" Task to transfer the content from Excel into the Transit Table
Create an "Execute SQL Task"
Get one by one row from Transit table in a loop or Cursor
Dynamically create a SQL string with the value read from the Transit Table
Execute it by using sp_executesql.
Use "Result set" if you want to output any recordset

Related

Inserting data into SQL table from SAS dataset isn't working as expected

I am attempting to insert a SAS dataset into an existing table on a SQL Server. This is via a simple proc sql statement.
proc sql;
insert into Repo.Test_Table
select * from Work.MetaTable;
quit;
One of the fields, [Method], is not inserting into the SQL table as expected. The [Method] field in the SAS table contains several brackets and other punctuation so I think this is causing a problem. For example, the Work.MetaTable looks like this:
Field_ID
Method
1
([Field_1]<=[Field_8])
2
([Field_4]=[Field_5])
When I run the proc sql to insert this into SQL, it only inserts the first open bracket "(" and this is the case for every row. For example, those two rows look like this in the SQL table:
Field_ID
Method
1
(
2
(
The [Method] field in SQL is nvarchar(max).
Does anyone know what might be causing the issue here and how I can get around it?

SSIS: enrich query and table with input file as base

I need to extract data from a DB2 database to a SQL Server. I need to create my query based on a Excel file I have 176 records, which I need to create repeating queries & put in SQL server
So for example;
I have an Excel with a Number, From date, To date, and a Country
So the query should use these information from the records
SELECT *
FROM dbo.Test
WHERE Number = excel.Number1 AND Date BETWEEN excel.fromDate1 AND excel.toDate1 AND Country = excel.country1
And then another query with
SELECT *
FROM dbo.Test
WHERE Number = excel.Number2 AND Date BETWEEN excel.fromDate2 AND excel.toDate2 AND Country = excel.country2
Etc...
How should I do something like this in SSIS?
If needed I can put the DB2 and Excel data in MS SQL
You can proceed with the following approach:
Extract data rows from Excel and put it into SSIS Object Variable
Proceed with a Foreach loop to get each row from the Object Variable, parsing Object Variable to separate variables
Inject variable values into SQL Select command with Expressions
Perform Data Flow task based on SQL command, transform and put it into the target
Overall, your task seems to be feasible, but requires some knowledge on parsing Object Variable in Foreach Loop, and writing Variable Expressions.

ETL Script to dynamically map multiple EXECUTE SQL resultset to multiple tables (table name based on sql file provided)

ETL Script to dynamically map multiple execute sql resultset to multiple tables (table name based on sql file provided)
I have a source folder with sql files ( I can put it up as stored procedures as well ) . I know how to loop and execute sql tasks in a foreach container. Now the part where I'm stuck is I need to use the final result set of each sql queries and shove it into a table with the same name as the sql file.
So, Folder -> script1.sql , script2.sql etc -> ETL -> goes to table script1, table script2 etc.
EDIT : Based on the comment made by Joe, I just want to say that I'm aware of using insert within a script but I need to insert it onto a table in a different server.And Linked servers are not the ideal solutions
Any psuedocode or link to tutorials will be extremely helpful . Thanks!
I would add the table creation to the script. It is probably the simplest way to do this. If your script is Select SomeField From Table1, you could change it to Select SomeField Into Table script1 From Table1. Then there is no need to map in SSIS which is not easy to do from my experience.

How to evaluate a sql string stored in excel column using SQL Server Integration Services?

I know how to insert data from excel columns to a table (I followed a very basic tutorial, use an excel source, data conversion and ole db destination).
But my problem is a little more elaborate, I have ~ 100 excel files where has stored in E column a SQL string created with =concatenate("insert into same table (", If(B2 > ...), C2 , ")";
Using a Integration ServicesI want to execute the value of column E of every excel file.
|A |B |C |D |E |
|Juan| 200| 'Activo'| ...|'insert into sometable ...'
How to achieve this?
You may want to make sure that the column E has a header. Inside a for each loop container, just extract that 1 column (which has your query) and store it within a string variable. Now within this For Each loop container, put in a Execute SQL Task and use the Dynamic SQL to execute the query. The caveat to this is that once the package fails, you will need to know on what statement it failed so that you don thave repeat the whole process. And if there are any spurious values, your DB's would be bombed.

How to insert retrieved rows into another table using ssis

I have a table and it has 500 rows. I want to retrieve only 10 rows and i want to insert into another table using control flow only. Through data flow task we can use OLEDB source and OLEDB destination. But i want result in such a way that by using execute sql task and for each loop. Is it possible to do in that way? My Idea is, get the set of ten records and and by using foreach loop iterate to every row and insert into the table by using execute sql task. The destination table need to create on the fly. I tried with some approach but not moving towards. Please find the image file.
Example taken from Northwind
Create variables (in variable collection) which represent the columns in the table which u ll create at runtime
Example :-
Customer_ID as string
Order_Id as int
Then u need to create Execute SQL Task and write the below query to select first 10 rows
Select top 10* from orders
Use FullResultSet and in Result Set configuration store the table rows in a variableName :- User::Result ResultName:0
Drop one Execute SQL Task and create a table on fly
IF OBJECT_ID('myOrders') IS not NULL
drop table myOrders
Create table myOrders
(OrderID int,
CustomerID varchar(50)
)
combine the 2 flows from Execute sql task and connect it to the Foreach loop
Drag a foreach loop .In collection use enumerator type as Foreach ADO Enumerator
In enumerator configuration select user::Result variable which stores the top 10 rows from the execute sql task and select the radio button " Rows in the first table"
In variable mapping ,map the column variables which u have created in the first step and the index will 0 for first column and 1 for 2nd column
Drag a execute sql task inside a foreach loop and write the below query :
Insert into myOrders( OrderID,CustomerID)
values
(?,?)
Map the parameters using parameter mapping configuration in execute sql task
VariableName : OrderID Direction : Input DataType=Long ParamterName=0
VariableName : CustomerID Direction : Input DataType=varchar ParamterName=1
I hope you are doing this on a "study-mode". There is no reason why to do this on the control flow over the data flow.
Anyway, your print screen is correct, I would just add another execute sql task in the beginning to create your destination table.
Then, your execute sql task should have the query to bring the 10 rows you want, its result set should be set to "Full result set" and on the resultset tab you should map the result set to a variable like this:
and configure your foreach loop container like this:
on each loop of the foreach you will have access to the values on the variables, then you can use another execute sql task to insert then on the new crated table

Resources