Here i am new in Developing the SSIS package
I need your support to come up with the solution.
I have 10 different set of stored procedures which I have to export into text file, all 10 procedures will return the same set of columns (only calling parameters are different).
I am not getting the solution how to do ?
Could you please help me to understand how to export the data from a stored procedure output to tab delimited text file?
Please let me know how to build the ssis package ?
Thanks
This is very hard to do without putting pictures in on each step. I do not seem to be able to put pictures in so I will try to describe in is much detail as I can.
You have to first set up a connection to the database where you are going to run the stored procedures from. This means creating a connection manager for "New OLE DB Connection". You will need vaild login to the database information to create this connection.
In the control area I would set up a "Execute SQL Task". I would set the result set to full result set and set the connection to the one you named in the prior step. To call a stored procedure from a SQL task use something like "exec ? = dbo.usp_check_load_table_all #JobCode = ?, #TransId = ? , #Status = ?, #TurnStatusOff = ?" The first ? is the return code from the stored procedure. The others are the parameters to run the stored procedure. Now you are running 10 different stored procedures and I only know how to run one but you could create ten packages, one to run each and concatenate the files when it is done. In the parameter mapping you set the values for the variables to run with. Make sure to create a USER::ReturnValue type long for the return code. The results set needs one entry a USER::Results of type object.
You now put in a foreach loop for a ADO enumerator putting in the USER::Results in as the variable. This allows you to read in each row one at a time. You must create user variables for the variable mapping to go into.
I would then do a data flow task and put a derived column task and set up each of the fields you want to write to the file from the USER:: fieids you created for the foreach loop.
I would create a flat file connection in the connection manager as a delimited file, tab delimited. You will need a file that looks like the output you desire as you will need to map each field in the file.
Add a flat file destination to under the deriried column task and map it to the flat file you just created. Now map each field to the output.
I hope this is helpful as I was once new SSIS myself.
Related
I am a newbie to SSIS and currently struggling with executing SQL task saving the result in a result set and exporting each table to a respective CSV using a data flow task.
There are 15 .sql files stored in a folder which I am creating a variable called FolderPath pointing towards them. Then I create a for each container that reads from a folder and create a variable in the variable mapping which is called SQLfile.
Inside the for-each container I have an execute SQL task which I changed its file connection variable and edited the expression to FolderPath + SQLfile.
Executing this loop works, when the result set value is set to none. Now I am trying to connect the tables created from this loop in a data flow task. I have no idea how to do this but I am guessing it has something to do with the result set. When I change the result set to full result set my loop breaks. I am assuming you cant have a result set inside a loop.
Now I am completely lost as I don't know how to save the result of those 15 tables and how to declare them as source in the data flow task.
Store the query in a variable. Then use the OLE DB Source component in the Data Flow Task and set the 'data access mode' to sql command from variable.
I keep running into issues creating a SSIS project that does the following:
inspects folder for .csv files -> for each csv file -> insert into [db].[each .csv files' name]
each csv and corresponding table in the database have their own unique columns
i've tried the foreach loop found in many write ups but the issue comes down to the flat file connection. it seems to expect each csv file has the same columns as the file before it and errors out when not presented with this column names.
anyone aware of a work around for this?
Every flat file format would have to have it's own connection because the connection is what tells SSIS how to interpret the data set contained within the file. If it didn't exist it would be the same as telling SQL server you want data out of a database but not specifying a table or its columns.
I guess the thing you have to consider is how are you going to tell a data flow task what column in a source component is going to map to a destination component? Will it always be the same column name? Without a Connection Manager there is no way to map the columns unless you do it dynamically.
There are still a few ways you can do what you want and you just need to search around because I know there are answers on this subject.
You could create a Script Task and do the import in .Net
You could create a SQL Script Task and use BULK INSERT or OPENROWSET into a temporary stagging table and then use dynamic sql to map and import the final table.
Try to keep a mapping table with below columns
FileLocation
FileName
TableName
Add all the details in the table.
Create user variables for all the columns names & one for result set.
Read the data from table using Execute SQL task & keep it in single result set variable.
In For each loop container variable mappings map all the columns to user variables.
Create two Connection Managers one for Excel & other for csv file.
Pass CSV file connection string as #[User::FileLocation]+#[User::FileName]
Inside for each loop conatiner use bulk insert & assign the source & destination connections as well as table name as User::TableName parameter.
if you need any details please post i will try to help you if it is useful.
You could look into BiML Script, which dynamically creates and executes a package, based on available meta data.
I got 2 options for you here.
1) Scrip component, to dynamically create table structures in sql server.
2) With for each loop container, use EXECUTE SQL TASK with OPENROWSET clause.
Trying to convert an old xp_cmdshell process that creates output files from within a cursor. I broke it down into a function call to pull a list of dates and a stored procedure that processes an individual day's worth of data.
When I try to get this working in SSIS, I keep going around in circles. I have an OLE DB Source dataflow task that references the function stored in a variable. I can pass that output to an OLE DB command that runs a sqlcommand "exec ?=export_data ?".
It's at this point where I get stuck. I need to export each date as a separate file, but can't get the flat file destination to work. I tried editing the function call with a second (null) column "result" to contain the output from each command call, but all I get is the '0' returnvalue instead of the in-memory table. I was able to get the recordset destination to work, but couldn't progress from there.
I know there's some extra work to be done with the flat file destination to get the distinct files, but I'd be happy just to get one file with some relevant data at this point.
I am working with SSIS 2008. I have a select query name sqlquery1 that returns some rows:
aq
dr
tb
This query is not implemented on the SSIS at the moment.
I am calling a stored procedure from an OLE DB Source within a Data Flow Task. I would like to pass the data obtained from the query to the stored procedure parameter.
Example:
I would like to call the stored procedure by passing the first value aq
storedProdecure1 'aq'
then pass the second value dr
storedProdecure1 'dr'
I guess it would be something like a cycle. I need this because the data generated by the OLE DB Source through the stored procedure needs to be sent to another destination and this must be done for each record of the sqlquery1.
I would like to know how to call the query sqlquery1 and pass its output to call another stored procedure.
How do I need to do this in SSIS?
Conceptually, what your solution will look like is an execute your source query to generate your result set. Store that into a variable and then you'll need to do iterate through those results and for each row, you'll want to call your stored procedure with that row's value and send the results into a new Excel file.
I'd envision your package looking something like this
An Execute SQL Task, named "SQL Load Recordset", attached to a Foreach Loop Container, named "FELC Shred Recordset". Nested inside there I have a File System Task, named "FST Copy Template" which is a precedence for a Data Flow Task, named "DFT Generate Output".
Set up
As you're a beginner, I'm going to try and explain in detail. To save yourself some hassle, grab a copy of BIDSHelper. It's a free, open source tool that improves the design experience in BIDS/SSDT.
Variables
Click on the background of your Control Flow. With nothing selected, right-click and select Variables. In the new window that pops up, click the button that creates a New Variable 4 times. The reason for clicking on nothing is that until SQL Server 2012, the default behaviour of variable creation is to create them at the scope of the current object. This has resulted in many lost hairs for new and experienced developers alike. Variable names are case sensitive so be aware of that as well.
Rename Variable to RecordSet. Change the Data type from Int32 to Object
Rename Variable1 to ParameterValue. Change the data type from Int32 to String
Rename Variable2 to TemplateFile. Change the data type from Int32 to String. Set the value to the path of your output Excel File. I used C:\ssisdata\ShredRecordset.xlsx
Rename Variable 4 to OutputFileName. Change the data type from Int32 to String. Here we're going to do something slightly advanced. Click on the variable and hit F4 to bring up the Properties window. Change the value of EvaluateAsExpression to True. In Expression, set it to "C:\\ssisdata\\ShredRecordset." + #[User::ParameterValue] + ".xlsx" (or whatever your file and path are). What this does, is configures a variable to change as the value of ParameterValue changes. This helps ensure we get a unique file name. You're welcome to change naming convention as needed. Note that you need to escape the \ any time you are in an expression.
Connection Managers
I have made the assumption you are using an OLE DB connection manager. Mine is named FOO. If you are using ADO.NET the concepts will be similar but there will be nuances pertaining to parameters and such.
You will also need a second Connection Manager to handle Excel. If SSIS is temperamental about data types, Excel is flat out psychotic-stab-you-in-the-back-with-a-fork-while-you're-sleeping about data types. We're going to wait and let the data flow actually create this Connection Manager to ensure our types are good.
Source Query to Result Set
The SQL Load Recordset is an instance of the Execute SQL Task. Here I have a simple query to mimic your source.
SELECT 'aq' AS parameterValue
UNION ALL SELECT 'dr'
UNION ALL SELECT 'tb'
What's important to note on the General tab is that I have switched my ResultSet from None to Full result set. Doing this makes the Result Set tab go from being greyed out to usable.
You can observe that I have assigned the Variable Name to the variable we created above (User::RecordSet) and I the Result Name is 0. That is important as the default value, NewResultName doesn't work.
FELC Shred Recordset
Grab a Foreach Loop Container and we will use that to "shred" the results that were generated in the preceding step.
Configure the enumerator as a Foreach ADO Enumerator Use User::RecordSet as your ADO object source variable. Select rows in the first table as your Enumeration mode
On the Variable Mappings tab, you will need to select your variable User::ParameterValue and assign it the Index of 0. This will result in the zerotth element in your recordset object being assigned to the variable ParameterValue. It is important that you have data type agreement as SSIS won't do implicit conversions here.
FST Copy Template
This a File System Task. We are going to copy our template Excel File so that we have a well named output file (has the parameter name in it). Configure it as
IsDestinationPathVariable: True
DestinationVarible: User::OutputFileName
OverwriteDestination: True
Operation: Copy File
IsSourcePathVariable: True
SourceVariable: User::TemplateFile
DFT Generate Output
This is a Data Flow Task. I'm assuming you're just dumping results straight to a file so we'll just need an OLE DB Source and an Excel Destination
OLEDB dbo_storedProcedure1
This is where your data is pulled from your source system with the parameter we shredded in the Control Flow. I am going to write my query in here and use the ? to indicate it has a parameter.
Change your Data access mode to "SQL Command" and in the SQL command text that is available, put your query
EXECUTE dbo.storedProcedure1 ?
I click the Parameters... button and fill it out as shown
Parameters: #parameterValue
Variables: User::ParameterValue
Param direction: Input
Connect an Excel Destination to the OLE DB Source. Double click and in the Excel Connection Manager section, click New... Determine if you're needing 2003 or 2007 format (.xls vs .xlsx) and whether you want your file to have header rows. For you File Path, put in the same value you used for your #User::TemplatePath variable and click OK.
We now need to populate the name of the Excel Sheet. Click that New... button and it may bark that there is not sufficient information about mapping data types. Don't worry, that's semi-standard. It will then pop up a table definition something like
CREATE TABLE `Excel Destination` (
`name` NVARCHAR(35),
`number` INT,
`type` NVARCHAR(3),
`low` INT,
`high` INT,
`status` INT
)
The "table" name is going to be the worksheet name, or precisely, the named data set in the worksheet. I made mine Sheet1 and clicked OK. Now that the sheet exists, select it in the drop down. I went with the Sheet1$ as the target sheet name. Not sure if it makes a difference.
Click the Mappings tab and things should auto-map just fine so click OK.
Finally
At this point, if we ran the package it would overwrite the template file every time. The secret is we need to tell that Excel Connection Manager we just made that it needs to not have a hard coded name.
Click once on the Excel Connection Manager in the Connection Managers tab. In the Properties window, find the Expressions section and click the ellipses ... Here we will configure the Property ExcelFilePath and the Expression we will use is
#[User::OutputFileName]
If your icons and such look different, that's to be expected. This was documented using SSIS 2012. Your work flow will be the same in 2005 and 2008/2008R2 just the skin is different.
If you run this package and it doesn't even start and there is an error about the ACE 12 or Jet 4.0 something not available, then you are on a 64bit machine and need to tell BIDS/SSDT that you want to run in 32 bit mode.
Ensure the Run64BitRuntime value is False. This project setting can be found by right clicking on the project, expand the Configuration Properties and it will be an option under Debugging.
Further reading
A different example of shredding a recordset object can be found on How to automate the execution of a stored procedure with an SSIS package?
I am trying to create an SSIS package that queries data from a table, and calls a stored procedure in another database with each row.
In my old DTS package, I was doing this:
EXEC myStoredProcedure ?, ?, ?
...and then I mapped the parameters. However, in SSIS, I can't figure out how to make this work.
I have a Data Flow task, which first runs a query for the data. It passes the data to an OLE DB Destination. I set the Data access mode to "SQL command", but when I try to put in the SQL above, I get "Invalid Parameter Count" when it parses the SQL. I can't get to the Mappings screen. Any ideas?
In the Data Flow, the OLE DB Command can be used to execute a SQL statement for each row in a dataflow - (MSDN documentation)
Alternatively, you can store the source result set in a variable of data type object and use a Foreach Loop container in the Control Flow (example here).
You will need to use an Execute SQL Task. In the SQLStatement section you can add the code to execute the stored procedure.
In order to pass in parameters, use the ? syntax and specify the parameters in the "Parameter Mapping" section.
A good example can be found here.