Find source and destination tables across all packages in a solution? - sql-server

I have around 10 solutions .sln, each of them with multiple packages in it.
I would like to be able to get:
solution name
package name
mapping between the source and target columns
source and target table(s)
source connection or source DB
Would that be possible by parsing the .dtsx file? I understand I can save it to an .xml file.
For now, what I did was to configure the SQL Server Profiler and to run all packages to retrieve the current SQL queries, dumping them in a table and parsing them directly in SQL Server.
I wonder if there is not a better solution using an external tool (like biml) or by parsing the files directly.
Any suggestions are appreciated

Related

Need advice on SSIS solution with variable based connection managers

I am working on the SSIS solution to get 20 different txt files from a specific folder and upload them to different SQL Server database tables. I added a mapping table with table_name, file_name, file_path, full_connection_string. How do I tell connection manager which connection to use for a certain file?
Which variables/parameters to use and where?
I do not want to have 20 txt connections(for the known filename difference) and 20 database connections.
All online tutorials are old and don't match Visual Studio 2019 UI.
Any help is highly appreciated!
You can have parametric connection string.
Define the variable: _Server (string)
Select the Connection
In Properties window, select [Expression]
Use your variable for (SereverName)
Obviously you have a loop and you can load the proper server name into your variable so in each iteration, the connection will be connected to specific server.
You will need one database connection manager for each target database as this is scoped at the database level.
You will need a flat file connection manager for each unique file metadata. You can have twenty Sales-date.txt files that use a single flat file connection manager and then expressions will take care of consuming the different files.
However, if you have Sales.txt and Customers.txt, the metadata, aka the columns inside the file, are going to be different and that's fine but you'll have to create a flat file connection manager for each of those types. This is the contract you're making with the SSIS engine - I promise all of the files this FFCM will touch conform to this standard. You will also need to have a Data Flow Task for each of these FFCMs as the engine computes how many rows of data it can operate on at a time based on the type and column constraints in the source.
If it were me, I'd spend a few days looking at Biml. Biml is the Business Intelligence Markup Language and what it allows you to do is describe your problem in a repeatable way. i.e. For each of these file types, I need an SSIS package that is a for each file enumerator to pick up the current file. Inside that, a data flow task to ingest the file. Then a File System Task to archive the file out of the working folder.
You've already started down this path identifying your metadata ( table_name, file_name, file_path, full_connection_string) the only thing remaining is to describe the contents of your files. If you look through my SO answers, you'll find plenty of answers that use Biml to create a reproducible solution.

SSIS: How to Load differently named flat files from a folder using SSIS

I'm very new to SSIS packages so please forgive me if this is a simple query.
I have 2 SSIS packages that have been set up;
The first picks up a csv file, formats the data slightly (cuts off a prefix in one of the columns) and places it in another folder with an updated filename and timestamp.
The Second package imports the formatted file into a SQL database table.
The issue that I have is that the incoming file names for the first package may differ, the structure of the data will remain the same.
Is there a way to configure the flatfile connection manager to pickup any files in the C:Incoming\ folder?
Thanks in advance.
You can use a foreach loop container, to get the files in a folder, and use expressions in flat file connection to get it.
For a detailed answer, you can refer to this article:
Loop through Flat Files in SQL Server Integration Services

Save SSIS package(.dtsx) in SQL Server Table

All,
As I understand the code behind of SSIS is just an XML file. Is there any way so that I can load this whole XML in my SQL Server table and on runtime able to create a package on any given server using this XML file.
SO basically it is as simple as storing XMl file or blob data into SQL Server table and load it back on file system to create an actual SSIS package out of this XML.
TIA
Yes, this can be done. I haven't tried but I don't see any reason why it won't work.
Just note that which ever system you are trying to extract and run the package you would need to have SSIS engine installed on that system.
However I would like to advice that there is a more elegant way of dynamic SSIS package creation available. Here is the link - https://msdn.microsoft.com/en-us/library/ms135946.aspx

Speeding Up ETL DB2 to SQL Server?

I came across this blog post when looking for a quicker way of importing data from a DB2 database to SQL Server 2008.
http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-data-from-db2-to-microsoft-sql-server/
I'm trying to figure out how to achieve the following:
3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)
I've got the data in a TXT file and a separate FDF with the details of the table structure. How do I combine them to create a suitable .FMT file?
I couldn't figure out how to create the suitable .FMT files.
Instead I ended up creating replica tables from the source DB2 system in SQL Server and ensured that that column order was the same as what was coming out from the IBM File Transfer Utility.
Using an Excel sheet to control what File Transfers/Tables should be loaded, allowing me to enable/disable as I please, along with a For Each Loop in SSIS I've got a suitable solution to load multiple tables quickly from our DB2 system.

Loading many flatfiles into SQL Server 2005

I have a very annoying task. I have to load >100 CSV-files from a folder to SQL Server database. The files have column names in first row. Data type can be varchar for all columns. The table names in database can just be filename of the CSVs. What I am currently doing is that I use Import/Export Wizard from SSMS, I choose flatfile from dropdown box, choose the file, next->next->next and finish! Any ideas how can I automate such a task in Integration services or with any other practical method?
Note: Files are on my local PC, DB-server is somewhere else, so I cannot use BULK INSERT.
You can use SSIS - Foeach loop container to extract file names - by arranging to particular format.Use a variable to dynamically fill the variable with file name.Then in dataflowtask , use flat file source for source - oledb destination.
Please post some sample file names.so that i can learn and guide you properly.
Thanks
Achudharam

Resources