SSIS Transfer Task That Handles Schema Changes - database

I'm using SSIS with SQL Server 2k5 to build a transfer task to copy all of the data from one database to another. This works quite well, except for one problem - the source database will periodically have schema changes (generally just additions like new columns) but the transfer task seems to choke if the two schemas don't match exactly. Is there some way that I can use SSIS to first bring the target DB up to date with the source DB's schema, and then do the transfer?

You can open the package programmatically and re-save it before executing. You can also programmatically build the package using the SSIS object model.

Related

Methods to transfer Tables from source database to destination database using SSIS dynamically

I am relatively new to SSIS and have to come up with a SSIS package for work such that certain tables must be dynamically moved from one SQL server database to another SQL server database. I have the following constraints that need to be met:
Source table names and destination table names may differ so direct copying of table does not work with transfer SQL server object task.
Only certain columns may be transferred from source table to destination table.
This package needs to run every 5 minutes so it has to be relatively fast.
The transfer must be dynamic such that if there are new source tables, the package need not be reconfigured with hard coded values.
I have the following ideas for now:
Use transfer SQL Server object task but I'm not sure if the above requirements can be met, especially selective transfer of tables and dynamic mapping of columns.
Use SQLBulkCopy in a script component to perform migration.
I would appreciate if anyone could give some direction as to how I can go about meeting the requirements and if my existing ideas are possible.

ssis sql sap hana db (odbc)

Hi I am using SSIS (MSSQL) to copy data between multiple tables. This has been working fine up until recently when the S.A.P. team keeps updating the schema of the tables without telling me.
I have multiple tables that they continue to add columns to; this in turn makes my SSIS job of copying the data across fail.
Is there a way in SSIS that I can look at the source table and adjust my destination table to reflect the changes on the fly?
I'm quite new at SSIS and don't mind running a script out of the GUI but wondered if this was an option within the GUI I'm already familiar with.
So in short, can I in SSIS allow for new columns being added to source tables and update my destination tables automatically to stop my jobs failing
(Oh and map source to destination tables automatically)?
You'll have to include the new columns in the data flow, i.e. source and destination (include and map them). So basically you CANNOT automate what you're looking for in SSIS. Hope it helps.
Look into BiML Script, which lets you create and execute SSIS packages dynamically based on the meta data available at run time.

create reusable controls/packages in an one dtsx package in SSIS?

We use SSIS for our data automation. The caveat is we don't use the normal way mentioned online. For our environment, we save the Package.dtsx file on a server that has a windows job that will execute it using dtexec.exe.
I have multiple SSIS packages to pull data from various sources (Oracle, MySQL, SQL Server) and the general flow for them is the same. The table names are different but I will use data as the table names for one of the sources/SSIS packages.
backup the table data into bak_data on the destination DB
import new data from the source into data
compare data quality (row count) against data and bak_data
if data quality does meet our threshold, send a success e-mail (execute task against our destination DB using db_send_dbmail)
if the data quality does not meet our threshold, backup data to bad_data then restore from bak_data to data and send failure e-mail
Since the steps are always the same I thought I could use Control Flow Package Parts and then just use variables for the table names and what not.
But upon further investigation I realized I cannot do that because the Control Flow Package .dtsxp is a separate file referenced in/from the Package.dtsx file?
I can copy it to our automation server but not sure if that will be enough to work when Package.dtsx is executed using dtexec.
Is there anyway I can create reusable controls/packages with my constraint/situation?

Mimic DTS Copy SQL Server Objects Task

I am in the process of migrating a web application database from SQL Server 2000 to SQL Server 2008. Currently there is a DTS package that is used to deploy content changes from a staging database to the production database.
The DTS package is using a Copy SQL Server Objects task with the following options selected: Copy Data (Append Data) and Use Collation. The specific tables to copy are selected in the "Select Objects" dialog.
Because this is the only DTS package we have, it doesn't make much sense to learn and implement an SSIS solution, IMO, so I want to recreate the functioning of the DTS package using only T-SQL.
Writing the Insert and Select is not a problem. What I need to know is how the "Append Data" option works.
Is it looking at each row in the source, finding matching rows in the destination, comparing and updating as necessary OR is is ignoring existing rows and simply appending new rows?
If it is indeed comparing and updating, is it safe to use the SQL Server Checksum function on the data as a method of comparison against the target or is there a better way? Ideally, I'd like to avoid any schema changes.
Please check this msdn article : Migrating DTS Packages to Integration Services
You might be able to migrate the single DTS package to SSIS package very easily using the tool noted in the article.

Create & Copy Access tables to SQL via SSIS? SQL 2008

I am trying come up with a way to pull the tables out of an Access database, automate the creation of those same tables in a SQL 2008 DB, and move the data to the new tables. This process will happen on a regular basis and there may be different tables each time.
I would like to do this totally in SSIS.
C# SQL CLR objects are an option.
The main issue I have been running into is how to get the Access table's schema and then convert that to a SQL script that I can run via SSIS.
Any ideas?
TIA
J
SSIS cannot adapt to new tables at runtime. (You can change connections, move a source to a table with a different name, but the same schema) So, it's not really easy to do what I think you are saying: Upsize an arbitrary set of tables in an Access DB to SQL (mirroring their structure and data, naming, etc), so that I can then write some straight SQL to transform the data into another SQL database or the same part of the database.
You can access the SSIS object model from C# and build a package (or modify a template package) programmatically and then execute it. This might offer the best bang for your buck, but the SSIS object model is kind of deep. The SSIS Team blog have finally started putting up examples (a year after I had to figure a lot of this out for myself)
There is always the upsizing wizard, and I'm sure there are some third party tools.

Resources