I am new to SSIS. I am using SSIS 2012 to transfer data from Excel(COZYROC Excel Source Plus component) to SQL server database(OLE DB Destination). My requirement is whenever columns in the excel are not matched with mapped columns or any columns are missing, I should log the error message in the database.
Please help to resolve this problem.
I don't believe that is possible.
SSIS (and SSRS and other applications) require a 'contract' between source and destination such that any changes to the source will throw a mapping error and force the developer to re-map the data flow.
SSIS is not capable of a scenario such as 'if the source columns change, pump the rest and log the changes to another path'.
This is also an example of why Excel makes a terrible data source for a normalized database/ETL project, because users can easily change the Excel doc in such a way that would break data mapping.
Good luck.
Related
I am relatively new to SSIS and have to come up with a SSIS package for work such that certain tables must be dynamically moved from one SQL server database to another SQL server database. I have the following constraints that need to be met:
Source table names and destination table names may differ so direct copying of table does not work with transfer SQL server object task.
Only certain columns may be transferred from source table to destination table.
This package needs to run every 5 minutes so it has to be relatively fast.
The transfer must be dynamic such that if there are new source tables, the package need not be reconfigured with hard coded values.
I have the following ideas for now:
Use transfer SQL Server object task but I'm not sure if the above requirements can be met, especially selective transfer of tables and dynamic mapping of columns.
Use SQLBulkCopy in a script component to perform migration.
I would appreciate if anyone could give some direction as to how I can go about meeting the requirements and if my existing ideas are possible.
Hi I am using SSIS (MSSQL) to copy data between multiple tables. This has been working fine up until recently when the S.A.P. team keeps updating the schema of the tables without telling me.
I have multiple tables that they continue to add columns to; this in turn makes my SSIS job of copying the data across fail.
Is there a way in SSIS that I can look at the source table and adjust my destination table to reflect the changes on the fly?
I'm quite new at SSIS and don't mind running a script out of the GUI but wondered if this was an option within the GUI I'm already familiar with.
So in short, can I in SSIS allow for new columns being added to source tables and update my destination tables automatically to stop my jobs failing
(Oh and map source to destination tables automatically)?
You'll have to include the new columns in the data flow, i.e. source and destination (include and map them). So basically you CANNOT automate what you're looking for in SSIS. Hope it helps.
Look into BiML Script, which lets you create and execute SSIS packages dynamically based on the meta data available at run time.
We use SSIS for our data automation. The caveat is we don't use the normal way mentioned online. For our environment, we save the Package.dtsx file on a server that has a windows job that will execute it using dtexec.exe.
I have multiple SSIS packages to pull data from various sources (Oracle, MySQL, SQL Server) and the general flow for them is the same. The table names are different but I will use data as the table names for one of the sources/SSIS packages.
backup the table data into bak_data on the destination DB
import new data from the source into data
compare data quality (row count) against data and bak_data
if data quality does meet our threshold, send a success e-mail (execute task against our destination DB using db_send_dbmail)
if the data quality does not meet our threshold, backup data to bad_data then restore from bak_data to data and send failure e-mail
Since the steps are always the same I thought I could use Control Flow Package Parts and then just use variables for the table names and what not.
But upon further investigation I realized I cannot do that because the Control Flow Package .dtsxp is a separate file referenced in/from the Package.dtsx file?
I can copy it to our automation server but not sure if that will be enough to work when Package.dtsx is executed using dtexec.
Is there anyway I can create reusable controls/packages with my constraint/situation?
What's the correct way of exporting data from Excel 2013 file to SQL Server database? The data from the Excel file should be transferred into SQL when saving excel file to a database.
I know many answers for this are available but my question is bit different: every time the excel data changes, or the user clicks on save button the data in the database also needs to be updated.
The easiest way to do this is with an SSIS package. SSIS (SQL Server Integration Services) is a package built in to SQL which allows transformations between data formats.
You can create a package by right-clicking on your target database in SQL Server Management Studio and selecting Tasks > Import Data. In the wizard that comes up asking for a data source choose "Microsoft Excel" from the top drop-down labelled as Data Source, then follow the wizard through. You'll have the choice of importing the Excel data into a new table or mapping it into an existing table.
If you want do this programmatically, you can save your package at the end of the wizard and then invoke it via code. But that's a different question.
What you want is not possible (as long as I know off). You can use SSIS package to migrate the Excel sheet into SQL Server, but is imposible to determine if someone "click save" or do some changes on the excel file. SSIS package can be programed to run on schedule or by demand. You should investigate SSIS packages. It is not easy to learn, but do what you need.
Maybe you would like to try a tool I have developed? It's an Excel Add-In that exports Excel data to SQL Server.
There is a feature to automatically export the data to SQL Server every time you press the save button in Excel. If you need to update the database every time a cell value change, you will need to add a few lines of VBA-code that will push the data to SQL Server.
To beta testers I currently give away a free license, so if you are interested in testing it out, send me an email:)
www.sqlpreads.com
I'm using SSIS with SQL Server 2k5 to build a transfer task to copy all of the data from one database to another. This works quite well, except for one problem - the source database will periodically have schema changes (generally just additions like new columns) but the transfer task seems to choke if the two schemas don't match exactly. Is there some way that I can use SSIS to first bring the target DB up to date with the source DB's schema, and then do the transfer?
You can open the package programmatically and re-save it before executing. You can also programmatically build the package using the SSIS object model.