Dynamically create destination table from source server with SSIS - sql-server

I need a bit advice how to solve the following task:
I got a source system based on IBM DB2 (IBMDA400) which has a lot of tables that changes rapidly and daily in structure. I must load specified tables from the DB2 into a MSSQL 2008 R2 Server. Therefore i thought using SSIS is the best choice.
My first attempt was just to add both datasources, drop all tables in MSSQL and recreate them with a "Select * Into #Table From #Table". But I was not able to get this working because I could not connect both OLEDB Connections. I also tried this with an Openrowset statement but the SQL Server does not allow that for security reasons and I am not allowed to change that.
My second try was to manually read the tables from the source and drop and recreate the tables with a for each loop and then load the data via the Data Flow Task. But I got stuck on getting the meta data from the Execute SQL Task... so i dont got the column names and types.
I can not believe that this is too hard to archieve. Why is there no "create table if not exist" checkbox on the Data Flow Task?
Of course i searched for the problem here before but could not find a solution.
Thanks in advance,
Pad

This is the solution i got at the end:
Create a File/Table which is used for selection of the source tables.
Important: Create a linked Server on your SQL Instance or a working Connectionstring for the OPENROWSET (i was not able to do so - i choosed the linked server)
Query source File/Table
Build a loop through the resultset
Use Variables and Script Task to build your query
Drop the destination table
Build another Querystring with INSERT INTO TABLE FROM OPENROWSET (or if you used linked Server OPENQUERY)
Execute this Statement
Done.
As i said above i am not quite happy with this but for now it should be ok. I will update this if i got another solution.

Related

Dynamic column mapping for both Source and destination in data flow tasks from Oracle to SQL Server

We have around 5000 tables in Oracle and the same 5000 tables exist in SQL server. Each table's columns vary frequently but at any point in time source and destination columns will always be the same. Creating 5000 Data flow tasks is a big pain. Further there's a need to map every time a table definition changes, such as when a column is added or removed.
Tried the SSMA (SQL Server Migration Assistance for Oracle ) but it is very slow for transferring huge amount of data then moved to SSIS
I have followed the below approach in SSIS:
I have created a staging table where it will have a table name, source
query (oracle), Target Query (SQL server) used that table in Execute
SQL task and stored the result set as the full result set
created for each loop container off that execute SQL task result set
and with the object and 3 variables table name, source query and
destination query
In the data flow task source I have chosen OLE DB source for oracle
connection and choose data access mode as an SQL command from a
variable (passed source query from loop mapping variable)
In the data flow task destination I have chosen OLE DB source for SQL
connection and choose data access mode as an SQL command from a
variable (passed Target query from loop mapping variable)
And looping it for all the 5000 tables..it is not working can you please guide us how I need to create it for 5000 tables dynamically from oracle to SQL server using SSIS. any sample code/help would be greatly appreciated. Thanks in advance
Using SSIS, when thinking about dynamic source or destination you have to take into consideration that the only case you can do that is when metadata is well defined at run-time. In your case:
Each table columns vary frequently but at any point of time source destination columns will always same.
You have to think about build packages programatically rather than looping over tables.
Yes, you can use loops in case you can classify tables into groups based on their metadata (columns names, data types ...). Then you can create a package for each group.
If you are familiar with C# you can dynamically import tables without the need of SSIS. You can refer to the following project to learn more about reading from oracle and import to SQL using C#:
Github - SchemaMapper
I will provide some links that you can refer to for more information about creating packages programatically and dynamic columns mapping:
How to manage SSIS script component output columns and its properties programmatically
How to Map Input and Output Columns dynamically in SSIS?
Implementing Foreach Looping Logic in SSIS

How to Copy/Consolidate data from different tables hosted on different MS SQL Servers and save them into one Table on another MS SQL Server

I am a newbie in SQL so please bear with me. I am hoping you can help/guide me. I have a table on 5 MS SQL Servers that have identical Columns and I want to consolidate the data into a separate table/separate MS SQL Server.
the challenge is that I only have "Read Only Permission" from the source table (5 MS SQL Servers) but I have permission to create a table on the destination MS SQL Server DB.
another challenge is I wan to truncate or extract parts of the txt in one column of the source table and save them into different columns on the destination table.
Next challenge is for the destination table to query once a day the source table for any update.
See screenshot by clicking either of the URL.
Screenshot URL1
Screenshot URL2
Appreciate it very much if you can help/guide me. Many thanks in advance.
You'll need to setup a linked server and use either an SSIS package to pull the data into the form you need, or OPENROWSET/OPENQUERY queries with an insert on the server you do have write privileges.
Either pre-create a table to put the new data in, or if not needed build up a temporary table or the insert the data into a table variable.
To concat a field to a new field use something like the examples below:
SELECT (field1 + field 2) as Newfield
or
SELECT (SUBSTRING(field1, 2,2) + SUBSTRING(field2, 3,1)) as Newfield
Finally you should setup all this an agent Job scheduled to your needs.
Apologies if this is not as detailed as you like, but it seems there are many questions to be answered and not enough detail to help further.
Alternatively you could also do a lookup upon lookup (USING SSIS):
data flow task > download first table completely to destination server
JOIN TO
dataflow task > reading from destination server, do a lookup to 2 origin server (if match you might update, if not, insert)
repeat until all 5 of them are done.
This is NOT the most elegant or efficient solution, but it will definitely get the work done.

Can Access generate CREATE TABLE script code like SQL Server can?

I have a MS Access file containing hundred of tables, I should create these tables using C# at runtime. So I should generate a script and use that query inside C# to create the tables.
Is there a way that MS Access can generate this SQL script automatically?
Best regards
No, Access itself cannot automatically create DDL (CREATE TABLE ...) code like SQL Server can. It is entirely possible that some third-party product might be able to scan through an Access database and write DDL statements for each table, but recommendations for such a third-party product would be off-topic on Stack Overflow.
Also, as mentioned in the comments to the question, creating an empty database file and then creating each table "from scratch" via DDL is not really necessary for an Access database. Since an Access database is just a file you can distribute your application with a database file that already contains the empty tables (and other database objects as required).
You can use an SSIS package to generate the create table command.
Start a new SSIS package. Add a connection manager for the Access database.
Then add a connection manager for a SQL Server database.
When you configure the dataflow task. Select the Access database as the source, then the SQL server as the destination. When choosing the table or view for the destination hit the [New] button and you will get the table creation script from the Access table DDL translated to MS SQL Server.

Importing data from different SQL Server to another when trigger fires

I need to create a trigger on one server to fire when a certain table is changed, after which I have to do some calculation with the data so I can import (it's no just copying) it into another table on a different server.
Also I only want to import the new data that hasn't been imported through a earlier trigger.
How would I go about this?
Create Linked Server to your target server, then use MERGE statement to perform action depending on existence of record in destination table.
I would refrain from creating triggers referring remote servers though, it might have performance and reliability implications. Consider using Service Broker if you want to achieve small latency and still make the update reliable.
You can create a linked server between two SQL Servers to send data from one to the other.
To create a linked server you need to use the system stored procedure sp_addlinkedserver, you can also do it through SQL Server Management Studio I believe.
Here is an example you can try:
EXECUTE sp_addlinkedserver #server=N'serverip/hostname', #provider=N'SQLNCLI'
You can view if your linked server has been created by querying sys.servers
You can query the linked server database with the following syntax:
SELECT x
FROM [linkedservername].[database].[schema].[table]
More information: http://msdn.microsoft.com/en-us/library/ff772782.aspx
For the trigger updating only data which hasn't been handled before. There are many ways you can do this. If your source table has an 'update date' type column, you can do it based on that. Alternatively, if your table has an identity column and you want to copy data incrementally you can store in a table the 'last id' which is copied over each time, then on the next run of the trigger you can tell it to start from that id+1 that way rows are only transferred once.

Copy table to a different database on a different SQL Server

I would like to copy a table from one database to another. I know you can easily do the following if the databases are on the same SQL Server.
SELECT * INTO NewTable FROM existingdb.dbo.existingtable;
Is there any easy way to do this if the databases are on two different SQL Servers, without having to loop through every record in the original table and insert it into the new table?
Also, this needs to be done in code, outside of SQL Server Management Studio.
Yes. add a linked server entry, and use select into using the four part db object naming convention.
Example:
SELECT * INTO targetTable
FROM [sourceserver].[sourcedatabase].[dbo].[sourceTable]
If it’s only copying tables then linked servers will work fine or creating scripts but if secondary table already contains some data then I’d suggest using some third party comparison tool.
I’m using Apex Diff but there are also a lot of other tools out there such as those from Red Gate or Dev Art...
Third party tools are not necessary of course and you can do everything natively it’s just more convenient. Even if you’re on a tight budget you can use these in trial mode to get things done….
Here is a good thread on similar topic with a lot more examples on how to do this in pure sql.
SQL Server(2012) provides another way to generate script for the SQL Server databases with its objects and data. This script can be used to copy the tables’ schema and data from the source database to the destination one in our case.
Using the SQL Server Management Studio, right-click on the source database from the object explorer, then from Tasks choose Generate Scripts.
In the Choose objects window, choose Select Specific Database Objects to specify the tables that you will generate script for, then choose the tables by ticking beside each one of it. Click Next.
In the Set Scripting Options window, specify the path where you will save the generated script file, and click Advanced.
From the appeared Advanced Scripting Options window, specify Schema and Data as Types of Data to Script. You can decide from here if you want to script the indexes and keys in your tables. Click OK.
Getting back to the Advanced Scripting Options window, click Next.
Review the Summary window and click Next.
You can monitor the progress from the Save or Publish Scripts window. If there is no error click Finish and you will find the script file in the specified path.
SQL Scripting method is useful to generate one single script for the tables’ schema and data, including the indexes and keys. But again this method doesn’t generate the tables’ creation script in the correct order if there are relations between the tables.
Microsoft SQL Server Database Publishing Wizard will generate all the necessary insert statements, and optionally schema information as well if you need that:
http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A
Generate the scripts?
Generate a script to create the table then generate a script to insert the data.
check-out SP_ Genereate_Inserts for generating the data insert script.
Create the database, with Script Database as... CREATE To
Within SSMS on the source server, use the export wizard with the destination server database as the destination.
Source instance > YourDatabase > Tasks > Export data
Data Soure = SQL Server Native Client
Validate/enter Server & Database
Destination = SQL Server Native Client
Validate/enter Server & Database
Follow through wizard

Resources