Load data, keys and indexes with SQL Server Integration Services (SSIS) - sql-server

I have created the package SQL Server Integration Services (SSIS) that loads data from one server to another (records from table to records to another table).
It works properly, but unfortunately destination table do not have a keys and indexes (source table has).
How to load data with keys and indexes?

SSIS is used to move data from one place to another. Keys and indexes are part of the structure of the destination table not part of the data itself and so SSIS cannot "load" them. Potentially the destination structure you move the data into could be very different from the source (and in fact I'd expect this in most cases if you're moving data out of a transactional system into a data warehouse for example). You also need to consider that it could be reading from multiple sources each with different indexes and keys.
If you're looking to replicate structure rather than the data then you need a different tool. This could be a simple as using SSMS to script the table out from the source and re-running on the destination or something more advanced such as using Visual Studio database projects.

Related

Load all CSVs from path on local drive into AzureSQL DB w/Auto Create Tables

I frequently need to validate CSVs submitted from clients to make sure that the headers and values in the file meet our specifications. Typically I do this by using the Import/Export Wizard and have the wizard create the table based on the CSV (file name becomes table name, and the headers become the column names). Then we run a set of stored procedures that checks the information_schema for said table(s) and matches that up with our specs, etc.
Most of the time, this involves loading multiple files at a time for a client, which becomes very time consuming and laborious very quickly when using the import/export wizard. I tried using an xp_cmshell sql script to load everything from a path at once to have the same result, but xp_cmshell is not supported by AzureSQL DB.
https://learn.microsoft.com/en-us/azure/azure-sql/load-from-csv-with-bcp
The above says that one can load using bcp, but it also requires the table to exist before the import... I need the table structure to mimic the CSV. Any ideas here?
Thanks
If you want to load the data into your target SQL db, then you can use Azure Data Factory[ADF] to upload your CSV files to Azure Blob Storage, and then use Copy Data Activity to load that data in CSV files into Azure SQL db tables - without creating those tables upfront.
ADF supports 'auto create' of sink tables. See this, and this

Methods to transfer Tables from source database to destination database using SSIS dynamically

I am relatively new to SSIS and have to come up with a SSIS package for work such that certain tables must be dynamically moved from one SQL server database to another SQL server database. I have the following constraints that need to be met:
Source table names and destination table names may differ so direct copying of table does not work with transfer SQL server object task.
Only certain columns may be transferred from source table to destination table.
This package needs to run every 5 minutes so it has to be relatively fast.
The transfer must be dynamic such that if there are new source tables, the package need not be reconfigured with hard coded values.
I have the following ideas for now:
Use transfer SQL Server object task but I'm not sure if the above requirements can be met, especially selective transfer of tables and dynamic mapping of columns.
Use SQLBulkCopy in a script component to perform migration.
I would appreciate if anyone could give some direction as to how I can go about meeting the requirements and if my existing ideas are possible.

Dynamic column mapping for both Source and destination in data flow tasks from Oracle to SQL Server

We have around 5000 tables in Oracle and the same 5000 tables exist in SQL server. Each table's columns vary frequently but at any point in time source and destination columns will always be the same. Creating 5000 Data flow tasks is a big pain. Further there's a need to map every time a table definition changes, such as when a column is added or removed.
Tried the SSMA (SQL Server Migration Assistance for Oracle ) but it is very slow for transferring huge amount of data then moved to SSIS
I have followed the below approach in SSIS:
I have created a staging table where it will have a table name, source
query (oracle), Target Query (SQL server) used that table in Execute
SQL task and stored the result set as the full result set
created for each loop container off that execute SQL task result set
and with the object and 3 variables table name, source query and
destination query
In the data flow task source I have chosen OLE DB source for oracle
connection and choose data access mode as an SQL command from a
variable (passed source query from loop mapping variable)
In the data flow task destination I have chosen OLE DB source for SQL
connection and choose data access mode as an SQL command from a
variable (passed Target query from loop mapping variable)
And looping it for all the 5000 tables..it is not working can you please guide us how I need to create it for 5000 tables dynamically from oracle to SQL server using SSIS. any sample code/help would be greatly appreciated. Thanks in advance
Using SSIS, when thinking about dynamic source or destination you have to take into consideration that the only case you can do that is when metadata is well defined at run-time. In your case:
Each table columns vary frequently but at any point of time source destination columns will always same.
You have to think about build packages programatically rather than looping over tables.
Yes, you can use loops in case you can classify tables into groups based on their metadata (columns names, data types ...). Then you can create a package for each group.
If you are familiar with C# you can dynamically import tables without the need of SSIS. You can refer to the following project to learn more about reading from oracle and import to SQL using C#:
Github - SchemaMapper
I will provide some links that you can refer to for more information about creating packages programatically and dynamic columns mapping:
How to manage SSIS script component output columns and its properties programmatically
How to Map Input and Output Columns dynamically in SSIS?
Implementing Foreach Looping Logic in SSIS

Moving SQL server data into Dynamics 365 CRM

I have more than 500s of tables in SQL server that I want to move to Dynamics 365. I am using SSIS so far. The problem with SSIS is the destination entity of dynamics CRM is to be specified along with mappings and hence it would be foolish to create separate data flows for entities for 100s of SQL server table sources. Is there any better way to accomplish this?
I am new to SSIS. I don't feel this is the correct approach. I am just simulating the import/export wizard of SQL server. Please let me know if there are better ways
It's amazing how often this gets asked!
SSIS cannot have dynamic dataflows because the buffer size (the pipeline) is calculated at design time (as opposed to execution time).
The only way you can re-use a dataflow is if all the source to target mappings are the same - Eg if you have 2 tables with exactly the same DDL structure.
One option (horrible IMO) is to concatenate all columns into a massive pipe-separated VARCHAR and then write this to your destination into a custom staging table with 2 columns eg (table_name, column_dump) & then "unpack" this in your target system via a post-Load SQL statement.
I'd bite the bullet, put on your headphones and start churning out the SSIS dataflows one by one - you'd be surprised how quick you can bang them out!
ETL works that way. You have to map source, destination & column mapping. If you want that to be dynamic that’s possible in Execute SQL task inside foreach loop container. Read more
But when we are using Kingswaysoft CRM destination connector - this is little tricky (may or may not be possible?) as this need very specific column mapping between source & destination.
That too when the source schema is from OLEDB, better to have separate Dataflow tasks for each table.

Copy data from one table and save it into another table in different database on different SQL Server

I have two different databases in two different SQL Servers. The databases are identical in schema but contain different data in one of the tables.
I want to copy all the data from one table in one database to the same table in the other database so that I can get rid of the database from which I am copying the data.
The data is too large so I cannot create data scripts and run it onto other database.
How can I achieve this?
There are many ways like ssis transfer,select * into ,but i prefer below way if you are just transferring data
create a linked server on source server for destination server,then you could refer destination server with four part name
Assuming linked server of source is A and destination server is B,data moving is as simple as
insert into B.databasename.Schema.Table
select * from table---this is in source server and db
if data is huge and you may worry about time outs,you can write a simple script which can do in batches like
While (1=1)
begin
insert into B.databasename.Schema.Table
select top 10000* from table---this is in source server and db
if (##rowcount=0)
break
end
Creating linked server ,you can follow this
You have the following options available to you. Not all of these will work, depending on your exact requirements and the networking arrangements between the servers.
SQL Server Management Studio - Import & Export Wizard: this is accessed from the right-click menu for a database > Tasks > Import Data (or Export Data).
SQL query using a Linked Server: a Linked Server configured between the two servers allows you to reference databases on one from the other, in much the same way as if they were on the same server. Any valid SQL query approach for transferring data between two tables within one database will then work, provided you fully-qualify the table names as Server.Database.Schema.Table.
SSIS: create an SSIS package with both servers as connections, and a simple workflow to move the data from one to the other. There is plenty of information available online on how to use SSIS.
Export to flat-file format then import: this could be done using the Import/Export Wizard above or SSIS, but instead of piping the data directly between the two servers, you would output the data from the source table into a suitable flat-file format on the filesystem. CSV is the most commonly used format for this. This file can then be moved to the destination server using any file transfer approach (compressed e.g. to a Zip file if desired), and imported into the destination table.
Database backup and restore: Similar to (4), but instead of using a flat file, you could create a backup of the source database via Tasks > Back Up... You then move that backup as a file (just like the CSV approach), and restore it onto the destination server. Now you have two databases on the destination server, and can move data from one to the other locally.
I hope, this query helps you!!!
INSERT INTO [dbo].[tablename] (Column1, Column2,Column3)
(select Column1, Column2,Column3, from [Database1].[dbo].[tablename]
Thanks!!!

Resources