How copy data from one database to another on different server? - database

I have 2 DB with the same schema on different servers.
I need to copy data from table T to the same table T in test database in different server and network.
What is the easiest way to do it?
I heard that data can be dumped to flat file and than inserted into database. How does it works?
Can this be achieved using sqlplus and oracle database?
Thank you!

Use Oracle export to export a whole table to a file, copy the file to serverB and import.
http://www.orafaq.com/wiki/Import_Export_FAQ
You can use rsync to sync an oracle .dbf file or files to another server. This has problems and syncing all files works more reliably.
For groups of records, write a query to build a pipe-delimited (or whatever delimiter suits your data) file with rows you need to move. Copy that file to serverB. Write a control file for sqlldr and use sqlldr to load the rows into the table. sqlldr is part of the oracle installation.
http://www.thegeekstuff.com/2012/06/oracle-sqlldr/
If you have db listeners up on each server and tnsnames knows about both, you can directly:
insert into mytable#remote
select * from mytable
where somecolumn=somevalue;
Look at the remote table section:
http://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_9014.htm
If this is going to be an ongoing thing, create a db link from instance#serverA to instance#serverB.
You can then do anything you have permissions for with data on one instance or the other or both.
http://psoug.org/definition/CREATE_DATABASE_LINK.htm

Related

Load all CSVs from path on local drive into AzureSQL DB w/Auto Create Tables

I frequently need to validate CSVs submitted from clients to make sure that the headers and values in the file meet our specifications. Typically I do this by using the Import/Export Wizard and have the wizard create the table based on the CSV (file name becomes table name, and the headers become the column names). Then we run a set of stored procedures that checks the information_schema for said table(s) and matches that up with our specs, etc.
Most of the time, this involves loading multiple files at a time for a client, which becomes very time consuming and laborious very quickly when using the import/export wizard. I tried using an xp_cmshell sql script to load everything from a path at once to have the same result, but xp_cmshell is not supported by AzureSQL DB.
https://learn.microsoft.com/en-us/azure/azure-sql/load-from-csv-with-bcp
The above says that one can load using bcp, but it also requires the table to exist before the import... I need the table structure to mimic the CSV. Any ideas here?
Thanks
If you want to load the data into your target SQL db, then you can use Azure Data Factory[ADF] to upload your CSV files to Azure Blob Storage, and then use Copy Data Activity to load that data in CSV files into Azure SQL db tables - without creating those tables upfront.
ADF supports 'auto create' of sink tables. See this, and this

Copy data from one table and save it into another table in different database on different SQL Server

I have two different databases in two different SQL Servers. The databases are identical in schema but contain different data in one of the tables.
I want to copy all the data from one table in one database to the same table in the other database so that I can get rid of the database from which I am copying the data.
The data is too large so I cannot create data scripts and run it onto other database.
How can I achieve this?
There are many ways like ssis transfer,select * into ,but i prefer below way if you are just transferring data
create a linked server on source server for destination server,then you could refer destination server with four part name
Assuming linked server of source is A and destination server is B,data moving is as simple as
insert into B.databasename.Schema.Table
select * from table---this is in source server and db
if data is huge and you may worry about time outs,you can write a simple script which can do in batches like
While (1=1)
begin
insert into B.databasename.Schema.Table
select top 10000* from table---this is in source server and db
if (##rowcount=0)
break
end
Creating linked server ,you can follow this
You have the following options available to you. Not all of these will work, depending on your exact requirements and the networking arrangements between the servers.
SQL Server Management Studio - Import & Export Wizard: this is accessed from the right-click menu for a database > Tasks > Import Data (or Export Data).
SQL query using a Linked Server: a Linked Server configured between the two servers allows you to reference databases on one from the other, in much the same way as if they were on the same server. Any valid SQL query approach for transferring data between two tables within one database will then work, provided you fully-qualify the table names as Server.Database.Schema.Table.
SSIS: create an SSIS package with both servers as connections, and a simple workflow to move the data from one to the other. There is plenty of information available online on how to use SSIS.
Export to flat-file format then import: this could be done using the Import/Export Wizard above or SSIS, but instead of piping the data directly between the two servers, you would output the data from the source table into a suitable flat-file format on the filesystem. CSV is the most commonly used format for this. This file can then be moved to the destination server using any file transfer approach (compressed e.g. to a Zip file if desired), and imported into the destination table.
Database backup and restore: Similar to (4), but instead of using a flat file, you could create a backup of the source database via Tasks > Back Up... You then move that backup as a file (just like the CSV approach), and restore it onto the destination server. Now you have two databases on the destination server, and can move data from one to the other locally.
I hope, this query helps you!!!
INSERT INTO [dbo].[tablename] (Column1, Column2,Column3)
(select Column1, Column2,Column3, from [Database1].[dbo].[tablename]
Thanks!!!

exporting query from SQL SERVER 2000 to csv file

This is the first time I want to create a .crv file myself. I wish to create if using a query that I create from database in SQL Server. Can anyone tell me how to do so? 1 more thing, if I change any data in database, can data from .crv file auto update?
You can use bcp utility to export data from DB to CSV file. More details on this can be found here
However, the file won't be auto-updated if data changes. You will need to regenerate the file.

Speeding Up ETL DB2 to SQL Server?

I came across this blog post when looking for a quicker way of importing data from a DB2 database to SQL Server 2008.
http://blog.stevienova.com/2009/05/20/etl-method-fastest-way-to-get-data-from-db2-to-microsoft-sql-server/
I'm trying to figure out how to achieve the following:
3) Create a BULK Insert task, and load up the file that the execute process task created. (note you have to create a .FMT file for fixed with import. I create a .NET app to load the FDF file (the transfer description) which will auto create a .FMT file for me, and a SQL Create statement as well – saving time and tedious work)
I've got the data in a TXT file and a separate FDF with the details of the table structure. How do I combine them to create a suitable .FMT file?
I couldn't figure out how to create the suitable .FMT files.
Instead I ended up creating replica tables from the source DB2 system in SQL Server and ensured that that column order was the same as what was coming out from the IBM File Transfer Utility.
Using an Excel sheet to control what File Transfers/Tables should be loaded, allowing me to enable/disable as I please, along with a For Each Loop in SSIS I've got a suitable solution to load multiple tables quickly from our DB2 system.

Load data from one oracle database table to another oracle database table

I want to load data from one database table to another database table.
For example there exists table 'tbl' in db1 and db2 databases, and I want to copy all data from 'tbl' of 'db1' to 'tbl' of 'db2' in oracle.
Any help would be appreciated.
I would make use of either exp/imp or expdp/impdp (10g+) for this.
The older exp/imp command is slower, but has the advantage that the export file is created and read from the client system. The expdp/impdp command is much faster, but the file is created on and read from the server where the databases live. So, if you have your databases on different servers, you'll need to copy the export files around. Also, it requires an Oracle Directory to be set up by the DBA.
In order to do this , you will need to create a database link between the two schemas. Here is a link to a tutorial that may help.

Resources