Updating the local database from a remote database - database

I have a local database used for development and testing purpose and one remote database in the live environment. Both are in Oracle. The local database contains the same table structure and all with old data. I need to update the local database with the data from the remote database. (I tried exp/imp. But it is showing an error like Object already exists). Can anyone please help me to figure out what may be the problem or what is the best method to do this?

You'll need to drop your local tables first. exp/imp does not remove existing tables or data before importing, if I recall.

Related

Use Remote Database If Object Is Not Found

I am in search of a better solution, I'd like to make a database where there are some local tables/sprocs/views that should be used as the default. As a fall back use a remote database.
The setup I have right now is to copy the schema and/or data of the tables I don't want to change in the remote database. Then create a view for all the views and tables I want read-only access to. The sprocs are just a copy from the remote database as well. This cuts down on time copying all the data as well.
What I am wondering is if there is a way at a lower level to tell SQL Server to fall back on the remote database if an object is not found, then fail if the object is still not found?

ORACLE DATABASE SYNCING

I have a local and remote oracle database table. The remote table is updated whenever new user(s) is registered. Now I am using a java scheduler to query the remote database every 30 minutes and updating the newly added values in my local table. It will be really good if both these tables are in sync that is, if a new entry is added to the remote table then it should be reflected in my local table also. Can anyone suggest a efficient way to achieve this ?
Try:
to create a database link between the local and the remote database
to create a materialized view to replicate the remote table to the local table.
See https://oracle-base.com/articles/misc/materialized-views

Copy records from a table on one SQL instance to an identical table on a different SQL instance

We had an intern who was given written instructions for deleting old data from a database based on dates (from within our ERP system). They were fascinated by the results and just kept deleting instead of stopping at the required date. There are now 4 years of missing records in the production database. I have these records in my development database, which is in a different instance on a different server. Is there a way to transfer just those 4 years worth of data from my development database to my production database, checking, of course, to make sure there are no duplicates (unique index on transaction number).
I haven't tried anything yet because I'm not sure where to start. I do have a test database on the same instance as the production database that I could use to test the transfer with.
There are several ways to do this. Assuming that this is on a different machine, you will want to create a Linked Server on your dev machine to link to the target server (Or, technically, a link from the production server to your dev machine could be used as well). Then, perform an insert of the selected records from the source to the target.
More efficiently, you can use the Export Data functionality. Right click on the database (Not the server / instance, but the database) and select Tasks / Export Data from the popup menu. This will pop up the SQL Server Import and Export Wizard. Use your query above to select the data for export.
If security considerations interfere with this, create a duplicate of the table(s) with alternate names (e.g. MyInvRecords) in a new database, and export the data into those tables. Back up that DB, transfer it to someplace accessible from the target server, restore that DB, then transfer the rows back into the original DB.
I haven't had to use anything but these methods before, so one of them should work for you.
A basic insert will work just fine.
Insert ProdDB.schema.YourTable
([Columns])
select ([Columns])
from TestDB.schema.YourTable
where YourDateRange predicates here

Keep Sql server table updated from Access table

I am attempting to keep a table in Sql server updated from an access table. Any time a change is made in the access table I would like that change reflected in the sql server table. The two tables can be identical. I have created an ODBC connection from access to sql server and can export the table to sql server; I just don’t know what must be done to keep that table updated. Any suggestions are appreciated.
Should this be implemented from within Access or within sql server?
Can you just add the SQL Server table to the Access database as a linked table? (Useful article on how to add linked tables)? That way users (let's hope there's not many!) of the Access database are in effect editing the SQL Server table directly.
If this isn't desirable then how about creating another table in the SQL Server database, and adding this to the Access database as a linked table. Then, add a trigger so that when an insert/update/delete is made to this table the same operation is done on your main table.
I think setting up a Linked Server in SQL Server could be easier to implement than an automatic export of data from Access.
According to the MSDN page,
Many types OLE DB data sources can be configured as linked servers, including Microsoft Access and Excel.
Server-on-SQL-2005-Server/
Access has no "event" that occurs when a row is updated/inserted/deleted that I know of. as JeffO points out data macros that could do what you want.
You could also periodically synch them. There are several techniques to periodically do the synch task (SQL Server Agent, Windows Service, Windows Scheduler, a timer in an application etc.), but still have to deal with all the problems that exist with synchronization if both tables can be modified, the worst being data conflict resolution. There is no easy solution for that.
Perhaps if you explained the problem you have that you are solving with synching data in SQL server and Access someone might be able to point you in the direction of a solution that doesn't have these problems.

SQL Server database remote transfer - best method

I have two databases, one on a remote server the other local. (SQL Server 2008)
The database on my local server has the entire structure setup but no data. I would like to copy the data from the remote server to my server and I am wondering the best method in which to do this.
The main issue I am experiencing is the user that I have to the remote database has limited permissions. I cannot read the stored procedures, user defined functions so when I use Import/Export wizard I do not get the schema etc. So a regular dump/restore is not working for me as it restores the tables without the Primary Keys/Foreign Keys and the stored procedures.
I'd like to do this,
INSERT INTO localtable SELECT * FROM remotedb.table
I was having issues because of the IDENTITY fields and I had to explicitly name all of the columns. Also I am not sure if SQL Server Management Studio allows you to use two different databases, remote and local, so I was looking for any advice.
I have also tried applications like SQL FTP and Backup and it fails because it runs out of memory (I have 16GB of memory on the machine and the DB is like 4GB). I also can use the SQL Server import/export wizard but then I don't get the schema information. I also tried SQL Compare from Red Gate and it runs into issues with the permissions. Unfortunately I do not have the time to request and gain access to a new user so I was hoping someone had a creative idea.
You can definitely use SQL Server Backups for this. It will not run out of memory. If it does please tell us the message (because likely you are misinterpreting it). This is the fastest possible and the most complete solution.
You can tell the export wizard to also script the schema. It is hidden under "advanced" somewhere (terrible UI). But the script will be extremely big and I know of no way to execute it.
You can drop all schema objects except PKs in the target database. Then you can use remote queries to copy all the data over. You will not get any problems with foreign keys and identity columns if you drop the beforehand. After you are done you can recreate all those objects. It is probably best if you use a transaction for all of this because that way you get consistent source data from a point-in-time.

Resources