I have 1 database sql on the Microsoft CRM and 1 remote local database sql. They have different schemas.
I must maintain both syncronized (2-way sync).
They can't communicate directly. So in the middle there is a rest service and a temporary database.
So, everytime there is an insert or an update on one of the database, the insert or the update is propagated to temporary database and from temporary database to the destination db.
I don't know how to manage conflicts. For example, in both db I have the table PERSON. In a database I update the name, in the other I update the birthday. How can I sync both db without lose both updates?
Thank you
Related
I want to create a daily process where I reload all rows from table A into table B. Over time table A rows will change due to changes in source system and also because of aging/deletion of records in the origin table. Table A gets truncated/reloaded daily in step 1. Table B is the master table that just gets new/updated rows.
From a historical point of view, I want to keep track of ALL the rows in table B and be able to do a point in time comparison for analytics purposes.
So I need to do two things, Daily insert rows from table A to table B if they don't exist and then also create a new record in Table B if the record already exists but ANY of the columns have changed. At one point I attempted to use temporal tables but I had too many false/positives on 'real' changes, basically certain columns were throwing off things because a date/time column was updated(only real change in row).
I'm using a Azure SQL Server Managed Instance database (Microsoft SQL Azure (RTM) - 12.0.2000.8).
At my disposal I have SSMS, SQL Server and also Azure Data Factory.
Any suggestions on the best way to do this or tools to help with this?
There are 2 concepts out of which you can implement any one.
Temporal table
Capture Data Change (CDC)
As CDC is the commonly used approach in which you can create an Azure data factory with a pipeline that loads delta data based on change data capture (CDC) information in the source Azure SQL Managed Instance database to an Azure blob storage.
To implement the CDC, you can you can follow this simple Microsoft tutorial Incrementally load data from Azure SQL Managed Instance to Azure Storage using change data capture (CDC)
Note: You also need to Create a storage account which is required but not given in above tutorial.
I have a local and remote oracle database table. The remote table is updated whenever new user(s) is registered. Now I am using a java scheduler to query the remote database every 30 minutes and updating the newly added values in my local table. It will be really good if both these tables are in sync that is, if a new entry is added to the remote table then it should be reflected in my local table also. Can anyone suggest a efficient way to achieve this ?
Try:
to create a database link between the local and the remote database
to create a materialized view to replicate the remote table to the local table.
See https://oracle-base.com/articles/misc/materialized-views
I have two databases A and B in same SQL Server instance.. I need to write a trigger -- After update of a table in database B it will fetch data from few tables of database A and then insert data in some table of database B .. The issue is the user who will be accessing the database B does not have access to database A .. If I write a trigger with 'sa' account, will it work when the user inserts some data in database B? Also let me know the scenario what would I have to do if database A is in a different SQL Server?
It can work, but you have to do a few things to get there. Here's the easiest way (though not necessarily the best):
Set the owners of both Databases to 'sa'.
Turn on CROSS-DATABASE chaining for both databases.
Turn on TRUSTWORTHY for the source database (B).
Edit the Trigger and add WITH EXECUTE AS OWNER before the FOR clause.
Note that while this works, it has significant security considerations (particularly #2 and #3). Here is a link that explains this and some other methods and some of the security issues: http://msdn.microsoft.com/en-us/library/ms188304.aspx
I have one database which is updated every minute, i should do some transactions using the DB.
But the same DB is used by many applications, so to do my transactions can i create a copy of the DB in my server. When the records are updated in the parent DB even the values in my DB should be updated. I would use the duplicated DB only to query records there will not be any updates/inserts.
Is there a way to achieve this in SQL SERVER 2008
Background information
Let's say I have two database servers, both SQL Server 2008.
One is in my LAN (ServerLocal), the other one is on a remote hosting environment (ServerRemote).
I have created a database on ServerLocal and have an exact copy of that database on ServerRemote. The database on ServerRemote is part of a web application and I would like to keep it's data up-to-date with the data in the database ServerLocal.
ServerLocal is able to communicate with ServerRemote, this is one-way traffic. Communication from ServerRemote to ServerLocal isn't available.
Current solution
I thought it would be a nice solution to use replication. So I've made ServerLocal a publisher and subscriptions are pushed to the ServerRemote. This works fine, when a snapshot is transfered to ServerRemote the existing data will be purged and the ServerRemote database is once again an exact replica of the database on ServerLocal.
The problem
Records that exist on ServerRemote that don't exist on ServerLocal are removed. This doesn't matter for most of my tables but in some of my tables I'd like to keep the existing data (aspnet_users for instance), and update the records if necessary.
What kind of replication fits my problem?
Option C: Transactional replication.
I've done this before where you have data in a subscription database and don't want it overwritten by the snapshot. You can set your initial snapshot to not delete the existing records and either don't create the records that are in the publisher (assume they are there) or create the records that are in the publisher (assume they are not there).
Take a look at what would be correct for your situation or leave a comment with more details on how you get your data in the subscriber originally. I'm not too familiar with aspnet_users and what that is. Transactional replication only helps if you don't want the data in the subscriber back at the publisher. If you want that you'll have to do merge replication.