Little background: I'm working in a large company with a lot of branches. We have several applications with separated databases sometimes on different servers. But every database contains a table with a list of branches and their relationships. I want to automatically synchronize these tables when one of them changed.
My question is: what are the best practices of automatic synchronization of tables in different databases (Microsoft SQL Server 2008)?
Are there sql server features for that purpose? Or external tool is a good way? Or it's better to write a small application and run it as a service or use the scheduler?
You can use replication (a SQL server built-in feature) to synchronize different databases.
You can also use triggers or log shipping to sync your tables as records are added ,updated or deleted:
Here are some links about replication.
Here are some links about log shipping.
Related
I use mssqlserver08 with ssms, and want to publish tables to two different servers.
One server is used to run my application, the other is used to backup data, and the tables' schema are the same.
1.Is it possible to publish same tables to different servers?
2.Is there any better way to backup data Immediately?
I have a transactional database (SQL Server 2014) with around 60 tables, and there is a requirement to create a separate reporting database for reporting purposes.
This will only need to run every 24 hours - however I will be needing to move the data into a different, more query-friendly schema!
Because of this I would hope I could just create some Views on the Transactional Db and then create a table based on that view in the Reporting db and copy across the data.
I originally thought of writing a scheduled Windows Service that somehow extracts data from the tables and inserts into the new one, but then thought if the schema changes it has to update in two places, and also thought surely an enterprise SQL Server license must have some tricks.
I then looked into 'database mirroring' on specific tables but this looks to soon be deprecated.
'Log shipping' looks like more of a disaster recovery solution!
Is there an industry 'best' approach to this problem?
You will need to devise an ETL process to extract data from your source database, transform it and load it into your reporting database. There are many tools available to you to make this easier. You can use SSIS, Azure Data Factory for Azure SQL, and there are many other options. You can use the SQL Agent to schedule stored procedures to run your ETL process.
Your target database will look much different than your source database. There is really no quick way (quick as in scheduling a backup) to accomplish this. There is a lot of information on data warehouse and ETL design available to you to assist you in deciding how to proceed.
We have a requirement where we will have to move data between different database instance on regular basis. (For e.g. some customers willing to pay more for the better performance). So this is not going to be one off.
The database tables has referential integrity. Is there a way in which this can be done without rewriting sql script (or some other method) every time we migrate customers data?
I came across this How to move data between multiple database's table while maintaining foreign-key relationships/referential integrity?. However it appears that we have write script every time we migrate data (please correct me if I misunderstood the answer on this thread).
Thanks
Edit:
Both servers are using SQL Server 2012 (same version). Its an Azure SQL Server database.
They are not necessarily linked (no firewall between them)
We are only transferring some data, not the whole database. This is only for certain customers who opted pay more.
The schema are exactly same in both databases.
Preyash - please see the documentation on the Split-Merge tool. The Split-Merge tool enables you do move data between databases, as you have described, based on a sharding key (e.g., customer ID). One modification that you will need for your application is to add a shard map (i.e., a database that understand the global state of which customers resides in which databases).
Have a look into Azure Data Sync. It is much more aligned with your requirements. But you may end up in having another SQL Azure DB to maintain a Hub. Azure data Sync follows hub-spoke pattern and will let you do all flexible directional syncs with a few minutes of syncing gap. It is more simple and can set it up very fast without any scripts and all as you wanted.
We have ERP application that store the data to ORACLE database. And also we have a lot of another application that use the ERP database. Same database but different instances. We got the performance issues when ERP and another application use the same database.
We are planning to separate the database server become three. One for ERP and two others for report and applications. these new database servers are came from ERP database, because they use the same database structure and data. So we could say these new database servers are mirror of ERP database. And also sometime data on the mirror database could be updated by other application, and it should be also updated on ERP database.
What best practice and method should be used for mirroring this condition?
Is it enough by use Data guard from ORACLE?
This is the picture of the architecture plan.
Data guard does not allow writing to the stand-by. Active dataguard does allow reading from the stand-by while archiving transactions from the primary node. So the report server using your ERP Mirror 1 is not a problem as long as it only reads data. Writing from the other applications to ERP Mirror 2 is. What you are looking for is advanced replication or Oracle streams. This is a very complex task. Maybe offloading your reporting to a data-guard stand-by solves your problems.
If one has a number of databases (due to separate application front-ends) that provide a complete picture - for example a CRM, accounting, and product database - what methods are available to centralize/abstract this data for easy reporting?
Essentially, I'm wondering if there is a way to automatically pull data from multiple databases into a central repository that is continuously updated from the three databases and which can be used for reporting?
I'm also open to alternative best practice suggestions?
Look into building a Data warehouse.
It is difficult to provide very specific info, since no version of SQL Server is given, but SQL Server Data Warehouse Cribsheet has some general information.
you can have views that join data from all your other databases.
Or do you want replicated data on all servers?