I am developing an web application in which i need to maintain the website in the local servers itself with the database in the computer itself , the local database will change periodically.There is a central database through which i have to access all the data in all the remaining DB's .
The problem is that even when internet connection is disabled, the local server will update the local database but when when it regains the internet connection it has to update the central database with the local modified data.
The tables( i mean the database schema, table names, attributes all) in all the DB's is same.The data should be appended if added any new ,should be deleted if any deleted and should be modified if any.
I am using MySQL server as DB, Apache Tomcat as server and using JSP, Servlets for business logic.
Please visit http://dev.mysql.com/doc/refman/5.1/en/replication-howto.html
Mysql replication might do the job but there are a few things that you have to consider, like:
the amount of data that has to be synchronized
the OS used on master and slave servers
because of the internet connection issue - why you disable internet connection? one option might be a scheduled job (crontab)
Related
I am a new SQL developer (not DBA or Architect) and I'm working on a new system for the directors of a company.
This company has around 50 dependencies and each one of them use the same desktop system for their inventory. The database of this system is not centralized, in every branch office they have their own individual database and it's not shared.
Now the directors of the company want to supervise all the inventory of the company, so they want that at the end of every day the data of each branch is transferred to a centralized database at the company server.
These are all SQL Server 2008 R2 databases. I am tasked with figuring out how to:
Transfer that data (is not a complete replication of those databases because they only want some of it) to the centralized database without direct data access between the databases of each brach and the database at the company server (they don't want it)
I have read a number of articles on the internet but almost every one of them talk about transactional replication and the use of SSIS (this was my first option when they assigned me this task) to transfer the data between the SQL servers, but I can't do that, they don't want a direct connection between the databases of the brach and the company due to technical limitations imposed by the network administrators (also I can only use ports 80 and 443).
I am hoping some of you with more experience with SQL and integration can help me with that.
I have been thinking about using web services, but I have no idea I have no idea where to start. Sorry if this may be a trivial question but I have been working as an android developer and all this is new to me.
I'll start by explaining what the current set up is and then go onto where it needs to be.
Currently we have a local SQL Server database for a CMS. The database is updated from other servers on site to update product information to display on the website, and CMS information is updated from an MVC application.
Moving forward we need to have a remote server with a SQL Server database that is an identical copy of the local database, this database will never be updated from the remote location.
The problem arises when attempting to design a method to sync the data from the local database to the remote server with no downtime on either end. I know SQL Server Enterprise has features that would help in this case, but we do not have a licence for it at this time.
The best idea we have come up with is to log ship to the remote server, create a restore from database that is receiving the logs being shipped and then update the website's web.config to point to the newly restored database. This could work, but seems overly complicated and we have an issue of an ever changing database name.
If any one could think of a better/simpler solution or a way to make the current idea better it would be much appreciated.
If anything is not clear or more info is needed let me know.
I think the logshipping solution fits your needs. After a one-time setup proces, the logs would be continually shipped from the local db to the remote db. Keeping the remote perfectly in sync with the local one, providing you with the readonly copy.
Logshipping is available in all editions, besides Express.
There would be no need for a continuous restore process.
You can find more information here.
I am currently been assigned to develop a sync application for my company. We have SQL server on our database server which will be synced with the client database. Client databases are not known, they can be SQLite or MYSQL or whatever.
What this sync app does is, detect changes that occur on server & client databases. Save these changes and sync. If changes occur on server database it will be synced with the client database and vice versa.
I did some research on it and came to know many solutions. One of them is to use a Microsoft Sync Framework. But I hardly found a good implementation example on it for syncing with remote databases.
Then I came across Change Data Capture(CDC) on SQL Server 2008. CDC works by detecting the change on the source table through triggers and put these changes on a separate table called sync_table, this table is then used for syncing.
Since, I cannot use the CDC feature because I don't have sufficient database rights on my machine, I have started to develop my own solution which works like how CDC does. I create separate sync_table for each source table, create triggers to detect data change and put this data in the sync_table.
However, I am advised to do some more research on it for choosing the best implementation methodology.
I need to keep the following things in mind,
Databases may/may not be on the same network.
On server side, the user must be able to select which tables will take part in the sync process.
Devices that will sync with the server database need to be registered first. Meaning that all client devices will be registered by the user before they can start syncing.
As usual any help will be appreciated :)
There is an open source project called SymmetricDS with many of the same goals. Take a look at the documentation and data model to see how the problem was solved, and maybe you will get some ideas. Instead of a separate shadow table for each source table, there is a single sym_data table where all the data is captured in comma separated value format. The advantage is one place to look for captured data and retrieve changes that were part of the same transaction. The table is kept small by purging it often after data is transferred successfully. It uses web protocols (HTTP) for data transfer. The advantage is leveraging existing web servers for performance, administration, and known filtering through firewalls. There is also a registration protocol used before clients are allowed to sync. The server admin "opens registration" for a client ID, which allows the client to connect for the first time. It supports many different databases, so you'll find examples of how to write triggers and retrieve unique transaction IDs on those systems.
I have a mobile db (SQL Server CE), which synchronizes with my database on a SQL-server via merge replication.
after some trobules with the device, I had to copy the mobile db from the device, but i couldn't sync anymore before that.
There are some important information in it and i have no other device to use the db there.
I can read it with SSMS, but i need it on the server and I don't want to copy it by hand (There are too many records in it, which changed and also a lot of new ones)
Is it possible to sync the SQLCE-Database without a device
Thank you
I have a WinForms business application that connects to a SQL Server on a server within the business network. Recently we have added an ASP.NET web site so some of the information within the system can be accessed from the Internet. This is hosted on the same server as the SQL Server.
Due to the bandwidth available to the business network from the Internet we want to host the web site with a provider but it needs access to the SQL Server database.
95% of data changes are made by the business using the WinForms application. The web site is essentially a read only view of the data but it is possible to add some data to the system which accounts for the other 5%.
Is replication the best way to achieve the desired result e.g. SQL Server within the business network remains the master database as most changes are made to this and then replicate this to the off site server? If so which type of replication would be the most suitable and would this support replicating the little data entered from the ASP.NET web site back to the main server?
The SQL Server is currently 2005 but can be upgraded as required for any replication requirements.
Are there other solutions to this problem?
Yes, since the web application is causing 5% (max) transaction; you can separate it.
I mean, you can have a different DB which is a carbon copy of the master one and have web application point to this DB.
You can setup a bi-directional transaction replication. So that, transaction made to the master DB will get replicated as well as transaction made to the secondary DB will be replicated as well.
No need of upgrading; as SQL Server 2005 supports replication.
For further information check MSDN on replication here: Bidirectional Transactional Replication
In a Nutshell, here are the steps you would do:
Take a full backup pf the master DB
Restore the DB to newly created DB server
Configure trans replication between them.
For better performance, you can also have the primary DB mirrored onto someother DB server.