Best approach to move data between SQL Servers - sql-server

Ok, let me explain the environment we are facing here:
We have an ASP.NET MVC 4 app that uses a SQL Server database.
This app isolates data in "projects", so when any user connects to it only can work on the data of one of this projects.
Sometimes... a group of users have to travel to remote regions for some days to retrieve data for a single project, and quite often they won't be able to have an internet connection (even mobile or satellite solutions are often out of reach).
While the displaced team works on a project, people at the office still can work on the rest of the projects (but no on the one that is abroad).
So... we are pondering the possibility of using a laptop to act as a "mobile server", where users can download the data from a specific project before travelling. While abroad, they can work against this "mobile server", update any data on their project and, when they come back, they could upload their updated data to the main server.
Our idea is to create stored procedures on both servers (main and mobile) that executes different queries to update data from a project between them, passing the project identifier as a parameter. Probably using Linked servers to allow main and mobile to see themselves during update operations.
Our questions here are:
Is this a good aproach?
Is there any other better approach that we're not seeing?
Are there any risks we should pay attention to in this or other approachs?

I've never used Bidirectional transaction replication so if that works for you, problem solved. I do have quite a bit of experience with data migration, including merging large data sets into software driven systems. And from that experience, replication has hurt us more than it has helped us (from a migration/merge view).
The biggest challenge in my opinion is going to be conflict resolution. I know you say that all of the data is in project specific databases, but there is no shared data at all? What about multiple remote users updating the same data? In that case you're going to need a little more than just replication.
Instead of maintaining two databases at all times (one for mobile, one as the regular in-house db), why not a system where a job is called to your main system indicating that a project needs to be prepared for "offline mode" (the job could be stored procedures or SSIS packages or straight T-SQL). Whatever the technology used, this job would copy all of the requested project data to a new database on the remote server/laptop and mark it somehow in the main database as read-only to prevent users in the office from updating that data.
Once the data is in offline mode on the remote server, the users can update and use the data as much as they want from that remote server. Then when the users get an internet connection or they are back in the office they can kick off another job that syncs the data to the main server, removes offline mode, and deletes/archives the remote database. Almost like a temporary project database.
Seriously, it sounds like a fun project.
Technologies to look at:
SSIS (Sql Server Integration Services) - In my experience, this is extremely fast at moving data and allows you the ability to add logic to handle conflict resolution, error logic, etc. It's free (with certain Sql Server editions) and the community is huge so supporting it should be easy. SSIS is not as dynamic as some of the specialized solutions out there.
A data migration suite like Pervasive's Data Integrator - I loved this but it's expensive. You could right an entire solution in this product that could handle the processing of your data bidirectionally and like SSIS it allows for complex programming logic.
T-SQL - With a linked server you could just write straight queries (using stored procedures if you wanted). The problem here is security on the linked server. We don't use them because of this issue. Linked Servers: Good or Bad?
Start using some of Microsoft's built in change detection technologies right off the bat. It's harder to implement when you're already using the system. Change Data Capture (CDC) will give you a full history of the records updated while Change Tracking will give you a light-weight summary of your changes. Using either technology will make syncing the data many times easier.
Change Tracking: http://msdn.microsoft.com/en-us/library/bb933874.aspx
Change Data Capture: http://msdn.microsoft.com/en-us/library/cc645937.aspx
SSIS: http://msdn.microsoft.com/en-us/library/ms169917.aspx
SQL Server Agent Jobs: http://msdn.microsoft.com/en-us/library/ms189237.aspx

Related

Approaches for Database Synchronization

I am currently been assigned to develop a sync application for my company. We have SQL server on our database server which will be synced with the client database. Client databases are not known, they can be SQLite or MYSQL or whatever.
What this sync app does is, detect changes that occur on server & client databases. Save these changes and sync. If changes occur on server database it will be synced with the client database and vice versa.
I did some research on it and came to know many solutions. One of them is to use a Microsoft Sync Framework. But I hardly found a good implementation example on it for syncing with remote databases.
Then I came across Change Data Capture(CDC) on SQL Server 2008. CDC works by detecting the change on the source table through triggers and put these changes on a separate table called sync_table, this table is then used for syncing.
Since, I cannot use the CDC feature because I don't have sufficient database rights on my machine, I have started to develop my own solution which works like how CDC does. I create separate sync_table for each source table, create triggers to detect data change and put this data in the sync_table.
However, I am advised to do some more research on it for choosing the best implementation methodology.
I need to keep the following things in mind,
Databases may/may not be on the same network.
On server side, the user must be able to select which tables will take part in the sync process.
Devices that will sync with the server database need to be registered first. Meaning that all client devices will be registered by the user before they can start syncing.
As usual any help will be appreciated :)
There is an open source project called SymmetricDS with many of the same goals. Take a look at the documentation and data model to see how the problem was solved, and maybe you will get some ideas. Instead of a separate shadow table for each source table, there is a single sym_data table where all the data is captured in comma separated value format. The advantage is one place to look for captured data and retrieve changes that were part of the same transaction. The table is kept small by purging it often after data is transferred successfully. It uses web protocols (HTTP) for data transfer. The advantage is leveraging existing web servers for performance, administration, and known filtering through firewalls. There is also a registration protocol used before clients are allowed to sync. The server admin "opens registration" for a client ID, which allows the client to connect for the first time. It supports many different databases, so you'll find examples of how to write triggers and retrieve unique transaction IDs on those systems.

How do I interface an xBase based ERP to a web application?

I am required to setup a web application that will interact with an existing ERP system (WinMagi). The ERP is basically a front-end to an xBase (FoxPro) database. The database is located on an in-house server. The ERP, as far as I'm aware, doesn't have an API but can accept purchase orders, etc through an EDI module. The web application should be able to accept online orders and query data for reporting.
My plan so far:
Synchronize the xBase DB to a SQL server instance on a cloud hosted VM.
(one-way from ERP -> SQL Server)
Use this sync process as an interface between the ERP and web application.
Push purchase orders back to the ERP using EDI.
My thinking here is that it would be safer from a data concurrency perspective to create or update data in the ERP through a controlled and accepted (by the ERP) interface.
Questions/Concerns:
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
Would the xBase DB become locked during sync? Or otherwise cause an issues for the live ERP?
How do I avoid data concurrency / integrity problems during the sync?
This system wouldn't be serving live data to the web app. What sort of issues can I expect due to this?
Should I prefer one language over another for this sort of project? My plan was to use Java/Hibernate MVC.
Am I perhaps going about this the wrong way? Would I be better off interfacing my web app directly with the xBase DB? Some problems that immediately spring to mind with this approach are networking issues between the office and the cloud-based VM and potential security vulnerabilities from opening up the ERP directly to the internet.
Any advice or suggestions you might be able to provide would be greatly appreciated!! Thanks in advance.
UPDATE - 3 Sep 2012
How I'm currently doing the data copy (it's not a synchronization) - runs nightly:
A linux box in the office copies the required DBFs from a read-only share on the ERP server to local storage.
The DBFs are converted to CSV using Dave Burton's fantastic dbf2csv perl script
The resulting CSVs are rsync'd to the remote VM. There are only small changes in the data so this is quite fast.
Once the rsync is complete the remote VM does a mysqlimport to the production DB.
Advantages of this approach
The ERP cannot be damaged in any way as the network access is read-only.
No custom logic has to be implemented to sync data and hence there are no concerns that the data could be wrong on the remote VM.
As the data copy runs at night the run time isn't too important.
Current run time is approx 7 minutes for over 1 million records with approx 20-30 fields per record.
Longest phases are the DBF copy and conversion to CSV.
Disadvantages
The DBFs have to be copied in full every time.
The DBFs have to be converted in full every time.
Tables that are being copied are locked during the mysqlimport. This isn't really too much of an issue though as the import runs during the night and the mysqlimport only takes about 20 seconds.
If you are using Visual Foxpro 3.0 or greater, you could use the built in DataBase container to create a connection to the SQL Server DB. Then the Views in the .DBC would do the heavy lifting of reading and updating the SQL Server tables.
I would envision a routine that looped through your Foxpro table and reading the rows and then making the updates to the SQL Server DB. So, the Foxpro tables shouldn't be lock. To ensure this, you could first query the DBFs into a cursor, then loop through the cursor.
I would suggest adding procedure to do concurrency checking.
Another option to server live Foxpro data in your web apps would be to create a linked server in SQL Server to your Foxpro database. That way your Foxpro data could be accessed real time.
I am currently doing something similar - I have to make invoice transactions from a FoxPro-based system available through a web application that will be on a remote, hosted VM running SQL Server.
I will answer your first point based on what I'm doing - you can decide for yourself whether it would work for you!
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
I didn't really look for any shared libraries. What I did was (somewhat simplified):
Added a field to the ERP-side transaction table that holds a CRC32 value based on other fields that I want to detect changes to (for example, the transaction balance).
Wrote a standalone EXE that scans the ERP-side transaction table on a timer, calculates a CRC32 value based on some fields, compares this to the last CRC32 value stored in the new field from point 1, and if different then something has changed and the transaction needs to be re-sent. This EXE was written in VFP for simplicity in accessing DBF files, and it runs as a Windows service. When I get time it will be re-done in C#.
Still in this EXE, once I have a list of new or changed transactions I convert them to JSON. I rolled my own JSON functions, but you could use Craig Boyd's from [Sweet Potato Software][1] or a number of others. There may be a PDF document associated with the transaction, if so it is encoded and embedded in the JSON.
I send the JSON to a web service on the remote side using a class that leverages the standard Windows WinHTTP library (WinHttp.WinHttpRequest.5.1) . The remote web service is essentially running Java. It decodes it all and updates the SQL Server.

Suitable method For synchronising online and offline Data

I have two applications with own database.
1.) Desktop application which has vb.net winforms interface, runs in offline enterprise network and stores data in central database [SQL Server]
**All the data entry and other office operations are carried out and stored in central database
2.) Second application has been build on php. it has html pages and runs as website in online environment. It stores all data in mysql database.
**This application is accessed by registered members only and they are facilitied with different reports of the data processed by 1st application.
Now I have to synchronize data between online and offline database servers. I am planning for following:
1.) Write a small program to export all the data of SQL Server [offline server] to a file in CVS format.
2.) Login to admin Section of live server.
3.) Upload the exported cvs file to the server.
4.) Import the data from cvs file to mysql database.
Is the method i am planning good or it can be tunned to perform good. I would also appreciate for other nice ways for data synchronisation other than changing applications.. ie. network application to some other using mysql database
What you are asking for does not actually sound like bidirectional sync (or movement of data both ways from SQL Server to MySQL and from MySQL to SQL Server) which is a good thing as it really simplifies things for you. Although I suspect your method of using CSV's (which I would assume you would use something like BCP to do this) would work, one of the issues is that you are moving ALL of the data every time you run the process and you are basically overwriting the whole MySQL db everytime. This is obviously somewhat inefficient. Not to mention during that time the MySQL db would not be in a usable state.
One alternative (assuming you have SQL Server 2008 or higher) would be to look into using this technique along with Integrated Change Tracking or Integrated Change Capture. This is a capability within SQL Server that allows you to determine data that has changed since a certain point of time. What you could do is create a process that just extracts the changes since the last time you checked to a CSV file and then apply those to MySQL. If you do this, don't forget to also apply the deletes as well.
I don't think there's an off the shelf solution for what you want that you can use without customization - but the MS Sync framework (http://msdn.microsoft.com/en-us/sync/default) sounds close.
You will probably need to write a provider for MySQL to make it go - which may well be less work than writing the whole data synchronization logic from scratch. Voclare is right about the challenges you could face with writing your own synchronization mechanism...
Do look into SQL Server Integration Service as a good alternate.

Data migration process for an application whose architecture is changing?

After having used an application for over 10 years, and been constantly limited by its lack of extensibility, we have decided to rewrite it fully from scratch. Because the new architecture differs from the old application, the database is also different. Here comes the problem: Is there any industrial process for migrating the data from the previous database to the new one? Some tables are alike, others are not. Overall, we need a process that will help us make sure that no data or logical constraints is lost during the migration.
PS: The old and new database are both Oracle databases.
Although you don't specify this in your question, I assume that you're going to develop the new version of your application/database, and then at some switchover point you need to migrate all of the live data from your old database into your new database.
If this is the case, then you're really asking about two distinct processes: the migration (with some modifications) of the database structure, followed later by the migration of the data itself.
For the first process, the best tool is you, the developer (I don't mean you're a "tool" - you know what I mean). You could bring over the structure of the old database and then change it as necessary for the new version; however, this approach in general tends to leave too much of the old structure behind. I think it's better to take advantage of the situation and rebuild the database from the ground up, using the original database just as a general reference.
For the second process, I would treat the data migration as a separate task requiring a separately-written and -tested application. This application could be a set of scripts or a compiled application or whatever is most convenient for you. Because your old and new databases will not have the same structure (and may in fact be very different), there are no commercial tools out there that will handle this task for you automagically. By treating this as a distinct application that you write yourself, you can test the data conversion process many times before your "go live" date.
I've heard of several different ways to attack problems such as this. The most simple solution I've seen is to use a Microsoft Access database and use ODBC connection to connect to both the new and old Oracle databases. You can then use Access to migrate and transform the data as you need.
The more elegant solution involes installing Microsoft SQL Server Development tools. You can use Business Intelligence Development Studio to create a SSIS package with two Oracle endpoints. SSIS can handle the heavy lifting of transforming the data between the databases and you can run the package locally so you don't have to have an instance of SQL Server running anywhere.
There's a tutorial series for SSIS at:
http://www.developerdotstar.com/community/node/364
You might also want to check out Oracle Warehouse Builder (OWB). The name is a little confusing, but it's Oracle's ETL (Extract, Transform, and Load) package. I've never used it personally, but it might do what you're looking to do as well.

Microsoft Sync Services - good solution for me?

We upload sales transactions from our stores to the headoffice server. At the moment, we use DTS (SQL Server Data Transformation Services), but we’re planning on replacing that with Microsoft Sync services for ADO.NET, as this seems to be Microsoft’s preferred solution for this type of setup and we want to follow the standard (that will be hopefully be around for a long time).
Here are the details of our setup and what we’re planning. I’m looking for some advice, especially about whether Sync Services fits into our solution.
Situation
Each store has a 3rd party EPOS system which stores sales in a Microsoft Access 2000 database, which we can access. Our headoffice database is SQL Server 2005, but will be upgraded to 2008. The headoffice is not on a VPN with all the stores, but we can open up our firewall to the stores’ IP addresses, so that they can send data directly to SQL Server. The stores are always connected to the internet via ADSL, although they do lose connection and we don’t want to lose sales data.
We are only uploading transactions from the store – definitions do not need to be downloaded.
Current solution
We have written a Windows service that runs on the store PC. This service downloads a DTS package from the server (which contains all the details of the upload) and runs it in the store – and this will upload sales to our server.
We chose DTS, because it is free when you install MSDE. We can’t use SSIS, because that would require a SQL Server licence at every store.
Another reason we chose DTS is that the details of the upload (i.e. which tables and fields to include) are stored on our headoffice server, so if we need to change things we can do that centrally and don’t need to install anything new at the stores. This isn’t a showstopper, but would be nice to have this ability in our new solution.
Potential solution - Microsoft Sync services for ADO.NET
We are currently building a proof of concept with Microsoft Sync services for ADO.NET. The idea is to put SQL CE (SQL Server Compact 3.5) in each store (client) and sync that to the headoffice SQL Server 2005 database (server). We’ll get the data into the SQL CE database either by (1) syncing it with the Access 2000 database or (2) getting the EPOS system developers to write sales straight into the SQL CE database – probably (2). But our main concern is getting the data from the store to the headoffice server. This method seems to be Microsoft’s preferred solution for occasionally connected systems and that is what made us look seriously at Sync Services.
I’m hoping that using this will mean that most of the work needed to upload the sales will be built into Sync Services and we won’t have to re-invent the wheel.
Potential solution - Upload to a custom webservice
There is also the possibility of uploading the sales transactions to a custom web service on our headoffice server and then into our SQL Server database. This means that we will have to build our own mechanism for determining which rows are new, and as well as caching for when the systems are disconnected. Also, we might be missing out on other functionality that will come built into Sync Services.
Please let me know if you have any advice that will help, especially : “Is Sync services the right solution!”. The problem that we are trying to solve seems very generic (uploading sales from stores) – and I’d like to solve it with a generic solution.
Microsoft Sync services is more than you need, but it will certainly do what you want, and it was built with your type of application in mind.
As with most new technologies out of microsoft, (caution: generalization!) you may find that it's not as mature as you might like. It'll do what you need it to, but you may run into issues that aren't easily resolved because it hasn't been put through the ringer. As an early adopter, though, you may find that the Sync developers are eager to help you out when you get stuck, so this isn't as big a problem as it might seem.
Make sure you read through all the literature on it, some of which is here, or linked in the following sites:
http://msdn.microsoft.com/en-us/sync/default.aspx
http://msdn.microsoft.com/en-us/sync/bb887608.aspx
http://en.wikipedia.org/wiki/Microsoft_Sync_Framework
Given your one-way flow of information, though, and centralized layout I expect you should have few, if any, issues setting it up and using it.
Be sure to report your experience back here!
-Adam

Resources