We have an firebird database connected to our access control system and then a separate web app that I developed for our time and attendance using sql server 2005 as the data source.
I wanted to use entity framework to connect to the firebird database to access data like users, transactions, sites, etc. As this method is very complicated getting the connection using firebird .NET provider the other option I have is creating a sort of replication (Mirror) from the firebird database to sql server.
I have done this with a DTS previously (Selecting the data and then inserting it) and it worked fine but had many manual processes involved in getting the data and updates made it difficult.
Is there a simpler way to do this or any suggestions would be appreciated.
Unfortunately you need to track what to replicate at the data level. If you are only pushing it to the MS SQL database you could use a modified timestamp, or a record version field (create a generator, set a trigger to update the version field upon update) to reduce what you select. Another popular option is to update a field to current_transaction, but if you do a restore you will start counting from 0.
If you are sending data both ways it gets more complicated -- you need to have conflict resolution. You could look at something like the Microsoft Sync Framework which can use the methods above.
Related
I'm modeling a new microservice architecture migrating some part of a monolithic software to microservices.
I'm adding a new PostgreSQL database and the idea is in the future use that database but for now I still need to keep updated the old SQL Server database and also synchronize the PostgreSQL database if something new appears in the old database.
I've searched for ETL tools but are meant to move data to a datawarehouse (that's not what I need). I just can't replicate the information because the DB model is not the same.
Basically I need a way to detect new rows inserted in the SQL Server database, transform that information and insert it in my PostgreSQL.
Any suggestions?
PostgreSQL's foreign data wrappers might be useful. My approach would be, to change the frontend to use PostgreSQL and let postgreSQL handle the split via it's various features (triggers, rules, ...)
Take a look at StreamSets Data Collector. It can detect changes in SQL Server and insert/update/delete to any DB that has a JDBC driver including Postgres. It is open source but you can buy support. You can also make field changes/additions/removals/renaming to the data stream so that the fields match the target table.
What we do
We run a website that provides statistics. We used to run Access as a backend database, but now made the transition to SQL Server.
How we work
When we receive new statistics we put them in a staging table for proofreading and testing, before exporting them to the live database. Now we are using Access as a frontend for SQL Server with linked tables. This works fine.
What is the best way to having a Access database with staging tables that when ready can be exported to a table in SQL Server? Mind you that the final process should be fairly simple and not technical, as the reason for using Access is the relatively user-friendly UI. Using SQL Server Management Studio would be to technical for the users handling the data.
Let me stress that the solution we need is not a one-time conversion of a table or database, but for staging changes and then pushing them to SQL Server.
Ended up using linked tables and a local staging table which we upsized when the data is ready to be updated on production database.
https://support.office.com/en-us/article/Move-Access-data-to-a-SQL-Server-database-by-using-the-Upsizing-Wizard-5d74c0df-c8cd-4867-8d07-e6e759d72924
Here's some documentation on what I discussed in the comments. If you have never used integration services before the beauty of it is once the import procedure has been created it can be used over and over again.
So once you have the data in excel format, even access if you really wanted to you can follow the steps on the link below
Creating A Simple SSIS Package
We have an application that requires our customers to have a SQL server instance on site. At their request, the application needs to synchronize the data in their database with a copy in our datacenter.
We're using .Net 3.5 SP1. We need to synchronize the data exactly, including IDENTITY columns.
We'd prefer to use something like LINQ to SQL that would let us make some simple select and insert/update calls against mapped entities. However, the IDENTITY columns seem to be a problem with LINQ and similar approaches.
We can do this all with built-up SQL statements and turn IDENTITY INSERT on / off as needed, but I'd prefer a more elegant solution.
Thanks!
** Edit - We DO need to write our own solution, and we do need to use .Net 3.5 SP1 to do it. I won't spend your time explaining all the reasons why, but please limit suggestions to options within the .Net playground.
Microsoft Sync Framework can be your solution. This is framework description from Microsoft:
Microsoft Sync Framework is a data synchronization platform from Microsoft that can be used to synchronize data across multiple data stores. Sync Framework includes a transport-agnostic architecture, into which data store-specific synchronization providers, modelled on the ADO.NET data provider API, can be plugged in.
Sync Framework is a comprehensive data synchronization solution that enables developers to build solutions that support synchronization of any database, on any data protocol over any network topology. msdn.microsoft.com
For your convinience providing link to good tutorial on the subject
If it is just a couple of tables that need to be synchronized and there is not a lot of data in the tables (now and future) you could develop some sort of bulk copy from your servers and bulk insert routine on the customer's server.
Since you said you can't use SQL Server replication services or SSIS, then perhaps a backup/restore procedure could be written. You could take a scheduled backup of your database and make it available to calling applications which could then copy the backup, restore it to another instance on the customers server, then pull all data you need via any number of methods and it would exist locally on the customers servers.
Beyond that, I think you may be asking for a maintenance and synchronization nightmare if you can't base your solution on tools that are made to do this sort of thing.
I have one SQL Server Express instance with a pretty normal well formed database. I need to have the data continuously replicated to a SQL Server Express instance on another server.
Now, I know that SQL Server Express does not include the Publisher part of built-in replication, so I'm looking for alternative solutions. I do not want to upgrade any of the databases.
Naturally, I could make my own replication with guids, timestamps etc. and transfer the data using my own coding(as suggested in SQL Server Express database replication/synchronization), but I would want to avoid all that work, especially seeing that the replication is really very basic.
Perhaps a generic trigger added to each table?
Perhaps some kind of database job?
Any suggestions?
You wouldn't be able to utilize any built-in job scheduling, because Express does not ship with SQL Server Agent.
Here's your options as far as I see it:
Write an application that transfers "articles" from your "publisher" db to your "subscriber" db(s)
Create a set of views to have a summation of data that you want to be published. Then create INSTEAD OF triggers on these views (you can't create an AFTER/FOR trigger on a view) to process that data and transfer it to your "subscriber"(s).
Those are both not very intensive tasks. In my opinion, just to have it centralized I would go the first route. That way all of the logic is contained within the application, and your "publisher" database is ignorant to the replication. Not to mention your application could handle an unavailable subscriber pretty easy.
I have two applications with own database.
1.) Desktop application which has vb.net winforms interface, runs in offline enterprise network and stores data in central database [SQL Server]
**All the data entry and other office operations are carried out and stored in central database
2.) Second application has been build on php. it has html pages and runs as website in online environment. It stores all data in mysql database.
**This application is accessed by registered members only and they are facilitied with different reports of the data processed by 1st application.
Now I have to synchronize data between online and offline database servers. I am planning for following:
1.) Write a small program to export all the data of SQL Server [offline server] to a file in CVS format.
2.) Login to admin Section of live server.
3.) Upload the exported cvs file to the server.
4.) Import the data from cvs file to mysql database.
Is the method i am planning good or it can be tunned to perform good. I would also appreciate for other nice ways for data synchronisation other than changing applications.. ie. network application to some other using mysql database
What you are asking for does not actually sound like bidirectional sync (or movement of data both ways from SQL Server to MySQL and from MySQL to SQL Server) which is a good thing as it really simplifies things for you. Although I suspect your method of using CSV's (which I would assume you would use something like BCP to do this) would work, one of the issues is that you are moving ALL of the data every time you run the process and you are basically overwriting the whole MySQL db everytime. This is obviously somewhat inefficient. Not to mention during that time the MySQL db would not be in a usable state.
One alternative (assuming you have SQL Server 2008 or higher) would be to look into using this technique along with Integrated Change Tracking or Integrated Change Capture. This is a capability within SQL Server that allows you to determine data that has changed since a certain point of time. What you could do is create a process that just extracts the changes since the last time you checked to a CSV file and then apply those to MySQL. If you do this, don't forget to also apply the deletes as well.
I don't think there's an off the shelf solution for what you want that you can use without customization - but the MS Sync framework (http://msdn.microsoft.com/en-us/sync/default) sounds close.
You will probably need to write a provider for MySQL to make it go - which may well be less work than writing the whole data synchronization logic from scratch. Voclare is right about the challenges you could face with writing your own synchronization mechanism...
Do look into SQL Server Integration Service as a good alternate.