Synchronizing 2 databases - database

I have a database in MySQL and another database that runs on MS SQL.
The MySQL is the backend database for my website running on Joomla.
I have an ERP running my store. This ERP is made by a 3rd party in .Net
A table called the orders gets updated whenever a user places an order in my website.
The order details must get flushed to my orders table in my ERP.
The table structure in the two databases are totally different so I will do the mapping myself.
My questions are:
How frequently should I transfer the data from my MySQL database to MS SQL?
Someone suggested that I could write a web service that would periodically pump data to my table in the ERP. So I started thinking about Nusoap webservices. Is this the right way or is there a better way to do it ??
I will also have to retrieve inventory-related information from my ERP to my MySQL database.

1: Depends on how often your data is changing, and how often you need to sync up (i.e., depends on your business).
2 & 3: A web service to transfer data could work just fine. But unless you're trying to come up with a general solution, this sounds like a lot more trouble than it's worth.
If I were doing this, I would export the data from Sql Server to a file, then import that file into mysql (mysql my_db < file.sql).
Getting data OUT of sql server in this format isn't so easy (there's no equivalent to mysqldump on Sql Server). But check out this question for some ideas.
If the data itself is compatible between systems (if the columns are equivalent data types), you can overcome the table structure differences by just creating a query in SQL Server which exports the data in the correct order.
In fact, you may be able to create a query who's output is the file.sql for import into mysql. For example, a query such as:
SELECT CONCAT(
'INSERT INTO MYTABLE VALUES (',
myColumn,
',',
myOtherColumn,
');'
) AS SQL_STATEMENT
Produces output something like:
INSERT INTO MYTABLE VALUES (myColumnValue1, myOtherColumnValue1);
INSERT INTO MYTABLE VALUES (myColumnValue2, myOtherColumnValue2);
....
I've exported data from sql server that way on at least one occasion.

How up to date do you need the ms sql database. That is going to be the deciding factor
I don't see any huge advantage to this being a web service.
This isn't a question.

Deciding how often you transfer the order across is a business decision not a technical one. But it is hard to see what competitive advantage you might gain from not processing your customers' orders as soon as possible, so it ought to be a no brainer.
Without knowing a lot more about your infrastructure and architecture we cannot give you definitive advice about approach. I would expect a decently written ERP package to include interfaces for importing and exporting information. Alas such expectations are often confounded. If you do need to write your own interface, avoid web services. Unless you have a very peculiar set-up all WS will mean is that it will take longer to satisfy your customers. I think we have already agreed that is not a good idea.
Considerations for a Syncronization API:
You need to track which new orders
have not been transferred to the ERP
database. A flag is clumsy, a queue
is perhaps more elegant.
Have a job/daemon polling
continuously to identify orders
which need to be transferred and
transfer them in near-real time.
Have a plan for handling the
unavailability of the ERP database.
Construct the mapping in a modular
fashion so you do not have to
rewrite the entire thing just
because of a change to the structure
of one of your tables.
The inventory data will probably
have to be pulled from the MySQL
database, as it seems unlikely that
the third party will allow you to
put code into their database. But
it's worth reading the contract.

Okay based on the replies I got I will rephrase my question giving more details.
I have an eCommerce portal running on Joomla and Virtue mart (never mind what they are !!)
The backend database here is MySQL.
I have an erp written in .net by my friend and the Db used there is MSSQL
Now I am going to host my eCommerce portal.
Following are actions that will take place and questions related to the actions
Action 1:
At the start of the day my friend updates inventory of various products on and erp table
question:
I want the updated inventory from the erp (MS SQL) to get reflected on my website database (MySQL) automatically. How do I do it ?
Action 2:
People come to my site and place orders.These orders are stored in an order table in my website(MYSQL).
Question 2:
I want these update orders related data from my website (MySQL) to be updated on a corresponding table in my erp (MS SQL)
More over the db structures of the tables in my erp and my website are completely different

Related

I have text data, and want to get it into AWS

I have what is essentially a traditional relational database, consisting of four tables, all related with IDs. Currently this database resides in four tab-delimited text files, in an S3 bucket. Very little, if any, data will ever be added to these tables. It is an unchanging reference database. So it will be exclusively read from, never added to or edited.
I would like to access this database in an Alexa skill. I've built a few skills already, using NodeJS, so I know how that all works. But I'm anxious to learn how to link up a skill with a back-end DB. This skill will need to do SQL SELECT statements against this DB, based-on user-provided parameters, and based on the query filter be able to pull a set of records into an array that can be used by my skill's lambda function.
Each of the current text files holds one of four tables. The largest table is about 35k rows. Whole DB is maybe 5 Mb, 90% of which is one of the four. Like I said, they are all connected with ID columns like a traditional RDBMS. This will not be for commercial purposes. Probably.
I am already familiar with SQL Server, it's the DB I know, and I'm comfortable with SQL Server Express and can whip something up there, but I'm open to learning NoSQL or some other method if it's more appropriate for this use case. And as this is mostly a learning exercise, if something is "just as good", it's good for me to know.
What is my best DB solution?
* NoSQL such as DynamoDB?
* Some sort of MySQL?
* SQL Server?
* Leave them as tab-delimited text and use them from the Lambda function directly?
Thanks, I don't want to start down the wrong road here.
A few options...
S3 Select
S3 Select (in Preview at the time of writing this) "enables applications to retrieve only a subset of data from an object by using simple SQL expressions. By using S3 Select to retrieve only the data needed by your application, you can achieve drastic performance increases – in many cases you can get as much as a 400% improvement."
DynamoDB
The benefit of using DynamoDB is that there is no need to run a database server -- it is a fully-managed service. While it doesn't support SQL syntax, it is very fast and can suit many use-cases.
In fact, most projects should consider using a NoSQL database like DynamoDB for every situation, unless there is a particular reason to use SQL (such as business reporting).
Cost is based upon storage and provisioned capacity (which can scale-up and down based on demand).
SQL Database
Yes, you can certainly run an SQL database, either through Amazon RDS (Relational Database Service) or on your own EC2 instance (eg MySQL or even Apache Derby. However, you are then paying for the server even when it isn't being used.
Using Microsoft SQL Server is probably too much for your use-case (and more expensive than using an open-source product).
I wonder if you could incorporate SQLite in your app, which would provide SQL capabilities without much overhead?
Do it in memory
5 MB is, quite frankly, not much data. You could simply load all the data into memory and do your manipulations from there. While the load might consume a few cycles, data access will be very quick after that.

Options for loading a bunch of disparate data sets into a 'target' schema

Background
5-10 data sources
Various formats (csv, psv, xml)
Different update schedules (weekly, monthly, quarterly)
Requirements
Only interested in some of the fields from each data source
Want to build a model from the various sources, into a single database (SQL Server)
Current platform/skillset
Azure
SQL Server
Considerations
Minimal code. Hopefully i can do this all via a UI/drag-drop interface.
Automation. Hoping i can drop the files onto a server when it needs to be updated, then "things" kick off (Azure Functions blob/FTP trigger?)
Questions
I haven't done much in the ETL space, but my initial thoughts point to something like SQL Server Integration Services, mainly because that's the only thing i can ever had experience in, ETL-wise.
Now that we have things like Azure Data Factory, SQL Data Warehouse, etc, would that be a better solution? Obviously the answer is "it depends", so what questions do i need to go about asking myself in order to clarify that? Can someone please point me to a good article to get started in this space?
TIA
The main question is where do you want to stage the data.
Many people are talking about Azure Data Lake as a staging area. There are pros and cons to this solution.
The pros are Azure Active Directory Service can be federated with your on premise forest. Once that is done, regular Access Control List can be used to restrict access.
The cons are the fact that you are using premium storage (SSD) which can cost a-lot of money for a small to medium size company.
On the other hand, Azure Blob Storage has been around for a long time. One of the pros is the cost of this storage. A shared access signature (SAS) can be used to let anyone access to the account.
The cons is that the SAS is the key to the whole kingdom. Unlike ADLS, you can not assign privledges at the file.
If you like SQL Server OpenRowSet or Bulk Insert, you are in for a treat. Support for those functions were added earlier this year.
Check out my article on MS SQL TIPS for the details.
As for scheduling, you can use a very simple Power Shell script in Azure Automation to create a hands off process.
Azure Data Factory might be able to do some of these tasks; However, you adding a-lot more complexity than a simple T-SQL statement to load data into a table.
Last but not least, learn to love PowerShell. You can pretty much do any type of file processing with that language and the right .NET components.
Happy coding.
John Miner
The Crafty DBA

SQL Server move data between databases

We have a requirement where we will have to move data between different database instance on regular basis. (For e.g. some customers willing to pay more for the better performance). So this is not going to be one off.
The database tables has referential integrity. Is there a way in which this can be done without rewriting sql script (or some other method) every time we migrate customers data?
I came across this How to move data between multiple database's table while maintaining foreign-key relationships/referential integrity?. However it appears that we have write script every time we migrate data (please correct me if I misunderstood the answer on this thread).
Thanks
Edit:
Both servers are using SQL Server 2012 (same version). Its an Azure SQL Server database.
They are not necessarily linked (no firewall between them)
We are only transferring some data, not the whole database. This is only for certain customers who opted pay more.
The schema are exactly same in both databases.
Preyash - please see the documentation on the Split-Merge tool. The Split-Merge tool enables you do move data between databases, as you have described, based on a sharding key (e.g., customer ID). One modification that you will need for your application is to add a shard map (i.e., a database that understand the global state of which customers resides in which databases).
Have a look into Azure Data Sync. It is much more aligned with your requirements. But you may end up in having another SQL Azure DB to maintain a Hub. Azure data Sync follows hub-spoke pattern and will let you do all flexible directional syncs with a few minutes of syncing gap. It is more simple and can set it up very fast without any scripts and all as you wanted.

Oracle DB Access

I have a client/server application currently that has a Oracle 10G database. The company that I purchased the application form is not providing support. The company when I purchased the application provided me a SQL tool with a READ Only access access to approx 30-40 views.
Based on my analysis the views provide some but not all the data and I want access to data which may be in other tables
I am not a developer but the business owner so excuse my naivety in some of the questions below.
Can I export/duplicate/replicate the Oracle DB to another Oracle DB and will a Oracle DBA be able to view/access all the tables and understand the relationships
What is the best way to create a duplicate DB that keeps in sync with the application DB which we currently have. We would like to use the Duplicate DB as a backend for a website.
Thanks a lot!
ML
Assuming that the Oracle database resides on a server in your organization, it seems premature to be talking about talking about replicating the data to a different database. It is certainly possible to do so. But you can also run many, many different applications against the same database. Unless you know that the current database server would not be able to cope with the additional workload of the new application or you are planning on investing the time and effort to transform the data into better data model as part of replicating the data (which is extremely unlikely if you don't already know what the underlying data model is and if you don't already know that this data model isn't going to work well for the new application), you probably want to start with the assumption that you can probably build the new application against the existing database.
A database developer or a DBA should be able (again, assuming that you own the server) to determine what underlying tables exist. That person should be able to at least get some idea of how the tables relate to each other based on the existing view definitions. If the original company did a good job building the database, a new developer/ DBA should have a relatively easy time understanding the relationships. If the original company did shoddy work or was intentionally secretive, it will be a more challenging undertaking.

How do I interface an xBase based ERP to a web application?

I am required to setup a web application that will interact with an existing ERP system (WinMagi). The ERP is basically a front-end to an xBase (FoxPro) database. The database is located on an in-house server. The ERP, as far as I'm aware, doesn't have an API but can accept purchase orders, etc through an EDI module. The web application should be able to accept online orders and query data for reporting.
My plan so far:
Synchronize the xBase DB to a SQL server instance on a cloud hosted VM.
(one-way from ERP -> SQL Server)
Use this sync process as an interface between the ERP and web application.
Push purchase orders back to the ERP using EDI.
My thinking here is that it would be safer from a data concurrency perspective to create or update data in the ERP through a controlled and accepted (by the ERP) interface.
Questions/Concerns:
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
Would the xBase DB become locked during sync? Or otherwise cause an issues for the live ERP?
How do I avoid data concurrency / integrity problems during the sync?
This system wouldn't be serving live data to the web app. What sort of issues can I expect due to this?
Should I prefer one language over another for this sort of project? My plan was to use Java/Hibernate MVC.
Am I perhaps going about this the wrong way? Would I be better off interfacing my web app directly with the xBase DB? Some problems that immediately spring to mind with this approach are networking issues between the office and the cloud-based VM and potential security vulnerabilities from opening up the ERP directly to the internet.
Any advice or suggestions you might be able to provide would be greatly appreciated!! Thanks in advance.
UPDATE - 3 Sep 2012
How I'm currently doing the data copy (it's not a synchronization) - runs nightly:
A linux box in the office copies the required DBFs from a read-only share on the ERP server to local storage.
The DBFs are converted to CSV using Dave Burton's fantastic dbf2csv perl script
The resulting CSVs are rsync'd to the remote VM. There are only small changes in the data so this is quite fast.
Once the rsync is complete the remote VM does a mysqlimport to the production DB.
Advantages of this approach
The ERP cannot be damaged in any way as the network access is read-only.
No custom logic has to be implemented to sync data and hence there are no concerns that the data could be wrong on the remote VM.
As the data copy runs at night the run time isn't too important.
Current run time is approx 7 minutes for over 1 million records with approx 20-30 fields per record.
Longest phases are the DBF copy and conversion to CSV.
Disadvantages
The DBFs have to be copied in full every time.
The DBFs have to be converted in full every time.
Tables that are being copied are locked during the mysqlimport. This isn't really too much of an issue though as the import runs during the night and the mysqlimport only takes about 20 seconds.
If you are using Visual Foxpro 3.0 or greater, you could use the built in DataBase container to create a connection to the SQL Server DB. Then the Views in the .DBC would do the heavy lifting of reading and updating the SQL Server tables.
I would envision a routine that looped through your Foxpro table and reading the rows and then making the updates to the SQL Server DB. So, the Foxpro tables shouldn't be lock. To ensure this, you could first query the DBFs into a cursor, then loop through the cursor.
I would suggest adding procedure to do concurrency checking.
Another option to server live Foxpro data in your web apps would be to create a linked server in SQL Server to your Foxpro database. That way your Foxpro data could be accessed real time.
I am currently doing something similar - I have to make invoice transactions from a FoxPro-based system available through a web application that will be on a remote, hosted VM running SQL Server.
I will answer your first point based on what I'm doing - you can decide for yourself whether it would work for you!
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
I didn't really look for any shared libraries. What I did was (somewhat simplified):
Added a field to the ERP-side transaction table that holds a CRC32 value based on other fields that I want to detect changes to (for example, the transaction balance).
Wrote a standalone EXE that scans the ERP-side transaction table on a timer, calculates a CRC32 value based on some fields, compares this to the last CRC32 value stored in the new field from point 1, and if different then something has changed and the transaction needs to be re-sent. This EXE was written in VFP for simplicity in accessing DBF files, and it runs as a Windows service. When I get time it will be re-done in C#.
Still in this EXE, once I have a list of new or changed transactions I convert them to JSON. I rolled my own JSON functions, but you could use Craig Boyd's from [Sweet Potato Software][1] or a number of others. There may be a PDF document associated with the transaction, if so it is encoded and embedded in the JSON.
I send the JSON to a web service on the remote side using a class that leverages the standard Windows WinHTTP library (WinHttp.WinHttpRequest.5.1) . The remote web service is essentially running Java. It decodes it all and updates the SQL Server.

Resources