Oracle database mirroring - database

We have ERP application that store the data to ORACLE database. And also we have a lot of another application that use the ERP database. Same database but different instances. We got the performance issues when ERP and another application use the same database.
We are planning to separate the database server become three. One for ERP and two others for report and applications. these new database servers are came from ERP database, because they use the same database structure and data. So we could say these new database servers are mirror of ERP database. And also sometime data on the mirror database could be updated by other application, and it should be also updated on ERP database.
What best practice and method should be used for mirroring this condition?
Is it enough by use Data guard from ORACLE?
This is the picture of the architecture plan.

Data guard does not allow writing to the stand-by. Active dataguard does allow reading from the stand-by while archiving transactions from the primary node. So the report server using your ERP Mirror 1 is not a problem as long as it only reads data. Writing from the other applications to ERP Mirror 2 is. What you are looking for is advanced replication or Oracle streams. This is a very complex task. Maybe offloading your reporting to a data-guard stand-by solves your problems.

Related

Real time report generate issue

We are developing a application has 2.2 million customers. We develop rest api. Transaction report api hit huge table to generate a complex report. If a single time more than 100k customers get this data huge hit my database server. My question is which way we generate report without hit SQL server database. Or world best organization how to manage this type of application database.
We use asp.net mvc5 and SQL server
There are at least two standard methods to "generate report without hit SQL server database".
use snapshot isolation in the same OLTP database to avoid the report locking other writing transactions and ensure to get consistent data
use the data transferred (ETL, log shipping, replication...) from OLTP to other (OLAP) database (i.e. to a warehouse or later in datamarts/cubes)
you can have install other SQL instance in prod and create transition replication on the database. Report on the replicated side on the database.

SQL Server move data between databases

We have a requirement where we will have to move data between different database instance on regular basis. (For e.g. some customers willing to pay more for the better performance). So this is not going to be one off.
The database tables has referential integrity. Is there a way in which this can be done without rewriting sql script (or some other method) every time we migrate customers data?
I came across this How to move data between multiple database's table while maintaining foreign-key relationships/referential integrity?. However it appears that we have write script every time we migrate data (please correct me if I misunderstood the answer on this thread).
Thanks
Edit:
Both servers are using SQL Server 2012 (same version). Its an Azure SQL Server database.
They are not necessarily linked (no firewall between them)
We are only transferring some data, not the whole database. This is only for certain customers who opted pay more.
The schema are exactly same in both databases.
Preyash - please see the documentation on the Split-Merge tool. The Split-Merge tool enables you do move data between databases, as you have described, based on a sharding key (e.g., customer ID). One modification that you will need for your application is to add a shard map (i.e., a database that understand the global state of which customers resides in which databases).
Have a look into Azure Data Sync. It is much more aligned with your requirements. But you may end up in having another SQL Azure DB to maintain a Hub. Azure data Sync follows hub-spoke pattern and will let you do all flexible directional syncs with a few minutes of syncing gap. It is more simple and can set it up very fast without any scripts and all as you wanted.

Oracle DB Access

I have a client/server application currently that has a Oracle 10G database. The company that I purchased the application form is not providing support. The company when I purchased the application provided me a SQL tool with a READ Only access access to approx 30-40 views.
Based on my analysis the views provide some but not all the data and I want access to data which may be in other tables
I am not a developer but the business owner so excuse my naivety in some of the questions below.
Can I export/duplicate/replicate the Oracle DB to another Oracle DB and will a Oracle DBA be able to view/access all the tables and understand the relationships
What is the best way to create a duplicate DB that keeps in sync with the application DB which we currently have. We would like to use the Duplicate DB as a backend for a website.
Thanks a lot!
ML
Assuming that the Oracle database resides on a server in your organization, it seems premature to be talking about talking about replicating the data to a different database. It is certainly possible to do so. But you can also run many, many different applications against the same database. Unless you know that the current database server would not be able to cope with the additional workload of the new application or you are planning on investing the time and effort to transform the data into better data model as part of replicating the data (which is extremely unlikely if you don't already know what the underlying data model is and if you don't already know that this data model isn't going to work well for the new application), you probably want to start with the assumption that you can probably build the new application against the existing database.
A database developer or a DBA should be able (again, assuming that you own the server) to determine what underlying tables exist. That person should be able to at least get some idea of how the tables relate to each other based on the existing view definitions. If the original company did a good job building the database, a new developer/ DBA should have a relatively easy time understanding the relationships. If the original company did shoddy work or was intentionally secretive, it will be a more challenging undertaking.

How do I interface an xBase based ERP to a web application?

I am required to setup a web application that will interact with an existing ERP system (WinMagi). The ERP is basically a front-end to an xBase (FoxPro) database. The database is located on an in-house server. The ERP, as far as I'm aware, doesn't have an API but can accept purchase orders, etc through an EDI module. The web application should be able to accept online orders and query data for reporting.
My plan so far:
Synchronize the xBase DB to a SQL server instance on a cloud hosted VM.
(one-way from ERP -> SQL Server)
Use this sync process as an interface between the ERP and web application.
Push purchase orders back to the ERP using EDI.
My thinking here is that it would be safer from a data concurrency perspective to create or update data in the ERP through a controlled and accepted (by the ERP) interface.
Questions/Concerns:
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
Would the xBase DB become locked during sync? Or otherwise cause an issues for the live ERP?
How do I avoid data concurrency / integrity problems during the sync?
This system wouldn't be serving live data to the web app. What sort of issues can I expect due to this?
Should I prefer one language over another for this sort of project? My plan was to use Java/Hibernate MVC.
Am I perhaps going about this the wrong way? Would I be better off interfacing my web app directly with the xBase DB? Some problems that immediately spring to mind with this approach are networking issues between the office and the cloud-based VM and potential security vulnerabilities from opening up the ERP directly to the internet.
Any advice or suggestions you might be able to provide would be greatly appreciated!! Thanks in advance.
UPDATE - 3 Sep 2012
How I'm currently doing the data copy (it's not a synchronization) - runs nightly:
A linux box in the office copies the required DBFs from a read-only share on the ERP server to local storage.
The DBFs are converted to CSV using Dave Burton's fantastic dbf2csv perl script
The resulting CSVs are rsync'd to the remote VM. There are only small changes in the data so this is quite fast.
Once the rsync is complete the remote VM does a mysqlimport to the production DB.
Advantages of this approach
The ERP cannot be damaged in any way as the network access is read-only.
No custom logic has to be implemented to sync data and hence there are no concerns that the data could be wrong on the remote VM.
As the data copy runs at night the run time isn't too important.
Current run time is approx 7 minutes for over 1 million records with approx 20-30 fields per record.
Longest phases are the DBF copy and conversion to CSV.
Disadvantages
The DBFs have to be copied in full every time.
The DBFs have to be converted in full every time.
Tables that are being copied are locked during the mysqlimport. This isn't really too much of an issue though as the import runs during the night and the mysqlimport only takes about 20 seconds.
If you are using Visual Foxpro 3.0 or greater, you could use the built in DataBase container to create a connection to the SQL Server DB. Then the Views in the .DBC would do the heavy lifting of reading and updating the SQL Server tables.
I would envision a routine that looped through your Foxpro table and reading the rows and then making the updates to the SQL Server DB. So, the Foxpro tables shouldn't be lock. To ensure this, you could first query the DBFs into a cursor, then loop through the cursor.
I would suggest adding procedure to do concurrency checking.
Another option to server live Foxpro data in your web apps would be to create a linked server in SQL Server to your Foxpro database. That way your Foxpro data could be accessed real time.
I am currently doing something similar - I have to make invoice transactions from a FoxPro-based system available through a web application that will be on a remote, hosted VM running SQL Server.
I will answer your first point based on what I'm doing - you can decide for yourself whether it would work for you!
What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
I didn't really look for any shared libraries. What I did was (somewhat simplified):
Added a field to the ERP-side transaction table that holds a CRC32 value based on other fields that I want to detect changes to (for example, the transaction balance).
Wrote a standalone EXE that scans the ERP-side transaction table on a timer, calculates a CRC32 value based on some fields, compares this to the last CRC32 value stored in the new field from point 1, and if different then something has changed and the transaction needs to be re-sent. This EXE was written in VFP for simplicity in accessing DBF files, and it runs as a Windows service. When I get time it will be re-done in C#.
Still in this EXE, once I have a list of new or changed transactions I convert them to JSON. I rolled my own JSON functions, but you could use Craig Boyd's from [Sweet Potato Software][1] or a number of others. There may be a PDF document associated with the transaction, if so it is encoded and embedded in the JSON.
I send the JSON to a web service on the remote side using a class that leverages the standard Windows WinHTTP library (WinHttp.WinHttpRequest.5.1) . The remote web service is essentially running Java. It decodes it all and updates the SQL Server.

How to automatically synchronize tables in different databases

Little background: I'm working in a large company with a lot of branches. We have several applications with separated databases sometimes on different servers. But every database contains a table with a list of branches and their relationships. I want to automatically synchronize these tables when one of them changed.
My question is: what are the best practices of automatic synchronization of tables in different databases (Microsoft SQL Server 2008)?
Are there sql server features for that purpose? Or external tool is a good way? Or it's better to write a small application and run it as a service or use the scheduler?
You can use replication (a SQL server built-in feature) to synchronize different databases.
You can also use triggers or log shipping to sync your tables as records are added ,updated or deleted:
Here are some links about replication.
Here are some links about log shipping.

Resources