I have a question but I'm not sure of the word to use.
My problem: I have an application using a database to stock information. The database can ben in access (local) or in a server (SQL Server or Oracle). We support these 3 kind of database. We want to give the possibility to the user to do what I think we can call versioning.
Let me explain : We have a database 1. This is the master. We want to be able to create a database 2 that will be the same thing as database 1 but we can give it to someone else.
They each work on each other side, adding, modifying and deleting records on this very complex database. After that, we want the database 1 to include the change from database 2, but with the possibility to dismiss some of the change.
For you information, ou application is already multiuser so why don't we just use this multi-user and forget about this versionning? It's because sometimes, we need to give a copy of the database to another company on another site and they can't connect on our server. They work on their side and then, we want to merge.
Is there anyone here with experience with this type of requirement? We have a lot of ideas but most of them require a LOT of work, massive modification to the database or to the existing queries.
This is a 2 millions and growing C++ app, so rewriting it is not possible!
Thanks for any ideas that you may give us!
J-F
The term you are looking for is Database Replication. You can google that to get more information about the topic (my personal experience is limited).
This was already done by ical (an old SunOS calendar app).
What you store/remember/transmit when the app makes the changes is not just the database contents, but the actual change log (e.g. "delete record with ID 1", "update record with ID 2 with these fields", "insert record with these fields")
That way you can apply these changes to master DB later on, AND to filter them before applying
Related
I have a client/server application currently that has a Oracle 10G database. The company that I purchased the application form is not providing support. The company when I purchased the application provided me a SQL tool with a READ Only access access to approx 30-40 views.
Based on my analysis the views provide some but not all the data and I want access to data which may be in other tables
I am not a developer but the business owner so excuse my naivety in some of the questions below.
Can I export/duplicate/replicate the Oracle DB to another Oracle DB and will a Oracle DBA be able to view/access all the tables and understand the relationships
What is the best way to create a duplicate DB that keeps in sync with the application DB which we currently have. We would like to use the Duplicate DB as a backend for a website.
Thanks a lot!
ML
Assuming that the Oracle database resides on a server in your organization, it seems premature to be talking about talking about replicating the data to a different database. It is certainly possible to do so. But you can also run many, many different applications against the same database. Unless you know that the current database server would not be able to cope with the additional workload of the new application or you are planning on investing the time and effort to transform the data into better data model as part of replicating the data (which is extremely unlikely if you don't already know what the underlying data model is and if you don't already know that this data model isn't going to work well for the new application), you probably want to start with the assumption that you can probably build the new application against the existing database.
A database developer or a DBA should be able (again, assuming that you own the server) to determine what underlying tables exist. That person should be able to at least get some idea of how the tables relate to each other based on the existing view definitions. If the original company did a good job building the database, a new developer/ DBA should have a relatively easy time understanding the relationships. If the original company did shoddy work or was intentionally secretive, it will be a more challenging undertaking.
I have a web application that has an SQL database.
For clarity I'm using Asp.Net 4.0/c#/SQL Server 2008 Web edition.
I recently puclished the site, which was my first, by creating a deployment package for the database.
Now a couple of months down the line, I need to update the database structure. The web application now has data that has been entered via the web, so i'll need to update the structure, then copy data across.
As this is the first time I've done it, I'm unsure of the process I should follow - is there a standard practice for this kind of update?
Also, since some of the tables use incremental ID's I need to ensure they remain the same in the newly updated database.
Any tips, links, advice appreciated.
Important Guidelines:
I assume you have not changed structure entirely (means keys column are same though solution is around for that too)
Steps are as follows:
Take export of the database
Add or remove the columns or whatever changes you want
Import the database back
Check the log for rows/tables (if some) were not updated successfully
Make SQL queries for them and run them to sync
Here are some general steps for this:
Take backup of your online database and restore it locally
Modify local database to suite your needs
Use third party comparison and synchronization tool to publish changes to your production database
There are many of these available and you can use them in trial mode to get the job done if you’re on a tight budget. You can try tools from Red Gate, ApexSQL, Idera, Dev Art and others…
I am developing a multi-tenant app. I chose the "Shared Database/Separate Schemas" approach.
My idea is to have a default schema (dbo) and when deploying this schema, to do an update on the tenants' schemas (tenantA, tenantB, tenantC); in other words, to make synchronized schemas.
How can I synchronize the schemas of tenants with the default schema?
I am using SQL Server 2008.
First thing you will need is a table or other mechanism to store the version information of the schema. If nothing else so that you can bind your application and schema together. There is nothing more painful than a version of the application against the wrong schema—failing, corrupting data, etc.
The application should reject or shutdown if its not the right version—you might get some blowback when its not right, but protects you from the really bad day when the database corrupts the valuable data.
You'll need a way to track changes such as Subversion or something else—from SQL you can export the initial schema. From here you will need a mechanism to track changes using a nice tool like SQL compare and then track the schema changes and match to an update in version number in the target database.
We keep each delta in a separate folder beneath the upgrade utility we built. This utility signs onto the server, reads the version info and then applies the transform scripts from the next version in the database until it can find no more upgrade scripts in its sub folder. This gives us the ability upgrade a database no matter how old it is to the current version. If there are data transforms unique the tenant, these are going to get tricky.
Of course you should always make a backup of the database that writes to an external file preferable with an human identifiable version number so you can find it and restore it when the script(s) go bad. And eventually it will so just plan on figuring out how to recover and restore.
I saw there is some sort of schema upgrader tool in the new VS 2010 but I haven't used it. That might also be useful to you.
There is no magic command to synchronize the schemas as far as I know. You would need to use a tool - either built in house or bought (Check out Red Gate's SQL Compare and SQL Examiner - you need to tweak them to compare different schemas).
Just synchronizing can often be tricky business though. If you added a column, do you need to also fill that column with data? If you split a column into two new columns there has to be conversion code for something like that.
My suggestion would be to very carefully track any scripts that you run against the dbo schema and make sure that they also get run against the other schemas when appropriate. You can then use a tool like SQL Compare as an occasional sanity check to look for any unexpected differences.
I have a database in MySQL and another database that runs on MS SQL.
The MySQL is the backend database for my website running on Joomla.
I have an ERP running my store. This ERP is made by a 3rd party in .Net
A table called the orders gets updated whenever a user places an order in my website.
The order details must get flushed to my orders table in my ERP.
The table structure in the two databases are totally different so I will do the mapping myself.
My questions are:
How frequently should I transfer the data from my MySQL database to MS SQL?
Someone suggested that I could write a web service that would periodically pump data to my table in the ERP. So I started thinking about Nusoap webservices. Is this the right way or is there a better way to do it ??
I will also have to retrieve inventory-related information from my ERP to my MySQL database.
1: Depends on how often your data is changing, and how often you need to sync up (i.e., depends on your business).
2 & 3: A web service to transfer data could work just fine. But unless you're trying to come up with a general solution, this sounds like a lot more trouble than it's worth.
If I were doing this, I would export the data from Sql Server to a file, then import that file into mysql (mysql my_db < file.sql).
Getting data OUT of sql server in this format isn't so easy (there's no equivalent to mysqldump on Sql Server). But check out this question for some ideas.
If the data itself is compatible between systems (if the columns are equivalent data types), you can overcome the table structure differences by just creating a query in SQL Server which exports the data in the correct order.
In fact, you may be able to create a query who's output is the file.sql for import into mysql. For example, a query such as:
SELECT CONCAT(
'INSERT INTO MYTABLE VALUES (',
myColumn,
',',
myOtherColumn,
');'
) AS SQL_STATEMENT
Produces output something like:
INSERT INTO MYTABLE VALUES (myColumnValue1, myOtherColumnValue1);
INSERT INTO MYTABLE VALUES (myColumnValue2, myOtherColumnValue2);
....
I've exported data from sql server that way on at least one occasion.
How up to date do you need the ms sql database. That is going to be the deciding factor
I don't see any huge advantage to this being a web service.
This isn't a question.
Deciding how often you transfer the order across is a business decision not a technical one. But it is hard to see what competitive advantage you might gain from not processing your customers' orders as soon as possible, so it ought to be a no brainer.
Without knowing a lot more about your infrastructure and architecture we cannot give you definitive advice about approach. I would expect a decently written ERP package to include interfaces for importing and exporting information. Alas such expectations are often confounded. If you do need to write your own interface, avoid web services. Unless you have a very peculiar set-up all WS will mean is that it will take longer to satisfy your customers. I think we have already agreed that is not a good idea.
Considerations for a Syncronization API:
You need to track which new orders
have not been transferred to the ERP
database. A flag is clumsy, a queue
is perhaps more elegant.
Have a job/daemon polling
continuously to identify orders
which need to be transferred and
transfer them in near-real time.
Have a plan for handling the
unavailability of the ERP database.
Construct the mapping in a modular
fashion so you do not have to
rewrite the entire thing just
because of a change to the structure
of one of your tables.
The inventory data will probably
have to be pulled from the MySQL
database, as it seems unlikely that
the third party will allow you to
put code into their database. But
it's worth reading the contract.
Okay based on the replies I got I will rephrase my question giving more details.
I have an eCommerce portal running on Joomla and Virtue mart (never mind what they are !!)
The backend database here is MySQL.
I have an erp written in .net by my friend and the Db used there is MSSQL
Now I am going to host my eCommerce portal.
Following are actions that will take place and questions related to the actions
Action 1:
At the start of the day my friend updates inventory of various products on and erp table
question:
I want the updated inventory from the erp (MS SQL) to get reflected on my website database (MySQL) automatically. How do I do it ?
Action 2:
People come to my site and place orders.These orders are stored in an order table in my website(MYSQL).
Question 2:
I want these update orders related data from my website (MySQL) to be updated on a corresponding table in my erp (MS SQL)
More over the db structures of the tables in my erp and my website are completely different
I'm not sure if this is valid, however I have a bug bear with SQL Server, and that is that I cannot organise objects in to a group of objects.
Imagine I'm working on a new section of work in a large database and I perhaps have 15 objects that I will be regularly using. What I want to do is sort of "Favourite" them in to a folder so that I don't have to trawl through all objects in my databases.
I know I could organise objects by schema, however these objects aren't necessarily schema specific, they cross boundaries.
Has anyone come across a method for organising objects in to a favourites group? I know SQL Server Projects organise scripts, but I can't see that they can organises tables?
Thanks
You can't do that with the native tools (SQL Server Management Studio) but there's a workaround: create a new empty database with those 15 tables - just the schema, not the data. Then when you're writing T-SQL code, you can quickly drag and drop elements out of those tables into your code.
The downside is that changes made in the real database won't be reflected in your working database, but you can automate that with a script to pull out the objects you need and recreate them in your working database. You can run that as often as you like (like every X hours, or as a SQL Agent job that runs when your local dev server starts up) without losing data, since you won't be modifying the structure in your "favorites" database.
I know I'm really late to the party, but the question showed up on the right under "Related" and I was curious enough to look.
There is a free add-in for Management Studio that seems to do exactly what you're asking:
http://www.sqltreeo.com/wp/dowload-free-ssms-add-in-to-create-own-folder-for-database-objects/
There is also a $65 commercial add-in which you may want to try as well. I haven't tried either so I'm not sure how well they work or what the paid version offers over the free add-in (if anything).
http://www.skilledsoftware.com/
Also can't hurt to vote for this Connect item and add a comment describing your business use case. While you may find it discouraging that it's been closed as Won't Fix, that is not necessarily a permanent decision:
http://connect.microsoft.com/SQLServer/feedback/details/209340