data transfer from one database to other database in odoo - database

I have one database. I want to transfer data from one database to new database. all tables have same fields into both databases. I can use export feature of openerp, but I need to maintain the relationship between odoo table and there is so many tables so I don't know which tables I can import first into a new database so it does not give any problem into other tables data import.
is there any that I can do this into easy and simple way?

There are two ways in which you can take backup.
By hitting the given URL – server/web/database/manager.
By Import/Export and validation, functionality is given by Odoo.
• Backup-> We can take the full backup of the system and store zip file in our system for a future update. For that, we have to hit this URL- http://localhost:8069/web/database/manager
- Restore-> In a similar manner, we can restore the database by uploading the zipped file which we recently downloaded.

Related

Database archive strategy

We have oracle database having 10 yrs of data. We want to archive them as application is getting slow due to large sets of data. Issue is that system still needs to access old data if user wants. So how can we design that? For example if we archive the data from 2010 to 2015 in archive database and delete the data from the current database, application will query based on a date to some table which has a data range and then connect to appropriate database i,e current or archive. There is also a possibility to get all the data from both databases in some cases.
How to archive records in multiple tables within one access database
Above solution talks about archiving but i need to have strategy if user wants to access the old records.
Thanks

Restore data dumped from an Oracle database instance to a SQL Server database instance?

I would like to know the steps on how to restore data dumped from an Oracle database to a SQL Server database?
Our purpose is to get data from an external Oracle database out of our organization. Due to security concern, the team that manages data source refused us to transfer data through ODBC server link. They dumped the selected tables that we need so we can restore the data in our organization. Each table's data files include .sql file to create table and constraints, a ".ctl" file, one or multiple ".ldr" files.
An extra trouble is: one of the tables contains a blob column, which stores a lot of binary data files, such as PDF etc.. This column takes most of the size of our dumped files. Otherwise I could ask them to send us data in excel directly.
Can someone give me a suggestion about what route we should take?
Either get them to export the data in an open format, or load it into an Oracle instance you have full control over. .ctl and .ldr files looks like they used the old SQL*Loader.

How to add data in database from .bak file by sql query

I have backup files. I want to add that data to the existing database not over write it.
Scenario:
The software is installed on different systems. The data is entered from different PC's. but the data is of same type. After taking the backup(.bak)file on run-time. Now i want to combine that all data in a single database not overwrite it. I have tried How to restore to a different database in sql server? I am out of that option and after so much googling i cant get answer.Just found to overwrite the database but i dont want to loose any of the data. Thankz

Is it performant to put all Log tables into a different database catalog in SQL Server?

Supposing we have a web application, which uses a SQL Server 2005 server database, would it be better for performance to move all our custom Log tables to a specific catalog?
Scenario
Our web application today uses different catalogs from SQL Server. Each catalog have tables related to a problem (domain/subject): db_financial, db_corporative, etc.
These catalogs already have many different log tables, to register a history of changes made by users during application usage: tb_log_product, tb_log_customer, tb_log_provider_prices, etc.
The goal
The goal is to know if there is any advantage on moving log tables to a specific catalog.
These log tables can have lots of data, so I was wandering if it is a nice idea to move all of them to a different catalog such as db_log (or if I must keep the log tables in the catalogs they are now).
Logs are mostly used for auditing purposes and to keep history of what-happened and who-dun-it. If you have a database called db_operations and table such as tb_customer, I recommend that your log-table tb_log_customer be in the same database (db_operations).
Keeping them in the same database will allow you to take backups of both customer and customer-log table as a single unit of work. If your log was in a different database such as db_logs, you would have to back up db_operations and db_logs at the same time and still not get a pristine restore. Same issue applies to log shipping and mirroring techniques.
To manage the log tables, I'd recommend creating filegroup(s). Log tables can go on these filegroup(s) and the path for the filegroup can be a different volume/controller. To manage the size of the log files, I propose deleting history after a certain period of time. I'd recommend taking a look at partitioning as well.

Database save, restore and copy

An application runs training sessions. Environment for each session (like "mission" or "level" in games) is stored in a database.
Before starting a session, user can choose which of many available databases to use.
During the session database may be modified.
After the session changed database is usually discarded, but sometimes may be saved under new or same name.
Databases are often copied between non-connected computers (on a flash card).
If environment were stored in plain files, it would be easy: copy, load, save.
We currently use similar approach: store databases as MS SQL backups, copy and save them as files, and load into actual DBMS when session starts. Main problem is modification: when database schema changes, all the backups must be updated, which is error-prone.
Storing everything in a single database with additional "environment id" relationship and providing utilities to load, save and copy environments seems too complex for the task.
What are other possible ways to design for that functionality? This problem is probably not unique and must have some though-out solution.
Firstly, I think you need to dispense with the idea of SQL Backups for this and shift to tables that record data changes.
Then you have a source database containing all your regular tables, plus another table that records a list of saved versions of it.
So table X might contain columns TestID, TestDesc, TestDesc2, etc
Then you might have a table that contains SavedDBID, SavedDBTitle,etc
Next, for each table X you have a table X_Changes. This has the same columns as table X, but also includes a SavedDBID column. This would be used to record any changed rows between the source database and the Saved one for a given SavedDBID.
When the user logs on, you create a clone of the source database. Then you use the Changes tables to make the clone's tables reflect the saved version. As the user updates the main tables in the clone, the changed rows should also be updated in the clone's Changes tables.
If the user decides to save their copy, use the Clone's changes tables to record the differences between the Source and the Clone in the original database, then discard the Clone.
I hope this is understandable. It will certainly make any schema changes easier to immediately reflect in the 'backups' as you'd only have one database schema to change. I think this is much more straightforward than using SQL Backups.
As for copying databases around using flash cards, you can give them a copy of the source database but only including info on the sessions they want.
As one possible solution - virtualise your SQL server. You can have multiple SQL servers if you want and you can clone and roll them back independently.

Resources