MySQL Auto Export as .SQL Whenever Data Changed - export

I am trying to Sync Local and Remote MySQL DB. I have Completed Remote Side Work and need an idea on how to Export MySQL DB locally whenever Database get Change. Any idea or existing Technique.

If you want to "sync" between 2 MySQL servers not using replication, you can use the Percona Toolkit and you use the tool called "pt-table-sync". See here :
http://www.percona.com/doc/percona-toolkit/2.2/pt-table-sync.html

Related

How to copy data from one database to another in Google Cloud

I am working on Google Cloud App Engine Platform.
I have to copy the data from one database of one instance to another database of another instance:
My databases are both Postgres 13
Instance a:
database a_a;
Instance b:
database b_b;
I have to copy "a_a"'s data into b_b;
I just want to copy the data without copying the entire database structure.
Is there a way to export and then import the data?
How can I do it?
Solved By Using gcloud sql export sql gs://<bucket_name>/sqldumpfile.gz --database=db_name --offload and the right permission for the service account.
All the source is here

how to mirror a whole database cluster in postgresql

I'm using a postgresql (9.6) database in my project which is currently in development stage.
For production I want to use an exact copy/mirror of the database-cluster with a slightly different name.
I am aware of the fact that I can make a backup and restore it under a different cluster-name, but is there something like a mirror function via the psql client or pgAdmin (v.4) that mirrors all my schemas and tables and puts it in a new clustername?
In PostgreSQL you can use any existing database (which needs to be idle in order for this to work) on the server as a template when you want to create a new database with that content. You can use the following SQL statement:
CREATE DATABASE newdb WITH TEMPLATE someDbName OWNER dbuser;
But you need to make sure no user is currently connected or using that database - otherwise you will get following error.
ERROR: source database "someDbName" is being accessed by other users
Hope that helped ;)

Import DB2 files to SQL Server

Given the DAT file and the DDL file for each table in a DB2 database, can I import this data to SQL Server? I have no access to the original server or any copy of a DB2 server so connecting to a live instance isn't an option.
Can I do this without a live instance of DB2 or should I go back to the client and ask for CSV files? Is there a procedure or tool that makes this process smoother? I've tried to find a file-based connection string to use to connect to a set of DB2 files with no luck. I've also tried SwissSQLDB2ToSQLServer and SqlLinesData to see if they have a file-based option built in.
OK, given the comment above, you can't import DB2's container files (DAT, LRG, or anything else) directly. You need a CSV or equivalent. Yes, one way to get this is run the EXPORT utility on a live DB2 database. HTH!

Where Jubula keeps its default tests database?

I would like to access Jubula's default database from outside the Jubula or, at least, copy it entirely and move to another computer. There is a reason a can not access it from inside and export tests as XML. Is there a way to do it?
it's using the H2 database by default. It is stored in your user's libriary. I don't know if it's possible to access it from other applications or another machine, but I recommend you not doing that.
What you want to do, is to set up a "conventional" database - like MySQL or Postgre -, and store your tests there. It has some benefits; users can access it, at the same time; you can easily make backups, etc. Just don't forget to install the Jubula Database Drivers like I did.
To save existing tests ("copy it entirely") into this new database, you have to export all Projects to XML. Then disconnect from your H2-database, and select the new database from your connections. Then import the XML file you've just exported, and all your tests will be there. You'll find these commands under the Test menu-section.

Export MSSQL DB, Import in shared environment

We are in the process of trying to migrate from a VPS to a shared environment. The VPS is running Studio Express 2005 so is therefore limited quite a lot in functionality in terms of exporting.
I have managed to export a database in .bak format and upload (Restore) it to the shared environment.
However, here comes the problem, the schema has come with the database. Causing problems when connecting via asp.
The table name structure is as follows [SCHEMA].[TABLE_NAME].
The shared environment does not allow for changing of schema or many advanced features. (Its running myLittleAdmin).
So I guess the schema changes would have to be done on the database, then exported then imported.
Ps. I'm new to MSSQL and more experienced in MYSQL.
Ok So I have found a solution to this.
Export the schema from Studio Express using Right Click > Tools > Generate Script.
Execute this script on the server.
Open this file, find and replace the old user with your new one.
Use a tool such as this one http://sqldumper.ruizata.com/ (SQL Dumper) to export the DB to .SQL.
Find and replace on this file, again for the old user to the new.
Copy this SQL and execute it on the server.
Job done!
Joe

Resources