I am working on Google Cloud App Engine Platform.
I have to copy the data from one database of one instance to another database of another instance:
My databases are both Postgres 13
Instance a:
database a_a;
Instance b:
database b_b;
I have to copy "a_a"'s data into b_b;
I just want to copy the data without copying the entire database structure.
Is there a way to export and then import the data?
How can I do it?
Solved By Using gcloud sql export sql gs://<bucket_name>/sqldumpfile.gz --database=db_name --offload and the right permission for the service account.
All the source is here
Related
I have spring boot application that need to be deployed on PCF. I want to use H2 database for local testing but after deployed in PCF I will be using SQL server. I have a schema that need to be used for each database. So, I have two schema.sql file one for H2 and another for SQL server. How can I tell spring for local profile schema-H2.sql need to be used and for profile cloud schema-sqlserver.sql need to be used.
You can set the spring.datasource.platform to differentiate the schema and data sql file.
eg,
spring.datasource.platform=h2
then the file name should be data-h2.sql and schema-h2.sql
Make sure you set spring.datasource.initialization-mode=always
I'm very new to Azure and have been tasked with automating the process of taking an existing version of our database, converting it to the newer version and then uploading that to Azure.
The conversion is done, that parts easy, what I'm struggling with is getting a .bacpac file from SSMS using PowerShell. I know I can use the Export Data Tier Application function in SSMS to do this but I need it to be automated. From there I can use something like the following to actually upload the database:
https://blogs.msdn.microsoft.com/brunoterkaly/2013/09/26/how-to-export-an-on-premises-sql-server-database-to-windows-azure-storage/
I have looked around and cannot find a solution to this, or even know where to start.
You can create bacpac of your on-premises databases and locate them on a local folder (c:\MyBacpacs) using SQLPackage.
sqlpackage.exe /Action:Export /SourceServerName:. /sdn:"DB_Foo" /tf:"c:\MyBacpacs\DB_Foo.bacpac"
You can then use AzCopy to upload bacpacs to Azure BLOB storage
AzCopy /Source:"c:\MyBacpacs" /Dest:"https://exampleaccount.blob.core.windows.net/bacpacs" /DestKey:storageaccountkey /Pattern:*.bacpac
I'm migrating a database from Azure VM to Azure SQL Database. I tried to use the "Deploy Database to Azure SQL Database" function in SSMS but it failed several times, seemingly due to the size of the database (110 GB). So I made a copy of the source database on the same source server, truncated the table with the majority of the data in it, then tried the deploy again. Success.
Now I need to get that data from the original source table into the destination table. I've tried two different approaches to this and both gave errors
In SSMS, I connected to both SQL Servers. I ran the below while attached to the destination database:
INSERT INTO dbo.DestinationTable
SELECT *
FROM [SourceServer].[SourceDatabase].dbo.SourceTable
With that I was given the error:
Reference to database and/or server name in
'SourceServer.SourceDatabase.dbo.SourceTable' is not supported in this
version of SQL Server.
In SSMS, used the Export Data Wizard from the Source Table. When trying to start that job, I received this error during the validation phase:
Error 0xc0202049: Data Flow Task 1: Failure inserting into the
read-only column "CaptureId"
How can I accomplish what should be this seemingly simple task?
instead of using deploy database to azure directly for large databases ,you could try below steps
1.Extract bacpac on local machine
2.Copy bacpac to blob
3.Now while creating database, you could use the bacpac in blob as source
This approach worked for us and is very fast,you may also have to ensure,that blob is in same region as SQLAzure
I've solved the issue. Using option #2, you simply need to tick the checkbox for "Enable Identity Insertion" and it works with no errors. This checkbox is inside the Edit Mappings sub-menu of the Export Data Wizard.
I am trying to Sync Local and Remote MySQL DB. I have Completed Remote Side Work and need an idea on how to Export MySQL DB locally whenever Database get Change. Any idea or existing Technique.
If you want to "sync" between 2 MySQL servers not using replication, you can use the Percona Toolkit and you use the tool called "pt-table-sync". See here :
http://www.percona.com/doc/percona-toolkit/2.2/pt-table-sync.html
We are in the process of trying to migrate from a VPS to a shared environment. The VPS is running Studio Express 2005 so is therefore limited quite a lot in functionality in terms of exporting.
I have managed to export a database in .bak format and upload (Restore) it to the shared environment.
However, here comes the problem, the schema has come with the database. Causing problems when connecting via asp.
The table name structure is as follows [SCHEMA].[TABLE_NAME].
The shared environment does not allow for changing of schema or many advanced features. (Its running myLittleAdmin).
So I guess the schema changes would have to be done on the database, then exported then imported.
Ps. I'm new to MSSQL and more experienced in MYSQL.
Ok So I have found a solution to this.
Export the schema from Studio Express using Right Click > Tools > Generate Script.
Execute this script on the server.
Open this file, find and replace the old user with your new one.
Use a tool such as this one http://sqldumper.ruizata.com/ (SQL Dumper) to export the DB to .SQL.
Find and replace on this file, again for the old user to the new.
Copy this SQL and execute it on the server.
Job done!
Joe