I am trying to migrate the heroku shared database from app1 to clearDB in app2 and i get the following error message:
$ heroku pgbackups:restore DATABASE "app1 database url" --app app2
DATABASE_URL does not match any of your databases
DATABASE (DATABASE_URL) <---restore--- b002.dump
! invalid format
! or to_url must be a postgres URL
You cannot use a postgres backup against a MySQL database.
The only way to do this is via taps which pushes the data through activerecord. More information here:
http://devcenter.heroku.com/articles/taps
Related
This is my first Django deployment, on Google App Engine flexible.
After deployment, all static content is rendered correctly, but as soon as the request hits an ORM code, it fails (server error 503). I'm using PostgreSQL and I think I link it correctly on my settings.py and app.yaml
DB_HOST: '/cloudsql/my-app:us-central1:my-app-postgresql'.
DB_NAME: 'my-db' ...
Do I need to know anything special about the deployment of Django PostgreSQL to App Engine? During the deployment, all tables and data will be recreated, right?
Finally, I deploy with DEBUG = True and see ProgrammingError at / the "my-app_my-app" relationship does not exist.
How do I export all my database to Google Cloud SQL database?
Based on this, it's look like i will not able to use ORM.
Do i have to use "Django Nonrel" ?
While searching it seems that the transfer of the database is opposed to the stage of the proxy connection thanks to the file cloud_sql_proxy.exe.
but I deleted the DB and resumed this step without success.
Finally my solution consists of 2 Steps:
After deployment, connect in SSH to the :
• From Google console > App Engine > Instances > SSH > Display the gcloud command (faster) and copy the command.
• On your gcould SDK :
Connect to your instance with the copied command then:
sudo docker ps and copy your app CONTAINER ID
sudo docker exec -it <CONTAINER ID> /bin/bash
python3 manage.py makemigrations
python3 manage.py migrate
python3 manage.py createsuperuser
This will create all the necessary tables. If you don't have any data in your local postgreSql, everything is done, otherwise step 2.
Dump your database and import it to Cloud SQL (doc)
• From your local dump your postgre database with :
pg_dump -h localhost -U postgres my_database_name | gzip > backup.gz
• From your bucket upload the file backup.gz
• From the instance of Postgres on the SQL Cloud, import your dump from your bucket.
Don't forget to delete it from the bucket in case it is on a public folder.
I just had the same issue as you today, and I solved it.
If you'r like me, kinda new to all cloud stuff there is one rule to remember, by default everything is locked, blacklisted. That's means that by default your App Engine service doesn't have the right to connect to your DB. Thats why you have a 503 because the request is rejected.
2 ways to solve it. By using the private IP address of you DB or the public IP.
I choose the private IP because the request stay in my VPC, so more secure.
Here the GCP's documentation
But to be quick :
I have create a VPC Connector in the same region as my project.
I have add the VPC connector to my app.yml file
vpc_access_connector:
name: projects/my-project-12345/locations/europe-west1/connectors/my-connector
And it's working like a charm !
PS : Also, if you are like me and did 1000000 tests, don't forget to delete all unused version of your app because I read that cost $.
First of, I want to say, that I am not a DB expert and I have no experience with the heroku service.
I want to deploy a play framework application to the heroku service. And I need a database to do so. So I created a postgresql database with this command, since it's supported by heroku:
Users-MacBook-Air:~ user$ heroku addons:create heroku-postgresql -a name_of_app
And I got this as response
Creating heroku-postgresql on ⬢ benchmarkingsoccerclubs... free
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
So the DB is now existing but empty of course. For development I worked with a local H2 Database.
Now I would want to populate the DB on heroku using a sql file, since it's quite a lot of data. But I couldn't find how to do that. Is there a command for the heroku CLI, where I can hand over the sql file as an argument and it populates the database? The File basically consists of a few tables which get created and around 10000 Insert commands.
EDIT: I also have CSV files from all the tables. So if there is a way how I can populate the Postgres DB with those would be also great
First, run the following to get your database's name
heroku pg:info --app <name_of_app>
In the output, note the value of "Add-on", which should look something like this:
Add-on: postgresql-angular-12345
Then, issue the following command:
heroku pg:psql <Add-on> --app <name_of_app> < my_sql_file.sql
For example (assuming your sql commands are in file test.sql):
heroku pg:psql postgresql-angular-12345 --app my_cool_app < test.sql
So I'm trying to manually move Magento to new server with old school ftp/phpmyadmin method. I only have the sql dump and folders from the root of old server, not access to it anymore and I don't know the earlier magento version number.
Should I
do fresh install of Magento to new server and then substitute folders and somehow import sql?
or
dump the files and sql to the new server first and then run the installer (something else?)
Many thanks
Usually below methods are followed,
Empty your cache admin sessions
Empty your logs (mysql)
create backup your database and your files
check your new server that supports magento using this
upload your files and import your db. If your database size is very large then directly import your database to your mysql server using sql command. Command would be like this,
mysql -u username -p database_name < file.sql
for this you need to logged in your server(mysql)
Update your mysql credentials to your etc file and update path in core_config_data table .
Place some dummy orders to check emails are correctly delivered and payment, shipping etc.
Buy SSL Certificates and avoid shared hosting.
I have created database on the Stratoes live server and my databse URL is this.
jdbc:mysql://rss1.stratoslive.wso2.com/karshamarkuptool_karsha_opensource_lk
I tried Database Console> Tools> Back Up and it asking me these credentials
Target file name:~/backup.zip Source directory:
jdbc:mysql://rss1.stratoslive.wso2.com/karshamarkuptool_karsha_opensource_lk
Source database name: karshamarkuptool_karsha_opensource_lk
Are my credentials right? it says there is no database found on the source directory.
If not what is the way to get a backup from Stratoes database? How can I configure it to get automatic weekly backup?
If you have mysql installed in your local setup, you can get a backup as just as the same way that you would take a backup of a database that resides in your local database server. For example, the following command would get you a backup of your database that you created under the RSS manager of StratosLive Data Service Server.
mysqldump -u your_username -pyour_password -h rss1.stratoslive.wso2.com extracted_host_name_from_the_JDBC_url_given_to_you database_name > local_file_system_path/backup.sql
Cheers,
Prabath
I'm new to heroku and currently trying to check out a copy of an apps database thats hosted on heroku.
When I run "heroku db:pull --app myapp" I get a warning that looks like:
WARNING: Potentially Destructive Action
! This command will affect the app: myapp
I'm confused as to how pulling a copy of the db from the production server can affect the live app? Most importantly, does it do anything to the database which is on production?
Thanks,
gearoid.
no, it won't do anything bad to your production database, the warning simply means it will DESTROY your local database when you pull, likewise when you PUSH it would destroy the remote database.