This is my first Django deployment, on Google App Engine flexible.
After deployment, all static content is rendered correctly, but as soon as the request hits an ORM code, it fails (server error 503). I'm using PostgreSQL and I think I link it correctly on my settings.py and app.yaml
DB_HOST: '/cloudsql/my-app:us-central1:my-app-postgresql'.
DB_NAME: 'my-db' ...
Do I need to know anything special about the deployment of Django PostgreSQL to App Engine? During the deployment, all tables and data will be recreated, right?
Finally, I deploy with DEBUG = True and see ProgrammingError at / the "my-app_my-app" relationship does not exist.
How do I export all my database to Google Cloud SQL database?
Based on this, it's look like i will not able to use ORM.
Do i have to use "Django Nonrel" ?
While searching it seems that the transfer of the database is opposed to the stage of the proxy connection thanks to the file cloud_sql_proxy.exe.
but I deleted the DB and resumed this step without success.
Finally my solution consists of 2 Steps:
After deployment, connect in SSH to the :
• From Google console > App Engine > Instances > SSH > Display the gcloud command (faster) and copy the command.
• On your gcould SDK :
Connect to your instance with the copied command then:
sudo docker ps and copy your app CONTAINER ID
sudo docker exec -it <CONTAINER ID> /bin/bash
python3 manage.py makemigrations
python3 manage.py migrate
python3 manage.py createsuperuser
This will create all the necessary tables. If you don't have any data in your local postgreSql, everything is done, otherwise step 2.
Dump your database and import it to Cloud SQL (doc)
• From your local dump your postgre database with :
pg_dump -h localhost -U postgres my_database_name | gzip > backup.gz
• From your bucket upload the file backup.gz
• From the instance of Postgres on the SQL Cloud, import your dump from your bucket.
Don't forget to delete it from the bucket in case it is on a public folder.
I just had the same issue as you today, and I solved it.
If you'r like me, kinda new to all cloud stuff there is one rule to remember, by default everything is locked, blacklisted. That's means that by default your App Engine service doesn't have the right to connect to your DB. Thats why you have a 503 because the request is rejected.
2 ways to solve it. By using the private IP address of you DB or the public IP.
I choose the private IP because the request stay in my VPC, so more secure.
Here the GCP's documentation
But to be quick :
I have create a VPC Connector in the same region as my project.
I have add the VPC connector to my app.yml file
vpc_access_connector:
name: projects/my-project-12345/locations/europe-west1/connectors/my-connector
And it's working like a charm !
PS : Also, if you are like me and did 1000000 tests, don't forget to delete all unused version of your app because I read that cost $.
Related
So not sure it this is stupid to ask, but I'm running a neo4j database server (using Apollo server) from my React Application. Currently, I run it using node in a separate terminal (and I can navigate to it on localhost), then run npm start in a different terminal to get my application going. How can I get the database just up and running always, so if customers use the product they can always access the database? Or, if this isn't good practice, how can I establish the database connection while I run my client code?
Technologies being used: ReactJS, Neo4j Database, GraphQL + urql
I tried moving the Apollo server code into the App.tsx file of my application to run it from there directly when my app is launched, but this was giving me errors. I'm not sure if this is the proper way to do it, as I think it should be abstracted out of the client code?
If you want to run your server in the cloud so that customers can access your React application you need two things:
one server/service to run your database, e.g. Neo4j AuraDB (Free/Pro) or other Cloud Marketplaces https://neo4j.com/docs/operations-manual/current/cloud-deployments/
A service to run your react application, e.g. netlify, vercel or one of the cloud providers (GCP, AWS, Azure) that you then have to configure with the server URL + credentials of your Neo4j server
You can run neo4j-admin dump --to database.dump on your local instance to create a copy of your database content and upload it to the cloud service. For 5.x the syntax is different, I think neo4j-admin database dump --path folder.
I have a Rails 5 app deployed with Google App Engine using Cloud SQL for MySQL following their tutorial.
When I run a database migration,
bundle exec rake appengine:exec -- bundle exec rake db:migrate
I get a deprecation warning:
WARNING: This command is deprecated and will be removed on or after 2018-10-31. Please use `gcloud builds submit` instead.
Before I go off on a vision quest to sort this out, has anyone else converted their Rails app to use gcloud builds for rake tasks like this? Mind sharing the gist? Thanks!
Go to the Cloud SQL Instances page in the Google Cloud Platform Console. ...
Select the instance you want to add the database to.
Select the Databases tab.
Click Create database.
In the Create a database dialog, specify the name of the database, and optionally the character set and collation. ...
Click Create.
If this isn't what your looking for then try to start over
I ended up finding this answer which goes through installing cloud sql proxy so you can run the migration locally:
RAILS_ENV=production bin/rails db:migrate
I'm still interested in a new way to easily execute the command in the cloud, but running locally with a db proxy totally works for now.
First of, I want to say, that I am not a DB expert and I have no experience with the heroku service.
I want to deploy a play framework application to the heroku service. And I need a database to do so. So I created a postgresql database with this command, since it's supported by heroku:
Users-MacBook-Air:~ user$ heroku addons:create heroku-postgresql -a name_of_app
And I got this as response
Creating heroku-postgresql on ⬢ benchmarkingsoccerclubs... free
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
So the DB is now existing but empty of course. For development I worked with a local H2 Database.
Now I would want to populate the DB on heroku using a sql file, since it's quite a lot of data. But I couldn't find how to do that. Is there a command for the heroku CLI, where I can hand over the sql file as an argument and it populates the database? The File basically consists of a few tables which get created and around 10000 Insert commands.
EDIT: I also have CSV files from all the tables. So if there is a way how I can populate the Postgres DB with those would be also great
First, run the following to get your database's name
heroku pg:info --app <name_of_app>
In the output, note the value of "Add-on", which should look something like this:
Add-on: postgresql-angular-12345
Then, issue the following command:
heroku pg:psql <Add-on> --app <name_of_app> < my_sql_file.sql
For example (assuming your sql commands are in file test.sql):
heroku pg:psql postgresql-angular-12345 --app my_cool_app < test.sql
I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..
I am trying to migrate the heroku shared database from app1 to clearDB in app2 and i get the following error message:
$ heroku pgbackups:restore DATABASE "app1 database url" --app app2
DATABASE_URL does not match any of your databases
DATABASE (DATABASE_URL) <---restore--- b002.dump
! invalid format
! or to_url must be a postgres URL
You cannot use a postgres backup against a MySQL database.
The only way to do this is via taps which pushes the data through activerecord. More information here:
http://devcenter.heroku.com/articles/taps