I'm new to heroku and currently trying to check out a copy of an apps database thats hosted on heroku.
When I run "heroku db:pull --app myapp" I get a warning that looks like:
WARNING: Potentially Destructive Action
! This command will affect the app: myapp
I'm confused as to how pulling a copy of the db from the production server can affect the live app? Most importantly, does it do anything to the database which is on production?
Thanks,
gearoid.
no, it won't do anything bad to your production database, the warning simply means it will DESTROY your local database when you pull, likewise when you PUSH it would destroy the remote database.
Related
So not sure it this is stupid to ask, but I'm running a neo4j database server (using Apollo server) from my React Application. Currently, I run it using node in a separate terminal (and I can navigate to it on localhost), then run npm start in a different terminal to get my application going. How can I get the database just up and running always, so if customers use the product they can always access the database? Or, if this isn't good practice, how can I establish the database connection while I run my client code?
Technologies being used: ReactJS, Neo4j Database, GraphQL + urql
I tried moving the Apollo server code into the App.tsx file of my application to run it from there directly when my app is launched, but this was giving me errors. I'm not sure if this is the proper way to do it, as I think it should be abstracted out of the client code?
If you want to run your server in the cloud so that customers can access your React application you need two things:
one server/service to run your database, e.g. Neo4j AuraDB (Free/Pro) or other Cloud Marketplaces https://neo4j.com/docs/operations-manual/current/cloud-deployments/
A service to run your react application, e.g. netlify, vercel or one of the cloud providers (GCP, AWS, Azure) that you then have to configure with the server URL + credentials of your Neo4j server
You can run neo4j-admin dump --to database.dump on your local instance to create a copy of your database content and upload it to the cloud service. For 5.x the syntax is different, I think neo4j-admin database dump --path folder.
I have a Rails 5 app deployed with Google App Engine using Cloud SQL for MySQL following their tutorial.
When I run a database migration,
bundle exec rake appengine:exec -- bundle exec rake db:migrate
I get a deprecation warning:
WARNING: This command is deprecated and will be removed on or after 2018-10-31. Please use `gcloud builds submit` instead.
Before I go off on a vision quest to sort this out, has anyone else converted their Rails app to use gcloud builds for rake tasks like this? Mind sharing the gist? Thanks!
Go to the Cloud SQL Instances page in the Google Cloud Platform Console. ...
Select the instance you want to add the database to.
Select the Databases tab.
Click Create database.
In the Create a database dialog, specify the name of the database, and optionally the character set and collation. ...
Click Create.
If this isn't what your looking for then try to start over
I ended up finding this answer which goes through installing cloud sql proxy so you can run the migration locally:
RAILS_ENV=production bin/rails db:migrate
I'm still interested in a new way to easily execute the command in the cloud, but running locally with a db proxy totally works for now.
I am using sqlite as my database for an offline app which is made in electron.
For creating the database, I was using knex migrations.
Problem is, it will run fine in development, i would migrate the database and start the electron process.
But while packaging the app for production build, I need the migrations to run on the client machine on the first start up. So that the database would be created and when there is an application update, a new migration would keep the database updated.
What is the appropriate approach for this. How do i run the migrations on app start up, or how do i keep the migrations in the bundle.
Won't all the code be kept in app.asar? Will the migration code be run from there?
Also, where should the database be created in the client machine?
If you are using electron builder then you can add this to the electron-builder.json
"extraFiles": "migrations/*",
where migrations is the folder where you keep the migrations.
To migrate it automatically on running
you can add the following code
const client = knex(config[env]);
client.migrate.latest(config);
First of, I want to say, that I am not a DB expert and I have no experience with the heroku service.
I want to deploy a play framework application to the heroku service. And I need a database to do so. So I created a postgresql database with this command, since it's supported by heroku:
Users-MacBook-Air:~ user$ heroku addons:create heroku-postgresql -a name_of_app
And I got this as response
Creating heroku-postgresql on ⬢ benchmarkingsoccerclubs... free
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
So the DB is now existing but empty of course. For development I worked with a local H2 Database.
Now I would want to populate the DB on heroku using a sql file, since it's quite a lot of data. But I couldn't find how to do that. Is there a command for the heroku CLI, where I can hand over the sql file as an argument and it populates the database? The File basically consists of a few tables which get created and around 10000 Insert commands.
EDIT: I also have CSV files from all the tables. So if there is a way how I can populate the Postgres DB with those would be also great
First, run the following to get your database's name
heroku pg:info --app <name_of_app>
In the output, note the value of "Add-on", which should look something like this:
Add-on: postgresql-angular-12345
Then, issue the following command:
heroku pg:psql <Add-on> --app <name_of_app> < my_sql_file.sql
For example (assuming your sql commands are in file test.sql):
heroku pg:psql postgresql-angular-12345 --app my_cool_app < test.sql
I need to implement an automatic trasfer of daily backups from one DB to another DB. Both DB's and apps are hosted on heroku.
I know this is possible if to do it manually from local machine with the command:
heroku pgbackups:restore DATABASE `heroku pgbackups:url --app production-app` --app staging-app
But this process should be automated and run not from local machine.
I have an idea to write a rake which will execute this command; and run this rake daily with the help of Heroku Scheduler add-on.
Any ideas how it is better to do this? Or maybe there is a better way for this task?
Thanks in advance.
I managed to solve the issue myself. It appeared to be not so complex. Here is the solution, maybe it'll be useful to somebody else:
1. I wrote a script which copies the latest dump from a certain server to the DB of the current server
namespace :backup do
desc "copy the latest dump from a certain server to the DB of the current server"
task restore_last_write_dump: :environment do
last_dump_url = %x(heroku pgbackups:url --app [source_app_name])
system("heroku pgbackups:restore [DB_to_target_app] '#{last_dump_url}' -a [target_app_name] --confirm [target_app_name]")
puts "Restored dump: #{last_dump_url}"
end
end
To avoid authenication upon each request to the servers, craete a file .netrc in the app root (see details here https://devcenter.heroku.com/articles/authentication#usage-examples)
Setup Scheduler add-on for heroku and add our rake task along with the frequency of its running.
That is all.