So my task is to write an assignment in nodejs and bootstrap with mongodb database.
My next task is to share this with the Assignee in a way so that he/ she can run the project in his/ her local environment.
I can transfer the codes in git and share but how to share the database as well?
You can see MongoDB documentation for that
mongodump -d <database_name> -o <directory_backup>
mongorestore -d <database_name> <directory_backup>
video for mongoexport command
You could use a database-as-a-service like mlab so that the database will be the same on both the machines.
Or if you don't like external database, you could also create a node.js script to init the database the first time the program is run.
Use studio3T, choose your collection. Right click on the collection and export in your desired format. Same way your colleague will import from Studio3T.
So here is my scenario:
Today my server was restarted by our hoster (acpi shutdown).
My mongo database is a simple docker container (mongo:3.2.18)
Because of an unknown reason the container wasn't restarted on reboot (restart: always was set in docker-compose).
I started it and noticed the volume mapping were gone.
I restored them to the old paths, restarted the mongo container and it started without errors.
I connected to the database and it was completely empty.
> show dbs
local 0.000GB
> use wekan
switched to db wekan
> show collections
> db.users.find();
>
Also I already tried db.repairDatabase();, no effect.
Now my _data directory contains a lot of *.wt files and more. (File list)
I found collection-0-2713973085537274806.wt which has a file size about 390MiB.
This could be the data I need to restore, assuming its size.
Any way of restoring this data?
I already tried my luck using wt salvage according to this article, but I can't get it running - still trying.
I know backups,backups,backups! Sadly this database wasn't backuped.
Related GitHub issue, contains details to software.
Update:
I was able to create a .dump file with the WiredTiger Data Engine tool. However I can't get it imported into a mongoDB.
Try running a repair on the mongo db container. It should repair your database and the data should be completely restored.
Start mongo container in bash mode.
sudo docker-compose -f docker-compose.yml run mongo bash
or
docker run -it mongo bash
Once you are inside the docker container, run mongo db repair.
mongod --dbpath /data/db --repair
The DB should repaired successfully and all your data should be restored.
I lost my database since i moved servers.
I still have the migrations in my laravel folder. And i also have this rater.sql file in the root of my project. Is there a way i could get my database back with the migrations?
If you are using shared hosting you cant run command, so you can open your files in an FTP file manager (I prefer PHPStorm), then you can run below command as Hedam said:
php artisan migrate
now you can export the database to your localhost and import that on your hosting.
If you dumped your database, your data is lost.
You can though restore the data structure with
php artisan migrate
Obviously, we do not know what rater.sql contains, so I suggest you look if any data can be recovered from this file.
I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..
I need to implement an automatic trasfer of daily backups from one DB to another DB. Both DB's and apps are hosted on heroku.
I know this is possible if to do it manually from local machine with the command:
heroku pgbackups:restore DATABASE `heroku pgbackups:url --app production-app` --app staging-app
But this process should be automated and run not from local machine.
I have an idea to write a rake which will execute this command; and run this rake daily with the help of Heroku Scheduler add-on.
Any ideas how it is better to do this? Or maybe there is a better way for this task?
Thanks in advance.
I managed to solve the issue myself. It appeared to be not so complex. Here is the solution, maybe it'll be useful to somebody else:
1. I wrote a script which copies the latest dump from a certain server to the DB of the current server
namespace :backup do
desc "copy the latest dump from a certain server to the DB of the current server"
task restore_last_write_dump: :environment do
last_dump_url = %x(heroku pgbackups:url --app [source_app_name])
system("heroku pgbackups:restore [DB_to_target_app] '#{last_dump_url}' -a [target_app_name] --confirm [target_app_name]")
puts "Restored dump: #{last_dump_url}"
end
end
To avoid authenication upon each request to the servers, craete a file .netrc in the app root (see details here https://devcenter.heroku.com/articles/authentication#usage-examples)
Setup Scheduler add-on for heroku and add our rake task along with the frequency of its running.
That is all.