How to share a website file with mongoDB database - database

So my task is to write an assignment in nodejs and bootstrap with mongodb database.
My next task is to share this with the Assignee in a way so that he/ she can run the project in his/ her local environment.
I can transfer the codes in git and share but how to share the database as well?

You can see MongoDB documentation for that
mongodump -d <database_name> -o <directory_backup>
mongorestore -d <database_name> <directory_backup>
video for mongoexport command

You could use a database-as-a-service like mlab so that the database will be the same on both the machines.
Or if you don't like external database, you could also create a node.js script to init the database the first time the program is run.

Use studio3T, choose your collection. Right click on the collection and export in your desired format. Same way your colleague will import from Studio3T.

Related

How to get a SQL dump for development in Shopware 6?

We are developing a customers shop and want to develop locally, but without all the user / order data.
How can we achieve this?
The question has those points:
How to create such dumps?
How to import them?
How to manage storage / make them easily accessible for developers?
You can use the smile SA gdpr dump to create a anonmyized dump which is fine for development.
This can be either stored in GIT or downloaded from the production server or you might setup another server to store those dumps.
There is also a nice script from kellerkinder especially for shopware 6 https://github.com/kellerkinderDE/shopware6-database-dump
To import just use mysql --force -u -p my-database < dump.sql

Restore CouchDB from .couch files

I'm trying to backup and restore a CouchDB following the official documentation:
https://docs.couchdb.org/en/latest/maintenance/backups.html
"However, you can also copy the actual .couch files from the CouchDB data directory (by default, data/) at any time, without problem. CouchDB’s append-only storage format for both databases and secondary indexes ensures that this will work without issue."
Since the doc seems to not show clearly the steps to restore from files, i copy the entire data folder, build up a local CouchDB docker container and try to paste the files into container opt/couchdb/data folder.
But what i get when i start/restart the container and access localhost:5984 to see the databases, is: "This database failed to load."
What should i do after copy the files? Paste directly should work? What is the right time to paste? Should i create the DBs before?
Thank you all
i've been able to resolve that way:
https://github.com/apache/couchdb/discussions/3436
I think you may need to update the ownership of the backup files on your docker container.
This fixed the issue for me:
# recursively change ownership of data dir to couchdb:couchdb
docker exec <container_id> bash -c 'chown -R couchdb:couchdb /opt/couchdb/data'
Just replace the <container_id> with your docker container id and the destination with the location of couchdb data directory in your container.

How to populate a heroku postgresql database with a sql file

First of, I want to say, that I am not a DB expert and I have no experience with the heroku service.
I want to deploy a play framework application to the heroku service. And I need a database to do so. So I created a postgresql database with this command, since it's supported by heroku:
Users-MacBook-Air:~ user$ heroku addons:create heroku-postgresql -a name_of_app
And I got this as response
Creating heroku-postgresql on ⬢ benchmarkingsoccerclubs... free
Database has been created and is available
! This database is empty. If upgrading, you can transfer
! data from another database with pg:copy
So the DB is now existing but empty of course. For development I worked with a local H2 Database.
Now I would want to populate the DB on heroku using a sql file, since it's quite a lot of data. But I couldn't find how to do that. Is there a command for the heroku CLI, where I can hand over the sql file as an argument and it populates the database? The File basically consists of a few tables which get created and around 10000 Insert commands.
EDIT: I also have CSV files from all the tables. So if there is a way how I can populate the Postgres DB with those would be also great
First, run the following to get your database's name
heroku pg:info --app <name_of_app>
In the output, note the value of "Add-on", which should look something like this:
Add-on: postgresql-angular-12345
Then, issue the following command:
heroku pg:psql <Add-on> --app <name_of_app> < my_sql_file.sql
For example (assuming your sql commands are in file test.sql):
heroku pg:psql postgresql-angular-12345 --app my_cool_app < test.sql

Google cloud sql instance super privilege error

I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..

Oracle public server

I want to learn about oracle, to try some queries and other SQL features of oracle data base, but don't want to install and mess with all realted issues. So my question is - is there any publicly available oracle server, to which I can connect through terminal and play with it?
I mean a service where I can register and some space would be allocated to my profile
Take a look at: http://apex.oracle.com/
The only thing I can think of is SQLFiddle: http://sqlfiddle.com/
But it won't let you have a "private" space. You need to re-create your schema each time (but you can bookmark your script which might be enough for you).
You could also try one of the pre-built virtual appliances - see
http://www.oracle.com/technetwork/community/developer-vm/index.html
If you need direct database access, you can run it in a Docker instance:
docker run -d -p 1521:1521 -p 8080:8080 alexeiled/docker-oracle-xe-11g
Then connect to it with sqlplus
sqlplus system/oracle#localhost:1521/xe
See here for more passwords, info on apex, etc.
Just came across this: Oracle Live SQL. It is browser based so nothing to install locally. But, you need to have an Oracle account.
Browser based SQL worksheet access to an Oracle database schema

Resources