Restore database problems - akeneo

I have restored a backup of the DB back into the Akeneo PIM DB but it is now not showing me any products within the system.
The DB is connecting properly ie I can log in and the category tree is available, attribute lists etc but my products (500~) are not available in the Akeneo environment (0 products). The data IS in the DB - I can query and view etc directly through SQL.
Anybody have any ideas? What have I missed?

Just got an answer from #gplanchat on the Akeneo Slack list with the solution.
I needed to reindex my catalog - something I haven't seen reference to in the documentation. FOr others, this is what is needed:
bin/console akeneo:elasticsearch:reset-indexes --env=prod
bin/console pim:product:index --all --env=prod
bin/console pim:product-model:index --all --env=prod
Thanks to #gplanchat

Related

Load database from offline back-up Neo4j

I backed up a neo4j database using
bin/neo4j-admin dump --database=neo4j --to=c:/
Then I load a database from an archive created with the dump command as follow
bin/neo4j-admin load --from=/var/lib/neo4j/data/c: --database=db
From Neo4j Enterprise Browser I execute
SHOW DATABASES
but I don't see the db previously loaded. How could I show it ?
If you are replacing your existing database with name "db" then use the --force option:
bin/neo4j-admin load --from=/var/lib/neo4j/data/c: --database=dbase --force
If you are restoring into a new database, then after the load, you need to create the database CREATE DATABASE dbase
Note that I changed the name of your database from db to something else since database names in Neo4j must be at least 3 characters long.

Google cloud sql instance super privilege error

I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..

OBIEE 11g - Copy MDS and BIPLATFORM schemas for new OBI install

I had OBIEE 11.1.1.6.4 installed on a windows 7 server. The app installation was corrupted and I had to re-install. Well, I re-installed OBIEE 11.1.1.6.12 (only install files I have). I want to use the same MDS and BIPLATFORM repositories as before, however; I do not have the schema passwords for the MDS and BIPLATFORM schemas. To get around this, I copied the database and gave new passwords.
When I attempt to hook the OBIEE 11.1.1.6.12 installation up to the copied database, OBIEE config says the BIPLATFORM and MDS schemas cannot be found. Does anyone know if this can be a version issue? The copied database was from a 11.1.1.6.4 install and the app is now 11.1.1.6.12. Is this the problem?
Any thoughts?
I found the answer and wanted to post it for others. There is a table in the repository called dbo.schema_version_registry$. If you change the schema prefix in the copied database (required if using the same DB), then you will need to manually edit this table to reflect the new prefix!
Took me days to track down but it's just a simple update against the table. Hope this helps someone.

Can't find umbraco data storage

I have installed Umbraco via WebMatrix, and entered "server=(localdb)\v11.0;integrated security=true" as a connection string. The site works fine, but I can't find the database that Umbraco have created. When I open the (localdb)\v11.0, it's not there.
I have tried searching whole system with *.mdf. but no luck. Where can the data be?
I am using umbraco 4.8.0
Most likely the database is in the user profile folder of the account that Umbraco is running under. See this post for more complete explanation. You may also want to look at this other post about why LocalDB by default puts the database file in the root of the user profile of the account it runs under.
I have found it. With no Database defined, SQL server uses the first or the default database in the list, which appears to master database. SSMS and such, don't display tables of system objects, so I found the tables by querying the database.

django tools for pushing preinstalled data to database

It there a tool for django to install some static data (necessary data to have application runing ) to database?
./manage.py syncdb will make the database schema i db, some tool for pushing static data to db ?
Thanks
Fixtures.
EDIT: Better example.
Fill your database with static data you need, then execute command:
./manage.py dumpdata --indent=4 > initial_data.json
After that every ./manage.py syncdb will insert data that you entered the first time into database.
If you are familiar with json format (or xml or yaml) you can edit initial data by hand. For details check out official documentation.

Resources