I am not a programer neither work full time in this i make a page for a client and try to change the email from user#example.com to a personal trying different methods the last one i sue was trying to use the wp_mail_smtp plugin and then the page says database error and i have 1222 lines of error and in the front page with te same problem of table don't appear, the page is mangooglamping.com and in there shows some problems too, i do not know how to restart the database and i am using the ssh of google cloud because i have everything online and in the tutorials i follow they put all in the web and do not explain how to download the page to local mac
Bitnami engineer here, you can easily modify the email address of the user by running these commands in the database
Obtain the id of the user
mysql -u root -p bitnami_wordpress -e "SELECT * FROM wp_users;"
Update the email address
mysql -u root -h DATABASEHOST -p bitnami_wordpress -e "UPDATE wp_users SET user_email='NEW-EMAIL' WHERE ID='ADMIN-ID';"
You can learn more about how to do this in our documentation
https://docs.bitnami.com/aws/apps/wordpress/administration/reset-wp-admin-email-address/
i do not know how to restart the database
Simply run
sudo /opt/bitnami/ctlscript.sh restart mysql
Related
I am trying to migrate a series of Trac projects originally hosted on CloudForge onto a new Bitnami virtual machine (debian with Trac stack installed).
The documentation on the Trac wiki regarding restoring from a backup is a little vague for me but suggests that I should be able to setup a new project
$ sudo trac-admin PROJECT_PATH initenv
stop the services from running
$ sudo /opt/bitnami/ctlscript.sh stop
copy the snapshot from the backup into the new project path and restart the services
$ sudo /opt/bitnami/ctlscript.sh start
and should be good to go.
Having done this (and worked through quite a few issues on the way) I have now got to the point where the browser page shows
Trac Error
TracError: Unable to check for upgrade of trac.db.api.DatabaseManager: TimeoutError: Unable to get database connection within 0 seconds. (OperationalError: unable to open database file)
When I setup the new project I note that I left the default (unedited) database string but I have no idea what database type was used for the original CloudForge Trac project i.e. is there an additional step to restore the database.
Any help would be greatly appreciated, thanks.
Edit
Just to add, the CloudForge was using Trac 0.12.5, new VM uses Trac 1.5.1. Not sure if this will be an issue?
Edit
More investigation and I'm now pretty sure that the CloudForge snapshot is not an SQLite (or other) database file - it looks like maybe a query type response as it starts and ends with;
BEGIN TRANSACTION;
...
COMMIT;
Thanks to anyone taking the time to read this but I think I'm sorted now.
After learning more about SQLite i discovered that the file sent by CloudForge was an sqlite DUMP of the database and was easy enough to migrate to a new database instance using the command line
$ sqlite3 location_of/new_database.db < dump_file.db
I think I also needed another prior step of removing the contents of the original new_database.db using the sqlite3 command line (just type sqlite3 in terminal)
$ .open location_of/new_database.db
$ BEGIN TRANSACTION;
$ DELETE FROM each_table_in_database;
$ COMMIT;
$ .exit
I then had some issue with credentials on the bitnami VM so needed to retrieve these (as per the bitnami documentation) using
$ sudo cat /home/bitnami/bitnami_credentials
and add this USER_NAME as a TRAC_ADMIN using
$ trac-admin path/to/project/ permission add USER_NAME TRAC_ADMIN
NOTE that pre and post this operation be sure to stop and re-start the bitnami services using
$ sudo /opt/bitnami/ctlscript.sh stop
$ sudo /opt/bitnami/ctlscript.sh start
I am the guy from Trac Users, you need to understand that the user isnt really stored in the db. You got some tables with columns holding the username but there is no table for an user. Looking at you post i think your setup used htdigest and then your user infos are in that credential file. if you cat it you should see something like
username:realmname:pwhash
i thing this is md5 as hash but it doesnt really matter for your prob. so if you want to make a new useryou have to use
htdigest [ -c ] passwdfile realm username
then you should use trac-admin to give the permission and at that point your user should be able to login.
Cheers
MArkus
I've set functional laravel project, database connection works, I've added some tables via terminal but now I'd like to see it just as you can see everything with phpmyadmin, for example. How to locate the database file and how to open it, by default?
In Terminal:
cd into Project directory
vagrant ssh
cd into Project directory in machine
mysql -u[username] -p[password] - e.g mysql -uhomestead -psecret
Vaala! You are connected to MySQL...
SHOW databases;
USE [Database Name]; - e.g use Homestead;
SHOW TABLES; to see all tables
SELECT * from [table]; - e.g select * from users
Are you using MySQL? If so, you can install the MySQL Workbench and enter the same database credentials to view your data similar to PHPMyAdmin. Alternatively, you can install PHPMyAdmin wherever you are developing (local machine/vagrant/remote). You can also view everything in tabular format with the command line and mysql as well, but it's not very friendly.
I am very new in Google app engine please help me to solve my problem
I have created one instance in Google cloud sql when I import SQL file then it shows me error like this.
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
How do I to add super privilege to my instance.
As stated at the Cloud SQL documentation:
The SUPER privilege is not supported.
You can take a look at this page that explains how to import data to a Cloud SQL instance.
I also faced the same issue. But the problem was in dumped sql database. When exporting the database use these flags
--hex-blob --skip-triggers --set-gtid-purged=OFF
Here is the complete documentation of how to do it (https://cloud.google.com/sql/docs/mysql/import-export/importing). Once the data is exported, it can be imported using command line, gcloud shell or there is an option of import in gcloud sql as well.
I used the import feature of gcloud sql console and it worked for me.
I ran into the the same error when backporting a gzipped dump (procured with mysqldump from a 5.1 version of MySQL) into a Google Cloud SQL instance of MySQL 5.6. The following statement in the sql file was the problem:
DEFINER=`username`#`%`
The solution that worked for me was removing all instances of it using sed :
cat db-2018-08-30.sql | sed -e 's/DEFINER=`username`#`%`//g' > db-2018-08-30-CLEANED.sql
After removal the backport completed with no errors. Apparently SUPER privilege is needed, which isn't available in Google Cloud SQL, to run DEFINER.
Another reference: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Good luck!
i faced same issue you can try giving 'super permission' to user but isn't available in GCP cloud SQL.
The statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
For the use case of copying between databases within the same instance, it seems the only way to do this is using mysqldump, which you have to pass some special flags so that it works without SUPER privileges. This is how I copied from one database to another:
DB_HOST=... # set to 127.0.0.1 if using cloud-sql proxy
DB_USER=...
DB_PASSWORD=...
SOURCE_DB=...
DESTINATION_DB=...
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
| mysql -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $DESTINATION_DB
Or if you just want to dump to a local file and do something else with it later:
mysqldump --hex-blob --skip-triggers --set-gtid-purged=OFF --column-statistics=0 -h $DB_HOST -u $DB_USER -p"$DB_PASSWORD" $SOURCE_DB \
> $SOURCE_DB.sql
See https://cloud.google.com/sql/docs/mysql/import-export/exporting#export-mysqldump for more info.
It's about the exporting of data. When you export from the console, it exports the whole Instance, not just the schema, which requires the SUPER privilege for the project in which it was created. To export data to another project, simply export by targeting the schema/s in the advanced option. If you run into could not find storage or object, save the exported schema to your local, then upload to your other project's storage, then select it from there.
In case somebody is searching for this in 2018 (at least august) the solution is:
Create a database. You can do this from UI, just go to Database menu and click "Create a database".
After you clicked "import" and selected your sql_dump (previously saved in a bucket), press "Show advanced options" and select your Db (not that advanced, are they?!). Otherwise, the default is the system mysql which, of course can not
support import.
Happy importing.
I solved this by creating a new database and in the SQL instance. (Default database is sys for mysql).
Steps(Non-cli version):
1) In GCP > SQL > Databases , create a new database e.g newdb
2) In your sql script, add: Use newdb;
Hope that helps someone
SUPER privilege is exclusively reserved for GCP
For you question, you need to import data into a YOUR database in which you have permission ..
When I download a new wordpress from wordpress.org and then paste it into my www folder of WAMP, then create a new database in phpmyadmin, then go to localhost and click the wordpress site, it asks to create the config file, and enter the database details, and i do that correctly, but when I click submit, it says "Can’t select database".
Any Idea why this is?
I already have a local wordpress site that started saying error connecting to database. The config settings are all correct so i tried to download a fresh wordpress site and use it and I cannot even set up a fresh one. I have never encountered a fresh wordpress site not working like this before.
I did a msqldump of my old site so that my boss could put it on his server. Not sure if that is relevant.
Cheers.
Please don't overcomplicate.
In the "Database Host" field add "localhost: e.g. in my case "localhost:3308" solved the problem.
For anyone who's still looking for solutions to this problem - please, check again your wp-config.php database credentials again. I had the same problem today and tried to over-complicate the matter by searching for advanced solutions, while I had a space in my DB_NAME field (was supposed to be 'wpdb' and was ' wpdb').
This space completely messed up my connection, I was even close to reinstalling the whole thing and losing all data.
If using LAMP -
Make sure that all privileges are granted for that database to the created mysql user. Under your MYSQL shell use the below
GRANT ALL PRIVILEGES ON database_name.* TO database_user#localhost IDENTIFIED BY 'user_password';
Where,
database_name = your database name as per wp-config.php
database_user = your user name as per wp-config.php
userpassword = your user password as per wp-config.php
It sounds like either you have not created a database or your wp-config file is not set up correctly. Have you edited wp-config.php file of wordpress and provided correct database name and user credentials. See Famous 5 minute installation of wordpress for details.
It took me way too long to discover that my msql databases were not working on any other websites either. I reinstalled Wamp Server and it worked.
Verify the wp-config.php file.
I had the same error and the problem was related with the quotes:
‘root’ vs 'root'
edit: the correct one is single quote
'
By default
username : root
password : leave it blank
database host : localhost
table_prefix : wp_
For me, the solution was giving wp_user all permissions on the wp_db database.
I achieved this using the command line interpreter (CLI):
$ mysql -uroot -p
MariaDB [(none)]> GRANT ALL ON `wp_db`.* TO `wp_user`#`localhost` IDENTIFIED BY 'pass';
MariaDB [(none)]> exit;
After these operations, I restarted the database:
$ systemctl restart mysql
Of course, the database name (wp_db), username (wp_user), and password of the user (pass) can be different.
my problem was that when I granted permissions to the DB user the host did not match what was in the wp_config.php
/** MySQL hostname */
define('DB_HOST', 'localhost:3306');
I was granting rights with
GRANT ALL ON databaseName.* TO 'UserName'#'127.0.0.1';
This gave me access when I was connected to the local host through ssh but It did not match wp_config.php. so I had to Grant like this then it all worked.
GRANT ALL ON databaseName.* TO 'UserName'#'localhost';
For Mariadb hostname should be localhost:3306 and for MySQL localhost:3308.this fixed my error.
I had this issue when trying to transfer databases between WAMP server installations.
I tried opening PhpMyAdmin on new PC, then got the port of MySql server (:3308) and added this port into DB_HOST ('localhost:3308'). This worked for me.
please create your database with name 'wordpress' before installing wordpress
You may need to create the database yourself.
Visit http://localhost:8888/phpMyAdmin5
create a new database with the name you prefer.
Don't add any table names, just use the name you choose for the database in the WordPress setup
Can someone please help me downloading a database from CPanel ?
I have a website hosted using CPanel WHM.
The database is huge.
I want to make changes to the website. But would like to work on it locally.
So I downloaded my website content.
When I try to download the database, the download gets stopped in middle because of the huge size of the database. How can I download the full database?
Since the database is huge, it can't be downloaded through phpmyadmin.
To download only through phpmyadmin, additional parameters like: php execution time limit, mysql time limit, mysql cache size etc etc needs to increased in files like php.ini mysql.ini
Instead mysqldump can be used.
Login to your site using SSH.
Eg:
>>ssh user#your_website.com
>>Enter password: your_password
>>mysqldump -u [uname] -p[pass] [dbname] > [backupfile.sql]
[uname] Your database username
[pass] The password for your database (note there is no space between -p and the password)
[dbname] The name of your database
[backupfile.sql] The filename for your database backup