Drupal template installation issue - drupal-7

I am trying to install a drupal template but it shows me this error
PHP memory limit 128M
Consider increasing your PHP memory limit to 196M to help prevent errors in the installation process. Increase the memory limit by editing the memory_limit parameter in the file /opt/users/ipg/w/i/ipg.wintergtiranacom/php53/php.ini and then restart your web server (or contact your system administrator or hosting provider for assistance). See the Drupal requirements for more information.
I have tried the ini_set method the .htaccess method and I can't find the /opt/users/ipg/w/i/ipg.wintergtiranacom/php53/php.ini
path.
Can anyone help me please

Go to you "php.ini" file as mention path /opt/users/ipg/w/i/ipg.wintergtiranacom/php53/php.ini and search memory limit variable and increase the limit to 196M

Try this
# mysql -u admin -p< Your Password>
mysql> set global net_buffer_length=1000000;
Query OK, 0 rows affected (0.00 sec)
mysql> set global max_allowed_packet=1000000000;
Query OK, 0 rows affected (0.00 sec)
then exit and restart you WAMP.
and re-install the template.

Related

How can I limit the size of the debug_kit.sqlite file in cakephp 3.x?

The debug_kit.sqlite file in the tmp directory grows with every request by approx. 1.5 Mb. If I don`t remember to delete it, I am running out of disc space.
How could I limit it`s growth? I don't use the history panel, so I don't need the historic data. (Side question: why does it keep all historic requests anyways? In the history panel only the last 10 requests are shown, so why keep more than 10 requests in the db at all?)
I found out that the debug_kit has a garbage collection. However it is not effective in reducing the disc space because sqlite needs to rebuild the database with the vacuum command to free disc space. I created a PR to implement vacuuming into the garbage collection: https://github.com/cakephp/debug_kit/pull/702
UPDATE: The PR has been accepted. You can solve the problem now by updating debug_kit to 3.20.3 (or higher): https://github.com/cakephp/debug_kit/releases/tag/3.20.3
Well, there is one main purpose for debug kit. DebugKit provides a debugging toolbar and enhanced debugging tools for CakePHP applications. It lets you quickly see configuration data, log messages, SQL queries, and timing data for your application. Simple answer is Just for debug. Even though only shown 10 requests, you can still query to get all histories such as
Cache
Environment
History
Include
Log
Packages
Mail
Request
Session
Sql Logs
Timer
Variables
Deprecations
It's safe to delete debug_kit.sqlite or you can set false to generate again or what I did it I run cronjob to delete it every day.
Btw, you should not enable it for staging or production. Hope this help for you.

Cannot export local databases using phpmyadmin

Environment:
Windows 10
XAMPP Win32 7.0.23
PHP: 7.0.23
MariaDB 10.1.28?
Magento 2.1.9
I cannot export any databases. I get that "Warning: a form on this page has more than 1000 fields" message, and then the export does not work. From what I've read, you're supposed to change the max_input_vars in php.ini. I checked my only php.ini file. The line was commented out. I removed the semicolon and increased the value - a few times. Here is what it looks like now:
; How many GET/POST/COOKIE input variables may be accepted
max_input_vars = 10000
No matter what I increased it to, I still get the same 1,000 fields message when I try to export a database. I searched the entire XAMPP directory and sub-directories, but found no other php.ini except for the one in the xampp/php directory. I would not think it needs to be higher than 10,000 for a fairly new Magento database with only a few products in it. When I imported the database into XAMPP, it was less than 2 MB in total size. I tried exporting another Magento database for the unmodified demo site, and I get the same warning and result.
Can someone help me? Thanks.
UPDATE:
info.php created and verifies the current max_input_vars = 10000. I am selecting the database in phpmyadmin, and then clicking on the export button at the top. The export, in simple mode, selects all tables for the given database for export.
put info.php into document root with
<?php
phpinfo();
?>
call the file in browser and check the value for max_input_vars.
Also specify if you are exporting a whole database or a query result for further advice

Installation of joomla 3.0 is not being finished

I was trying to work on joomla 3.0. I have done everything that is necessary. But the page given in the image is being shown for a long time. Database tables are being created. But it is not going to the next step. Can anyone help me out? TIA
With the almost none information you provide I could tell you this.
Be sure you are selecting mysql and not mysqli and that the username has all the permissions granted on the database you provided.
In your server's home directory, (which is the lowest directory for shared hosting users,) make a new file called "phprc" inside of the .php folder. If the folder doesn't exist yet, create it. The period in front of the folder name means that it's invisible, so make sure you can see invisible files in your FTP client, or use the command "ls -a" to see all files in the command line.
Add the following lines to the 'phprc' file:
Code:
max_execution_time = 3000 ;
memory_limit=128M ;
Then save it.
normally if it is a shared host could take some minutes to reflect the change...but try again after 5 or 10 minutes and you might see it works.

Clearing the cakephp tmp/cache solves the issue for only one save call. What can be the reason?

I modified the schema of mysql database(added a new table etc.), I cleared the tmp/cache(except directories).
Now the save in the new table happens only once(I have multiple save calls in a for loop and save happens for all of them) and fails from next time I enter the flow.
I am using cakephp 1.3.
What else should I check ?
Got it.
The cache issue was one part of the problem, which got fixed by clearing the files in tmp/cache directory.
Learning is -
If you make mysql schema changes(add new table / column etc.) in mysql, either clear the tmp/cache directory or set the debug level as 3 and refresh the page and set the debug level again to 0(if on production).
I was also getting a save error - mysql server has gone away, because in configs the wait_timeout value was 600 seconds. But my script was taking longer than that.
So model->save() was not working.
In my.cnf I updated the timeout to 4800 and restarted mysql and it fixed the problem.

Drupal website blocked because of many connection errors - website goes offline

From time to time, the number of database connections from our Drupal 6.20 system to our Mysql database reaches 100-150 and after a while the website goes offline. The error message when trying to connect to Mysql manually is "blocked because of many connection errors. Unblock with 'mysqladmin flush-hosts'". Since the database is hosted on an Amazon RDS I don't have the permission to issue this command, but I can reboot the database and once rebooted the website works normally again. Until next time.
Drupal reports multiple errors prior to going offline, of two types:
Duplicate entry
'279890-0-all' for key
'PRIMARY' query:
node_access_write_grants /* Guest :
node_access_write_grants */ INSERT
INTO node_access (nid, realm, gid,
grant_view, grant_update,
grant_delete) VALUES (279890,
'all', 0, 1, 0, 0) in
/var/www/quadplex/drupal-6.20/modules/node/node.module
on line 2267.
Lock wait timeout exceeded; try
restarting transaction query:
content_write_record /* Guest :
content_write_record */ UPDATE
content_field_rating SET vid = 503621,
nid = 503621, field_rating_value =
1212 WHERE vid = 503621 in
/var/www/quadplex/drupal-6.20/sites/all/modules/cck/content.module
on line 1213.
The nids in these two queries are always the same and refer to two nodes that are frequently automatically updated by a custom module. I can track down a correlation between these errors and unusually many web requests in the Apache logs. I would understand that the website would become slower because of this. But:
Why do these errors occur, and how can they be solved? It seems to me it's to do with several web requests trying to update the same node at the same time. But surely Drupal should deal with this by locking the tables etc? Or should I deal with it in some special way?
Despite the higher web load, why does the database completely lock and require to be rebooted? Wouldn't it be better if the website still had access to Mysql and so, once the load is lower, it can serve pages again? Is there some setting for this?
Thank you!
Can be solved one or all of these three things to check:
are you out of disk space? From ssh, type command df -h and make sure you still have disk space.
Are the tables damaged? Repair the tables in phpMyAdmin, or CLI instructions here: http://dev.mysql.com/doc/refman/5.1/en/repair-table.html
Have you performance-tuned your mysql with an /etc/my.cnf? See this for more ideas: http://drupal.org/node/51263

Resources