CakePHP: caching with APC still creates cache files, no performance benefit - cakephp

My problem:
I am making Apache Benchmark test to see if CakePHP APC engine works. However, if I setup Cake's caching configuration to use APC engine, the cache files with serialized cached data are still being created in tmp folder, which make me think that file caching is being used.
I also get no performance benefit: using APC and File engines, test results are ~ 4 sec. If I hardcode plain apc_add() and apc_fetch functions in my controller, the test result gets better: ~3.5 sec.
SO the APC is working, but Cake somewhy can't use it.
My setup:
bootstrap.php:
/*Cache::config('default', array(
'engine' => 'File',
'duration'=> '+999 days',
'prefix' => 'file_',
));*/
Cache::config('default', array(
'engine' => 'Apc',
'duration'=> '+999 days',
'prefix' => 'apc_',
));
controller:
$catalogsLatest = Cache::read('catalogsLatest');
if(!$catalogsLatest){
$catalogsLatest = $this->Catalog->getCatalogs('latest', 5, array('Upload'));
Cache::write('catalogsLatest', $catalogsLatest);
}
php.ini:
[APC]
apc.enabled = 1
apc.enable_cli = 1
apc.max_file_size = 64M
If I check Cache::settings() in controller before or after cache executuon, I get these results:
Array
(
[engine] => Apc
[path] => E:\wamp\www\cat\app\tmp\cache\
[prefix] => apc_
[lock] => 1
[serialize] =>
[isWindows] => 1
[mask] => 436
[duration] => 86313600
[probability] => 100
[groups] => Array
(
)
)
I am using CakePHP 2.2.4.

Yes, of course APC cache will boost up your cakephp powered application performance So let's check your settings from my following instructions and let me know after following the instruction do a benchmark test and tell me the result.
You can cache whole your HTML view file in cache with APC cache engine in CakePHP.
Cake's CacheHelper will do that job for you. Suppose you have a PostsController and you want to cache all your view files related this controller. In this case first of all you have to define the following code in your controller.
class PostsController extends AppController {
public $helpers = array('Cache');
}
And in your bootstrap.php file you have to add the CacheDispatcher.
Configure::write('Dispatcher.filters', array(
'CacheDispatcher'
)
);
And now again in your PostsController you have to tell about the cache files.
public $cacheAction = array(
'view' => 36000,
'index' => 48000
);
This will cache the view action 10 hours, and the index action 13 hours.
Let me know your apache benchmark tool test result. I think the mostly similar question are being discussed on another thread https://stackoverflow.com/a/18916692/1431786 check it out.
Thanks.

Related

cakephp paginator extremely slow

I have a cakephp application, 2.4, and I'm having issues with the Paginator component. First off, it's not the database, it's definitely the execution of parsing the query results. I have DebugKit installed and can see that my mysql query for the paginated data takes a whole 2 ms. The table has 2.5 million records of messages, and 500,000 users. Obviously proper indexing is in place. But, the controller action is taking 6167.82 ms. So, here's my controller action:
$this->Paginator->settings = array(
'Message' => array(
'fields' => array(
'Recipient.username',
'Recipient.profile_photo',
'Recipient.id',
'Message.*'
),
'joins' => array(array(
'table' => 'users',
'alias' => 'Recipient',
'type' => 'LEFT',
'conditions' => array(
'Recipient.id = `Message`.`recipient_id`'
)
)),
'conditions' => array(
'Message.sender_id' => $this->Auth->user('id'),
'Message.deleted_by_sender' => '0'
),
'limit' => 10,
'order' => 'Message.id DESC',
'recursive' => -1
)
);
$sents = $this->Paginator->paginate( 'Message' );
$this->set( 'sents', $sents );
$this->view = 'index';
I've google this and searched stack overflow. The majority of the responses are for poor mysql optimization which isn't my case. The other half of the responses suggest containable. So, I tried containable. Using contain was actually slower because it tried to grab even more data from the user's field than just the username, photo, and id. Then when cake built the array from the query results it executed nearly 500 ms slower with containable because of the extra user data I'm assuming.
I'm going to now dig into the cake Paginator component and see why it's taking so long to build the response. I'm hoping someone beats me to it and has a good solution to help speed this up.
My web server is running ubuntu 12.04 with 3gb ram, apache and mod_php with apc installed and working for the model and core cache. The database is on a separate server. I also have a redis server persisting other user data and the cake session data. There is plenty of power here to parse 10 records from a mysql query containing about a dozen rows.
EDIT: ANSWER
As suggested first by Ilie Pandia there was something else happening, such as a callback, that was slowing down the pagination. This was actually unrelated to the pagination component. The Recipient model had a behavior that loaded an sdk in the setup callback for a 3rd party service. That service was taking several seconds to respond. This happened when the linkedModel in the query was loaded to filter the results. Hopefully anyone else looking for reasons why cake might be performing poorly will also look at the callbacks on models in the application and plugins.
I see no reason for this to run slow at all.
So this suggests that there are some callback installed (either in the model or the controller) that do additional processing and inflate the action time so much.
That is assuming that there is nothing else in the controller but what your wrote.
You could actually measure the time of the paginate call itself and I think you will find that it is very fast. So the bottle neck is elsewhere in the code.
PS: You could also try to disable DebugKit for a while. Introspection may take very long for some particular cases.
Install DebugKit for your application.
And inspect which query is taking too much time. From there, you should be able to track the bottleneck.

CakePHP: Why is my cached file causing huge spikes when it expires?

I'm using cake 2.1.3 and currently have a page that is getting hundreds of views per second and so I have utelized caching in order to handle the load better. The problem is, that once the cache expires, I get a spike in my server resources as well as hundreds of mysql connections.
I'm wondering if I'm going about this the wrong way and if I should be running a cron to cache the page instead of how I'm currently doing it or if there's another technique I'm not thinking of.
here's what my function looks like in my controller:
public function index() {
$this->layout = 'ajax';
if (isset($this->params['url']['callback'])) {
$callback = $this->params['url']['callback'];
}else{
$callback = 'callback';
}
$this->set('callback',$callback);
$today = date("Y-m-d");
$end_date = strtotime ('+1 day' , strtotime($today)) ;
$end_date = date ( 'Y-m-d' , $end_date);
$start_date = strtotime ('-1 day' , strtotime($today)) ;
$start_date = date ( 'Y-m-d' , $start_date);
$total = Cache::read('popular_stories', 'short');
if (!$total) {
$total = $this->TrackStoryView->find('all', array(
'fields' => array('COUNT(story_id) AS theCount', 'headline', 'url'),
'conditions' => array('date BETWEEN ? AND ?' => array($start_date,$end_date)),
'group' => 'story_id',
'order' => array('theCount DESC'),
'limit' => 20,
));
Cache::write('popular_stories', $total, 'short');
}
$this->set('story', $total);
}
Here's what my Cache config looks like in my bootstrap.php file:
Cache::config('short', array(
'engine' => 'File',
'duration' => '+60 minutes',
'path' => CACHE,
'prefix' => 'cake_short_'
));
This is what's in my view file:
<?php
echo $callback . '('.json_encode($story).')';
?>
I was hoping that once the cached file expires, as soon as the first person accessed it, it would craete a new cached file and serve that up for everyone, however because hundreds of people are hitting it per second, it seems like this method isn't working for me and that maybe I should be caching the view view a cron somehow instead or maybe there's a different way to cache that I'm not utelizing.
It sounds like you have the answer more or less figured out (create the cache automatically, not triggered by a user request).
To do this, look into cake's AppShell class, book talks about it here. You can then link this to a cron job. If you create the file thru Cache::write, cake should be aware that it is a new cache file and read it transparently. You might want to leave the "if cache not found" block in there just in case your cronjob fails.
Shells & Tasks in cake are fun and allow you to free your application from using the request/response model exclusively.
TLDR: It's not ideal to force a user to break the cache for you. Use a chron job or a trigger on data change.
Explanation:
"hundreds of views per second" is the problem. When it expires, there are "hundreds of views" during the time it's trying to create the cache file.
The first person hits it, it starts creating the cache, and in the meantime, another hundred+ people hit it, and it looks, and can't yet find a cache file...etc.
If you can manage, try to manually create the cache when an item(s) is updated, or run a chron job that creates a new cache every X minutes as opposed to having it create for a user.
Cake has lots of cool triggers like afterSave() that you can use to trigger this kind of thing. If that doesn't make sense in your case though, a chron job should be fine for you.
I think the answer lies by working out how long this query takes:
$total = $this->TrackStoryView->find('all', array(
'fields' => array('COUNT(story_id) AS theCount', 'headline', 'url'),
'conditions' => array('date BETWEEN ? AND ?' => array($start_date,$end_date)),
'group' => 'story_id',
'order' => array('theCount DESC'),
'limit' => 20,
));
Lets say it takes 500ms.
You are getting 100 hits a second, so when the cache clears the first request makes the find call and then 50 other people also make the find call before the first request completes.
One alternative solution:
Make the cached content never expire. Set up a cron task that overwrites the cache by calling a different action which runs:
Cache::write('popular_stories', $total, 'short');
To overwrite the cached content.
This way, the 100s of users per second will ALWAYS read from cache

Multiple image upload - Meio Upload

I am using the 'MeioUpload' plugin found here 'https://github.com/jrbasso/MeioUpload' and Cakephp 2.x.
Currently using this for single image uploads, please can anyone give advice on how to handle multiple image uploads using this plugin. Currently the db table storing the images holds filename, dir, mimetype and filesize fields for each image. I want to store more than one image for each of my posts when adding a new post. Any help would be much appreciated, thanks in advance :).
As I mentioned in my comment, you might want to try https://github.com/josegonzalez/upload as MeioUpload is now deprecated, and it's developer is working on that new upload plugin I linked to.
Either way, the following info for MeioUpload holds true for the new plugin, too.
MeioUpload is built to handle one uploaded file per corresponding set of fields. I don't think the example in MeioUpload's ReadMe is ideal, as it seems to imply that you have to have a table of 'images', where as in reality, you can have a table of just about anything, where each record holds one or more uploaded files (be it images, PDF's, MP3's... anything).
So, with that in mind, you have two solutions:
1) If your posts will have a potentially infinite number of images (ie, not a fixed, small number) then you can have Posts and Images in separate tables, and set up a hasMany relationship between them. See http://book.cakephp.org/2.0/en/models/associations-linking-models-together.html
2) If you know that each post will only have a max of say 3 or 4 (or some other relatively small number) of images, then you can implement 3 (or 4, or X) sets of image fields in your Posts table / model, each to handle a separate upload. They'd be named, eg. featured_image_filename, feautred_image_dir, etc; image2_filename, image2_dir, image2_mimetype, etc; image3_filename, image3_dir, etc.
Your acts as would look something like:
var $actsAs = array(
'MeioUpload.MeioUpload' => array(
'featured_image_filename' => array(
'fields' => array(
'dir' => 'featured_image_dir',
'filesize' => 'featured_image_filesize',
'mimetype' => 'featured_image_mimetype'
),
),
'image2_filename' => array(
'fields' => array(
'dir' => 'image2_dir',
'filesize' => 'image2_filesize',
'mimetype' => 'image2_mimetype'
),
),
'image3_filename' => array(
'fields' => array(
'dir' => 'image3_dir',
'filesize' => 'image3_filesize',
'mimetype' => 'image3_mimetype'
),
),
)
);
This second solution is hardly ideal database design, but sometimes when you know there'll never be more than a few images, it's just the easiest way to do it - both in terms of developing, and in terms of an easy to use UI.
Make sense?

CakePHP 2.x: Custom Logging

I've got a CakePHP application that receives instant payment notifications from PayPal. I'd like to log the data that gets posted by PayPal. I could easily do that using something like this:
file_put_contents(LOGS . 'ipns.log', date('Y-m-d H:i:s ') . print_r($_POST, true) . "\n", FILE_APPEND|LOCK_EX);
But I prefer to do things "the CakePHP way™" whenever possible. I've already looked through the "Core Libraries > Logging" section of CakePHP's cookbook and am having trouble understanding it. I know it's not correct to do this:
CakeLog::write('ipns', print_r($_POST, true));
Although the above does seem to work, it can also cause problems, as shown here.
So what is the CakePHP way to do this? Or should I just use the raw PHP shown at the top of this question?
What you want is explained here http://book.cakephp.org/2.0/en/core-libraries/logging.html#creating-and-configuring-log-streams
But I would suggest you to read the whole page and not just this section.
I would write the ipn to a database table field by field and not into a file log. I can tell you this based on my experience with the paypal API. The advantages are obviously, you can for example lookup the ipns for an order, search for errors and so on.
According to Writing to log paragraph of Logging section of the 2.x cookbook:
CakeLog does not auto-configure itself anymore. As a result log files
will not be auto-created anymore if no stream is listening. Make sure
you got at least one default stream set up, if you want to listen to
all types and levels. Usually, you can just set the core FileLog class
to output into app/tmp/logs/:
CakeLog::config('default', array(
'engine' => 'File'
));
So, in order to make CakeLog::write('ipns', print_r($_POST, true)); to write to custom file app/tmp/logs/ipns.log you need in app/Config/bootstrap.php instead of:
/**
* Configures default file logging options
*/
App::uses('CakeLog', 'Log');
CakeLog::config('debug', array(
'engine' => 'File',
'types' => array('notice', 'info', 'debug'),
'file' => 'debug',
));
CakeLog::config('error', array(
'engine' => 'File',
'types' => array('warning', 'error', 'critical', 'alert', 'emergency'),
'file' => 'error',
));
write:
/**
* Configures default file logging options
*/
App::uses('CakeLog', 'Log');
CakeLog::config('default', array(
'engine' => 'File'
));

CakePHP not loading associated properties with model on production server

This is a weird one.
I have a local server on which I develop apps. A product review app I developed works flawlessly on it, and utilizes Cake's associative modeling ($hasMany, $belongsTo, et. al.).
After pushing this app up to a production server, it fails. Gives me an error message:
Notice (8): Undefined property: AppModel::$Product [APP/controllers/reviews_controller.php, line 46]
ReviewsController::home() - APP/controllers/reviews_controller.php, line 46
Dispatcher::_invoke() - CORE/cake/dispatcher.php, line 204
Dispatcher::dispatch() - CORE/cake/dispatcher.php, line 171
[main] - APP/webroot/index.php, line 83
I've debug()'d $this and it shows, plain as day, that, while the local server is loading the associated models, the production server is not. The databases are mirror duplicates (literally, the production server was imported from the dev db), and I can manually load models, which tells me it's connecting to the DB just fine.
What on Earth is going on?
UPDATE
The sql query from the production server is this:
SELECT `Review`.`id`, `Review`.`title`, `Review`.`product_id`, `Review`.`score`, `Review`.`submitted`, `Review`.`reviewed`, `Review`.`right`, `Review`.`wrong`, `Review`.`user_id`, `Review`.`goals`
FROM `reviews`
AS `Review`
WHERE 1 = 1
ORDER BY `Review`.`submitted` desc LIMIT 10
The sql query from the dev server is this:
SELECT `Review`.`id`, `Review`.`title`, `Review`.`product_id`, `Review`.`score`, `Review`.`submitted`, `Review`.`reviewed`, `Review`.`right`, `Review`.`wrong`, `Review`.`user_id`, `Review`.`goals`, `User`.`id`, `User`.`username`, `Product`.`id`, `Product`.`name`
FROM `reviews`
AS `Review`
LEFT JOIN `users` AS `User` ON (`Review`.`user_id` = `User`.`id`)
LEFT JOIN `products` AS `Product` ON (`Review`.`product_id` = `Product`.`id`)
WHERE 1 = 1
ORDER BY `Review`.`submitted` desc LIMIT 10
UPDATE 2
Here's some of the code the errors point to:
$title = $this->Review->Product->find( 'first', array( 'fields' => array( 'Product.name' ), 'conditions' => array( 'Product.id' => $filter ) ) );
UPDATE 3
<?php
class Review extends AppModel {
var $name = 'Review';
var $displayField = 'title';
//The Associations below have been created with all possible keys, those that are not needed can be removed
var $belongsTo = array(
'User' => array(
'className' => 'User',
'foreignKey' => 'user_id',
'conditions' => '',
'fields' => '',
'order' => ''
),
'Product' => array(
'className' => 'Product',
'foreignKey' => 'product_id',
'conditions' => '',
'fields' => '',
'order' => ''
)
);
}
?>
I had this problem, and for me it was due to a missing field in one of the database tables. I'd triple-check to make sure both DB's are exactly the same (although you said they were...): feel free to use this 7-year-old app to check them :D http://www.mysqldiff.org/
Other people with this issue talked about filename issues and that all files should be lowercased, so that may be something to check as well...
Actually - from a quick glance it might be worth using containable to make sure your data calls are consistent.
If you don't want to go through the hassle of adding containable (but I would urge you to do so - it is one my favourite features of cakephp), you may want to set recursive in your find() call just to make sure the associated models are loaded.
Do you have a way of looking at the files on the server without going through ftp? I had a problem similar to this where the timestamps were messed up on the files and the server would not update the file. I had to delete the files, and then re upload them. You may have already tried this but I just thought I would suggest the possibility. Maybe some of those files are outdated on the server.
Have a great day!
Can you pastebin the Review model, Product model, AppModel, AppController, and the controller you are getting the error from.
The line:
Notice (8): Undefined property: AppModel::$Product [APP/controllers/reviews_controller.php, line 46]
Seems to indicate the Review Model is loading the AppModel and not the file you want it to. In that case the Review model won't have a Product association.
you can print the stacktrace out to see, here's some code I snatch from php.net
echo "<div>Stack<br /><table border='1'>";
$aCallstack=debug_backtrace();
echo "<thead><tr><th>file</th><th>line</th><th>function</th><th>args</th></tr></thead>";
foreach($aCallstack as $aCall)
{
if (!isset($aCall['file'])) $aCall['file'] = '[PHP Kernel]';
if (!isset($aCall['line'])) $aCall['line'] = '';
echo "<tr><td>{$aCall['file']}</td><td>{$aCall['line']}".
"</td><td>{$aCall['function']}</td><td>";
debug (($aCall['arg']));
echo "</td></tr>";
}
echo "</table></div>";
die();
It's gonna be hard looking through all that though.

Resources