CakePHP 3: Set cache duration in Controller when writing - cakephp

I am using CakePHP 3 in my project. I am still learning new things as I go. Today I had a requirement to use CakePHP cache to cache data that I retrieve from the database. Every time I load a page that returns data from the database, takes around 30 to 40 seconds.
So I went ahead and configured cache in my controller, which significantly improved page loading from 30 sec to less then 4 seconds.
Now, what I want to do is set duration of the cache to clear itself after 1 hour to refresh new data that is stored in the database.
This is my code that does the caching:
if (!($custstoredata = Cache::read('custstoredata'))) {
# Code logic
Cache::write('custstoredata', $customers);
$this->set('data',$customers);
} else {
$this->set('data', Cache::read('custstoredata'));
}
After doing some research online, I found that I can use Cache::set to configure duration so I went a ahead and added Cache::set(array('duration' => '+1 hour')); in my if statement, but when I load the page in browser, I get this error:
Error: Call to undefined method Cake\Cache\Cache::set()
I am not sure what is the right way to set caching duration in controller real time when cache is written to a file.

I think I answered my own question.
I added below cache config in app.php file:
'reports_seconds' => [
'className' => 'File',
'path' => CACHE,
'serialize' => true,
'duration' => '+60 seconds',
]
Once I added above code, I modified my if statement with below code that fixed the problem.
if (!($custstoredata = Cache::read('custstoredata'))) {
# Code logic
Cache::write('custstoredata', $customers, $config = 'reports_seconds');
$this->set('data',$customers);
} else {
$this->set('data', Cache::read('custstoredata'));
}

Related

Laravel 5.2: Handling database insertions using the Laravel Queues service

Good morning!
I have the following code in my Controller:
$num_rows = $this->drupalRequestRowCount();
$uris = $this->drupalRequestPrepareUri($num_rows, config('constants.IMPORT_DATA_URI'));
$requestHandler = new RequestHandler($uris[0]);
$content = $requestHandler->httpRequest();
// Dispatch each URI for the job that will handle
// the insertion of our Drupal data.
foreach($uris as $uri) {
$job = (new DataJob($uri))->onQueue('data_export_insert');
$this->dispatch($job);
}
return 'Jobs dispatched';
I have a Drupal database with data that I want to synchronize with my Laravel database every night. I make use of an API i developed myself in Drupal that returns a lot of data that I need for my Laravel database.
The API returns a maximum of 10 items (which contain a lot of data). The Drupal database has +- 17000 items that I need to import every night. I thought that the Queue API might be a good solution for this to import the data in batches instead of importing it all at once. I loop through the $uris array in the foreach loop that appends an offset to the API based on the $num_rows I get.
So if $num_rows returns 17000 items, the $uris array will contain 1700 items like so:
[0] => 'http://mywebsite.com/api/request_data.json?offset=0'
[1] => 'http://mywebsite.com/api/request_data.json?offset=10'
[2] => 'http://mywebsite.com/api/request_data.json?offset=20' etc.
This means that I want to have 1700 jobs dispatched that Laravel will execute for me once I run the php artisan queue:listen database command.
The following code is the Job I want to execute 1700 times.
public function __construct($uri)
{
$this->uri = $uri;
}
public function handle()
{
try {
Log::info('Inserting data into database for uri: ' . $this->uri);
$requestHandler = new RequestHandler($this->uri);
$content = $requestHandler->httpRequest();
foreach($content as $key => $value) {
$id = $value->id;
$name = $value->name;
DB::insert('INSERT INTO table_1 (id, name) VALUES (?, ?)', [$id, $name]);
}
} catch(\Exception $e) {
Log::error('Error in job: '. $e->getMessage());
}
}
My problem is that the Jobs don't get executed at all. The jobs are being correctly dispatched to the database because I can see 1700 rows in Jobs the table when I check the Jobs table in the database. I tried logging the handle function but when i run php artisan queue:listen or php artisan queue:work nothing happens. The failed_jobs table also don't have any records.
The handle function is not being executed so there is no way I can log anything that's happening inside. I tried finding information on the internet with examples, but all I can find is some mailing examples which I don't need.
Any help would be greatly appreciated!
Fixed the issue. Make sure you place in your Job code protected $payload! With payload being the data you want to pass to the Queue Handler!

CakePHP 1.3 cleares all cached pages after adding new post

I am using CakePHP 1.3 and trying to enable cache for view pages, cache system works fine and caches all pages. But when we add a new post (insert new record to database) or edit an old one (update a record of the table) CakePHP deletes all cached pages, not just the edited page!
app/config/core.php :
Cache::config('default', array('engine' => 'File','duration' => 8640000));
app/controllers/articles_controller.php :
var $helpers = array('Cache');
var $cacheAction = array(
'view' => array('duration' => 8640000),
'latest' => array('duration' => 8640000),
);
How can I tell Cake to delete just the cached version of changed page and not all cached pages?
This it actually pretty hard, so I can't just give you a piece of code to solve this. You need to edit the actual cake files in the lib folder that manage caching. Note: this is super not recommended by the cake people. However the lib/Cake/Cache/Engine/FileEngine.php is the file that has the operations of the file engine. You seem interested in the delete function:
/**
* Delete a key from the cache
*
* #param string $key Identifier for the data
* #return boolean True if the value was successfully deleted, false if it didn't exist or couldn't be removed
*/
public function delete($key) {
if ($this->_setKey($key) === false || !$this->_init) {
return false;
}
$path = $this->_File->getRealPath();
$this->_File = null;
//#codingStandardsIgnoreStart
return #unlink($path);
//#codingStandardsIgnoreEnd
}
Also, instead of editing the core cake files you could add your own file engine and use parted of the cake engine by moving the code and just extending the code there (that's totally cool in open source).
Its also possible that by reading the code used to implement the file caching engine you will find your actual solution. Good Luck.

How to determine a Wordpress version remotely?

I have numerous sites and its becoming a nuisance keeping them all up to date, so I would ideally like to compile a list where I can display the version of each website automatically. So I can see at the drop of a hat which ones needs updated and so on.
I have remote access to all off their databases, I had thought about querying the wp_options table for the DB Version but that isn't specific enough when it comes to smaller version updates as far as I am aware.
Any thoughts?
Here's a demo plugin
<?php
/** Plugin Name: My JSON data **/
add_filter( 'query_vars', function( $qv ){
$qv[] = 'mydata';
return $qv;
});
add_action( 'template_redirect', function(){
$input = get_query_var( 'mydata' );
$secret = 'abcdefg'; // Edit this
if( ! empty( $input ) )
{
if( $secret === $input )
{
$data = array(
'version' => $GLOBALS['wp_version'],
'foo' => 'bar',
);
wp_send_json_success( $data );
}
else
{
wp_send_json_error();
}
}
} );
where example.com/?mydata=abcdefg gives
{"success":true,"data":{"version":"3.8.1","foo":"bar"}}
and example.com/?mydata=wrong shows:
{"success":false}
I wouldn't recommend trying to bridge a system to check WordPress, espiecally since the WordPress core since 3.7.1 comes with this functionality.
WordPress 3.7.1+ Auto Updates, so it would be best to upgrade all your WordPress sites - this would also be a great idea for security purposes.
What you might want to consider is removing any redundant plugins and have a plan for updating those plugins every few months too.
3rd-party plugins are usually the reason a site is vulnerable, more so than the core of WordPress. Fight the fire before it becomes a fire in the first place! Use less plugins or keep on top of them.

Cakephp keep session alive for long durations

I m building an app called Trackosaur which tracks time on things you do. I m using Cake2+jQuery1.8 for this. The issue I m facing is related to sessions getting timed out. I could adjust the time that a session times out through the php ini. But I need to 'keep alive' a session for really long durations (10+ hours). So I setup a ajax call to a trivial function in my UserController which just uses session_start() in it.
JS
function keepAlive()
{
$.ajax({
type: 'get',
url: '/users/keepalive'
}).done(function(data){});
}
CAKE
public function keepalive()
{
session_start();
}
The ajax call is made every 10 minutes. I m not really sure if this is a good way to keep the session alive. Is there a better way I could do this using something in Cake itself as opposed to using session_start?
Many thanks for your time :)
In your core config file you can change the session timeout value.
In CakePHP 1.3 it's easy. Just find this and change to your value (36000 for 10 hours).
app/config/core.php
/**
* Session time out time (in seconds).
* Actual value depends on 'Security.level' setting.
*/
Configure::write('Session.timeout', '120');
In CakePHP 2, find this line and read the comment block above it for an explanation of how to configure the session time. I have not had to do this myself but I think:
Configure::write('Session', array(
'defaults' => 'php',
'Session.timeout' => 36000
));

CakePHP: Reporting Failed Downloads with the Media View

I'm using CakePHP's Media view to force file downloads. My code is pretty much exactly like the example provided in the cookbook, which I'll paste here for your convenience:
<?php
class ExampleController extends AppController {
public function download () {
$this->viewClass = 'Media';
// Download app/outside_webroot_dir/example.zip
$params = array(
'id' => 'example.zip',
'name' => 'example',
'download' => true,
'extension' => 'zip',
'path' => APP . 'outside_webroot_dir' . DS
);
$this->set($params);
}
}
In the database, I have a field that keeps track of how many times the file was downloaded. I'm looking for a way to make sure that this number is as accurate as possible, so if a user's download gets cancelled or times out, the number does not increment. Is there some way for CakePHP's Media view to report that the download was, indeed, successful?
Detecting when a file has finished downloading is no easy task. This is something that would be done on the client side with javascript, but browsers do not give you any hooks for that.
There is a pretty clever solution here (setting a cookie and then looking for it with javascript), but it only tells you when the download has started.

Resources