I know that it's good to minify assets because doing so reduces their file size, which reduces the amount of time it takes for the page to load. I also know that it's good to combine assets because doing so reduces the number of HTTP requests, which, once again, reduces the amount of time it takes for the page to load. This is important because there are still people on dial-up and mobile devices often don't have a fast connection.
The thing I'm struggling with is how to easily add asset minification and combining into my workflow. I develop locally using CakePHP and I use Git for version control. When it's time to go live, I ssh into the server hosting the live site and merge in the latest commit.
Here's how I would go about rolling my own solution (only accounts for minification and is not tested!):
1.) My development environment's "app/Config/core.php" file would always have its "debug" level set to a value greater than 0 and the production environment's would always be at 0.
2.) On the file system, all CSS and JavaScript would be stored in external files, like so:
app/webroot/css/used-site-wide.css
app/webroot/css/used-on-a-few-pages.css
app/webroot/css/used-on-one-page.css
app/webroot/js/used-site-wide.js
app/webroot/js/used-on-a-few-pages.js
app/webroot/js/used-on-one-page.js
3.) Rather than using echo $this->Html->script(array('used-on-a-few-pages', 'used-on-one-page'), array('inline' => false)); in the view file, I would use this:
Configure::write('external_js', array('used-on-a-few-pages'));
Configure::write('inline_js', array('used-on-one-page'));
4.) Rather than using echo $this->fetch('script'); in the layout file, I would use this:
if (Configure::read('external_js') !== null) {
$external_js = Configure::read('external_js');
if (Configure::read('debug') == 0) {
foreach ($external_js as &$external_js_filename) {
$external_js_filename .= '-min';
}
}
echo $this->Html->script($external_js);
}
if (Configure::read('inline_js') !== null) {
$inline_js = Configure::read('inline_js');
if (Configure::read('debug') == 0) {
foreach ($inline_js as &$inline_js_filename) {
$inline_js_filename .= '-min';
}
}
echo "\n<script type=\"text/javascript\">\n\t/* <![CDATA[ */";
foreach ($inline_js as $inline_js_filename) {
echo file_get_contents(JS . Configure::read('inline_js') . '.js');
}
echo "\n\t/* ]]> */\n</script>";
}
5.) Finally, I would set up Git to create the minified assets whenever a commit is made.
Using this setup, I would be working with the unminified assets in development and the minified ones in production. The thing is, I don't want to re-invent the wheel if I don't have to. I believe that re-inventing the wheel should only be done if you're solving a problem that is both significant and uncommon.
How do you all handle this?
Thanks!
If you're after something more simple than Mark Story's AssetCompress plugin, check this out:
https://github.com/joshuapaling/CakePHP-Combinator-Plugin
It will combine and minify JS and CSS files, and you can easily make it only combine/minify when debug mode is 0 (there's a example of that in the .markdown on GitHub). It uses the date modified of the included JS/CSS files to decide when it needs to make a new cached file.
It isn't nearly as full-featured as Mark Story's plugin, but it is simple, does the job, and it should only take you 10 or 15 mins to set up.
I've recently stumbled upon this situation myself and I have tried the (now deprecated) Combinator plugin which uses jsmin that isn't very reliable. So then I tried Mark Story's plugin which is way to complicated and it doesn't even do the builds automagically when files have changed, you have to do some cake bake every time you wish to create the combined and minified files.
Therefore I have wrote my own simple helper which you can check out here: https://github.com/Highstrike/cakephp-compressor. Everything is explained in the readme file and it's very easy to use, taking advantage of google closure for js minification. It also handles HTML minification.
It's even easier than maurymmarques's solution because it's just a helper file, no controller or configuration needed.
I'm hoping it will help someone.
You also can check this plugin https://github.com/maurymmarques/minify-cakephp
It's very easy to install and configure.
Related
We are looking to speed up our website. More specifically, we are looking to lower TTFB. Our website consists mainly of pages that are dynamically generated based on the url path (a subject is extracted) and on parameters in the url path.
These entries are put into an sql query that pulls in all the right data from our database executed with php.
Here is the issue:
These queries work perfectly to generate the pages and all the information associated with them (e.g. tags).
However, rerunning the code everytime a visitor goes to a page takes a significant amount of time, resulting in a high TTFB/server response time. In essence, these pages only need to be updated using the sql queries once every month. In between that, it should be possible to serve them as preloaded/pregenerated static HTML pages (until we indicate a refresh). We are currently using Cloudflare as a CDN which has been great in speeding up the website already. Now, even though we have the page rule with "Cache Everything" on, we can still see it reruns the php code including sql queries.
The question:
Does anybody know a good way to accomplish this goal of caching the dynamic part of the website? Whether that's with Cloudflare, or via another way? I know that Akamai offers this service but evidently, there is some switching cost associated with swapping the website to another CDN, and we are rather satisfied with Cloudflare.
Thanks in advance!
If your website has hundreds of pages with many visitors everyday, you might want to implement some sort of caching mechanism for your website to speed up page loading time. Each client-server request consist of multiple database queries, server response and the processing time increasing overall page loading time. The most common solution is to make copies of dynamic pages called cache files and store them in a separate directory, which can later be served as static pages instead of re-generating dynamic pages again and again.
Understanding Dynamic pages & Cache Files
Cache files are static copies generated by dynamic pages, these files are generated one time and stored in separate folder until it expires, and when user requests the content, the same static file is served instead of dynamically generated pages, hence bypassing the need of regenerating HTML and requesting results from database over and over again using server-side codes. For example, running several database queries, calculating and processing PHP codes to the HTML output takes certain seconds, increasing overall page loading time with dynamic page, but a cached file consist of just plain HTML codes, you can open it in any text editor or browser, which means it doesn’t require processing time at all.
Dynamic page :— The example in the picture below shows how a dynamic page is generated. As its name says, it’s completely dynamic, it talks to database and generates the HTML output according to different variables user provides during the request. For example a user might want to list all the books by a particular author, it can do that by sending queries to database and generating fresh HTML content, but each request requires few seconds to process also the certain server memory is used, which is not a big deal if website receives very few visitors. However, consider hundreds of visitors requesting and generating dynamic pages from your website over and over again, it will considerably increase the pressure, resulting delayed output and HTTP errors on the client’s browser.
dynamic-page-example
Cached File :— Picture below illustrates how cached files are served instead of dynamic pages, as explained above the cached files are nothing but static web pages. They contain plain HTML code, the only way the content of the cached page will change is if the Web developer manually edits the file. As you can see cached files neither require database connectivity nor the processing time, it is an ideal solution to reduce server pressure and page loading time consistently.
cached-file-example
PHP Caching
There are other ways to cache dynamic pages using PHP, but the most common method everyone’s been using is PHP Output Buffer and Filesystem Functions, combining these two methods we can have magnificent caching system.
PHP Output buffer :— It interestingly improves performance and decreases the amount of time it takes to download, because the output is not being sent to browser in pieces but the whole HTML page as one variable. The method is insanely simple take a look at the code below :
<?php
ob_start(); // start the output buffer
/* the content */
ob_get_contents(); gets the contents of the output buffer
ob_end_flush(); // Send the output and turn off output buffering
?>
When you call ob_start() on the top of the code, it turns output buffering on, which means anything after this will be stored in the buffer, instead of outputting on the browser. The content in the buffer can be retrieved using ob_get_contents(). You should call ob_end_flush() at the end of the code to send the output to the browser and turn buffering off.
PHP Filesystem :— You may be familiar with PHP file system, it is a part of the PHP core, which allow us to read and write the file system. Have a look at the following code.
$fp = fopen('/path/to/file.txt', 'w'); //open file for writing
fwrite($fp, 'I want to write this'); //write
fclose($fp); //Close file pointer
As you can see the first line of the code fopen() opens the file for writing, the mode ‘w’places the file pointer at the beginning of the file and if file does not exist, it attempts to create one. Second line fwrite() writes the string to the opened file, and finally fclose()closes the successfully opened file at the beginning of the code.
Implementing PHP caching
Now you should be pretty clear about PHP output buffer and filesystem, we can use these both methods to create our PHP caching system. Please have a look at the picture below, the Flowchart gives us the basic idea about our cache system.
php-cache-system
The cycle starts when a user request the content, we just check whether the cache copy exist for the currently requested page, if it doesn’t exist we generate a new page, create cache copy and then output the result. If the cache already exist, we just have to fetch the file and send it to the user browser.
Take a look at the Full PHP cache code below, you can just copy and paste it in your PHP projects, it should work flawlessly as depicted in above Flowchart. You can play with the settings in the code, modify the cache expire time, cache file extension, ignored pages etc.
<?php
//settings
$cache_ext = '.html'; //file extension
$cache_time = 3600; //Cache file expires afere these seconds (1 hour = 3600 sec)
$cache_folder = 'cache/'; //folder to store Cache files
$ignore_pages = array('', '');
$dynamic_url = 'http://'.$_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI'] . $_SERVER['QUERY_STRING']; // requested dynamic page (full url)
$cache_file = $cache_folder.md5($dynamic_url).$cache_ext; // construct a cache file
$ignore = (in_array($dynamic_url,$ignore_pages))?true:false; //check if url is in ignore list
if (!$ignore && file_exists($cache_file) && time() - $cache_time < filemtime($cache_file)) { //check Cache exist and it's not expired.
ob_start('ob_gzhandler'); //Turn on output buffering, "ob_gzhandler" for the compressed page with gzip.
readfile($cache_file); //read Cache file
echo '<!-- cached page - '.date('l jS \of F Y h:i:s A', filemtime($cache_file)).', Page : '.$dynamic_url.' -->';
ob_end_flush(); //Flush and turn off output buffering
exit(); //no need to proceed further, exit the flow.
}
//Turn on output buffering with gzip compression.
ob_start('ob_gzhandler');
######## Your Website Content Starts Below #########
?>
<!DOCTYPE html>
<html>
<head>
<title>Page to Cache</title>
</head>
<body>
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer ut tellus libero.
</body>
</html>
<?php
######## Your Website Content Ends here #########
if (!is_dir($cache_folder)) { //create a new folder if we need to
mkdir($cache_folder);
}
if(!$ignore){
$fp = fopen($cache_file, 'w'); //open file for writing
fwrite($fp, ob_get_contents()); //write contents of the output buffer in Cache file
fclose($fp); //Close file pointer
}
ob_end_flush(); //Flush and turn off output buffering
?>
You must place your PHP content between the enclosed comment lines, In fact I’d suggest putting them in separate header and footer file, so that it can generate and serve cache files for all the different dynamic pages. If you read the comment lines in the code carefully, you should find it pretty much self explanatory.
You can do this at the edge with cloudflare to get better performance. The nice thing is you can set the domain to "Under Development" at any time to see code changes - without having to change a server-side mechanism.
Your Page Rule would look something like the below. Note the url variable with wildcard. This means any url with that variable name will be cached at the edge. You should see TTFB + entire html download under 50ms easily.
Also note the Expiration. You said a month so I chose that setting. But I'd probably make it "1 Day". Just so I can keep an eye on things.
I am using the Drupal 7 Migrate module to create a series of nodes from JPG and EPS files. I can get them to import just fine. But I notice that when I am done importing them if I look at the nodes it creates, none of the attached filefield and thumbnail files contain filename information.
Upon inspecting the file_managed table I see that both the filename and filemime fields are empty for ONLY the files that I attached via the migrate module. This also creates an issue with downloading the files.
Now I think the problem has to do with the fact that I am using "file_link" instead of "file_copy" as the file operation I specify. The problem is I am importing around 2TB (thats Terabytes) of image files. We had to put in a special request with Rackspace just to get access to that much disk space on our server. So I can't go around copying from one directory to the next because of space issues. So "file_link" seems like the obvious choice.
Now you probably want to see how I am doing this exactly, so here is the code snippet:
$jpg_arguments = MigrateFileFieldHandler::arguments(NULL,
'file_link', FILE_EXISTS_RENAME, 'en', array('source_field' => 'jpg_name'),
array('source_field' => 'jpg_filename'), array('source_field' => 'jpg_filename'));
$this->addFieldMapping('field_image', 'jpg_uri')
->arguments($jpg_arguments);
As you can see I am specifying no base path (just like the beer.inc example file does). I have set file_link, the language, and the source fields for the description, title, and alt.
It is able to generate thumbnails from the JPGs. But still missing those columns of data in the db table. I traced through the functions the best I could but I don't see what is causing this. I tried running the uri in the table through the functions that generate the filename and the filemime and they output just fine. It is like something is removing just those segments of data.
Does anyone have any idea what this could be? I am using the Drupal 7 Migrate module version 2.2. It is running on Drupal 7.8.
Thanks,
Patrick
Ok, so I have found the answer to yet another question of mine. This is actually an issue with the migrate module itself. The issue is documented here. I will be repealing this bounty (as soon as I figure out how).
I'm including a number of images as "Content" in my deployed XAP for Mango.
I'd like to enumerate these at runtime - is there any way to do this?
I've tried enumerating resources like:
foreach (string key in Application.Current.Resources.Keys)
{
Debug.WriteLine("Resource:" + key);
}
But the images aren't included in the list. I've also tried using embedded resources instead - but that didn't help. I can read the streams using Application.GetResourceStream(uri) but obviously I need to know the names in order to do this.
This is no API baked in to WP7 that allows you to enumerate the contents of the Xap. You need to know the name of the content items before you can retreive them.
There probably is some code floating around somewhere that is able to sniff out the Zip catalog in the XAP however I would strongly recommend that you don't bother. Instead include some sensible resource such as an Xml file or ResourceDictionary that lists them.
Having found no practical way to read the Content files from a XAP I build such a list at design time using T4.
See an example at https://github.com/mrlacey/phonegap-wp7/blob/master/WP7Gap/WP7Gap/MainPage.xaml.cs
This seems the right way to go as:
a) I'd rather build the list once at design time rather than on every phone which needs the code.
and
b) I shouldn't ever be building the XAP without being certain about what files I'm including anyway.
Plus it's a manual step to set the build action on all such files so adding a manual step to "Run Custom Tool" once for each build isn't an issue for me.
There is no way to enumerate the files set as "Content".
However, there is a way to enumerate files at runtime, if you set your files as "Embedded Resource".
Here is how you can do this:
Set the Build Action of your images as "Embedded Resource".
Use Assembly.GetCallingAssembly().GetManifestResourceNames() to
enumerate the resources names
Use
Assembly.GetCallingAssembly().GetManifestResourceStream(resName)
to get the file streams.
Here is the code:
public void Test()
{
foreach (String resName in GetResourcesNames())
{
Stream s = GetStreamFromEmbeddedResource(resName);
}
}
string[] GetResourcesNames()
{
return Assembly.GetCallingAssembly().GetManifestResourceNames();
}
Stream GetStreamFromEmbeddedResource(string resName)
{
return Assembly.GetCallingAssembly().GetManifestResourceStream(resName);
}
EDIT : As quetzalcoatl noted, the drawback of this solution is that images are embedded in the DLL, so if you a high volume of images, the app load time might take a hit.
I would like to translate my ExtJS application in different languages. My issue is that I'm using ExtJS MVC framework, and most of my JS files are downloaded dynamically by the framework itself.
The ideal solution (that I thought of) would be to have an extra option in the Ext.Loader (or in my Ext.app.Application) that would define the language to use, and depending on this to automatically download such file as "a.MyClass.fr.js" after loading my "a.MyClass.js" (which would contain an Ext.apply, overriding my string resources). That's probably not available in the ExtJS framework at the moment.
The alternative solution I can see, is to perform a trick on the server-side. First, a cookie would be created on the client, to set to the language. On the server-side, I could catch all the requests to JS files, then if a cookie is set (='fr' for example), I'd combine the requested JS file (MyClass.js) with its i18n's friend (MyClass.fr.js) dynamically on the server and return the result. That would work, but it's really tricky because it implies other things (caching...).
Maybe the best way is to implement the first behavior I described in the ExtJS framework myself...
What do you think? I'm looking for a really clean and neat way of doing it! Thanks :)
I recently struggled with the same problem.
Finding a clean way to do this was quite a challenge - most alternatives were either..
1) Duplicate your code base per locale (WTH)
2) Download localized files overriding each of your components (Maintenance hell? What about the poor translators?)
3) Use/generate a static file containing translations and refer to it (All languages are downloaded? Extra build step to generate it? How do you keep them in synch?)
I tried to get the best of all worlds and ended up with a utility class responsible for:
1) Loading the ExtJS translation files (which basically apply overrides to extjs base components)
2) Loading a locale specific property resourcebundle (specifying which locale to load) from the server.
3) Prototyping String with a translate() method which queries the loaded store (containing the message bundle from the server) and returns the translation based on the value of the string.
This is the gist of things:
Bundle & prototyping:
localeStore.load({
callback : function(records, operation, success) {
// Define translation function (NB! Must be defined before any components which want to use it.)
function translate() {
var record = localeStore.getById(this.valueOf()) ;
if(record === null) {
alert('Missing translation for: ' + this.valueOf()); // Key is not found in the corresponding messages_<locale>.properties file.
return this.valueOf(); // Return key name as placeholder
} else {
var value = record.get('value');
}
return value;
}
String.prototype.translate = translate;
callback.call(); // call back to caller(app.js / Ext.Application), loading rest of application
}
});
As an example from a view:
this.copyButton = Ext.create('Ext.button.Button', {
disabled: true,
text: 'DOCUMENT_LIBRARY_MENU_COPYTO_BUTTON'.translate(),
action: 'openCopyDialog'
});
Bundle on the server (mesages_en.properties):
DOCUMENT_LIBRARY_MENU_COPYTO_BUTTON=Copy file
etc..
Pros:
No-fuss code, 'Your_key'.translate() makes it easy to read and aware that this is a localized string
None/little maintenance overhead (Keeping an override file for each locale? Jesus..)
You only load the locale you need - not the whole shabang.
If you really want to, you could even have your own translation for the ExtJS locale files in the same bundle.
You could write unit tests to ensure that all bundles contain the same keys, thus avoiding orphaned translations later
Cons:
Synchronous - the store must be loaded before your main app starts. I solved this by adding a callback from the utility class which was called once all texts were loaded.
No real-time population of texts.. though I didn't want to make my users overload the server either :P
So far my approach has worked out pretty well for my requirements.
Site load isn't noticeably slower and the bundles (containing ~200 keys/values per bundle) measure out at ~10kb during load.
There is currently no solution so I decided to create my own hack/addon on the Ext.Loader. I uploaded the code on GitHub: https://github.com/TigrouMeow/extjs-locale-loader. It's exactly what I needed and I really hope it will help others as well!
You should first complete your development phase and build your project or use ext-all.js file to I18s translate your UI
see: http://docs.sencha.com/ext-js/4-0/#!/example/locale/multi-lang.html
The appropriate language modifier script (/ext/local/ext-lang-xxx.js) needs to be loaded after ext is loaded (including dynamically loaded classes). In the example above, I would have probably used Ext.Loader.loadScriptFile but they eval a downloaded one directly. The only other thing is that your classes need to be built in different languages or you just use variables and reference the lang-specific variable file.
you could also use a variable in the Loader paths:
var lang='fr';
Loader
{
paths:
{
'Ext': '.',
'My': './src/my_own_folder'+'/'+lang
}
Looking to figure out how to measure the total PHP execution time of a CakePHP site. It looks like in 1.2 this was included in the rendered HTML as a HTML comment when in debug mode, but this is not happening on my 1.3 site, and in any case I want it as an element I can output to the user, not a comment.
I can do this easily in regular PHP using microtime() but I'm not sure where to add the code in CakePHP, and I suspect it might have a more robust execution timer anyway. Ideas?
Just in case anyone else is curious, I solved this by adding the following code to my layout.ctp. You could also do this in the controller and pass it in as a variable, which might be a little more classic MVC-friendly, but I wanted this on every page of the site without duplicating code in each controller.
Page rendered in <?php echo round((getMicroTime() - $_SERVER['REQUEST_TIME']) * 1000) ?>ms.
Use Debug Kit. Among other functionality, you can grab the total execution time via
DebugKitDebugger::requestTime()
This may not be the "right" way to do it, but you can add the following PHP code back into app/webroot/index.php (at the very end). Then if you have debug on > 0, you'll get the old 1.2 functionality back.
if (Configure::read() > 0) {
echo "<!-- " . round(getMicrotime() - $TIME_START, 4) . "s -->";
}