I'm new to cakephp and I found this in the manual:
It’s a well known fact that serving assets through PHP is guaranteed
to be slower than serving those assets without invoking PHP. And while
the core team has taken steps to make plugin and theme asset serving
as fast as possible, there may be situations where more performance is
required. In these situations it’s recommended that you either symlink
or copy out plugin/theme assets to directories in app/webroot with
paths matching those used by CakePHP.
app/Plugin/DebugKit/webroot/js/my_file.js
becomes app/webroot/debug_kit/js/my_file.js
app/View/Themed/Navy/webroot/css/navy.css
becomes app/webroot/theme/Navy/css/navy.css
Are files in plugin/webroot/asset required to be read by PHP then inserted into HTML rather than served directly by the server itself because really isn't a webroot directory that can be accessed by the http module?
The manual says soft links will speed this process up. Does cakephp first look in /app/webroot/asset then call the dispatcher to find it in plugin/webroot/asset and read it and serve it?
Or is the process identical in how the file is found/read except cake must use the dispatcher to locate the asset if it is not in the app/webroot/asset location?
For serving files...
Webservers are fastest
The default rewrite rules are as follows:
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]
That means if the request is for a file that the webserver can see - don't talk to php just respond with the file's contents (or appropriate headers). In this circumstance there is no "Does cakephp first look in /app/webroot/asset ..." as there is no CakePHP - or PHP - involved in handling the request at all.
So, in brief that's:
Request
-> webserver
-> check if file exists
-> response (file contents)
If a different webserver (not apache) is being used, CakePHP expects equivalent rewrite rules. It will never check if the equivalent of app/webroot/<the current url> exists - as the webserver should be doing that itself.
PHP is slow
If the request is for a file that does not exist in the webroot, things are much slower because quite simply there's more processes involved. Even a php script like so:
<?php
// example app/webroot/index.php
$path = 'server/this/file.html';
echo file_get_contents($path);
exit;
is slower than an equivalent request handled directly by a webserver, as that's:
Request
-> webserver
-> check if file exists
-> invoke php
-> get file contents
-> respond to webserver
-> response
Plus php wasn't specifically designed for serving files (like a webserver is or should be) and so is inherently slower than a webserver alone at doing so.
CakePHP is slower
The only path that is directly web-accessible for a CakePHP project is ´app/webroot`.
For a request handled by CakePHP, even using the Asset dispatch filter (which is a slimmed down dispatch process) - obviously there is more logic involved, so it's slower than the bare minimum logic required to server a file with php. In brief the request becomes:
Request
-> webserver
-> check if file exists
-> invoke php
-> Bootstrap CakePHP
-> Dispatch Request
-> Check Dispatch filters
-> check if request matches a configured plugin/theme file path
-> check if file exists
-> generate response
-> output response
-> respond to webserver
-> response
The difference in performance compared to letting the webserver handle the request for a static file can be very significant.
Conclusion
Serving files with php when it's not necessary is a waste of resources, if at all possible allow the response to come from higher up the request - the webserver, a proxy or preferably the user's own browser cache (~0 latency!).
Related
I am running a Wordpress SSL-Website which requires to have one tab to be non-SSL because it runs a non-secure websocket and it would be rejected due to mixed content otherwise.
In that regard I am doing a redirect in the .htaccess file:
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} ^/(stage/)$
RewriteCond %{SERVER_PORT} ^443$
RewriteRule ^(.*) http://%{HTTP_HOST}/$1 [R=301,L]
which indeed makes this specific tab unsecure, however, it also results to the following two issues:
1.) After the switch from https to http I am loosing any wordpress status information - it basically behaves as if the user is not logged in. Going back to the other secured tabs the information is back again.
2.) This specific tab includes three iFrames which I may only include via https and not via http. When including them via http I am on top of issue 1.) also loosing the wordpress data base access at all.
In fact the switch from https to http is only a workaround but currently a fine compromise for the meantime. It works without the issues mentioned in Joomla and now I would like to know if there is way to get rid of them in Wordpress as well.
Thanks in advance,
best
Alex
WordPress' SECURE_AUTH_COOKIE is equal to 'wordpress_sec_' . md5(get_site_option('siteurl')). So as you change environments (secure to non-secure) that siteurl will change and your session cookie hash will be different. You'll need the user to login on BOTH secure AND non-secure before proceeding.
The code in question is located at wp-includes/default-constants.php. Since this runs before any plugins/theme code. You'd have to hack this at the wp-config.php level. Then check out it's use in the wp-includes/pluggable.php file to see future modifications are needed. The pluggable file runs after plugins init, so you can hook into filters there, if needed.
I'm trying to modularize my front-end which is in Angular-JS, with it we are using HA-proxy as a load balancer and K8s.
Each ACL in the HA-proxy configuration is attached to a different service in K8s and since we are using Angular with the (hash-bang enabled), in the HA-proxy configuration file we use that as a way to identify the different modules.
Below is my configuration, in HA-proxy which is failing because I can't escape the # in the file even after following the HA Documentation.
acl login-frontend path_beg /\#/login
use_backend login-frontend if login-frontend
acl elc-frontend path_beg /\#/elc
use_backend elc-frontend if elc-frontend
I have tried escaping it as /%23/login and /'#'/admin but without success.
Any idea would be greatly appreciated.
The fragment (everything followed by a # character) as defined in RFC 3986
As with any URI, use of a fragment identifier component does not
imply that a retrieval action will take place. A URI with a fragment
identifier may be used to refer to the secondary resource without any
implication that the primary resource is accessible or will ever be
accessed.
and it is used on the client side, therefore a client (a browser, a curl, ...) does not send it with a request. As reference: Is the URL fragment identifier sent to the server?
So there is no point to route/acl with it. The reason why haproxy provide an escape sequence for that is you may want to include it with a body, a custom header... but again, you will not obtain that part from the request line (the first line with URI).
What is really happening here is the user is requesting from HAProxy / and Angular, in the user's browser, is then parsing the #/logic and #/elc part to decide what to do next.
I ran into a similar problem with my Ember app. For SEO purposes I split out my "marketing" pages and my "app" pages.
I then mounted my Ember application at /app and had HAProxy route requests to the backend that serviced my Ember app. A request for "anything else" (i.e. /contact-us) was routed to the backend that handled marketing pages.
/app/* -> server1 (Ember pages)
/ -> server2 (static marketing pages)
Since I had some urls floating around out there on the web that still pointed to things like /#/login but really they should now be /app/#/login what I had to do was edit the index.html page being served by my marketing backend and add Javascript to that page that parsed the url. If it detected a /#/login it forced a redirect to /app/#/login instead.
I hope that helps you figure out how to accomplish the same for your Angular app.
i want to protect some files in a folder by requiring passwords for download
however list of users that are allowed to download are on a mysql table with their passwords in md5 format (which means i cannot generate a htpasswd file)
to make it harder i also need to allow some users to download some files and others to download other files without being able to move files (separating files in multiple folders)
so i what i need is some kind of auth api , when there is a request nginx askes a Script (lets say a php script) with parameters of username/password/ip/filename and depending on script's response allow or disallow the download
is this possible?
what i've done so far:
1.looking in the 3rd party modules list http://wiki.nginx.org/3rdPartyModules
where i found a module with PAM but my server is windows
2.googling lots of terms without any results
3.looking at the module development tutorials http://www.evanmiller.org/nginx-modules-guide.html
i'm not really good at C so a pre made module for windows that bounces the request to a script (without proxy-ing the download through it) is the best if not some pointers to how should i make a module that meets my requirements is appreciated .
You can use the http_auth_request module.
p.s. Do you actually know that nginx for windows is not production ready?
I want to protect some files in my server from download but my site needs to have access in them. I want to protect some subtitles files, which are needed to be accessed by site, but I don't want anybody to download them. The site is hosted.
For example some sites use some strange strings that are connected with user, video and IP. Can be used something like this for my case.
http://www11.some-site.com:182/d/qygiatnqvsulzrqmk7n6nbhddbcscvyguy4auc3fn4nvf23jp64tjcpa/File-needed.mp4?start=0
If you are using Apache, You have to use rewrite rules in your .htaccess file. IF you are using other HTTP Server brand, you should use almost the same logic that I will show here, so check your HTTP server manual in that case.
Explain:
When you type www.me.com/index.php the PHP system puts the content generated by the echo commands you use inside your code.
When you type www.me.com/myfiles/iou345yo13i2u4ybo34ybu3/passwords.txt your server will put the file contents to the client browser, which will ask you to download it as a file or show as a page, depending of the file extension.
Now, if you do something like this in your .htaccess file:
RewriteEngine On
RewriteRule ^myfiles/([^/]*)^.pdf$ /index.php?file=$1& [L]
# avoit direct access to your server directories file listing
Options All -Indexes
You will type www.me.com/myfiles/file123.pdf but the server will execute index.php with the file name as content of the "file" parameter, and there in the code, you will be able to check the session to see if the user has the authorization to download this file.
If the user has the authorization, you then use the readfile() function to send the file to the user and he will not recognize where it came from (I mean the real path).
Look on how to do this here:
PHP - send file to user
Change to 644 permissions in your content.
I want to restrict access to a file or files using .htaccess file. Basically, no one should be able to download file(s) using direct link to the file. However, the file should be accessible from my website.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
Thank you in advance,
You can deny access to the directory for every IPA but the server's:
<Directory /dir/of/presentation>
Order Allow,Deny
Allow from 127.0.0.1
Deny from All
</Directory>
That wonk work, as you pointed out.
How about using Mod Rewrite with a rule that maps /dir/of/presentation/* to a forbidden page. That way a direct link won't work. A request for http://site/content/presentation.ppt
could get redirected to http://site/forbidden.html
Internally, you could make a link to http://authorizedRequest/presentation.ppt map to http://site/content/presentation.ppt
It's just security through obscurity. It wouldn't prevent anyone from typing your "secret" URI into their browser directly.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
It's possible but there's ways to get around it. You need to check against the referer sent by the browser, but anyone can spoof that and sometimes a browser may choose to not even include a referer.
If you are trying to protect the file Presentation.ppt, put these rules in the htaccess file in your document root:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/to/Presentation.ppt - [L,F]
If you want to protect a folder /path/images/ then:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/images - [L,F]
Thank you all for your answers. I have tried all of your suggestions, but I still couldn't get it working. However, I did come up with a solution that does work.
Step 1: Disable or turn off Option Indexes on your web server by removing the word indexes leaving everything else the same. In some instances, you may be able to do this using .htaccess file. If you are unable to do this with .htaccess, then you will have to look for httpd.conf file in your server. It is usually located at etc/apache/httpd.conf or etc/httpd/conf/httpd.conf. Once you find it, turn this option off in there.
Step 2: Create a folder within your webpage folder and call it whatever you want but make sure it is not easily guessable or that it is obvious (i.e. Joe33CompanyOCT2MeBoss). Then, move the files you want to hide or protect from your visitor into this folder.
Step 3: Within robot.txt file, disallow all bots or crawlers from indexing your folder or the files within this folder by entering, "Disallow yourfoldername."
Step 4: Then you will have to create a PHP file using a similar code below. The code below will force download.
$File1 = 'http://yourwebsite.com/Joe33CompanyOCT2MeBoss/Presentation.ppt';
header("Content-Disposition: attachment; filename=\"".basename($File1)."\"");
header("Content-Type: application/force-download");
ob_end_clean();
flush();
readfile($File1);
exit;
This way direct path to the file is hidden from your visitor and even though they can download the file directly they simply don't know the actual URL to the file, because force download php code doesn't reveal the actual path to the file. So, now my visitors to my website has to go through my webpage for downloading this file instead of directly.
Following stackoverflow questions have been very instrumental in helping me solve my programming issues. Thanks,
How to Automatically Start a Download in PHP?
php file force download
The easiest (though not bulletproof) is to redirect the user agent when the HTTP_REFERER is not correct. This can be done using mod_rewrite in the server configuration or (second choice) inside a .htaccess like file. It helps against simply hotlinking (links referencing your file by url).
You should read the fine documentation of moapaches d_rewrite.