CakePHP Application: some pages with SSL, some without - cakephp

I have an application written with the CakePHP framework and it is currently located in httpdocs. I want a few pages to be redirected to https://
Basically this shouldn't be a problem to detect whether the user is already on https://... or not. My issue is a different one: In my opinion I would need to make a copy of the whole project and store it in httpsdocs, right? This sounds so silly but how should it work without duplicating the code? I think I miss something but I don't get it ...

I have never had to copy the code for ssl. You should specify in the vhost for the site what the path is.
On apache there is a vhost for each, ssl and non ssl. Both can have the same webroot path.

If your webhoster requires you to put the https part of your website in httpsdocs, then you will need to put something there. But not the whole project: maybe only the /web part (the part that is actually served up by the webhoster).
Something like
/cake/app/ --> your app code
/httpsdoc/.. --> index.php and possibly css stuff, images etc
/httpsdocs/.. --> copy of index.php and the rest as well
Of course, you could also use some internal redirect in .htaccess
One suggestion: now that google indexes https urls, you could also choose to make the whole site available through https.

Related

Path Routing: Application Load Balancer for React Application

I'm trying to create path routing in AWS application load balancer.
Example:
apple.mango.com/vault goes to instance1 port 80 and nginx routes it to /var/html/reactApp1/build/
apple.mango.com/flow goes to instance2 port 80 and nginx routes it to /var/html/reactApp2/build/
My configuration look something like this:
Also, for both /var/html/reactApp1/build/ and /var/html/reactApp2/build/, I have them hosted normally say mango.com and apple.com and they work just fine.
Problem Statement:
When the application is visited via path routing like apple.mango.com/vault or apple.mango.com/flow it reaches the correct machines/root but fails to load the sites as expected.
Upon inspecting the blank page, it does not load the node-modules:
Where am I going wrong?
I know this question was asked since almost 2 years, but I think I found the solution, so even if it is not usefull now, it can be important in the near future:
In the rules of your load balancer you must enable one rule to allow redirect the traffic to your app when it try to search /static/... because that is the bundle.js created by react:
I hope this will be as useful for you as it has been for me.

How to escape # character in HAProxy config?

I'm trying to modularize my front-end which is in Angular-JS, with it we are using HA-proxy as a load balancer and K8s.
Each ACL in the HA-proxy configuration is attached to a different service in K8s and since we are using Angular with the (hash-bang enabled), in the HA-proxy configuration file we use that as a way to identify the different modules.
Below is my configuration, in HA-proxy which is failing because I can't escape the # in the file even after following the HA Documentation.
acl login-frontend path_beg /\#/login
use_backend login-frontend if login-frontend
acl elc-frontend path_beg /\#/elc
use_backend elc-frontend if elc-frontend
I have tried escaping it as /%23/login and /'#'/admin but without success.
Any idea would be greatly appreciated.
The fragment (everything followed by a # character) as defined in RFC 3986
As with any URI, use of a fragment identifier component does not
imply that a retrieval action will take place. A URI with a fragment
identifier may be used to refer to the secondary resource without any
implication that the primary resource is accessible or will ever be
accessed.
and it is used on the client side, therefore a client (a browser, a curl, ...) does not send it with a request. As reference: Is the URL fragment identifier sent to the server?
So there is no point to route/acl with it. The reason why haproxy provide an escape sequence for that is you may want to include it with a body, a custom header... but again, you will not obtain that part from the request line (the first line with URI).
What is really happening here is the user is requesting from HAProxy / and Angular, in the user's browser, is then parsing the #/logic and #/elc part to decide what to do next.
I ran into a similar problem with my Ember app. For SEO purposes I split out my "marketing" pages and my "app" pages.
I then mounted my Ember application at /app and had HAProxy route requests to the backend that serviced my Ember app. A request for "anything else" (i.e. /contact-us) was routed to the backend that handled marketing pages.
/app/* -> server1 (Ember pages)
/ -> server2 (static marketing pages)
Since I had some urls floating around out there on the web that still pointed to things like /#/login but really they should now be /app/#/login what I had to do was edit the index.html page being served by my marketing backend and add Javascript to that page that parsed the url. If it detected a /#/login it forced a redirect to /app/#/login instead.
I hope that helps you figure out how to accomplish the same for your Angular app.

How to restrict access to a file or files using .htaccess?

I want to restrict access to a file or files using .htaccess file. Basically, no one should be able to download file(s) using direct link to the file. However, the file should be accessible from my website.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
Thank you in advance,
You can deny access to the directory for every IPA but the server's:
<Directory /dir/of/presentation>
Order Allow,Deny
Allow from 127.0.0.1
Deny from All
</Directory>
That wonk work, as you pointed out.
How about using Mod Rewrite with a rule that maps /dir/of/presentation/* to a forbidden page. That way a direct link won't work. A request for http://site/content/presentation.ppt
could get redirected to http://site/forbidden.html
Internally, you could make a link to http://authorizedRequest/presentation.ppt map to http://site/content/presentation.ppt
It's just security through obscurity. It wouldn't prevent anyone from typing your "secret" URI into their browser directly.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
It's possible but there's ways to get around it. You need to check against the referer sent by the browser, but anyone can spoof that and sometimes a browser may choose to not even include a referer.
If you are trying to protect the file Presentation.ppt, put these rules in the htaccess file in your document root:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/to/Presentation.ppt - [L,F]
If you want to protect a folder /path/images/ then:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/images - [L,F]
Thank you all for your answers. I have tried all of your suggestions, but I still couldn't get it working. However, I did come up with a solution that does work.
Step 1: Disable or turn off Option Indexes on your web server by removing the word indexes leaving everything else the same. In some instances, you may be able to do this using .htaccess file. If you are unable to do this with .htaccess, then you will have to look for httpd.conf file in your server. It is usually located at etc/apache/httpd.conf or etc/httpd/conf/httpd.conf. Once you find it, turn this option off in there.
Step 2: Create a folder within your webpage folder and call it whatever you want but make sure it is not easily guessable or that it is obvious (i.e. Joe33CompanyOCT2MeBoss). Then, move the files you want to hide or protect from your visitor into this folder.
Step 3: Within robot.txt file, disallow all bots or crawlers from indexing your folder or the files within this folder by entering, "Disallow yourfoldername."
Step 4: Then you will have to create a PHP file using a similar code below. The code below will force download.
$File1 = 'http://yourwebsite.com/Joe33CompanyOCT2MeBoss/Presentation.ppt';
header("Content-Disposition: attachment; filename=\"".basename($File1)."\"");
header("Content-Type: application/force-download");
ob_end_clean();
flush();
readfile($File1);
exit;
This way direct path to the file is hidden from your visitor and even though they can download the file directly they simply don't know the actual URL to the file, because force download php code doesn't reveal the actual path to the file. So, now my visitors to my website has to go through my webpage for downloading this file instead of directly.
Following stackoverflow questions have been very instrumental in helping me solve my programming issues. Thanks,
How to Automatically Start a Download in PHP?
php file force download
The easiest (though not bulletproof) is to redirect the user agent when the HTTP_REFERER is not correct. This can be done using mod_rewrite in the server configuration or (second choice) inside a .htaccess like file. It helps against simply hotlinking (links referencing your file by url).
You should read the fine documentation of moapaches d_rewrite.

Hiding Drupal admin

I'm trying to hide my Drupal7 admin pages from the Internet, only allowing access from internal lan's.
I'm trying to hide /admin /user and */edit, but is there anything else I need to deny, to disable access to all parts of the Drupal admin?
<Location ~ "(/([aA][dD][mM][iI][nN]|[uU][sS][eE][rR])|/[eE][dD][iI][tT])">
Order deny,allow
Deny from all
Allow from 12.193.10.0/24
</Location>
Apache seems to accept this, and urlencoding chars in the request seems to be resolved before the request is handled (e.g.: /%55ser)
Edit: I've noticed parameterized paths, so I'm going to check for these kinds also: ?q=admin
There are more than those you've listed, the */delete pages for one.
Modules can tell Drupal that certain paths (other than those beginning admin/) are supposed to be administrative by implementing hook_admin_paths().
You can invoke the same hook to get a list of all the patterns that should be treated as administrative, and update your vhost file accordingly:
$paths = module_invoke_all('admin_paths');
A devel printout of the $paths variable looks like this:
It should give you a pretty good idea of the paths you need to hide. The printout will probably look completely different for your installation, it depends what modules you have installed.

Deploying CakePHP on sub domain redirects to wrong location

I am trying to deploy my locahost CakePHP website to a subdomain. I am able to view the website but it is not working correctly when I try to login or register.
It does not show me any validation error nor does it allow me to login or register and it is landing on the below URL:
subdomain.example.com/webroot/index.php?url=users/login
where it should be something like:
subdomain.example.com/users/login
I am using Go Daddy shared hosting.
The older settings defined at http://bakery.cakephp.org/articles/cguyer/2009/10/18/mod-rewrite-on-godaddy-shared-hosting has solved the issue.
Thank you every one for your help.
This sounds (and looks) like an Apache rewrite issue. There are a couple of things you need to look at:
Check to make sure the .htaccess files are were they are expected. You should have one in the app directory and another in the webroot directory. Sometimes when we compress and/or transfer files to the webserver, the .htaccess files get left behind.
Make sure that the server you are running the site on has rewrite turned on. This may require that you call support at Go Daddy. But my experience is they are always willing to help.
Good luck!

Resources