Hiding Drupal admin - drupal-7

I'm trying to hide my Drupal7 admin pages from the Internet, only allowing access from internal lan's.
I'm trying to hide /admin /user and */edit, but is there anything else I need to deny, to disable access to all parts of the Drupal admin?
<Location ~ "(/([aA][dD][mM][iI][nN]|[uU][sS][eE][rR])|/[eE][dD][iI][tT])">
Order deny,allow
Deny from all
Allow from 12.193.10.0/24
</Location>
Apache seems to accept this, and urlencoding chars in the request seems to be resolved before the request is handled (e.g.: /%55ser)
Edit: I've noticed parameterized paths, so I'm going to check for these kinds also: ?q=admin

There are more than those you've listed, the */delete pages for one.
Modules can tell Drupal that certain paths (other than those beginning admin/) are supposed to be administrative by implementing hook_admin_paths().
You can invoke the same hook to get a list of all the patterns that should be treated as administrative, and update your vhost file accordingly:
$paths = module_invoke_all('admin_paths');
A devel printout of the $paths variable looks like this:
It should give you a pretty good idea of the paths you need to hide. The printout will probably look completely different for your installation, it depends what modules you have installed.

Related

Fixing domain display on directory redirects

I have multiple domains directing to multiple directories on my system, as an example...
shopwebsite.co.uk > public_html/useraccounts/shopsite
carwebsite.co.uk > public_html/useraccounts/carsite
foodwebsite.co.uk > public_html/useraccounts/foodsite
This has been okay for a while until I realized that framed forwarding caused the mobile responsiveness to stop responding and so I changed all of the domains to simple redirects. This does work although as you can now work out, whenever somebody types in one of these URLs the website displays as something like:
https://mymainwebsite.co.uk/useraccounts/foodsite
Which is causing a few problems for me in various ways. What I am looking to achieve is to attach each domain to its directory path while keeping the URLs and being able to use the domain properly while also keeping mobile responsiveness.
Now this may seem like a really simple situation although I have a couple different domain hosts and my website hosting is via Hostinger, so I am not able to manipulate certain back end features, making it slightly more difficult for me. I am also still learning and really don't have much knowledge in DNS, IP forwarding, etc so I wouldn't know what I'm looking for?
If somebody can point me in the right direction I can more than likely figure the rest out on my own. I can access my htaccess file for the main website as well as DNS and ability to park domains etc...
Hope somebody can help me find a solution. Thanks.
I think you are looking for something called "Virtual Host". This is a configuration of the web server that allows you to run multiple web sites on the same server. In Apache simple virtual host looks like this:
<VirtualHost *:80>
DocumentRoot "/public_html/useraccounts/shopsite"
ServerName shopwebsite.co.uk
# Other directives here
</VirtualHost>
<VirtualHost *:80>
DocumentRoot "/public_html/useraccounts/carsite"
ServerName carwebsite.co.uk
# Other directives here
</VirtualHost>
There are no redirects in this method. The browser is connecting to the webserver and is sending "Host" header - the domain name typed in the URL, then it returns the files from the directory you configured in "DocumentRoot" directive for this "ServerName".
You should call your hosting provider support team and ask them how do you manage virtual hosts.

Nagios: creating custom menu for an authenticated user

Is there any 'easy' way to create customized web gui (for example, menu, default home page etc.) for a Nagios authenticated user? I have created a user for a customer, who has access to certain hostgroups only. But after logging in, the user can obviously see the default menu, which is customized for internal use. How can I prevent this?
There are ways to restrict what a user sees in the standard gui, check the manual pages. Basically, a user will see only those hosts and services which have contact lists containing this user. You can do a bit more configuration for special cases in the etc/cgiauth.cfg file.
If you want to restrict a user to very few predefined pages, you can do that with a few tricks in the web server configuration. You should have some understanding of how apache config files work for this, and this assumes you can distinguish your customer from your company employees using their IP address. If you can't, you can use groups and AuthGroupFiles, but it will be a bit harder that way.
The basic idea is:
Allow everyone access to the static pages, images, css stuff etc.
Allow access to the CGIs only from the IPs your company uses
create special URLs for the customer that "hide" the real CGIs
This needs mod_authz, mod_rewrite and mod_proxy together with mod_proy_http to work.
You should have a nagios.conf in your web server directory; its exact location and contents depend on distribution and on whether you're using a RPM or compiled nagios yourself, so your directory paths may vary.
In the configuration for the CGI scripts, we put
<Directory /usr/local/nagios/sbin>
Order deny, allow
Deny from all
Allow from 127.0.0.1
Allow from 1.2.3.4 # <-- this should be the address of the webserver
Allow from 192.168.1.0/24 # <-- this should be the addresses your company use
require valid-user
</Directory>
This denies access to the CGIs to everyone but you.
Then, we define a few web pages that get rewritten to CGI scripts:
<Location />
RewriteEngine On
RewriteRule customer.html$ http://127.0.0.1/nagios/cgi-bin/status.cgi?host=customerhost [P]
</Location>
So when anyone accesses customer.html, the server will fetch http://127.0.0.1/nagios/cgi-bin/status.cgi?host=customerhost using its internal proxy; this will create a new request to the CGI that seems to come from 127.0.0.1 and thus match the "Allow from 127.0.0.1" rule.
Mod_proxy still needs come configuration:
ProxyRequests On
<Proxy *>
AddDefaultCharset off
Order deny,allow
Deny from all
Allow from 1.2.3.4 # <--- again, use your server IP
Allow from 127.0.0.1
</Proxy>
which restricts the proxy to internal apache use and prevents other people from the internet from using your proxy for anything else.
Of course, it's still the original CGIs that get executed, but your customer can't use them directly, he'll only be able to access the ones you've made available in your RewriteRules. The links, and action pulldown, will still be there, but accessing them will result in error messages.
If you still want more, use a programming language of your choice (I've done this with perl, but php, phyton, ruby, ... should work just as well), parse the objects.cache and status.dat files, and create your very own UI. Once you've written a few library functions to parse those files (which shouldn't be too difficult, their syntax is trivial), creating your own GUI is just as hard, or as easy, as programming any other kind of Web UI.
After some research, I have found a work-around for my case. The solution lies in the fact, that by default Nagios uses a single password file (for http auth) for two different directiories:
$NAGIOS_HOME/sbin (where the cgi files are stored) and
$NAGIOS_HOME/share (HTML and PHP files are stored)
This means, anyone authenticating as a user gets access to both the folders and subfolders automatically. This can be prevented by using seperate password file for the folders above.
Here is a snippet from a custom nagios.conf file with two different password files:
## BEGIN APACHE CONFIG SNIPPET - NAGIOS.CONF
ScriptAlias /nagios/cgi-bin "/usr/local/nagios/sbin"
<Directory "/usr/local/nagios/sbin">
Options ExecCGI
AllowOverride None
Order allow,deny
Allow from all
AuthType Digest
AuthName "Nagios Access"
AuthDigestFile /usr/local/nagios/etc/.digest_pw1>
Require valid-user
</Directory>
Alias /nagios "/usr/local/nagios/share"
<Directory "/usr/local/nagios/share">
Options None
AllowOverride None
Order allow,deny
Allow from all
AuthType Digest
AuthName "Nagios Access"
AuthDigestFile /usr/local/nagios/etc/.digest_pw2
Require valid-user
</Directory>
## END APACHE CONFIG SNIPPETS
Now for example, lets make a custom directory for customer1 under /var/www/html/customer1 and copy all the html and php files from Nagios ../share directory there and customize them and add an alias in Apache.
Alias /customer1 "/var/www/html/customer1"
<Directory "/var/www/html/customer1">
Options None
AllowOverride None
Order allow,deny
Allow from all
AuthType Digest
AuthName "Nagios Access"
AuthDigestFile /usr/local/nagios/etc/.digest_pw3
Require user customer1
</Directory>
Now one can add the same user/password for customer1 at password files 1 and 3 so that they can have access to the custom web gui and to the cgi scripts. Of course beforehand one must set appropriate contact groups in Nagios so that after authentication the customer sees only the groups he/she is a contact for. The default Nagios share directory is secured with the nagios-admin (or whatever) user/password which resides in password files 2 and of course in 1.

CakePHP Application: some pages with SSL, some without

I have an application written with the CakePHP framework and it is currently located in httpdocs. I want a few pages to be redirected to https://
Basically this shouldn't be a problem to detect whether the user is already on https://... or not. My issue is a different one: In my opinion I would need to make a copy of the whole project and store it in httpsdocs, right? This sounds so silly but how should it work without duplicating the code? I think I miss something but I don't get it ...
I have never had to copy the code for ssl. You should specify in the vhost for the site what the path is.
On apache there is a vhost for each, ssl and non ssl. Both can have the same webroot path.
If your webhoster requires you to put the https part of your website in httpsdocs, then you will need to put something there. But not the whole project: maybe only the /web part (the part that is actually served up by the webhoster).
Something like
/cake/app/ --> your app code
/httpsdoc/.. --> index.php and possibly css stuff, images etc
/httpsdocs/.. --> copy of index.php and the rest as well
Of course, you could also use some internal redirect in .htaccess
One suggestion: now that google indexes https urls, you could also choose to make the whole site available through https.

How to restrict access to a file or files using .htaccess?

I want to restrict access to a file or files using .htaccess file. Basically, no one should be able to download file(s) using direct link to the file. However, the file should be accessible from my website.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
Thank you in advance,
You can deny access to the directory for every IPA but the server's:
<Directory /dir/of/presentation>
Order Allow,Deny
Allow from 127.0.0.1
Deny from All
</Directory>
That wonk work, as you pointed out.
How about using Mod Rewrite with a rule that maps /dir/of/presentation/* to a forbidden page. That way a direct link won't work. A request for http://site/content/presentation.ppt
could get redirected to http://site/forbidden.html
Internally, you could make a link to http://authorizedRequest/presentation.ppt map to http://site/content/presentation.ppt
It's just security through obscurity. It wouldn't prevent anyone from typing your "secret" URI into their browser directly.
For instance, say I have a file called Presentation.ppt. I want the visitor to have access to it through my website, but if they try to download it or access it using direct link then the server should reject the request.
Is it possible to do that using .htaccess?
It's possible but there's ways to get around it. You need to check against the referer sent by the browser, but anyone can spoof that and sometimes a browser may choose to not even include a referer.
If you are trying to protect the file Presentation.ppt, put these rules in the htaccess file in your document root:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/to/Presentation.ppt - [L,F]
If you want to protect a folder /path/images/ then:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^(https?://)?your_website.com
RewriteRule ^/?path/images - [L,F]
Thank you all for your answers. I have tried all of your suggestions, but I still couldn't get it working. However, I did come up with a solution that does work.
Step 1: Disable or turn off Option Indexes on your web server by removing the word indexes leaving everything else the same. In some instances, you may be able to do this using .htaccess file. If you are unable to do this with .htaccess, then you will have to look for httpd.conf file in your server. It is usually located at etc/apache/httpd.conf or etc/httpd/conf/httpd.conf. Once you find it, turn this option off in there.
Step 2: Create a folder within your webpage folder and call it whatever you want but make sure it is not easily guessable or that it is obvious (i.e. Joe33CompanyOCT2MeBoss). Then, move the files you want to hide or protect from your visitor into this folder.
Step 3: Within robot.txt file, disallow all bots or crawlers from indexing your folder or the files within this folder by entering, "Disallow yourfoldername."
Step 4: Then you will have to create a PHP file using a similar code below. The code below will force download.
$File1 = 'http://yourwebsite.com/Joe33CompanyOCT2MeBoss/Presentation.ppt';
header("Content-Disposition: attachment; filename=\"".basename($File1)."\"");
header("Content-Type: application/force-download");
ob_end_clean();
flush();
readfile($File1);
exit;
This way direct path to the file is hidden from your visitor and even though they can download the file directly they simply don't know the actual URL to the file, because force download php code doesn't reveal the actual path to the file. So, now my visitors to my website has to go through my webpage for downloading this file instead of directly.
Following stackoverflow questions have been very instrumental in helping me solve my programming issues. Thanks,
How to Automatically Start a Download in PHP?
php file force download
The easiest (though not bulletproof) is to redirect the user agent when the HTTP_REFERER is not correct. This can be done using mod_rewrite in the server configuration or (second choice) inside a .htaccess like file. It helps against simply hotlinking (links referencing your file by url).
You should read the fine documentation of moapaches d_rewrite.

Cake php host in server with multiple domain

I have a server with many domains/ applications on it. I need to host a cake php application on that server. When I uploaded, I get errors w.r.t urls.
for eg, www.xyz.com/aboutus. this url is working. there is a controller called Aboutus.
But when I take the url www.xyz.com/aboutus/add, it must go to the add method in Aboutus controller. It is working in my local system. But in live, it shows the error that 'add' controller is missing.
In my local, I have changed the document root in apache. But in live server I cant do this as there are multiple sites.
You need to make sure that the ROOT, APP_DIR, and CAKE_CORE_INCLUDE_PATH variables in each site's webroot/index.php have been updated to go to the right paths. [details here] (or see below where I list my settings) Other than that, just make sure your host has mod rewrite on and you should be good to go.
According the the CakePHP book for 2.0.x, it's easier to just change the include_path, but I haven't tried that yet: http://book.cakephp.org/2.0/en/deployment.html#multiple-cakephp-applications-using-the-same-core
The file-structure I use:
/cakephp
/cakephp_1_3
/cakephp_2_0_5
/public_html
/mysite1.com
/mysite2.com
/mysite3.com
//webroot/index.php (of one of my sites)
define('ROOT', DS.'home'.DS.'myusername'.DS.'public_html');
define('APP_DIR', DS.'mysite1.com');
define('CAKE_CORE_INCLUDE_PATH', DS.'home'.DS.'myusername'.DS.'cakephp'.DS.'cakephp_2_0_5'.DS.'lib');
(I just took the 3 lines that set the variables - they're not really three lines in a row like that)
Don't forget to make sure your database settings are still correct in app/Core/Config/database.php

Resources