Hosting Multiple Domains on Same Server Port with Apache2 - apache2

How do I configure Apache2 via webmin or command-line (I'm using RHEL5 Linux) so that I can have multiple domains on the same server on the same port but in different subdirectories?
For instance, trying to get homerentals.ws and homerepair.ws to be detected on port 80 (default port) on the same server. I know that my DNS holds the two addresses and web hits currently go to the same test page. Now all I need is for web hits to go to a subdirectory, but not show this subdirectory. For instance, I do not want people going to http://homerentals.ws and being redirected back to http://homerentals.ws/homerentals/. Instead, http://homerentals.ws would go to /var/www/html/homerentals, while http://homerepair.ws would go to var/www/html/homerepair, but would not look any differently in the URL.
On IIS, I did this once with host-header detection. But I don't know how to do it on RHEL5 Linux via webmin or file editing. I'm stuck.

The feature you're describing is known as virtual hosts. Have a look at Apache's documentation. In general you need to edit /etc/apache2/httpd.conf file to make things happen (maybe it can be edited through webmin, but I'm not familiar with it).

Related

Filename path mapping, when using docker/container/vm

If xdebug provides information about files (e.g. stack traces), we can configure the way it works.
https://xdebug.org/docs/all_settings#filename_format
But on my host machine, the files have a different location, then on my guest machine.
E.g.: /home/me/project/myapp VS /app
Is there any way to configure xdebug so that I can map the files and get the correct host file paths?
There is currently no way to do this with Xdebug.
I have been working on a plan to allow for something like this, but it has not concluded into an actual implementation idea. I would recommend that you file an issue at Xdebug's issue tracker at https://bugs.xdebug.org to register your interest in this feature.

Xdebug remote debugging

I have been trying very hard to get xdebug working. Tens of hours have been spent but still not making much progress. I think that is because there are some basic concepts being not very clear to me. One of them is "remote debugging".
Let's say I have a remote php file in VPS, if I download it and xdebug it, how does the IDE know the local file is a copy of the remote one? Can someone describe what happens when xdebugging?
What I guess is: after setting the remote port in IDE and setting browser, when I open the PHP file in the browser, when a breakpoint is reached, the IDE will establish some connection with the server and display the remote file's content in the editor. Now I can watch variables, step running functions etc, and any change I make to the file will be saved to the remote server. Is this understanding correct?
when I open the PHP file in the browser, when a breakpoint is reached, the IDE will establish some connection with the server and display the remote file's content in the editor
That is not correct. The IDE does not establish the connection, but Xdebug/PHP does. The IDE acts like a server and listens for incoming debugging connections.
You don't mention which IDE you use, but most of them will allow you to setup a "path" mapping. Such a mapping tells the IDE how to map remote paths (the ones that PHP and Xdebug see), to the ones on your local system (the ones your IDE sees).
PHP Storm should ask you for a mapping if it can't find a file for example, but otherwise you can configure them yourself in File -> Settings -> Build, Execution, Deployment -> Deployment and then the "Mappings" tab.
Other IDEs do it in other ways, but all IDEs (except Komodo) need to have the files locally available. Please note that the protocol does support only having the files remotely, but only the Komodo editor knows how to deal with that. PHP Storm does not yet.

cakephp: warning 512 /tmp/cache/ not writable on shared host justhost

When I go to www.merryflowers.com/webroot/ i'm getting the following warnings. Based on the guidance i got from my previous post (cakephp: configuring cakephp on shared host justhost), I right clicked on the app/tmp/ (on the remote server) and all the folders within that and set the permission to be writable (ie. 777). But I'm still getting the same warnings.
Since i'm using windows 7 (chmod doesn't work), I also tried CACLS on the command prompt for tmp folder. Since i'm not familiar with CACLS, i don't know the exact command to make tmp writable to all. Can someone please help me out. Thank you.
Warning (512): /home/aquinto1/public_html/merryflowers.com/tmp/cache/ is not writable [CORE/cake/libs/cache/file.php, line 278].php, line 429
Warning (512): /models/ is not writable [CORE/cake/libs/cache/file.php, line 278]
Warning (512): /persistent/ is not writable [CORE/cake/libs/cache/file.php, line 278]
Is your site hosted locally on your Windows machine, like through XAMPP or WAMP, etc? Those are *nix paths, not Windows paths.
Did you FTP to your sites - like, with an FTP client - and change the permissions? Doing this through FTP clients isn't always 100% reliable. It looks like you changed the perms on /tmp, but they didn't cascade to /tmp/cache, etc. like you thought. Try setting them all one by one.
According to your other post - cakephp: configuring cakephp on shared host justhost - your site is set up with remote hosting. I looked at their service briefly, from the looks of them, you can probably remote (aka, "shell" or "ssh") into your server and get access to the command line. A lot of webhosts provide this these days, although you may have to specifically request they enable it for you.
On a Windows machine, you can use PuTTY to shell into your remote server: http://www.chiark.greenend.org.uk/~sgtatham/putty/
HTH. :)

Cache Outgoing Data from browser

This might be a very broad question. But this is what i want. I open a website and enter some details in the website like my credentials to login or it may be any data that pass from my browser to the website. Now what i want is that i should cache ( write to a temp file ) whatever that i send to that website. How can this be done? I tried to extract the data present in the packets that are flowing out of my machine but i find only junk characters in that (may be header). any ideas are welcomed. I am using Ubuntu Linux and would like to achieve this using shell script/C/C++
One option would be to use the Fiddler Web Debugger which is scriptable (C#).
Although it's a Win32 program, it can act as a proxy for any Linux machine. See Debug traffic from another machine (even a Mac or Unix box) for details.
There's also a native Linux app called dsniff which can be used to log all HTTP traffic to a file.

Building a centralized configuration repository

I'm trying to develop an open source application to be sort like a centralized configuration management for all Unix platform like for example (changing root password, SSH configuration, DNS settings, /etc/hosts management.... and others).
I need your feedback for what do you recommend to use as the interface for all the configuration (list of scripts will be running in the Unix Servers as a clients to read the configuration and apply it in each system "Client===>to===>Server mode"
Should I use LDAP to host the configurations and any Unix OS can talk to the LDAP to get the configuration
or Should I just save the configuration in Database (e.g. MySQL) and build a web interface to read the database and print the configuration to the client ?
or you have any other idea?
You might look into something like Chef or Puppet instead. Why re-invent the wheel?
Curl can download a file from a URL and write that file to standard output. For example, executing curl -sS http://someHost/file.cfg will download "file.cfg" from the specified web server. The "-sS" options instruct Curl to print error messages but not any any progress diagnostics. By the way, Curl supports many protocols including HTTP, FTP and LDAP, so you have flexibility in the technology you want to use to host your centralised configuration repository (CCR).
You could use curl to retrieve a configuration file from the CCR, store the result in a local file and then parse that local file.
Check out Blueprint from DevStructure. It sounds like something along the lines of what you're trying to do. Basically it reverse engineers servers and detects everything that has changed from the install state. Open-source too.
https://github.com/devstructure/blueprint (Blueprint # Github)
We are also about to launch ConfigChief which is a central configuration repository that would do what you want: central point to store configuration (with all features like versioning, audit, ACL, inheritence, etc).
Once you have that, combined with change notification, you can just run a curl as Ciaran McHale says against the CCR and get your parsed configuration file back. This would eliminate the need for writing scripts to generate config files from the outside.
If you are interested, you can signup for a beta at http://woot.configchief.com
DISCLAIMER: I guess it is obvious from the first word!

Resources