Filename path mapping, when using docker/container/vm - xdebug

If xdebug provides information about files (e.g. stack traces), we can configure the way it works.
https://xdebug.org/docs/all_settings#filename_format
But on my host machine, the files have a different location, then on my guest machine.
E.g.: /home/me/project/myapp VS /app
Is there any way to configure xdebug so that I can map the files and get the correct host file paths?

There is currently no way to do this with Xdebug.
I have been working on a plan to allow for something like this, but it has not concluded into an actual implementation idea. I would recommend that you file an issue at Xdebug's issue tracker at https://bugs.xdebug.org to register your interest in this feature.

Related

How to support multi-architecture docker-compose configuration for devcontainer.json?

In our engineering team we have people using older macbook pros as well as the new M1 (ARM) chips. We currently have 2 different docker-compose.yaml files that pull in different docker images for our data services based on which architecture the host computer is using. This is not ideal, but currently works fine. I want to make use of devcontainer.json so that our app layer could also live in docker to make setting up a new machine within our eng org very easy. The problem I'm running into is I'm not sure how to tell devcontainer.json which docker-compose.yaml file to use based on which architecture is being used.
My current thought is to just have the developer set an env var on their host system that derives their arch and then utilize that env var within devcontainer.json to point to the correct docker-compose.yaml file, but I'm wondering if there is a better way to achieve my goal.
I noticed that there is an open issue within the vscode-remote-release repo but I think that pertains to the app code image that gets built. Not quite the same situation I'm in, but the solution is probably one part of the solution to my question.

How do you allow the .shtml extension with webpack-dev-server?

My company has shared resources and uses *.shtml to keep the same look and feel across the intranet. When developing on my local machine, how can I tell webpack-dev-server to use it? It downloads the file instead.

Getting proxy information on Linux programmatically

I am currently using libproxy to get the proxy information (if any) on RedHat and Debian Linux. It doesn't work all that well, but it's the only way I know I can use to get the proxy information from my code.
I need to stop using the lib since in most cases it doesn't recognize the proxy.
Is there any way to acquire the proxy information? What i mean is, is there a file (or group of files) i can read, or an env variable or an API or system call that i can use to get the information?
Gnome based code is OK, KDE might help as well but i am looking for something more generic.
The code is C.
Now, before anyone asks, I don't want to use libproxy anymore. Period. I don't want to start investigating why it doesn't work. I don't really want to know whether there is a new version of that lib. I know it might work, I just don't want to use it. i can't use it (just because). So please don't point me that way.
Code is appreciated.
thanks.
In linux, the "global proxy setting" is typically just environment variables that are usually set in /etc/profile. You can examine those variables to see what proxy is set.
The variables are:
http_proxy - the proxy for HTTP connections
ftp_proxy - the proxy for FTP connections
Using the Network Proxy Preferences tool under Gnome saves information in the GConf database. The path to the keys are /system/http_proxy and /system/proxy. You can read about the detail in those trees at this page.
You can access the GConf database using the library API. Note that GConf is based on GObject. To examine the contents of this tree using the command line, try the following:
gconftool-2 -R /system/http_proxy
This will provide a "name = value" listing of the tree, which may be usable in your application. Note that this requires a system() call, so it's not recommended for a deployed application, but it might help you get started.
GNOME has its own place to store the Proxy settings, and I am sure KDE or any other DE has its own place too. May be you can look for any mention of where Proxy settings should be store in the Linux Standard Base. That could hint you a standard of doing it irrespective of Distro or DE.
DE -> Desktop Environment
char* proxy = getenv("all_proxy");
This statement puts the value of the environment variable called all_proxy, which is used by the system as a global proxy, in your C variable.
To print it in bash, try env | grep 'all_proxy' | cut -d= -f 2.

Hosting Multiple Domains on Same Server Port with Apache2

How do I configure Apache2 via webmin or command-line (I'm using RHEL5 Linux) so that I can have multiple domains on the same server on the same port but in different subdirectories?
For instance, trying to get homerentals.ws and homerepair.ws to be detected on port 80 (default port) on the same server. I know that my DNS holds the two addresses and web hits currently go to the same test page. Now all I need is for web hits to go to a subdirectory, but not show this subdirectory. For instance, I do not want people going to http://homerentals.ws and being redirected back to http://homerentals.ws/homerentals/. Instead, http://homerentals.ws would go to /var/www/html/homerentals, while http://homerepair.ws would go to var/www/html/homerepair, but would not look any differently in the URL.
On IIS, I did this once with host-header detection. But I don't know how to do it on RHEL5 Linux via webmin or file editing. I'm stuck.
The feature you're describing is known as virtual hosts. Have a look at Apache's documentation. In general you need to edit /etc/apache2/httpd.conf file to make things happen (maybe it can be edited through webmin, but I'm not familiar with it).

Is it possible to use relative paths for SSIS packages dtsConfig files?

I am trying to make our SQL Server Integration Services packages as portable as possible and the one thing that is preventing that is that the path to the config is always an absolute path, which makes testing and deployment a headache. Are there any suggestions for making this more manageble?
Another issue is when another developer gets the package out of source control the path is specific to the developers machine.
If you are trying to execute your packages using Visual Studio then the configuration file path will be hardcoded in there. So if you move your project around you'll need to change the path in the package settings. To avoid this you could use the Environment variable option to store the configuration file path. Then you'll only need to change that.
For testing and deployment however you should probably use the dtexec utility to execute your packages. Make some batch files for that. Preferably one for each different environment. Here the configuration file path can be relative.
dtexec /File Package.dtsx /Conf configuration.dtsConfig
This is if you're packages are on file system. You can also store them in SQL Server. You can also store your configuration in SQL Server which may provide flexibility.
After several hours trying to make this work I found a solution here (not the best one, but it works)
Locate your configuration files (dtsconfig files) in the same directory as your solution file (.sln file)
ALWAYS open your solution by double-clicking the solution file (.sln file). This will set the ‘working folder’ to be where the solution lives, your configuration file will be read correctly
Otherwise the relative paths did not work for me.
Check out the free utility that can edit SSIS configuration file paths without BIDS:
http://ssisconfigeditor.codeplex.com/
My stock standard trick for these sorts of problems are mapping drives.
Either by using a mapped network drive or by using Subst (both methods are interchangable).
e.g. Map the location of your package to N:\ then inside your package use paths using N:\MyParentPackage.dtsx, N:\MyChildPackage.dtsx. The packages can be on totally different drives in different folders on different machines, it'll work once you map the package location to the N:\
I usually put a script along side the project files to map the drive, which maps the drive so it can be easily run before. One gotcha, if you're using subst on VISTA - Win8, map it for elevated and non-elevated.
I use the same approach for file references in Visual Studio projects. Only issue with this approach, you use to solve too many issues in your dev environment and you'll run out of drives letters.

Resources