ReactJS: Storing very simple settings/constants - reactjs

I am very new to ReactJS and I might be thinking completely wrong. In our react app I make a lot of AJAX calls to the backend. For example, in dev I make calls to http://localhost:3000, in production I (naturally) use another host, which changes depending on the installation.
The hosts are static, set once and never change.
How do I make the host-information manageable in React?
I read about redux/flux etc to store global variable, but this is overkill for us. We just need to have one string (URL/host-info) that we can replace with another. We can store the info in a file, as a command-line param or whatever. We just need it to be simple.
UPDATE: Turn out that I fully did not understand the build system. As Dan pointed out we use webpack to package the solution. Using a loader we could swap out our configuration-settings in the code. We ended up using a simple string replacement loader (string-replace-webpack-plugin) since env variables are not suitable for this solution.

What you're describing are usually known as environment variables. You generally maintain a specific set of environment variables for each context your application is developed or run in.
For instance you might have an APP_HOST environment variable which would be set differently at your local machine, than it would at your server.
Most programs that run on the server can read the environment variables directly, but because React apps run in the client's browser, you'll have to make them aware of the appropriate environment variables before they are served.
The easiest way to do this is with envify.
Envify will allow you to reference environment variables from your frontend code.
// app.js
const host = process.env.APP_HOST;
fetch(host);
Envify is a Browserify transform, meaning you'd need to run your code through a command like this.
# define some environment variables
APP_HOST="localhost:3000"
# build the code
browserify app.js -t envify -o bundle.js
What comes out the other side would be:
// bundle.js
var host = "localhost:3000";
fetch(host);
If you use Webpack, there's a similar alternative in the form of envify-loader.

Related

How to get generated content in Apache module and save it to file

I have been playing around with Apache module development and got the module working. However, I ran into the issue of where to hook properly to get all the data I need.
I am making a simple caching module that needs to hook at the beginning of the request and check if the file for this URL exists on disk and if it does then serve that file and stop content generation of Apache.
Currently, the module still continues to go into content generation mode. Let's say I have a long-running PHP script that takes 5s to generate. I would to omit calling the script altogether and just serve the static file from disk.
Furthermore, if the local file does not exist, I would like Apache to execute content generation (actually executes the PHP script) and before sending that data to the client I would like to have a proper hook that somehow gets this data and saves it to a local file.
I have tried ap_hook_fixups, ap_hook_handler and APR_HOOK_LAST, APR_HOOK_LAST and all the variations but no luck.
It always executes at the start of the request.
I also do not want to use any existing Apache modules. I want this to be a self-contained module.
Is there a way to do this kind of thing?
Based on the information you have provided, it sounds like you want your module to execute First and not last.
From what I understand of your issue, you want to make sure the file that would potentially be generated is already on disk or not, and if it is, serve that file, instead of allowing your php script to serve it.
In this case, you'll want to use APR_HOOK_FIRST or APR_HOOK_REALLY_FIRST
Then, assuming your file is on disk, you serve your file, and at the end of your module's work do a return OK;
If the file does not exist, do a return DECLINED
The DECLINED tells Apache that your module should not be the handler for this request and will continue down the list of modules till it finds something that will.
The goal here is to get your module to run before the php module does, to prevent your generation code from running, and fall back onto the php module if the requested file was not found.
Note:
The APR_HOOK_? priorities are just numbers -10 to 30
You should be able to fine tune this if you find your module executing a little too soon, like before mod_ssl for example.
Also, I am terrible at following documents, but the official Apache module development docs are amazingly assembled. Please try to use them, if you have,
I have spent the last 6 months messing around with Apache module development, working on a telegram bot.
I have had to do this song and dace a few times now.

Can REACT read windows environment variables?

Hi guys,
Can REACT read windows environment variables? I know that it wouldn't be best practices and I should be using the .env file but this seems like something I should be able to make my application do and unfortunately all the google is around using the .env file which is how I am going to do it in the end but I would still like to know the answer here.
Thanks guys.
React doesn't know anything about environment variables; it's a library that runs in an HTML page in your browser. (HTML and JavaScript don't know anything about environment variables, for that matter, and that's only a good thing.)
The bundler/toolchain you use may schlep some environment variables over to the JavaScript code, e.g.
If you're using create-react-app, see https://create-react-app.dev/docs/adding-custom-environment-variables/
If you're using vite, see https://vitejs.dev/guide/env-and-mode.html
If you're using Next.js, see https://nextjs.org/docs/basic-features/environment-variables
If you're using something else, see that toolchain's documentation.
All of the above-linked tools support reading environment variables from, well, the environment, as well as .env files (commonly known as envfiles).

How to register a local Julia package in a local registry?

I have a Julia package in my home directory under the name Foo. Foo is a directory with Project.toml file that describes the package dependences, name, etc. I want to be able to use this package from another folder in a particular manner as follows.
Build the package Foo
Register Foo in a local Julia repository
Run Pkg.add("Foo") from anywhere on the system, such as script.jl which would have the following:
using Pkg
Pkg.add("Foo")
using Foo
# Now use Foo
Foo.bar()
Here is what I've tried so far.
Navigate to Baz directory where I want to run Baz/script.jl
Use repl, hit ] and run dev --local PATH_TO_FOO
From repl, run using Foo
Foo is now accessible in the current repl session (or script)
Summary: I have managed to import a package Foo in another directory Baz.
Basically, I want to be able to write a script.jl that can make use of the local registry instead of this dev --local . method.
I assume that this is for personal use and you are not a system administrator for a large group.
Maintaining a registry is hard work. I would avoid creating a registry if at all possible. If your goal is to make the use of scripts less painful, a lighter solution exists: shared environments.
A shared environment is a regular environment, created in a canonical location. This means that Pkg will be able to locate it by name: you do not have to specify an explicit path.
To set up a shared environment:
give it a name: Pkg.activate("ScriptEnvironment"; shared=true)
populate it: Pkg.add(Pkg.PackageSpec(; path="/path/to/Foo")) (depending on your use case, you can also use Pkg.develop, add registered packages, etc)
that's all!
Now all your scripts can use the shared environment:
using Pkg
Pkg.activate("ScriptEnvironment"; shared=true)
using Foo
If other sets of scripts require different sets of packages, repeat the procedure with a different shared name. Pkg environments are really cheap so feel free to make liberal use of this approach.
Note: I would discourage beginning scripts with Pkg.add("Foo") because this carries the risk of inadvertently polluting the active environment.
There are some instructions on registry creation and maintenance at https://github.com/HolyLab/HolyLabRegistry.

xinetd does not load environment variables set in /etc/profile.d

I am using xinetd to serve the output of check_mk_agent. I have custom check_mk_agent scripts, some of which are configured with environment variables. These environment variables are set in /etc/profile.d/set_env.sh. When I run check_mk_agent manually, the environment variables are found, and the custom checks succeed. When i do telnet myhost 6556, the environment variables are not found, and the custom checks fail.
My question is, what is a good way to ensure that set_env.sh gets run in the xinetd context? I would rather not use env and passenv variables in xinetd configuration, because it would be annoying to unnecessarily maintain environment variables in multiple places on the same host.
Thanks!
Edit the file check_mk_agent file, and add the flowing line just after #!/bin/bash:
source /etc/profile.d/set_env.sh
Save this, and retry.

Getting proxy information on Linux programmatically

I am currently using libproxy to get the proxy information (if any) on RedHat and Debian Linux. It doesn't work all that well, but it's the only way I know I can use to get the proxy information from my code.
I need to stop using the lib since in most cases it doesn't recognize the proxy.
Is there any way to acquire the proxy information? What i mean is, is there a file (or group of files) i can read, or an env variable or an API or system call that i can use to get the information?
Gnome based code is OK, KDE might help as well but i am looking for something more generic.
The code is C.
Now, before anyone asks, I don't want to use libproxy anymore. Period. I don't want to start investigating why it doesn't work. I don't really want to know whether there is a new version of that lib. I know it might work, I just don't want to use it. i can't use it (just because). So please don't point me that way.
Code is appreciated.
thanks.
In linux, the "global proxy setting" is typically just environment variables that are usually set in /etc/profile. You can examine those variables to see what proxy is set.
The variables are:
http_proxy - the proxy for HTTP connections
ftp_proxy - the proxy for FTP connections
Using the Network Proxy Preferences tool under Gnome saves information in the GConf database. The path to the keys are /system/http_proxy and /system/proxy. You can read about the detail in those trees at this page.
You can access the GConf database using the library API. Note that GConf is based on GObject. To examine the contents of this tree using the command line, try the following:
gconftool-2 -R /system/http_proxy
This will provide a "name = value" listing of the tree, which may be usable in your application. Note that this requires a system() call, so it's not recommended for a deployed application, but it might help you get started.
GNOME has its own place to store the Proxy settings, and I am sure KDE or any other DE has its own place too. May be you can look for any mention of where Proxy settings should be store in the Linux Standard Base. That could hint you a standard of doing it irrespective of Distro or DE.
DE -> Desktop Environment
char* proxy = getenv("all_proxy");
This statement puts the value of the environment variable called all_proxy, which is used by the system as a global proxy, in your C variable.
To print it in bash, try env | grep 'all_proxy' | cut -d= -f 2.

Resources