Is there a way to not use the tilde character in the create-react-app build js file names?
After I run npm run build on my create-react-app, I get a couple runtime~main.[hash].js files in the /static/js/ folder . My destination file system does not allow tilde characters in file or folder names.
Can I change config somewhere to output runtime-main.[hash].js with a - instead of tilde??
https://facebook.github.io/create-react-app/docs/production-build
Solved without ejecting:
-Install rescripts: npm install #rescripts/cli --save-dev in your project
-Create a file .rescriptsrc.js at the root directory of your project and put the code below inside:
module.exports = config => {
config.optimization.splitChunks.automaticNameDelimiter = '_';
if (config.optimization.runtimeChunk) {
config.optimization.runtimeChunk = {
name: entrypoint => `runtime_${entrypoint.name}`
};
}
return config;
};
In package.json in sripts section add a new entry "build-re": "rescripts build"
Now when running npm run build-re it will generate _ instead of ~ in file names.
https://github.com/harrysolovay/rescripts
Related
I made a React app that I want to build for different servers / customers.
I created a folder configurations with different .env files. E.g.: .env.foo, .env.bar, .env.tree, .env.mirror, .env.backup.
To point three values out as an example I have:
REACT_OWNER_NAME=Malcom X
REACT_OWNER_MAIL=mx#mail.com
REACT_FTP_FOLDER_PATH=foo.domain.com
In my App.js I have
const App = () => {
return (
<>
{process.env.REACT_OWNER_NAME}<br />
{process.env.REACT_OWNER_MAIL}
</>
);
}
Now I want to place each configurations/.env* file in the root, build the app and deploy it to the different servers.
To automate this, I build the following builder.sh script:
mv .env env
for CURR in foo bar tree mirror backup
do
cp configurations/.env.$CURR .env
export $(cat .env | grep -v '#' | awk '/=/ {print $1}')
if [ -z ${REACT_FTP_FOLDER_NAME+x} ]; then foldername=$CURR; else foldername=$REACT_FTP_FOLDER_NAME; fi
yarn build
mkdir apps/${foldername}
mv build/* apps/${foldername}/.
rm -rf build
mv .env apps/${foldername}/.env
done
mv env .env
Now my issue: When I run the script with for CURR in foo, then for CURR in bar and so on (each file one by one) I have no issue, I get all the apps build into apps/.
But when I run it with for CURR in foo bar tree mirror backup the output is the same, but when I log process.env all variables containing a space get cut off by the space.
When I just yarn build the apps, no issue. When I add "" around the strings I get REACT_OWNER_NAME: "\Malcom".
In every case when I compare configurations/.env.foo with apps/foo.domain.com/.env there is no difference between the files.
And the funnies thing... the first app (in this example foo) is correct.
So I have this codegen.yml
overwrite: true
schema:
- ${REACT_APP_GRAPHQL_URL}:
headers:
'x-hasura-admin-secret': ${REACT_APP_GRAPHQL_ADMIN_SECRET}
documents: "./src/**/*.{ts,tsx}"
generates:
src/generated/graphql.tsx:
plugins:
- "typescript"
- "typescript-operations"
- "typescript-react-apollo"
config:
withHooks: true
My .env looks like this:
REACT_APP_GRAPHQL_URL=https://somesite.com/graphql
REACT_APP_GRAPHQL_ADMIN_SECRET=abcde1234
but it failed everytime I run codegen npm run codegen and npm run codegen -r dotenv/config. I've tried changing up the quote marks, spaces etc but it still doesn't work. When I replace the environment variable with the URI and admin-secret, it runs fine. What did I do wrong?
Not the best solution but it's worked for me
Add your variebles before calling script like:
REACT_APP_GRAPHQL_URL=$(grep REACT_APP_GRAPHQL_URL .env | cut -d '=' -f2)
so script should looks like
"scripts": {
"graphql-codegen": "REACT_APP_GRAPHQL_URL=$(grep REACT_APP_GRAPHQL_URL .env | cut -d '=' -f2) REACT_APP_GRAPHQL_ADMIN_SECRET=$(grep REACT_APP_GRAPHQL_ADMIN_SECRET .env | cut -d '=' -f2) graphql-code-generator --config ./codegen.yml",
},
Then just run
npm run graphql-codegen
The goal I am trying to achieve is to build a docker image (with a react app within) that is using environment variables from the host.
Planned workflow:
Build the docker image locally
Upload the docker image
Call command docker-compose up
I want the environment variable REACT_APP_SOME_ENV_VARIABLE of the system (where the image is hosted) to be usable by the react app.
Current solution:
// App.js
function App() {
return (
<p>SOME_ENV_VARIABLE = {process.env.REACT_APP_SOME_ENV_VARIABLE}</p>
);
}
# Dockerfile
FROM node:13.12.0-alpine as build-step
# Install the app
RUN mkdir /app
WORKDIR /app
COPY package.json /app
RUN npm install --silent
# Build the app
COPY . /app
RUN npm run-script build
# Create nginx server and copy build there
FROM nginx:1.19-alpine
COPY --from=build-step /app/build /usr/share/nginx/html
# docker-compose.yml
version: '3.5'
services:
react-env:
image: react-env
ports:
- 80:80/tcp
environment:
- REACT_APP_SOME_ENV_VARIABLE=FOO
What am I doing wrong and how do I fix it?
This was solved by using an NGINX docker package, inject the compiled React production code into the NGINX html folder, then modify the docker-entrypoint.sh file.
FROM nginx:1.19-alpine
COPY --from=build-step /app/build /usr/share/nginx/html
COPY ./docker/docker-entrypoint.sh /docker-entrypoint.sh
Then in that file add the following code at the end of the old script
#!/bin/sh
#!/bin/sh
# vim:sw=4:ts=4:et
set -e
if [ -z "${NGINX_ENTRYPOINT_QUIET_LOGS:-}" ]; then
exec 3>&1
else
exec 3>/dev/null
fi
if [ "$1" = "nginx" -o "$1" = "nginx-debug" ]; then
if /usr/bin/find "/docker-entrypoint.d/" -mindepth 1 -maxdepth 1 -type f -print -quit 2>/dev/null | read v; then
echo >&3 "$0: /docker-entrypoint.d/ is not empty, will attempt to perform configuration"
echo >&3 "$0: Looking for shell scripts in /docker-entrypoint.d/"
find "/docker-entrypoint.d/" -follow -type f -print | sort -n | while read -r f; do
case "$f" in
*.sh)
if [ -x "$f" ]; then
echo >&3 "$0: Launching $f";
"$f"
else
# warn on shell scripts without exec bit
echo >&3 "$0: Ignoring $f, not executable";
fi
;;
*) echo >&3 "$0: Ignoring $f";;
esac
done
echo >&3 "$0: Configuration complete; ready for start up"
else
echo >&3 "$0: No files found in /docker-entrypoint.d/, skipping configuration"
fi
fi
# Set up endpoint for env retrieval
echo "window._env_ = {" > /usr/share/nginx/html/env_config.js
# Collect enviroment variables for react
eval enviroment_variables="$(env | grep REACT_APP.*=)"
# Loop over variables
env | grep REACT_APP.*= | while read -r line;
do
printf "%s',\n" $line | sed "s/=/:'/" >> /usr/share/nginx/html/env_config.js
# Notify the user
printf "Env variable %s' was injected into React App. \n" $line | sed "0,/=/{s//:'/}"
done
# End the object creation
echo "}" >> /usr/share/nginx/html/env_config.js
echo "Enviroment Variable Injection Complete."
exec "$#"
Functionality:
This will find all environment variable sent to the docker container running the frontend to extract all variables starting with REACT_APP and add them to a file named env_config.js.
All you need to do in the react app is to load that script file, then access the environment variables using window._env_.<property>.
DISCLAIMER
Environment variables injected with this method is fully readable by anyone using the site. This is not a secure method for sensitive information. Only use this for things such as "where is the backend api endpoint" or other non-sensitive information that can be extracted just as easily.
In your approach, the environment variables are injected when the container starts, and by that time your App is built and docker image is created, and also you cannot access process.env on client side. Therefore to access them on client side, we have to do the below steps.
You must be using webpack in your React App for bundling and other stuff.
So in you webpack.config.js, declare your environment variable REACT_APP_SOME_ENV_VARIABLE using Define plugin which will result in declaring the variables as global variables for the app.
Your webpack config should look something like this:
const path = require("path");
const webpack = require("webpack");
module.exports = {
target: "web",
performance: {
hints: false,
},
node: {
fs: "empty"
},
entry: "./src/index.js",
output: {
path: path.join(__dirname, "/build"),
filename: "[name].[contenthash].js"
},
module: {
rules: [
//your rules
]
},
plugins: [
new webpack.DefinePlugin({
"envVariable": JSON.stringify(process.env.REACT_APP_SOME_ENV_VARIABLE),
}),
],
};
And in your App, you can use the variable like this
// App.js
function App() {
return (
<p>SOME_ENV_VARIABLE = {envVariable}</p>
);
}
NOTE: Make sure before RUN npm run-script build command is run, your environment variables are injected to docker container.
For that you should declare your environment variables in DockerFile using ENV before RUN npm run-script build step.
My Jenkins pipeline runs on the Slave using agent { node { label 'slave_node1' } }.
I use Jenkins file parameter named uploaded_file and upload a file called hello.pdf
My pipeline contains the following code
stage('Precheck')
{
steps {
sh "echo ${WORKSPACE}"
sh "echo ${uploaded_file}
sh "ls -ltr ${WORKSPACE}/*"
Output:
/web/jenkins/workspace/MYCOPY
hello.pdf
ls: cannot access /web/jenkins/workspace/MYCOPY/* No such file or directory
As you can see no files were found on the slave WORKSPACE.
Can you let me understand if I'm checking for the uploaded file in the correct location i.e under WORKSPACE directory?
How can I get the file uploaded to the slave's WORKSPACE?
I'm on jenkins version 2.249.1
Can I get this to work at least on the latest version of Jenkins ?
So do you have a fixed file that that is copied in every build? I.e its the same file?
In that case you can save it as a secret file in jenkins and do the following:
environment {
FILE = credentials('my_file')
}
stages {
stage('Preperation'){
steps {
// Copy your fie to the workspace
sh "cp ${FILE} ${WORKSPACE}"
// Verify the file is copied.
sh "ls -la"
}
}
}
I'm running a conf file in logstash which gives the output as required but when i open ls to list file it's not getting created.need help in resolving issue.
conf file:
input {
elasticsearch{
hosts =>"localhost:9200"
index=>"index1"
}
}
output{
file{
path => "/opt/elk/logstash-6.5.0/example.json"
}
stdout{}
}
Result of execution:
[2019-10-13T14:07:47,503][INFO ][logstash.outputs.file ]
Opening file {:path=>"/opt/elk/logstash-6.5.0/example.json"}
{
"#timestamp" => 2019-10-13T08:37:46.715Z,
"example" => [
//data within example//
}
[2019-10-13T16:44:19,982][INFO ][logstash.pipeline] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x4f8848b run>"}
but when I run ls to see example.json, it doesn't show up
ls command:
elastic#es-VirtualBox:~/opt/elk/logstash-6.5.0$ ls
bin CONTRIBUTORS fetch.conf Gemfile.lock lib logs logstash-core-plugin-api NOTICE.TXT throw.conf vendor config data Gemfile grokexample.conf LICENSE.txt logstash-core modules output.json tools x-pack
so was wondering if conf only creates a temporary file?