How to compile Elm with React in Browserify - reactjs

I want to start using Elm in a React project, but they are not using Webpack, they are using Browserify with Gulp instead. How can I get Elm to be compiled with Browserify?

In the end, the best way around it I found was to compile the Elm code first, and only then compile the JS. This means that, unlike when using elm-webpack-loader where you can require the .elm files directly form JS, I have to require the compiled elm code.
Here is a Gulp task to compile all Main.elm files in a directory or any of its subdirectories into a single .js file. You can then require the modules with
import { Widget1, Widget2 } from "../compilation-folder/myapp.js"
Here is the task:
const path = require("path");
// const execSync = require("child_process").execSync;
const shell = require("gulp-shell");
const glob = require("glob");
// Compiles all Main.elm files in a specified src folder and its subfolders
// into the output file specified
module.exports = (src, dest, outname) => {
const output = path.join(dest, `${outname}`);
return done => {
glob(src + "/**/Main.elm", {}, (err, filesArray) => {
const files = filesArray.join(" ");
shell.task(`elm-make --yes ${files} --output=${output}`)(done)
})
}
};
You can use it like this:
const buildElm = require("./fileWithCodeAbove.js")
gulp.task("build-elm", buildElm("./elm-directory", "./build-directory", "myapp.js");

Related

react scripts build generate new hash even if the code not changes

I build react app without create-react-app (without eject).
I want to generate new hash every build if the code not change (because cache issue).
I installed react-app-rewired for using config overloads and change package.json to
"build": "react-app-rewired build",
in config-overrides.js I'm trying to create new hash for each build (minified, css, js,styled and etc) but not sure I do it in right way
require('dotenv').config();
var uniqid = require('uniqid');
const FileManagerPlugin = require('filemanager-webpack-plugin');
const CopyPlugin = require('copy-webpack-plugin');
const HtmlWebPackPlugin = require('html-webpack-plugin');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
module.exports = {
webpack: function (config, env) {
console.log('outputconfig before', config.output);
const buildHash = uniqid();
config.output.filename = `static/js/[name].${buildHash}.js`;
config.output.chunkFilename = `static/js/[name].${buildHash}.chunk.js`;
console.log('outputs config', config.output);
return config;
},
};
when I deploy it to production it looks like the hash build is the same if the code has not changes.. not sure if I configure the config-overloads.js right, maybe I need to add webpack or something not sure.
I want every build to generate new unique name to js, css and html files.
If the input is the same, if you did not change anything in the codebase, webpack will ALWAYS produce the same hash. This is the main property of a hash function. So you have to somehow always change the codebase. To achieve this you can write to a js file and import it in app.js so webpack will see that file's input has changed. In your webpack.config.js
const crypto = require("crypto");
const fs = require("fs");
const content = crypto.randomBytes(8).toString("hex");
const value = JSON.stringify(content);
// we have to write a valid javascript
fs.writeFile("src/test.js", `export const a=${value}`, (err) => {
if (err) {
console.error(err);
}
});
this test.js has no effect on your code. So you could safely import in app.js
import "./test.js";
If you dont import it will not work. Because webpack will read only imported code.
in webpack.config.js i defined the filename like this
output: {
filename: "[name].[hash].js",
path: path.join(__dirname, "dist"),
publicPath: "/",
},
every time I run npm run build it creates a different hash. Proof of work:

TypeError: feature is not a function in discord.js using WOK tutorials

Coding DJS bot using the WOK YT tutorials as a base.
Refactored all my code and made a load-features file to load many of my commands into index.js automatically.
Here is the code for that file:
const path = require("path");
const fs = require("fs");
module.exports = (client) => {
const readFeatures = (dir) => {
const files = fs.readdirSync(path.join(__dirname, dir));
for (const file of files) {
const stat = fs.lstatSync(path.join(__dirname, dir, file));
if (stat.isDirectory()) {
readFeatures(path.join(dir, file));
} else if (file !== "load-features.js") {
const feature = require(path.join(__dirname, dir, file));
console.log(`Enabling feature "${file}"`);
feature(client);
}
}
};
readFeatures(".");
};
The error says TypeError: feature is not a function when calling feature(client). It was working fine yesterday but suddenly decided not to work.
If you need more examples of the code itself, here is a link to the tutorial repository, where this file is identical to mine and how it is connected to index.js: https://github.com/AlexzanderFlores/Worn-Off-Keys-Discord-Js/tree/master/43-Refactoring
Please advise.

How to reuse the "public" folder in a monorepo with multiple CRAs?

I use create-react-app in multiple packages in a monorepo. There is a fair amount of duplicated code and files in the "public" folder of each app, since they all have the same icons, descriptions, fonts etc.
Is there a way to move some or all of the files in the "public" folder to their own package, run them through a tool like handlebars.js and finally bundle them with create-react-app, without ejecting?
I couldn't find any open-source tool for this, so I wrote my own scripts:
prepare_cra_files.sh
#!/usr/bin/env bash
for app in apps/*/ ; do
# Flush the files if they already exist
if [ -d "$app"public ]; then
rm -r "$app"public
fi
# Copy over the template files
cp -r template "$app/public"
done
node templatify.js
templatify.js
const Handlebars = require("handlebars");
const fs = require("fs-extra");
const path = require("path");
const APPS_PATH = path.join(__dirname, "..", "apps");
const INDEX_HTML_TEMPLATE_PATH = path.join(__dirname, "template", "index.handlebars");
(async () => {
const dirs = await fs.readdir(APPS_PATH);
const indexHtmlTemplate = Handlebars.compile(await fs.readFile(INDEX_HTML_TEMPLATE_PATH, "utf-8"));
dirs.forEach(async appName => {
const indexHtmlContextPath = path.join(APPS_PATH, appName, "/handlebars/index.json");
if (!fs.existsSync(indexHtmlContextPath)) {
throw new Error(`Please provide the index.html context for the ${appName} appName`);
}
const indexHtmlContext = JSON.parse(await fs.readFile(indexHtmlContextPath, "utf-8"));
const indexHtml = indexHtmlTemplate(indexHtmlContext);
await fs.writeFile(path.join(APPS_PATH, appName, "public", "index.html"), indexHtml);
await fs.remove(path.join(APPS_PATH, appName, "public", "index.handlebars"));
});
})();

webpack separate build files

I have a nested directory structure with jsx modules, like
app/js/header/index.jsx
app/js/task/runner.jsx
and so on
is it possible to have webpack transpile each one of them and output the result in the same directory as the jsx file?
Regards
If I understand you correctly, you want to put resulting module next to each source module. It seems that you can achieve this with a plugin:
var fs = require('fs');
function MyPlugin() {}
MyPlugin.prototype.apply = function(compiler) {
compiler.plugin('emit', function(compilation, callback) {
compilation.modules.forEach(m => {
if (/filename/.test(m.resource)) { // test for filename to exclude node_modules
fs.writeFileSync(m.resource + '.transpiled', m._source._value);
}
});
callback();
});
};
and in the webpack config:
{
...
plugins: [ MyPlugin() ],
...
}
Is it what you are trying to do?

Dynamically loading multiple entry points and splitting output

I have several different angular modules that I want to dynamically concatenate such that each module has a different single output file.
so that if in the resources/assets/js/angular/modules directory I have:
Role
- controllers
roleIndexController
- app.js
Descriptions
-controllers
descriptionIndexController
-app.js
And I want this to end up being two files:
RoleModule.js
DescriptionsModule.js
And I don't want to explicitly have to add each module to the gulp file every time I add a new one.
I've tried adding glob characters using webpack in laravel elixir:
require('laravel-elixir-webpack');
elixir(function (mix) {
mix.webpack(
['angular/app.js', '.angular/modules/**/app.js'],
{
output : {
filename : './angular/app.js'
}
})
and also just using gulp-webpack and glob_entries:
gulp.task('compileAngularScripts', function(){
return gulp.src('angular/app.js')
.pipe(webpack({
entry : glob_entries('./resources/assets/js/angular/modules/**/app.js'),
output : {
filename : 'app.js'
}
}))
.pipe(gulp.dest('test/'));
});
I can't even seem to get the glob wildcard characters to work in order to dynamically concatenate files, much less break each module into it's own file.
Is there a way to do this using gulp?
I accomplished this by using the glob npm package with webpack
var glob = require('glob');
var webpack = require('gulp-webpack');
elixir(function (mix) {
.task('compileScripts')
gulp.task('compileScripts', function () {
return gulp.src('angular/app.js')
.pipe(webpack({
entry : entries(),
output : {
filename : "[name]Module.js"
}
}))
.pipe(gulp.dest('public/js/angular/modules'));
});
function entries() {
var entries = {};
glob.sync('./modules/**/Resources/assets/angular/app.js').forEach(function (url) {
var moduleNmae = url.split("/")[2];
entries[moduleNmae] = url;
});
return entries;
}

Resources