How to use a Service Worker to cache a virtual file? - reactjs

I am attempting to implement a service worker for a boilerplate project I'm working on (https://github.com/jonnyasmar/gravity-bp feedback welcome!), but I've hit a snag :(
Problem:
I'm serving the index.html for this boilerplate virtually as an interpreted Twig template via ExpressJS. However, because I'm generating the service worker assets at build time and that is where I'm pointing it to the cacheable static assets, I can't figure out how to tell the service worker that I want it to cache the virtual index.html file served by ExpressJS at runtime.
My most successful attempts successfully cache all static assets (including the asset-manifest.json generated at build time), but will not cache a virtual index.html.
If I convert it to a static html file, the service worker does successfully cache it.
Please be sure to upvote this question to help get some visibility if you are interested in the answer!
Questions:
Is there a way to correct my code to accomplish this?
Is there anything wrong with doing it this way?
If yes to #2, how would you recommend handling this and why?
See the full source on GitHub.
Relevant code:
webpack.config.js:
output: {
filename: '[name].js',
chunkFilename: '[chunkhash].js',
path: path.resolve(__dirname, 'public'),
publicPath: '/'
},
plugins: {
new ManifestPlugin({
fileName: 'asset-manifest.json',
}),
new SWPrecacheWebpackPlugin({
cacheId: 'gravity-bp',
dontCacheBustUrlsMatching: /\.\w{8}\./,
filename: 'sw.js',
minify: true,
navigateFallback: 'index.html',
stripPrefix: 'public/',
swFilePath: 'public/sw.js',
staticFileGlobs: [
'public/index.html',
'public/**/!(*map*|*sw*)',
],
})
}
sw.ts:
const swUrl: string = 'sw.js';
export const register = (): void =>{
if(process.env.NODE_ENV === 'production' && 'serviceWorker' in navigator){
const sw: ServiceWorkerContainer = navigator.serviceWorker;
sw.register(swUrl).then(registration =>{
registration.onupdatefound = (): any =>{
const installer: ServiceWorker = registration.installing;
installer.onstatechange = (): any =>{
if(installer.state === 'installed'){
if(sw.controller){
console.log('New content available.');
}else{
console.log('Content cached for offline use.');
}
}
};
};
}).catch((error) =>{
console.error('Failed to register service worker:', error);
});
}
};
export const unregister = (): void =>{
if('serviceWorker' in navigator){
navigator.serviceWorker.ready.then(registration =>{
registration.unregister();
});
}
};
server.ts:
import * as path from 'path';
const twig = require('twig').__express;
const express = require('express');
const compression = require('compression');
const pkg = require('../../package.json');
const version = pkg.version;
let app = express(),
ip = '0.0.0.0',
port = 3000,
views = path.resolve('./src/views');
app.use(compression());
app.use(express.static('public'));
app.set('view engine', 'twig');
app.engine('.twig', twig);
app.set('views', views);
// Routes
app.get("*", function(req: any, res: any, next: any){
// vars
res.locals.version = version;
res.render('index');
});
let server = app.listen(port, ip, function(){
let host = server.address().address;
let port = server.address().port;
console.log('Gravity Boilerplate ready at http://%s:%s', host, port);
});

Within the sw-precache-webpack-plugin documentation it talks about using sw-precache options. The one you should investigate is the dynamicUrlToDependencies setting. See some of these links for more info:
https://github.com/GoogleChromeLabs/sw-precache/issues/156
dynamicUrlToDependencies [Object⟨String,Buffer,Array⟨String⟩⟩]
For example, maybe start with this to test:
dynamicUrlToDependencies: {
'/': 'MAGIC_STRING_HERE'
},
So really, you need to configure the sw-precache WebPack plugin to load a server rendered page as the navigateFallback route.

Related

Fetching from localhost server in Create React App during development vs fetching from deployment during production

This is a personal portfolio page that I'm implementing a contact form within, using nodemailer.
The nodemailer thing is all set from server side. I just need some advice on pointing the client post request to the right place in regards to development and deployment.
I figured as much for setting an environment variable for production vs development and hitting the fetch based upon that. Now I'm just wondering how to go about finding whatever I would put in the fetch for production.
would it be just pointing back into my own app:
fetch(www.mydomain.com/send-email, data) ...
I'm in the Heroku docs trying to figure this out.
Basically, I have a huge blind spot which is hitting a server API from Create React App that isn't launched independently on localhost:3000. I have yet to hit a server route from my client that wasn't served locally on localhost. When I push this to Heroku, I need to have the right route or config, what I need is some advice on how to do this.
I understand proxying somewhat. Just wondering what the steps are to properly hit my server route from an client/server deployed on Heroku as opposed to localhost:3000 during deployment.
When I'm in development I pretty much always axios.post a server that I've spun up on localhost:3000,
which I then hit with something like this coming from my client..
axios.post('localhost:3000/send-email', data)
.then( () => {
setSent(true)
})
.then(() => {
resetForm()
})
.catch((err)=> {
console.log('Message not sent', err)
})
}
...which is then handled by an endpoint on the express server listening on localhost:3000, that looks somewhat like what I've pasted below.
const express =
require('express'),
bodyParser = require('body-parser'),
nodemailer = require('nodemailer'),
cors = require('cors'), path = require('path'),
port = process.env.PORT || 3000, publicPath = path.join(__dirname, '..', 'build');
require('dotenv').config();
const app = express();
app.use(cors());
app.use(express.static(publicPath));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.get('*', (req, res) => {
res.sendFile(path.join(publicPath, 'index.html'));
});
app.post('/send-email', (req, res) => {
console.log('request: ', req.body)
let data = req.body;
let transporter = nodemailer.createTransport({
service: 'gmail',
port: 465,
auth: {
user: process.env.EMAIL,
pass: process.env.PASSWORD
}
});
let mailOptions = {
from: data.email,
to: process.env.EMAIL,
subject: `${data.subject}`,
html: `<p>${data.name}</p>
<p>${data.email}</p>
<p>${data.message}</p>`
};
transporter.sendMail(mailOptions,
(err, res) => {
if(err) {
res.send(err)
} else {
res.send('Success')
}
transporter.close();
});
})
app.listen(port, () => {
console.log(`Server is up on port ${port}!`);
});
folder structure is like this:
main
|-server
|-server.js
|-src
|-components
|-Contact.js
Use the process.env.NODE_ENV variable to differ the environments.
When you run npm start, it is always equal to 'development', when you run npm test it is always equal to 'test', and when you run npm run build to make a production bundle, it is always equal to 'production'. You cannot override NODE_ENV manually.
Therefore, you can create and export a function like
export function apiDomain() {
const production = process.env.NODE_ENV === 'production'
return production ? 'anotherDoman' : 'localhost:3000'
}
or maybe, depending on your requirements
export function apiDomain() {
const { protocol, hostname, origin } = window.location
return hostname === 'localhost' ? `${protocol}//${hostname}` : origin
}
For more details, take a look at https://create-react-app.dev/docs/adding-custom-environment-variables/

plugging nextjs with keystonejs

I am trying to plug nextjs with keystone js show that i can use reactjs for frontend and keystonejs as CMS but It's not working. I was following a tutorial to do so although it worked in tutorial but i don't know why its not working in my case.
tutorial website
This is how i tried to configure.
keystone.js
// Simulate config options from your production environment by
// customising the .env file in your project's root folder.
require('dotenv').config();
// Require keystone
const keystone = require('keystone');
//var handlebars = require('express-handlebars');
//Next app
const next = require('next');
const dev = process.env.NODE_ENV !== 'production';
const app = next({dev});
// Initialise Keystone with your project's configuration.
// See http://keystonejs.com/guide/config for available options
// and documentation.
keystone.init({
'name': 'cmsblog',
'brand': 'cmsblog',
// 'sass': 'public',
// 'static': 'public',
// 'favicon': 'public/favicon.ico',
// 'views': 'templates/views',
// 'view engine': '.hbs',
// 'custom engine': handlebars.create({
// layoutsDir: 'templates/views/layouts',
// partialsDir: 'templates/views/partials',
// defaultLayout: 'default',
// helpers: new require('./templates/views/helpers')(),
// extname: '.hbs',
// }).engine,
'auto update': true,
'session': true,
'auth': true,
'user model': 'User',
});
//Load your project's Models
keystone.import('models');
// Setup common locals for your templates. The following are required for the
// bundled templates and layouts. Any runtime locals (that should be set uniquely
// for each request) should be added to ./routes/middleware.js
app.prepare()
.then(() => {
// keystone.set('locals', {
// _: require('lodash'),
// env: keystone.get('env'),
// utils: keystone.utils,
// editable: keystone.content.editable,
// });
// Load your project's Routes
keystone.set('routes', require('./routes'));
// Configure the navigation bar in Keystone's Admin UI
keystone.set('nav', {
posts: ['posts', 'post-categories'],
galleries: 'galleries',
enquiries: 'enquiries',
users: 'users',
});
// Start Keystone to connect to your database and initialise the web server
keystone.start();
})
Routes.js
const keystone = require('keystone');
exports = module.exports = nextApp => keystoneApp => {
//setup Route Bindings
const handle = nextApp.getRequestHandler();
keystoneApp.get('/api/posts', (req,res,next) => {
const Post = keystone.list('Post');
Post.model
.find()
.where('state', 'published')
.sort('-publishedDate')
.exec(function(err, results) {
if(err) throw err;
res.json(results);
});
});
keystone.get('*', (req,res) => {
return handle(req,res);
});
};
folder structure
result after visiting localhost:3000
https://medium.com/#victor36max/how-to-build-a-react-driven-blog-with-next-js-and-keystonejs-cae3cd9fb804
The tutorial website says
A Next.js 404 page should show up instead of KeystoneJS page.
Let’s try to make a page with Next.js.
In pages folder, make a new file index.js .
So you can keep it up.

How to structure multiple react apps and still share code + assets

I've built a react app which uses the following structure;
node_modules
src/
app/
index.ts
index.html
...
server/
index.ts
...
node_modules/ // using the Alle pattern here
#<custom_packages>
api
components
Now I need to add a new app. Which runs on a different domain, but should be able to use as much shared code as it can, inc our custom packages. My first attempt was to do the following;
node_modules
src/
app1/
index.ts
index.html
...
app2/
index.ts
index.html
...
server/
index.ts // Uses ENV_VAR to serve a different bundle
...
node_modules/
#<custom_packages>
api
components
The problem I'm now running into, is that both apps generate their own assets etc. But i would like to share them between apps so the client can cache them. I could decide not to use webpack to build the assets and just put them in a static folder, but then I lose the support of the offline-plugin in webpack.
Also we decided to use a mono-repo in this case. This is making CI significantly harder, but managing shared code a lot easier. I'm kind of wondering if there are any seasoned developers that have faced this situation more often.
Basically, how would you structure 2 react apps that should share as much code as possible?
I needed a similar setup but for Angular. Here is the Webpack config that I have (I left only relevant parts):
const CommonsChunkPlugin = require('webpack/lib/optimize/CommonsChunkPlugin');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const apps = [
{
name: 'app1',
baseUrl: '/app1'
},
{
name: 'app2',
baseUrl: '/app2'
}
];
module.exports = function (args = {}) {
const isDev = !args.PROD;
const distPath = 'dist';
var config = {};
config.entry = {};
apps.forEach(function (app) {
config.entry[getAppBundleName(app)] = './src/apps/' + app.name + '/main.ts';
});
config.output = {
path: root(distPath),
filename: '[name].js',
chunkFilename: '[name].[chunkhash].js',
publicPath: '/dist/'
};
config.resolve = {
extensions: ['.ts', '.js', '.json'],
modules: [root('src'), root('node_modules')]
};
config.module = {
rules: [
// Add your loaders here
]
};
config.plugins = [
// Add your plugins here
// This enables tree shaking of the vendor modules
new CommonsChunkPlugin({
name: 'vendor',
chunks: ['admin'].concat(apps.map(getAppBundleName)),
minChunks: module => /node_modules/.test(module.resource)
}),
new CommonsChunkPlugin({
name: 'shared',
chunks: ['admin'].concat(apps.map(getAppBundleName)),
minChunks: module => /src(\\|\/)shared/.test(module.resource)
})
];
apps.forEach(function (app) {
var otherApps = apps.slice();
var appItemIndex = otherApps.indexOf(app);
if (appItemIndex > -1) {
otherApps.splice(appItemIndex, 1);
}
config.plugins.push(new HtmlWebpackPlugin({
template: 'index_template.html',
title: app.name,
filename: getAppDevServerHtmlFileName(app),
excludeChunks: otherApps.map(getAppBundleName),
chunksSortMode: 'manual',
chunks: ['vendor', 'shared', getAppBundleName(app)],
inject: 'head',
metadata: {
baseUrl: app.baseUrl
}
}));
});
config.devServer = {
port: 4200,
stats: stats,
historyApiFallback: {
rewrites: apps.map(function (app) {
return {
from: new RegExp('^' + app.baseUrl),
to: '/dist/' + getAppDevServerHtmlFileName(app)
}
}),
},
};
return config;
}
function getAppBundleName(app) {
return app.name;
}
function getAppDevServerHtmlFileName(app) {
return app.name + '_index.html';
}
function root(args) {
args = Array.prototype.slice.call(arguments, 0);
return path.join.apply(path, [__dirname].concat(args));
}`
In my case, I have the following folder structure, which is similar to yours:
node_modules
src/
apps/
app1/
...
main.ts
app2/
...
main.ts
shared/
shared-module1/
...
shared-module2/
...
index.ts
...
webpack.config.js
And here is the output after compilation in the dist folder:
dist/
app1.js
app1_index.html
app2.js
app2_index.html
vender.js
shared.js
As you can see, vendor.js and shared.js contain the shared code between the apps.

webpack-dev-server not working

Update current problem :
it seems that the webpack hot loader goes wrong,because when i run the following cmd:webpack,it can be built as usual.but when i run ""dev": "webpack-dev-server --color --hot --progress && node ./server.js"".webpack cannot generate built files for me .
my webpack.config is as follows:
module.exports = {
entry: getEntries(),
.....
function getEntries(){
var routeDir = path.join(SRC_DIR,"javascripts","routes");
var routeNames = routeDir?fs.readdirSync(routeDir):[];
var nameMaps = {};
routeNames.forEach(function(routeName){
var filename = routeName.match(/(.+)\.js$/)[1];
console.log("filename in entry ",filename);
if(filename){
var devEntryPath = [
'webpack-dev-server/client?http://127.0.0.1:3001', // WebpackDevServer host and port
'webpack/hot/only-dev-server',
path.join(routeDir,filename)
];
nameMaps[filename] = devEntryPath;
}
});
return nameMaps;
}
server.js
var server = new WebpackDevServer(webpack(config), {
publicPath: config.output.publicPath,
hot: true,
historyApiFallback: true
}).listen(3001,'localhost',function(err,result){
if(err) console.log(err);
console.log("webpack listening at port 3001");
});
var app = express();
app.get("/monitor/index",function(req,res){
res.sendFile(__dirname+"/src/views/"+"page1.html");
});
app.get("/monitor/category/*",function(req,res){
res.sendFile(__dirname+"/src/views/"+"page2.html");
});
app.use(express.static(__dirname))
.listen(9090, 'localhost', function (err, result) {
if (err) console.log(err);
console.log('Listening at localhost:9090');
});
finally,i found where the problem is,and know the relationship between webpack-dev-server and my express server.
when using hot-loader with webpack-dev-server:
step1:the webpack build the input file to the publicPath (which was designated in "output" of webpack.config.js).
step2,the node server will send html to the front,and search for the related assets(such as js,img etc),but where? we can change the script(related with html) path to the webpack-dev-server.(just generated by step1),so node-server will ask webpack-dev-server for help.
to sum up ,i modified 3 places:
publicPath of webpackDevServer
webpack output(publicPath),equal to above
script path in html.
that's all.and now,my project can run as expected.

React Hot Loader not updating as expected

I am using react-hot-loader and webpack. I also use webpack-dev-server together with an express backend.
This is my relevant webpack config for development:
var frontendConfig = config({
entry: [
'./src/client/app.js',
'webpack-dev-server/client?http://localhost:3000',
'webpack/hot/dev-server'
],
output: {
path: targetDir,
publicPath: PROD ? '/build/assets/' : 'http://localhost:3000/build/assets/' ,
filename: 'app.js'
},
module: {
loaders: [
{test: /\.js$/,
exclude: /node_modules/,
loaders: PROD ? [babelLoader] : ['react-hot', babelLoader] }
]
},
plugins: [
new webpack.HotModuleReplacementPlugin({ quiet: true })
]
});
with this config I start webpack and webpack-dev-server
gulp.task('frontend-watch', function() {
new WebpackDevServer(webpack(frontendConfig), {
publicPath: frontendConfig.output.publicPath,
hot: true,
stats: { colors: true }
}).listen(3000, 'localhost', function (err, result) {
if(err) {
console.log(err);
}
else {
console.log('webpack dev server listening at localhost:3000');
}
});
});
so webpack-dev-server is running at localhost:3000 and receives app.js from webpack watcher (which now is not anymore written to file system).
my express server serves as a backend/api and has the following config:
var express = require('express');
// proxy for react-hot-loader in dev mode
var httpProxy = require('http-proxy');
var proxy = httpProxy.createProxyServer({
changeOrigin: true,
ws: true
});
var isProduction = process.env.NODE_ENV === 'production';
// It is important to catch any errors from the proxy or the
// server will crash. An example of this is connecting to the
// server when webpack is bundling
proxy.on('error', function(e) {
console.log('Could not connect to proxy, please try again...');
});
module.exports = function (app) {
// We only want to run the workflow when not in production
if (!isProduction) {
console.log('setting up proxy for webpack-dev-server..');
// Any requests to localhost:4200/build is proxied
// to webpack-dev-server
app.all('assets/app.js', function (req, res) {
proxy.web(req, res, {
target: 'http://localhost:3000'
});
console.log('request proxied to webpack-dev!');
});
}
var server = require('http').createServer(app);
app.use(express.static(homeDirectory + '/build'));
app.use(express.static(homeDirectory + '/files'));
server.listen(4200);
};
That's all good so far, the proxying work for app.js and I see successfull hot update messages in the browser console:
Now, while it looks fine it does not work as I expected:
when I change a component's render() method it updates as supposed, but when I change a helper method (that is used in render()) then I don't get any hot update. is that normal?
Another thing that bugs me, if I work like this, and do a 'hard' browser reload at some point, all changes I made are reverted to the point where I started my webpack-dev-server - all the hot updates in between have not been persisted somehow. is that normal as well? I would expect that I loose my state but not any changes I made to the code in the meantime. That has probably something to with my app.js not being written to the file system.
For your question #2, that's not normal, I have a template repo that has HMR working available here and it works just fine https://github.com/briandipalma/wp-r-template
For question #1, usually render methods display or format data, not grab it from somewhere. But if you need to format data, use a function outside of the component
Parent component would call the following once you retrieve the price
<ChildComponent price={this.state.price}
ChildComponent's render function would use props (or better yet a parameter of the function). Remember: the whole point of React is composition and data flow
return (
<div>{this.props.price}</div>
);

Resources