Precached content does not load from cache - reactjs

In my React SSR application I have implemented service worker(via Workbox).
It's working fine. Every time when I am changing some piece of code, rebuilding again, running the server, going to the browser, I am seeing that my cache was updated succesfully.
But one thing I cant understand. When I am deleting some asset(js or css) from my local server and trying to do some action in the browser(which invokes that asset) I am getting a chunk error, which says that the file is not available.
The main question is if that asset is already is in cache storage it should not be loaded from that cache or I have missed something?
The components I have used is
Node/express(for server)
#loadable/components(for code splitting), combined with webpack
Google workbox plugin
// my sw.js file
import { skipWaiting } from 'workbox-core';
import { precacheAndRoute } from 'workbox-precaching';
declare const self: Window & ServiceWorkerGlobalScope;
precacheAndRoute(self.__WB_MANIFEST);
skipWaiting();
// my workbox setup
const serviceWorkerRegistration = async (): Promise<void> => {
const { Workbox } = await import('workbox-window');
const wb = new Workbox('./service-worker.js');
wb.addEventListener('activated', (event: any) => {
if (event.isExternal) {
window.location.reload();
}
});
wb.register();
};

This is due to your usage of skipWaiting() inside of your service worker. When the waiting service worker activates, it will delete all of the outdated precached entries that are no longer associated with the new service worker deployment.
There is more background information in this two closely related answers, as well as a presentation:
What are the downsides to using skipWaiting and clientsClaim with Workbox?
Workbox: the danger of self.skipWaiting()
Paying Attention while Loading Lazily

Related

Script load failed service worker on PWA

On my PWA application I have successfully registered a service worker. Some users on Safari are experiencing this error Script https://app.com/service-worker.js load failed Note that I was still able to view the service worker in the dev tools despite the error (it was activated status).
It is not clear what this error means. It is not impacting their experience but I am afraid it might cause issues later on.
Here is my service-worker.js - Note we cannot have any caching in our application. We enabled the service worker just so we can have add to home screen capability on Android.
/* eslint-disable no-restricted-globals */
// This service worker can be customized!
// See https://developers.google.com/web/tools/workbox/modules
// for the list of available Workbox modules, or add any other
// code you'd like.
// You can also remove this file if you'd prefer not to use a
// service worker, and the Workbox build step will be skipped.
import { clientsClaim } from 'workbox-core';
import { precacheAndRoute } from 'workbox-precaching';
clientsClaim();
// Precache all of the assets generated by your build process.
// Their URLs are injected into the manifest variable below.
// This variable must be present somewhere in your service worker file,
// even if you decide not to use precaching. See https://cra.link/PWA
precacheAndRoute(self.__WB_MANIFEST.filter(() => false));
self.addEventListener('install', (event) => {
self.skipWaiting();
event.waitUntil(caches.open('app').then((cache) => cache.addAll([])));
});
self.addEventListener('message', (event) => {
if (event.data && event.data.type === 'SKIP_WAITING') {
self.skipWaiting();
}
});
self.addEventListener('activate', (event) => {
event.waitUntil(
caches
.keys()
.then((cacheNames) => Promise.all(cacheNames.map((cacheName) => caches.delete(cacheName)))),
);
});

NextJS dynamimc routing question about slugs

I have a doubt with nextjs..
I'm building my site like this
pages
[slug]
index.jsx
index.jsx
so in my slug/index I'm doing this
export async function getStaticPaths() {
const resProducts = await fetch(`${process.env.PRIVATE_ENDPOINT}/products`);
const products = await resProducts.json();
const paths = products.data.map((p) => ({
params: {
slugProduct: p.slug,
},
}));
return {
// this should be dynamic
paths,
fallback: true,
};
}
My question is what happend if I add a new product in my back office?
Do I have to rebuild with next build?
My question is what happend if I add a new product in my back office?
Do I have to rebuild with next build?
The short answer is NO. If the requested page have not been genereted at build time, Next.js will serve a "fallback" version of the page and will statically generate the requested path HTML and JSON on the background. When the statically generation completed, the browser receives the JSON for the generated path. Subsequent requests to the same path will serve the generated page, just like other pages pre-rendered at build time.
Don't forget to use router.isFallback to detect that request is on fallback.
You can see the good document here.
https://nextjs.org/docs/basic-features/data-fetching#getstaticpaths-static-generation

Integration testing a fetch calls in a node module

Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.

React application with external plugins

I'm building a React application bundled using Parcel or Webpack.
The application should be able to embed external React components
developed by third-parties and hosted elsewhere as modern javascript modules:
// https://example.com/scripts/hello-plugin.js
import React from 'react';
export default class HelloPlugin extends React.Component {
render() {
return "Hello from external plugin!";
}
}
Host application loads these components using asynchronous import like this, for example:
// createAsyncComponent.tsx
import * as React from 'react';
import { asyncComponent } from 'react-async-component';
export default function createAsyncComponent(url: string) {
return asyncComponent({
resolve: () => import(url).then(component => component.default),
LoadingComponent: () => <div>Loading {url}....</div>,
ErrorComponent: ({ error }) => <div>Couldn't load {url}: {error.message}</div>,
})
}
But looks like bundlers don't allow importing arbitrary urls as external javascript modules.
Webpack emits build warnings: "the request of a dependency is an expression" and the import doesn't work. Parcel doesn't report any errors, but fails when import(url) occurs at runtime.
Webpack author recommends using scriptjs or little-loader for loading external scripts.
There is a working sample that loads an UMD component from arbitrary URL like this:
public componentDidMount() {
// expose dependencies as globals
window["React"] = React;
window["PropTypes"] = PropTypes;
// async load of remote UMD component
$script(this.props.url, () => {
const target = window[this.props.name];
if (target) {
this.setState({
Component: target,
error: null,
})
} else {
this.setState({
Component: null,
error: `Cannot load component at ${this.props.url}`,
})
}
});
}
Also, I saw a similar question answered a year ago where the suggested approach also involves passing variables via a window object.
But I'd like to avoid using globals given that most modern browsers support modules out of the box.
I'm wondering if it's possible. Perhaps, any way to instruct the bundler that my import(url) is not a request for the code-split chunk of a host application, but a request for loading an external Javascript module.
In the context of Webpack, you could do something like this:
import(/* webpackIgnore: true */'https://any.url/file.js')
.then((response) => {
response.main({ /* stuff from app plugins need... */ });
});
Then your plugin file would have something like...
const main = (args) => console.log('The plugin was started.');
export { main };
export default main;
Notice you can send stuff from your app's runtime to the plugin at the initialization (i.e. when invoking main at the plugin) of the plugins so you don't end up depending on global variables.
You get caching for free as Webpack remembers (caches) that the given URL has already loaded so subsequent calls to import that URL will resolve immediately.
Note: this seems to work in Chrome, Safari & firefox but not Edge. I never bothered testing in IE or other browsers.
I've tried doing this same sort of load with UMD format on the plugin side and that doesn't seem to work with the way Webpack loads stuff. In fact it's interesting that variables declared as globals, don't end up in the window object of your runtime. You'd have to explicitly do window.aGlobalValue = ... to get something on the global scope.
Obviously you could also use requirejs - or similar - in your app and then just have your plugins follow that API.
Listen to the Webpack author. You can't do (yet) what you're trying to do with Webpack.
You will have to follow his suggested route.

importScripts in Web Workers is undefined inside a React/Webpack environment

I am working on a React app created with create-react-app. I was having trouble creating a web worker in it so I posted a question here on SO: Creating a web worker inside React
I've found a solution, as written in the post above, to load a worker without ejecting the app and messing with the Webpack config. This is the code, from the post above:
// worker.js
const workercode = () => {
self.onmessage = function(e) {
console.log('Message received from main script');
const workerResult = 'Received from main: ' + (e.data);
console.log('Posting message back to main script');
self.postMessage(workerResult);
}
};
let code = workercode.toString();
code = code.substring(code.indexOf("{")+1, code.lastIndexOf("}"));
const blob = new Blob([code], {type: "application/javascript"});
const worker_script = URL.createObjectURL(blob);
module.exports = worker_script;
and in the file that uses the worker:
import worker_script from './worker';
const myWorker = new Worker(worker_script);
myWorker.onmessage = (m) => {
console.log("msg from worker: ", m.data);
};
myWorker.postMessage('im from main');
It works, however, I cannot seem to get importScripts to work. Even if I do this (outside onmessage or inside onmessage):
if (typeof importScripts === 'function') {
importScripts('myscript.js');
}
In that case, the if statement turns out to be true, but then fails on the actual import with the same error message 'importScripts' is not defined as if the if statement is a false positive, which doesn't sound right. I'd say this is a context issue and that the worker probably isn't loading properly (although it seems to work), but it's just a guess.
Any ideas what's happening here?
importScripts in a worker created from Blob works fine, at least in 2021 (react 17.0.2, react-scripts 4.0.3, Chrome 92). The imported script URL must be absolute because worker was created from Blob.
The original issue might have been a bug in webpack or the transpilation might have changed the code in a weird way.
const workercode = () => {
importScripts("https://example.com/extra.js");
console.log(self.extraValue); // 10
self.onmessage = function(e) {
console.log('Message received from main script');
...
}
};
 
// extra.js
self.extraValue = 10;
Looks like this is still broken in 2022 - Seems there is a regression coming down the dev pipeline (at least in Android WebView and possibly some dev/canary chrome verions.)
https://bugs.chromium.org/p/chromium/issues/detail?id=1078821

Resources