I'm building a React application bundled using Parcel or Webpack.
The application should be able to embed external React components
developed by third-parties and hosted elsewhere as modern javascript modules:
// https://example.com/scripts/hello-plugin.js
import React from 'react';
export default class HelloPlugin extends React.Component {
render() {
return "Hello from external plugin!";
}
}
Host application loads these components using asynchronous import like this, for example:
// createAsyncComponent.tsx
import * as React from 'react';
import { asyncComponent } from 'react-async-component';
export default function createAsyncComponent(url: string) {
return asyncComponent({
resolve: () => import(url).then(component => component.default),
LoadingComponent: () => <div>Loading {url}....</div>,
ErrorComponent: ({ error }) => <div>Couldn't load {url}: {error.message}</div>,
})
}
But looks like bundlers don't allow importing arbitrary urls as external javascript modules.
Webpack emits build warnings: "the request of a dependency is an expression" and the import doesn't work. Parcel doesn't report any errors, but fails when import(url) occurs at runtime.
Webpack author recommends using scriptjs or little-loader for loading external scripts.
There is a working sample that loads an UMD component from arbitrary URL like this:
public componentDidMount() {
// expose dependencies as globals
window["React"] = React;
window["PropTypes"] = PropTypes;
// async load of remote UMD component
$script(this.props.url, () => {
const target = window[this.props.name];
if (target) {
this.setState({
Component: target,
error: null,
})
} else {
this.setState({
Component: null,
error: `Cannot load component at ${this.props.url}`,
})
}
});
}
Also, I saw a similar question answered a year ago where the suggested approach also involves passing variables via a window object.
But I'd like to avoid using globals given that most modern browsers support modules out of the box.
I'm wondering if it's possible. Perhaps, any way to instruct the bundler that my import(url) is not a request for the code-split chunk of a host application, but a request for loading an external Javascript module.
In the context of Webpack, you could do something like this:
import(/* webpackIgnore: true */'https://any.url/file.js')
.then((response) => {
response.main({ /* stuff from app plugins need... */ });
});
Then your plugin file would have something like...
const main = (args) => console.log('The plugin was started.');
export { main };
export default main;
Notice you can send stuff from your app's runtime to the plugin at the initialization (i.e. when invoking main at the plugin) of the plugins so you don't end up depending on global variables.
You get caching for free as Webpack remembers (caches) that the given URL has already loaded so subsequent calls to import that URL will resolve immediately.
Note: this seems to work in Chrome, Safari & firefox but not Edge. I never bothered testing in IE or other browsers.
I've tried doing this same sort of load with UMD format on the plugin side and that doesn't seem to work with the way Webpack loads stuff. In fact it's interesting that variables declared as globals, don't end up in the window object of your runtime. You'd have to explicitly do window.aGlobalValue = ... to get something on the global scope.
Obviously you could also use requirejs - or similar - in your app and then just have your plugins follow that API.
Listen to the Webpack author. You can't do (yet) what you're trying to do with Webpack.
You will have to follow his suggested route.
Related
I am working on documentation tool for Typescript library. The idea is to leverage parcel's watch mode to continuously build the library, and use the same in a pre-built documentation app.
For the same I need to load a module library (built in another project) dynamically via URL.
<script type="module">
const libraryModule = "http://localhost:8080/lib.module.js";
const promise = import(libraryModule);
promise.then(library => {
// do something with library
window.ComponentLibrary = library;
});
</script>
However, parcel replaces the above import with require and the load fails. Using System.import throws System is not defined error.
I tried to use dynamic-import-polyfill and then initialize it as under and the use as below:
dynamicImportPolyfill.initialize({
modulePath: 'http://localhost:13090', // Defaults to '.'
importFunctionName: '$$import' // Defaults to '__import__'
const promise = $$import(libPath);
});
This throws the following error:
TypeError: Failed to resolve module specifier "react/jsx-dev-runtime". Relative references must start with either "/", "./", or "../"
I have also tried using script type as text/javascript but doesn't work either.
Looking for guidance on the best way here to get the component library loaded?
Figured it out: yes, we can load a component library as a module dynamically.
The issue was that React UMD module is not a pure ES/Javascript module. Also, with React 17, JSX components are picked from react/jsx-runtime. So, first I had to convert the React UMD module into an ES module - it's just a thin wrapper. Similarly, added a wrapper for jsx-runtime. To make things work had to use importmaps which are currently not supported in all browsers - see caniuse.com to check latest support.
This completes your setup and now your library compiled as ES module will work just fine. Below is what I used to get working:
<script type="importmap">
{
"imports": {
"react/jsx-runtime": "/react-jsx-runtime.js",
"react": "/react-as-es-module.js"
}
}
</script>
<script type="module" src="/component-library.js"></script>
<script type="module">
import * as MyComponentLib from "/component-library.js";
window.ComponentLibrary = { ...MyComponentLib };
</script>
Code for react-jsx-runtime.js looks as under:
import * as React from 'react';
export const jsx = React.createElement;
export const jsxs = React.createElement;
Code for react-as-es-module.js goes as:
import 'https://unpkg.com/react#17.0.2/umd/react.production.min.js';
const {
Children,
Component,
Fragment,
// and all other exports
} = React || {};
export {
Children,
Component,
Fragment,
// and all other exports
}
export default React;
I compiled component-library.js using ParcelJS using the type: "module" in package.json file. I would detail this in blog post and demo Github repo soon.
Hope this helps.
Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.
I recently read about React Lazy and how it "loads" the components during run time when they are required to be rendered. I assume that "loading" here means fetching the component from the server and then rendering it.
So my question is, how does React manage this fetching of components? How does it know the exact path from where to fetch this component (given that our code will mention the relative path but fetching will require complete server path)? Does it depend on Webpack for this?
Let's look into the React code. React.lazy is defined as follows.
export function lazy<T, R>(ctor: () => Thenable<T, R>): LazyComponent<T> {
let lazyType = {
$$typeof: REACT_LAZY_TYPE,
_ctor: ctor,
// React uses these fields to store the result.
_status: -1,
_result: null,
};
if (__DEV__) {
// ... additional code only in development mode
}
return lazyType;
}
As you can see, React.lazy requires a Promise which resolves to a module with a default export containing a React component (freely cited by React Docs). This also means that not React resolves the file, but import() does. import() works as documented in the MDN.
The async import() is a new function in ES6 which is not available in all browsers but can be polyfilled by Webpack and Babel/Typescript/others.
What you often see is code like the following, which automatically splits the imported file away by Webpack.
import(/* webpackChunkName: "xyz" */ './component/XYZ')
This creates a new javascript xyz.js next to your bundle script.
If you don't use Webpack, you need to create those files by yourself. Webpack just reduces the work required from you. So you don't absolutely depend on Webpack. This approach might look like the following:
// ./component/xyz.js
export default function() { return <div>Component</div> }
// ./main.js
const OtherComponent = React.lazy(() => import('./component/xyz.js'));
export default function() { return <div>Component</div> }
And the file structure:
| public
|---| main.js
|---| component
|---| --- | main.js
As you see, no webpack required. It just makes your life easier.
I want to use web workers in my app.
This is my worker:
import { processOrderFieldValue } from './utils';
export function colsAndRowsWorker() {
this.onmessage = () => {
processOrderFieldValue();
// in this place the compiler is trying to convert this module to this:
//Object(_WEBPACK_IMPORTED_MODULE_0__["processOrderFieldValue"])
//and I've got this:
//Uncaught ReferenceError: _WEBPACK_IMPORTED_MODULE_0__ is not defined
postMessage('oops');
};
}
I also tried to use importScripts('./utils'), but also got an error.
How to use modules in worker inside the React app? Perhaps, I can define the function as a relative path in importScripts()?
I was write code within view3D v2.11, React, ES6 and webpack. But I don't know how to write Autodesk.Viewing.Extensions within webpack and React. Can anyone show me some examples?
Using webpack to write a viewer extension is no different than using webpack to write any other js application. Take a look at my extensions library repo, each extension is bundled into a separate .js or .min.js whether you build the project in dev or prod: library-javascript-viewer-extensions.
This is designed this way so each extension can be loaded dynamically independently, however if you build an entire application around the viewer, you may simply include the code for each extension along with the rest of the app and generate a single webpack bundle.
This React project contains multiple viewer extensions (some are extracted from the library mentioned above) and is bundling extensions code along with the rest of the app: forge-rcdb.
As far as React and the viewer are concerned, it's not very relevant because the viewer is a 3D component which is created dynamically, all WebGL canvas and viewer 2D elements are being generated after your app loads a model, whereas React lets you declare 2D components when the app is starting. I was playing a bit with injecting dynamically React components to the viewer div, you can take a look in that same project here:
this.viewer = new Autodesk.Viewing.Private.GuiViewer3D(this.viewerContainer)
// Experimental !
// This this to render dynamic components
// inserted after the viewer div has been created
const reactViewerNode = document.createElement('div')
this.viewer.container.appendChild(reactViewerNode)
this.viewer.react = {
node: reactViewerNode,
components: [],
addComponent: (component) => {
this.viewer.react.components.push(component)
},
render: (props) => {
ReactDOM.render(
<div>
{
React.Children.map(
this.viewer.react.components,
(child) => React.cloneElement(child, props))
}
</div>,
reactViewerNode)
}
}
I then render those dynamic components by overriding componentDidUpdate:
componentDidUpdate () {
if (this.viewer && this.viewer.react) {
if(this.viewer.react.components.length) {
this.viewer.react.render(this.props)
}
}
}
and there is an example of use:
viewer.react.addComponent(
<DBDropdown key="test" className="react-div">
</DBDropdown>
)
I actually haven't implemented any feature using that in the live version of the app, but it should give you an idea.
Hope that helps.