Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.
Related
I was trying the Next 13 beta version, and I faced a strange problem. What I am trying to do is, fetch data on the server side and display them on the page. However, the "fetch" operation fails on the server side. Below is the code for the Next.js page. It falls under the 'app' directory, as 'app/pageName/page.js'
import React from 'react'
async function callApi() {
const data = await fetch('https://marketplace-api.scistoneprojects.com/api/items/?search=yil');
return data.json();
}
export default async function Page() {
const data = await callApi();
return (
<main>
{data.results && data.results.map((product, index) => (
<h1>{product.title}</h1>
))}
</main>
)
}
Click to see Error Message. (UND_ERR_CONNECT_TIMEOUT)
Click to see API response (Django REST)
Click to see Next 13 Doc
Note: The fetch operation fails after ~ 10 seconds.
What I did:
I tried Axios, but it also fails.
I tried adding 'enableUndici: true' to the next config file. (Fails)
I tried other mock APIs, some work some don't. (Weird)
They all work normally on the client side.
They all work normally in the Next 12.
They all work normally on any other React app.
Versions:
node 18.12.0
next 13.1.0
react 18.2.0
react-dom 18.2.0
npm 9.2.0
Machine: Mac Mini M1 (Ventura 13.1)
In my React SSR application I have implemented service worker(via Workbox).
It's working fine. Every time when I am changing some piece of code, rebuilding again, running the server, going to the browser, I am seeing that my cache was updated succesfully.
But one thing I cant understand. When I am deleting some asset(js or css) from my local server and trying to do some action in the browser(which invokes that asset) I am getting a chunk error, which says that the file is not available.
The main question is if that asset is already is in cache storage it should not be loaded from that cache or I have missed something?
The components I have used is
Node/express(for server)
#loadable/components(for code splitting), combined with webpack
Google workbox plugin
// my sw.js file
import { skipWaiting } from 'workbox-core';
import { precacheAndRoute } from 'workbox-precaching';
declare const self: Window & ServiceWorkerGlobalScope;
precacheAndRoute(self.__WB_MANIFEST);
skipWaiting();
// my workbox setup
const serviceWorkerRegistration = async (): Promise<void> => {
const { Workbox } = await import('workbox-window');
const wb = new Workbox('./service-worker.js');
wb.addEventListener('activated', (event: any) => {
if (event.isExternal) {
window.location.reload();
}
});
wb.register();
};
This is due to your usage of skipWaiting() inside of your service worker. When the waiting service worker activates, it will delete all of the outdated precached entries that are no longer associated with the new service worker deployment.
There is more background information in this two closely related answers, as well as a presentation:
What are the downsides to using skipWaiting and clientsClaim with Workbox?
Workbox: the danger of self.skipWaiting()
Paying Attention while Loading Lazily
In my React app (built with Create React App cli, and not ejected) I have it set up so if there is no REACT_APP_API_URL defined then it uses mocked data.
I do this by suppling a fakeFetch function to redux-api-middleware ala
import { apiMiddleware as aMl, createMiddleware } from 'redux-api-middleware'
import fakeFetch from 'fixtures/utils/fakeFetch'
const apiMiddleware = apiBase ? aMl : createMiddleware({ fetch: fakeFetch })
// etc... configure the `redux` store with the middleware
That's fine when developing, but I'd like for that code to be completely detached from the build when actually building for deployment.
Is there any way I can do something along the lines of
<% if process.env.REACT_APP_API_URL %>
import { apiMiddleware } from 'redux-api-middleware'
<% else %>
import { createMiddleware } from 'redux-api-middleware'
import fakeFetch from 'fixtures/utils/fakeFetch'
const apiMiddleware = createMiddleware({ fetch: fakeFetch })
<% endif %>
// etc... configure the `redux` store with the middleware
to prevent webpack from including up all my fixtures / fake data in the production build, while giving me a very simple way to switch between mock vs live data?
I do not want to have to eject the app, but am open to using a webpack plugin that's injected using Create React App Configuration Overrides.
I think webpack code-splitting with dynamic imports could be your best bet. This way, your mock data is bundled but never sent to the client (which I think is the main goal here).
import { apiMiddleware, createMiddleware } from 'redux-api-middleware'
const getMiddleware = async () => {
if (process.env.REACT_APP_API_URL) {
return apiMiddleware;
} else {
// loads bundle over network
const { default: fakeFetch } = await import('fixtures/utils/fakeFetch');
return createMiddleware({ fetch: fakeFetch });
}
}
I know this does not answer the question directly but on a side note, I think the cleanest way would be to utilise a mock sever such as mock-server.com. In development, you would just use the mock server url in process.env.REACT_APP_API_URL. This way, the test data lives in a completely different environment and provide a clear separation of concerns. You could probably also just create a simple local express app that just returns hardcoded JSON if you don't want to use third-party tools.
Is it possible to check if a file exists within the /public/ directory?
I have a set of images that correspond to some objects. When available, I would like to display them using <img> tag. However not all of the objects have a corresponding image, in which case I would like to perform a REST request to our server.
I could create a list of files as part of build process, but I would like to avoid that if possible.
I am using create-react-app if it matters (if I understand correctly fs doesn't work in client-side React apps).
EDIT: I guess I should have been more exact in my question - I know client-side JS can't access this information (except through HTTP requests), I was just hoping something saves information (during build) about the files available in a way that is accessible to client-side Javascript... Maybe Webpack or some extension can do this?
You can do this with your axios by setting relative path to the corresponding images folder. I have done this for getting a json file. You can try the same method for an image file, you may refer these examples
Note: if you have already set an axios instance with baseurl as a server in different domain, you will have to use the full path of the static file server where you deploy the web application.
axios.get('http://localhost:3000/assets/samplepic.png').then((response) => {
console.log(response)
}).catch((error) => {
console.log(error)
})
If the image is found the response will be 200 and if not, it will be 404.
Edit: Also, if the image file is present in assets folder inside src, you can do a require, get the path and do the above call with that path.
var SampleImagePath = require('./assets/samplepic.png');
axios.get(SampleImagePath).then(...)
First of all you should remember about client-server architecture of any web app. If you are using create-react-app you are serving your app via webpack-dev-server. So you should think about how you will host your files for production. Most common ways are:
apache2 / nginx
nodejs
but there is a lot of other ways depending on your stack.
With webpack-dev-server and in case you will use apache2 / nginx and if they would be configured to allow direct file access - it is possible to make direct requests to files. For example your files in public path so
class MyImage extends React.Component {
constructor (props) {
super(props);
this.state = {
isExist: null
}
}
componentDidMount() {
fetch(MY_HOST + '/public/' + this.props.MY_IMAGE_NAME)
.then(
() => {
// request status is 200
this.setState({ isExist: true })
},
() => {
// request is failed
this.setState({ isExist: false })
}
);
}
render() {
if (this.state.isExist === true) {
return <img src={ MY_HOST + "/public/" + this.props.MY_IMAGE_NAME }/>
}
return <img src="/public/no-image.jpg"/>
}
}
I'm building a React application bundled using Parcel or Webpack.
The application should be able to embed external React components
developed by third-parties and hosted elsewhere as modern javascript modules:
// https://example.com/scripts/hello-plugin.js
import React from 'react';
export default class HelloPlugin extends React.Component {
render() {
return "Hello from external plugin!";
}
}
Host application loads these components using asynchronous import like this, for example:
// createAsyncComponent.tsx
import * as React from 'react';
import { asyncComponent } from 'react-async-component';
export default function createAsyncComponent(url: string) {
return asyncComponent({
resolve: () => import(url).then(component => component.default),
LoadingComponent: () => <div>Loading {url}....</div>,
ErrorComponent: ({ error }) => <div>Couldn't load {url}: {error.message}</div>,
})
}
But looks like bundlers don't allow importing arbitrary urls as external javascript modules.
Webpack emits build warnings: "the request of a dependency is an expression" and the import doesn't work. Parcel doesn't report any errors, but fails when import(url) occurs at runtime.
Webpack author recommends using scriptjs or little-loader for loading external scripts.
There is a working sample that loads an UMD component from arbitrary URL like this:
public componentDidMount() {
// expose dependencies as globals
window["React"] = React;
window["PropTypes"] = PropTypes;
// async load of remote UMD component
$script(this.props.url, () => {
const target = window[this.props.name];
if (target) {
this.setState({
Component: target,
error: null,
})
} else {
this.setState({
Component: null,
error: `Cannot load component at ${this.props.url}`,
})
}
});
}
Also, I saw a similar question answered a year ago where the suggested approach also involves passing variables via a window object.
But I'd like to avoid using globals given that most modern browsers support modules out of the box.
I'm wondering if it's possible. Perhaps, any way to instruct the bundler that my import(url) is not a request for the code-split chunk of a host application, but a request for loading an external Javascript module.
In the context of Webpack, you could do something like this:
import(/* webpackIgnore: true */'https://any.url/file.js')
.then((response) => {
response.main({ /* stuff from app plugins need... */ });
});
Then your plugin file would have something like...
const main = (args) => console.log('The plugin was started.');
export { main };
export default main;
Notice you can send stuff from your app's runtime to the plugin at the initialization (i.e. when invoking main at the plugin) of the plugins so you don't end up depending on global variables.
You get caching for free as Webpack remembers (caches) that the given URL has already loaded so subsequent calls to import that URL will resolve immediately.
Note: this seems to work in Chrome, Safari & firefox but not Edge. I never bothered testing in IE or other browsers.
I've tried doing this same sort of load with UMD format on the plugin side and that doesn't seem to work with the way Webpack loads stuff. In fact it's interesting that variables declared as globals, don't end up in the window object of your runtime. You'd have to explicitly do window.aGlobalValue = ... to get something on the global scope.
Obviously you could also use requirejs - or similar - in your app and then just have your plugins follow that API.
Listen to the Webpack author. You can't do (yet) what you're trying to do with Webpack.
You will have to follow his suggested route.