In my React app (built with Create React App cli, and not ejected) I have it set up so if there is no REACT_APP_API_URL defined then it uses mocked data.
I do this by suppling a fakeFetch function to redux-api-middleware ala
import { apiMiddleware as aMl, createMiddleware } from 'redux-api-middleware'
import fakeFetch from 'fixtures/utils/fakeFetch'
const apiMiddleware = apiBase ? aMl : createMiddleware({ fetch: fakeFetch })
// etc... configure the `redux` store with the middleware
That's fine when developing, but I'd like for that code to be completely detached from the build when actually building for deployment.
Is there any way I can do something along the lines of
<% if process.env.REACT_APP_API_URL %>
import { apiMiddleware } from 'redux-api-middleware'
<% else %>
import { createMiddleware } from 'redux-api-middleware'
import fakeFetch from 'fixtures/utils/fakeFetch'
const apiMiddleware = createMiddleware({ fetch: fakeFetch })
<% endif %>
// etc... configure the `redux` store with the middleware
to prevent webpack from including up all my fixtures / fake data in the production build, while giving me a very simple way to switch between mock vs live data?
I do not want to have to eject the app, but am open to using a webpack plugin that's injected using Create React App Configuration Overrides.
I think webpack code-splitting with dynamic imports could be your best bet. This way, your mock data is bundled but never sent to the client (which I think is the main goal here).
import { apiMiddleware, createMiddleware } from 'redux-api-middleware'
const getMiddleware = async () => {
if (process.env.REACT_APP_API_URL) {
return apiMiddleware;
} else {
// loads bundle over network
const { default: fakeFetch } = await import('fixtures/utils/fakeFetch');
return createMiddleware({ fetch: fakeFetch });
}
}
I know this does not answer the question directly but on a side note, I think the cleanest way would be to utilise a mock sever such as mock-server.com. In development, you would just use the mock server url in process.env.REACT_APP_API_URL. This way, the test data lives in a completely different environment and provide a clear separation of concerns. You could probably also just create a simple local express app that just returns hardcoded JSON if you don't want to use third-party tools.
Related
I am looking through next.js documentation and trying to understand what the suggested approach is for setting URLs that change in different environments. Mostly, I want to ensure that I'm pointing backend URLs correctly in development versus production.
I suppose you can create a constants configuration file, but is there a supported, best practice for this?
Open next.config.js and add publicRuntimeConfig config with your constants:
module.exports = {
publicRuntimeConfig: {
// Will be available on both server and client
yourKey: 'your-value'
},
}
you can call it from another .js file like this
import getConfig from 'next/config'
const { publicRuntimeConfig } = getConfig()
console.log(publicRuntimeConfig.yourKey)
or even call it from view like this
${publicRuntimeConfig.yourKey}
You can configure your next app using next-runtime-dotenv, it allows you to specify serverOnly / clientOnly values using next's runtime config.
Then in some component
import getConfig from 'next/config'
const {
publicRuntimeConfig: {MY_API_URL}, // Available both client and server side
serverRuntimeConfig: {GITHUB_TOKEN} // Only available server side
} = getConfig()
function HomePage() {
// Will display the variable on the server’s console
// Will display undefined into the browser’s console
console.log(GITHUB_TOKEN)
return (
<div>
My API URL is {MY_API_URL}
</div>
)
}
export default HomePage
If you don't need this separation, you can use dotenv lib to load your .env file, and configure Next's env property with it.
// next.config.js
require('dotenv').config()
module.exports = {
env: {
// Reference a variable that was defined in the .env file and make it available at Build Time
TEST_VAR: process.env.TEST_VAR,
},
}
Check this with-dotenv example.
Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.
I'm building a React application bundled using Parcel or Webpack.
The application should be able to embed external React components
developed by third-parties and hosted elsewhere as modern javascript modules:
// https://example.com/scripts/hello-plugin.js
import React from 'react';
export default class HelloPlugin extends React.Component {
render() {
return "Hello from external plugin!";
}
}
Host application loads these components using asynchronous import like this, for example:
// createAsyncComponent.tsx
import * as React from 'react';
import { asyncComponent } from 'react-async-component';
export default function createAsyncComponent(url: string) {
return asyncComponent({
resolve: () => import(url).then(component => component.default),
LoadingComponent: () => <div>Loading {url}....</div>,
ErrorComponent: ({ error }) => <div>Couldn't load {url}: {error.message}</div>,
})
}
But looks like bundlers don't allow importing arbitrary urls as external javascript modules.
Webpack emits build warnings: "the request of a dependency is an expression" and the import doesn't work. Parcel doesn't report any errors, but fails when import(url) occurs at runtime.
Webpack author recommends using scriptjs or little-loader for loading external scripts.
There is a working sample that loads an UMD component from arbitrary URL like this:
public componentDidMount() {
// expose dependencies as globals
window["React"] = React;
window["PropTypes"] = PropTypes;
// async load of remote UMD component
$script(this.props.url, () => {
const target = window[this.props.name];
if (target) {
this.setState({
Component: target,
error: null,
})
} else {
this.setState({
Component: null,
error: `Cannot load component at ${this.props.url}`,
})
}
});
}
Also, I saw a similar question answered a year ago where the suggested approach also involves passing variables via a window object.
But I'd like to avoid using globals given that most modern browsers support modules out of the box.
I'm wondering if it's possible. Perhaps, any way to instruct the bundler that my import(url) is not a request for the code-split chunk of a host application, but a request for loading an external Javascript module.
In the context of Webpack, you could do something like this:
import(/* webpackIgnore: true */'https://any.url/file.js')
.then((response) => {
response.main({ /* stuff from app plugins need... */ });
});
Then your plugin file would have something like...
const main = (args) => console.log('The plugin was started.');
export { main };
export default main;
Notice you can send stuff from your app's runtime to the plugin at the initialization (i.e. when invoking main at the plugin) of the plugins so you don't end up depending on global variables.
You get caching for free as Webpack remembers (caches) that the given URL has already loaded so subsequent calls to import that URL will resolve immediately.
Note: this seems to work in Chrome, Safari & firefox but not Edge. I never bothered testing in IE or other browsers.
I've tried doing this same sort of load with UMD format on the plugin side and that doesn't seem to work with the way Webpack loads stuff. In fact it's interesting that variables declared as globals, don't end up in the window object of your runtime. You'd have to explicitly do window.aGlobalValue = ... to get something on the global scope.
Obviously you could also use requirejs - or similar - in your app and then just have your plugins follow that API.
Listen to the Webpack author. You can't do (yet) what you're trying to do with Webpack.
You will have to follow his suggested route.
I would like to create a custom page using react but I cannot find the documentation to do this. On the Sonarqube documentation, there only the way to create a custom page using javascript only and I don’t understand how the example plugin works with react.
Can you tell me if there is a documentation that I can use.
Short answer: There isn't. There is barely anyone (no one in fact, as far as I've seen) using custom pages currently.
However, it IS possible. You need to create a react project with Webpack (or a similar JS packager).
I also recommend using Create-React-App. This fixes a lot of the setup for you. After that, in your index.js you use the example code from the SonarQube wiki.
Here is an example:
/*
PRODUCTION ENTRYPOINT
*/
import React from 'react';
import ReactDOM from 'react-dom';
import Project from './components/Project';
import './main.css';
window.registerExtension('myplugin/coverage', function (options) {
appendCustomCSS();
let isDisplayed = true;
window.SonarRequest.getJSON('/api/measures/component', {
component: options.component.key,
metricKeys: 'coverage'
}).then(function (response) {
if (isDisplayed) {
let obj = JSON.parse(response.component.measures[0].value);
let div = document.createElement('div');
render(obj, div);
options.el.appendChild(div);
}
});
return function () {
isDisplayed = false;
};
});
function appendCustomCSS() {
let fileref = document.createElement("link");
fileref.setAttribute("rel", "stylesheet");
fileref.setAttribute("type", "text/css");
fileref.setAttribute("href", "/static/myplugin/coverage.css");
document.head.append(fileref);
}
function render(objectArray, container) {
ReactDOM.render(<div className="Coverage"><Project objects={objectArray}/></div>, container);
}
Is it possible to setup a project which has code for both React Native(Mobile app) + React(web), having the code shred between platforms except for the UI part.
Have done something similar with Angular + NativeScript using this seed, which enables code sharing between native app and web application(Except for the UI layer). Looking for something similar for React + React Native.
Please share if you know any such seed for React Native + Angular as well, if available.
Jonathan Kaufman has a good article on how to set this up: http://jkaufman.io/react-web-native-codesharing/
The basic strategy is to have a different entry point (index.js) for each platform (android/ios/web). Then the majority of your non-rendering code can live in a shared app or common folder. You'll still need to segregate your rendering code (i.e. uses of View, div, etc.), though, as that will differ by platform.
Pay attention to the comments on that article as well, as there's some good discussion on the pitfalls of this approach. Example:
By sharing a common package.json between native and web, you've glued them together by their common dependencies, the most important one being react. Let's say you upgrade to a version of react-native that depends on >= react#16, but your web app depends on some other library which depends on =< react#15. --timtas
You can give a try to React-Native-Web, but imho you should create 2 different projects, isolate and copy what can be used on both (like api requests and util functions). Your code will be easier to debug and maintain.
Yes, absolutely possible. We've done it before using this lib react-native-web. https://github.com/necolas/react-native-web
beside index.ios.js and index.android.js, you will need create index.web.js, the content should be similar like this.
import { AppRegistry } from 'react-native';
import App from './app/Containers/App/App.container';
AppRegistry.registerComponent('ReactNativeWeb', () => App);
AppRegistry.runApplication('ReactNativeWeb', { rootTag: document.getElementById('react-app') });
also you need to create your own nodejs code to serve up the bundle. full reference
I do it this way.
1) Create a React Native project.
2) Add react-dom and react-scripts dependencies to package.json and install them.
3) All the component code is separated this way:
My regular React Native component:
// MyComponent.js
class MyComponent extends React.Component {
costructor(props) {
...
}
someMethod() {
...
}
render() {
return (
<View>
...
</View>
)
}
}
Changed for using in web:
// MyComponentController.js
class MyComponentController extends React.Component {
costructor(props) {
...
}
someMethod() {
...
}
}
// MyComponent.js
const MyComponentController = require('./MyComponentController')
class MyComponent extends MyComponentController {
render() {
return (
<div>
...
</div>
)
}
}
// MyComponent.native.js
const MyComponentController = require('./MyComponentController')
class MyComponent extends MyComponentController {
render() {
return (
<View>
...
</View>
)
}
}
And then I use in it in all the platforms:
const MyComponent = require('./MyComponent')
For this to work nicely with an old project I had to implement some dummies, but it all can be done better by your own layer of abstraction. Part of my example:
const ReactNative = {
Platform: {
OS: 'web'
},
AppRegistry: {
registerComponent: (name, provider) => {
const Component = provider()
ReactDOM.render(
<Component />,
document.getElementById('root')
);
}
},
AsyncStorage: {
setItem: (key, value, callback) => {
localStorage.setItem(key, value)
callback()
},
getItem: key => {
const val = localStorage.getItem(key) || null
return new Promise(ok => ok(val))
}
},
StyleSheet: {
create: dict => dict
},
Dimensions: {
get: function() {
// http://stackoverflow.com/questions/3437786/get-the-size-of-the-screen-current-web-page-and-browser-window
const w = window
const d = document
const e = d.documentElement
const g = d.getElementsByTagName('body')[0]
const x = w.innerWidth || e.clientWidth || g.clientWidth
const y = w.innerHeight|| e.clientHeight|| g.clientHeight
return {
width: x,
height: y
}
}
},
Linking: {
openURL: (url) => {
window.open(url)
}
},
// etc, I add dummies as soon as I need them
}
But, as I said, this was necessary only because I did not have much time and had not known in advance that I would have to port to web.
You can try the repo that I tried to prepare:
https://mehmetkaplan.github.io/react-spa-jwt-authentication-boilerplate/
This has a step by step guideline that enables to share common logic between react and react-native applications.
It aims to differentiate only in the presentation layer. Other than that all logic is compiled to be shared between applications.
It also comes with facebook and google logins, database (mysql) integration, WebView task generation, etc.
And also it gives the fundamental know-how on "single page applications", "JWT (json web token) based security", etc..
Once read the README, you can simply clone the repo and set your environment (Database) and start developing business logic on top of the shared code structure and security baseline.
You can create 3 stand-alone applications - React, React-native & Server.
Both React & React-native will use the same services from your back-end app.
Else go with a single app where on loading home page, app will understand the device and render React / React-native code as per the device.