I am having a running GraphQL based server and I am able to test it properly using GraphiQL. But I am unable to understand the implementation of Relay.
As GraphQL is on the server side, then how its schema is transferred to relay over network layer at the client end? Or how it is pointing to the same GraphQL?
It takes a little extra work. Basically you have to dump a serialized version of the schema to a file on the server side, and then move that file over to the client and include it during babel compilation. Facebook has a babel plugin that takes this file and builds it into the bundle in order for Relay to know about the schema.
EDIT: here is a snippet for how to dump the schema file to JSON
import { graphql } from 'graphql'
import { introspectionQuery, printSchema } from 'graphql/utilities'
/*
generates json of our schema for use by relay
*/
export default function (schema) {
return new Promise((resolve, reject) => {
graphql(schema, introspectionQuery).then(result => {
if (result.errors) {
console.error(`ERROR introspecting schema: ${result.errors}`)
reject(new Error(result.errors))
} else {
resolve({ json: result, graphql: printSchema(schema) })
}
})
})
}
After you obtain that, you have to npm install babel-relay-plugin and include that in your babel plugins (probably in the webpack config, if you're using that).
Related
I am currently stuck with a problem trying to fetch github repo data using the octokit npm package.
I use vite to run a dev server and when I try to make a request, the error that i get is:
Uncaught Error: Module "stream" has been externalized for browser compatibility and cannot be accessed in client code.
My React .tsx file looks like this:
import { Octokit, App } from 'octokit'
import React from 'react'
const key = import.meta.env.GITHUB_KEY
const octokit = new Octokit({
auth: key
})
await octokit.request('GET /repos/{owner}/{repo}', {
owner: 'OWNER',
repo: 'REPO'
})
export default function Repos() {
return (
<>
</>
)
}
I have redacted the information for privacy purposes.
If anyone knows how to resolve this issue with vite, please let me know!
Check first if this is similar to octokit/octokit.js issue 2126
I worked around this problem by aliasing node-fetch to isomorphic-fetch. No idea if it works for all usages within octokit, but works fine for my project.
You'll need to install the isomorphic-fetch dependency before making this config change.
// svelte.config.js
const config = { // ... kit: {
// ...
vite: {
resolve: {
alias: {
'node-fetch': 'isomorphic-fetch',
},
},
},
},
};
export default config;
Note: there are still questions about the support/development of octokit: issue 620.
For ApolloClient we can use either SchemaLink or HttpLink.
Is there any advantage for using one or other?
// SchemaLink
function createLink() {
const { SchemaLink } = require('#apollo/client/link/schema')
const { schema } = require('../server/schema')
return new SchemaLink({ schema })
}
// HttpLink
function createLink() {
const { HttpLink } = require('#apollo/client/link/http')
return new HttpLink({
uri: '/api/graphql',
credentials: 'same-origin'
}
// ApolloClient
function createApolloClient() {
return new ApolloClient({
link: createLink(),
cache: new InMemoryCache()
})
}
The HttpLink is forwarding GraphQL queries through a HTTP POST request to the provided endpoint uri.
The SchemaLink, well, the documentation explains what it does:
The schema link provides a graphql execution environment, which allows you to perform GraphQL operations on a provided schema. This type of behavior is commonly used for server-side rendering (SSR) to avoid network calls and mocking data.
So while the HttpLink is used to hit a distant GraphQL server through the network (usually, in a client application context), the SchemaLink is used when the server is available locally, where we can import the schema in the same source as a server-side project, like in your example snippet.
Is there any advantage for using one or other?
They're meant for different use-cases, while nothing stops you from doing HTTP calls on the server to an endpoint on the same server (e.g. localhost), using the schema link to query a local schema might be better depending on the project's architecture (Is the GraphQL server deployed to an external host where the schema wouldn't be available locally? etc.)
They could also both be used, e.g. HttpLink on the browser side and SchemaLink for server-side rendering of the same React application.
I am looking through next.js documentation and trying to understand what the suggested approach is for setting URLs that change in different environments. Mostly, I want to ensure that I'm pointing backend URLs correctly in development versus production.
I suppose you can create a constants configuration file, but is there a supported, best practice for this?
Open next.config.js and add publicRuntimeConfig config with your constants:
module.exports = {
publicRuntimeConfig: {
// Will be available on both server and client
yourKey: 'your-value'
},
}
you can call it from another .js file like this
import getConfig from 'next/config'
const { publicRuntimeConfig } = getConfig()
console.log(publicRuntimeConfig.yourKey)
or even call it from view like this
${publicRuntimeConfig.yourKey}
You can configure your next app using next-runtime-dotenv, it allows you to specify serverOnly / clientOnly values using next's runtime config.
Then in some component
import getConfig from 'next/config'
const {
publicRuntimeConfig: {MY_API_URL}, // Available both client and server side
serverRuntimeConfig: {GITHUB_TOKEN} // Only available server side
} = getConfig()
function HomePage() {
// Will display the variable on the server’s console
// Will display undefined into the browser’s console
console.log(GITHUB_TOKEN)
return (
<div>
My API URL is {MY_API_URL}
</div>
)
}
export default HomePage
If you don't need this separation, you can use dotenv lib to load your .env file, and configure Next's env property with it.
// next.config.js
require('dotenv').config()
module.exports = {
env: {
// Reference a variable that was defined in the .env file and make it available at Build Time
TEST_VAR: process.env.TEST_VAR,
},
}
Check this with-dotenv example.
Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.
Is it possible to check if a file exists within the /public/ directory?
I have a set of images that correspond to some objects. When available, I would like to display them using <img> tag. However not all of the objects have a corresponding image, in which case I would like to perform a REST request to our server.
I could create a list of files as part of build process, but I would like to avoid that if possible.
I am using create-react-app if it matters (if I understand correctly fs doesn't work in client-side React apps).
EDIT: I guess I should have been more exact in my question - I know client-side JS can't access this information (except through HTTP requests), I was just hoping something saves information (during build) about the files available in a way that is accessible to client-side Javascript... Maybe Webpack or some extension can do this?
You can do this with your axios by setting relative path to the corresponding images folder. I have done this for getting a json file. You can try the same method for an image file, you may refer these examples
Note: if you have already set an axios instance with baseurl as a server in different domain, you will have to use the full path of the static file server where you deploy the web application.
axios.get('http://localhost:3000/assets/samplepic.png').then((response) => {
console.log(response)
}).catch((error) => {
console.log(error)
})
If the image is found the response will be 200 and if not, it will be 404.
Edit: Also, if the image file is present in assets folder inside src, you can do a require, get the path and do the above call with that path.
var SampleImagePath = require('./assets/samplepic.png');
axios.get(SampleImagePath).then(...)
First of all you should remember about client-server architecture of any web app. If you are using create-react-app you are serving your app via webpack-dev-server. So you should think about how you will host your files for production. Most common ways are:
apache2 / nginx
nodejs
but there is a lot of other ways depending on your stack.
With webpack-dev-server and in case you will use apache2 / nginx and if they would be configured to allow direct file access - it is possible to make direct requests to files. For example your files in public path so
class MyImage extends React.Component {
constructor (props) {
super(props);
this.state = {
isExist: null
}
}
componentDidMount() {
fetch(MY_HOST + '/public/' + this.props.MY_IMAGE_NAME)
.then(
() => {
// request status is 200
this.setState({ isExist: true })
},
() => {
// request is failed
this.setState({ isExist: false })
}
);
}
render() {
if (this.state.isExist === true) {
return <img src={ MY_HOST + "/public/" + this.props.MY_IMAGE_NAME }/>
}
return <img src="/public/no-image.jpg"/>
}
}