For ApolloClient we can use either SchemaLink or HttpLink.
Is there any advantage for using one or other?
// SchemaLink
function createLink() {
const { SchemaLink } = require('#apollo/client/link/schema')
const { schema } = require('../server/schema')
return new SchemaLink({ schema })
}
// HttpLink
function createLink() {
const { HttpLink } = require('#apollo/client/link/http')
return new HttpLink({
uri: '/api/graphql',
credentials: 'same-origin'
}
// ApolloClient
function createApolloClient() {
return new ApolloClient({
link: createLink(),
cache: new InMemoryCache()
})
}
The HttpLink is forwarding GraphQL queries through a HTTP POST request to the provided endpoint uri.
The SchemaLink, well, the documentation explains what it does:
The schema link provides a graphql execution environment, which allows you to perform GraphQL operations on a provided schema. This type of behavior is commonly used for server-side rendering (SSR) to avoid network calls and mocking data.
So while the HttpLink is used to hit a distant GraphQL server through the network (usually, in a client application context), the SchemaLink is used when the server is available locally, where we can import the schema in the same source as a server-side project, like in your example snippet.
Is there any advantage for using one or other?
They're meant for different use-cases, while nothing stops you from doing HTTP calls on the server to an endpoint on the same server (e.g. localhost), using the schema link to query a local schema might be better depending on the project's architecture (Is the GraphQL server deployed to an external host where the schema wouldn't be available locally? etc.)
They could also both be used, e.g. HttpLink on the browser side and SchemaLink for server-side rendering of the same React application.
Related
i was trying to upload file to my apollo server but i can't upload any file and it does not throw any error but i receive only empty object in resolver and it works find when i use GraphQL Client like Altair
output in server when using Altair Client
{
filename: 'Copy of Massive Orange Summer Sale Animated-300x250px-MediumRectangle.png',
mimetype: 'image/png',
encoding: '7bit',
createReadStream: [Function: createReadStream]
}
output in server when using #apollo/client package
{}
client code
server code
Uploads are not supported in Apollo Client by default. If you want to enable them, you need to use createUploadLink from apollo-upload-client as shown here.
import { createUploadLink } from 'apollo-upload-client';
const client = new ApolloClient({
cache: new InMemoryCache(),
link: createUploadLink(),
});
I have a .ts(not .tsx) file which just exports a json object like
const obj = {
img1: gql_img1,
img2: gql_img2
}
I want gq1_img1 and gq1_img2 to be the results of a graphql query
I found a solution which uses Apollo Client, but it doesn't look like they're using Gatsby and I don't think Gatsby uses a client.
The problem with using useStaticQuery is that it's a hook, if I try to use it like in the snippet below, I get "Error: Invalid hook call. Hooks can only be called inside of the body of a function component. This could happen for one of the following reasons:"
const gql = () => {
const gql = useStaticQuery(graphql
`query adQuery {
invoiceNinja300x250: file(
extension: {regex: "/(jpg)|(jpeg)|(png)/"},
name: {eq: "IN-300x250-2"}
){
childImageSharp {
fluid(maxWidth: 250) {
...GatsbyImageSharpFluid_withWebp_noBase64
}
}
},
invoiceNinja600x300: file(
extension: {regex: "/(jpg)|(jpeg)|(png)/"},
name: {eq: "IN-600x300-2"}
){
childImageSharp {
fluid(maxWidth: 250) {
...GatsbyImageSharpFluid_withWebp_noBase64
}
}
}
}`
)
return gql
}
const GQL = gql()
Like I mentioned in your reddit post, if you're not using a page query or static query, you'll need Apollo Client or some other gql client.
I found a solution which uses Apollo Client, but it doesn't look like they're using Gatsby and I don't think Gatsby uses a client.
Gatsby and GraphQL clients are different things. Gatsby is a React framework for building static websites and uses graphQL to fetch data in various ways.
A GraphQL client is much like fetch or axios, they are libraries used to request, post, update, delete data from a REST API.
Can you explain your use case a bit? Maybe there is a more Gatsby way of doing it.
Have you considered the React Context API? On the production Gatsby app I work on that's what we use for global variables like some JSON/object data. It allows you in some sort of high level/layout/data layer component to stuff some values you get from a different file into your app to use with other components.
Goal: Call a function that invokes a fetch call to validate it works with my backend rest-api (end to end testing basically).
Project: node module built to be imported into several react web application. The module contains only fetch calls and minimal logic. Its basically a glorified wrapper for URLs and settings. Created to cut down work required to implement common end points used in applications.
Setup: I have a docker compose building a docker test container and pulling in my rest-api docker image (built in a different system). The test container pulls in the packed module and installs it with dependencies. Then it brings up the tests alongside the backend + other images needed for the backend (DB, login system, etc).
Problem: How to implement the tests to handle the calls.
Currently I've tried calling the fetch methods directly. This works for my login fetch but any additional call fails to send the cookie. As far as I understand the code I have depends on the browser for the cookie. I've tried several solutions to get said cookie but i've been unable to get fetch of node-fetch to send it properly. My best guess is each test was creating a new cookie but I lack the knowledge to full debug this solution path.
my send solution path was to attempt to use puppeteer to load a fake page and then evaluate the function in page following examples like:
https://github.com/puppeteer/puppeteer/issues/2579
How to use imported function inside page.evaluate in Puppeteer with Jest?
Problem with this is the tests kept failing to load libraries required or some other problem.
Code:
Here is the call I'm trying to test for the most part. Each function I have wraps around this providing {url: "api/endpoint/path", method: "GET"}. With some passing in a body for larger data posts.
export function request(options) {
//Build options
options = {
headers: {
'Content-Type': 'application/json'
},
...options
};
return fetch(options.url, options)
.then(response => {
//...
//Handle errors
if (!response.ok) {
return Promise.reject(`${response.status} - ${response.statusText}`);
}
try {
return response.json();
} catch (e) {
if (e.name === 'SyntaxError') {
return Promise.reject(response.text());
}
}
});
}
Test example i've tried:
import puppeteer from "puppeteer";
import {myCall} from "my-rest-bridge";
it('Check response', async () => {
//Load browser
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
//Load page
await page.goto('http://docker:8888/index.html');
//Do login
expect(await page.evaluate(() => login('user', 'password')).toBe(expectedUserResponseJson);
//Make call
expect(await page.evaluate(() => myCall(input1, input2)).toBe(expectedCallResponseJson);
//Close page
await page.close();
})
Took me a while but I built a solution to my own question. Its not perfect so if anyone has a better idea please answer.
So my solution works as follows. I built an addition git project to create a shell reactjs application inside a docker image. This application pulls in my node module, iterates through all the exports, and then generates a component per function.
import React from 'react';
import * as magicNodeModule from "magic-node-module"; //obviously not the real name
import CallRest from "./CallRest";
/** Core component for the application */
export default class App extends React.Component {
/** Renders the core application */
render() {
const restCalls = Object.keys(magicNodeModule);
return (
<div id={"App"}>
<div>
Functions found:
<ul id={"function-list"}>
{restCalls.map(restCall => <li>{restCall}</li>)}
</ul>
<hr/>
</div>
{
restCalls.map(restCall => {
return (
<CallRest restName={restCall} restCall={magicNodeModule[restCall]}/>
);
})
}
</div>
)
}
}
This component (CallRest) contains an input box, submit button, and output div. A user, or in my use case puppeteer, can input data into the input. Then by clicking submit it will run the fetch call and insert the resulting output into the div. Works very much like swagger 2 for those that are familiar with the system.
The solution is still built up as a series of docker images inside of a compose. Though it did require setting up a reverse proxy to allow the react app to communicate with backend API. As well it pulls in a fresh copy of the node module as a pack zip and installs it into the docker. This way I only have to build the shell react docker once in a blue moon.
Where to add custom configuration for external libraries(npm modules) in react, which will be used in entire app
My problem statement is:
I am doing api calls using axios, i don't want to include authentication header in each call separately, i am thinking to create a file and import axios there and do something like this
customAxios = axios
customAxios.defaults.headers.common['Authorization'] = store.getState().session.token;
export customAxios
Now i will import customAxios in any file where i need to do api call
Is this the correct way? if it's not how to handle this situation
Btw i am new to react
You can create an api class in whose constructor you can create an axios instance and assign the header information. Then use the instance in all your calls.
export default class Api{
constructor(auth_token){
this.customAxios= axios.create({
baseURL: 'https://myserverurl.com'
});
this.customAxios.defaults.headers.common['Authorization'] = auth_token;
}
getInfo(){
this.customAxios.get('/info') ....// do further processing and return data
}
}
And when you want to call the api you can create an instance and use that instance for your calls.
let apiInstance = new Api(store.getState().session.token);
apiInstance.getInfo();
I am having a running GraphQL based server and I am able to test it properly using GraphiQL. But I am unable to understand the implementation of Relay.
As GraphQL is on the server side, then how its schema is transferred to relay over network layer at the client end? Or how it is pointing to the same GraphQL?
It takes a little extra work. Basically you have to dump a serialized version of the schema to a file on the server side, and then move that file over to the client and include it during babel compilation. Facebook has a babel plugin that takes this file and builds it into the bundle in order for Relay to know about the schema.
EDIT: here is a snippet for how to dump the schema file to JSON
import { graphql } from 'graphql'
import { introspectionQuery, printSchema } from 'graphql/utilities'
/*
generates json of our schema for use by relay
*/
export default function (schema) {
return new Promise((resolve, reject) => {
graphql(schema, introspectionQuery).then(result => {
if (result.errors) {
console.error(`ERROR introspecting schema: ${result.errors}`)
reject(new Error(result.errors))
} else {
resolve({ json: result, graphql: printSchema(schema) })
}
})
})
}
After you obtain that, you have to npm install babel-relay-plugin and include that in your babel plugins (probably in the webpack config, if you're using that).