API caching for next JS - reactjs

I'm building an app with Next.js... we have 100k+ pages and content changes daily, so using SSR and getServerSideProps.
Some of our data is coming from a headless CMS provider that charges by the request. I'd like to cache the API responses from this server for 24hrs.
What is the best way of going about this?
Is there a common library most folks use to do this?
Just looking for suggestions of approaches I should investigate (or great examples of how to do this).

I used this npm package:
https://www.npmjs.com/package/memory-cache
And then something like this:
import cacheData from "memory-cache";
async function fetchWithCache(url, options) {
const value = cacheData.get(url);
if (value) {
return value;
} else {
const hours = 24;
const res = await fetch(url, options);
const data = await res.json();
cacheData.put(url, data, hours * 1000 * 60 * 60);
return data;
}
}
Then if you want to fetch something with using the cache just call this function. Or it can be used as a midware in the requests. It checks if the data is already in the cache and returns it, or if not - it puts the data into the cache under the key. The key can be anything, I am using the url for instance.

In addition to Tobias Lins' answer:
At least if deploying on Vercel, you can use set Cache-Control headers in getStaticProps, getServerSideProps, API routes, etc to cache responses on Vercel's edge network. This solution does not require any additional dependencies and very minimal code.
api route example - source Vercel
// pages/api/user.js
export default function handler(req, res) {
res.setHeader('Cache-Control', 's-maxage=86400');
res.status(200).json({ name: 'John Doe' });
}
Example in getServerSideProps - Source NextJS
// This value is considered fresh for ten seconds (s-maxage=10).
// If a request is repeated within the next 10 seconds, the previously
// cached value will still be fresh. If the request is repeated before 59 seconds,
// the cached value will be stale but still render (stale-while-revalidate=59).
//
// In the background, a revalidation request will be made to populate the cache
// with a fresh value. If you refresh the page, you will see the new value.
export async function getServerSideProps({ req, res }) {
res.setHeader(
'Cache-Control',
'public, s-maxage=10, stale-while-revalidate=59'
)
return {
props: {},
}
}
I believe you'd want to use:
res.setHeader('Cache-Control', 's-maxage=1440000')
Here are some other useful links for caching on Vercel:
https://vercel.com/docs/concepts/functions/edge-caching
https://vercel.com/docs/concepts/edge-network/overview
https://vercel.com/docs/concepts/edge-network/caching
https://vercel.com/docs/concepts/edge-network/headers
For your specific case, you also may want to look into using getStaticPaths with getStaticProps. You can use fallback: true on getStaticPaths to only build pages when they're visited (you can still build your post popular pages at initial build time).
https://nextjs.org/docs/basic-features/data-fetching#the-fallback-key-required
I know this is an old post, but for others googling (at least those deploying on Vercel), these solutions should help where revalidate in getStaticProps does not.

You could use getStaticProps from Next.js for SSG
They currently have a revalidate property that you can return, that defines how often the content should be re-fetched.
Take a look here:
https://nextjs.org/blog/next-9-5#stable-incremental-static-regeneration

This is how we did it without any 3rd party libraries, as in our use-case we only had to cache a relatively smaller amount of global data(header/footer menus) which was shared across the site.
The data was coming from a CMS via GraphQL.
We ran an async method getGlobalData on each page from on getStaticProps method and then returned the cached data to the page component via props.
import fs from 'fs';
import path from 'path';
// Cache files are stored inside ./next folder
const CACHE_PATH = path.join(__dirname, 'globalData.json');
export default async function getGlobalData() {
let cachedData;
// #1 - Look for cached data first
try {
cachedData = JSON.parse(fs.readFileSync(CACHE_PATH, 'utf8'));
} catch (error) {
console.log('❌ CACHE NOT INITIALIZED');
}
// #2 - Create Cache file if it doesn't exist
if (!cachedData) {
// Call your APIs to-be-cached here
const data = await fetchGlobalData();
// Store data in cache files
// this always rewrites/overwrites the previous file
try {
await fs.writeFileSync(
CACHE_PATH,
JSON.stringify(data),
err =>throw err
);
console.log('💾 CACHE FILE WRITTEN SUCCESSFULLY');
} catch (error) {
console.log('❌ ERROR WRITING MEMBERS CACHE TO FILE\n', error);
}
cachedData = data;
}
return cachedData;
}
Call getGlobalData method from getStaticProps.
export async function getStaticProps({ preview = false }) {
const globalData = await getGlobalData();
// call other page-specific/non-shared APIs here
// ...
return { props: { globalData } };
}
References
https://flaviocopes.com/nextjs-cache-data-globally/
Note if you get an error saying fs or path is unknown or invalid, then please understand that, the above code is supposed to be running or referenced "serverside" i.e only inside getStaticProps or getServerSideProps. If you import and reference it "browser-side", say somewhere inside your components or on the page (other than methods mentioned above), then you will get an error, as there is no filesystem(fs) or path modules on browser. They are only available on node.

Related

Using ApolloClient pagination API results in requests, even if all page content is already in cache

I am using the ApolloClient core pagination API approach to accumulate paginated requests in a merge function and the repaginate them with a read function: https://www.apollographql.com/docs/react/pagination/core-api
This all works, but now there is a request for each page, even the ones that are already in the cache.
Which defeats the whole purpose when I'm repaginating!
I'm using the default fetchStrategy, cache-first.
If all requested data is present in the cache, that data is returned. Otherwise, Apollo Client executes the query against your GraphQL server and returns that data after caching it.
I wonder how ApolloClient checks that all requested data is in the cache with the pagination implementation.
Because right now (and the docs seems to rely on this) it always does the request, even when the keyArgs match and the data is in the cache.
Does someone know what causes this and how I can customize this cache-first strategy to check if all the items of the requested page are already in the cache?
Here is my code, in case that helps for context or if I'm just doing something wrong:
typePolicies: {
Query: {
fields: {
paginatedProductTracking: {
// Include everything except 'skip' and 'take' to be able to use `fetchMore`
// and repaginate when reading cache
// (essential for switching between desktop pagination and mobile lazy loading
// without having to refetch)
keyArgs: (args) => JSON.stringify(omit(args, ['query.skip', 'query.take'])),
merge: (existing, incoming, { args }) => {
if (!existing) {
return incoming;
}
if (!incoming) {
return existing;
}
const data = existing.paginatedData;
const newData = incoming.paginatedData;
return {
...existing,
// conservative merge that is robust against pages being requested out of order
paginatedData: [
...data.slice(0, args?.query.skip || 0),
...newData,
...data.slice((args?.query.skip || 0) + newData.length),
],
};
},
},
},
},
},
const [pageSize, setPageSize] = useState(100);
const [page, setPage] = useState(0);
const skip = page * pageSize;
const query = {
filter,
aggregationInterval,
order,
skip,
take: pageSize,
search: search ? values : null,
locations: currentLocations.length > 0 ? currentLocations.map((location) => location.id) : undefined,
};
const { data, loading, fetchMore } = useProductTrackingAggregatedDataQuery({
variables: {
query,
},
});
onPageChange={async (newPage) => {
await fetchMore({
variables: {
query: {
...query,
skip: newPage * pageSize,
},
},
});
setPage(newPage);
}}
I was recently faced with the exact same issue and had everything implemented in the way the official documentation illustrates until I stumbled upon this issue which is still open so I'm guessing this is still how the fetchMore function actually behaves to date. So #benjamn says that:
The fetchMore method sends a separate request that always has a fetch policy of no-cache, which is why it doesn't try to read from the cache first.
This being the case, fetchMore is only useful if you are implementing an endless scroll sort of pagination where you know beforehand that the new data is not in the cache.
In the pagination documentation it also states that:
If you are not using React and useQuery, the ObservableQuery object returned by client.watchQuery has a method called setVariables that you can call to update the original variables.
If you change the variables to your query it will trigger your read function implementation. And if the read function finds the data within existing it can return them or return undefined which will in turn trigger a network request to your graphql server to fetch the missing data, which will trigger your merge function to merge the data in the desired way, which will again trigger the read function which will now be able to slice the data you requested according to your { args } out of your existing and return them, which will finally trigger your watched ObservableQuery to fire and your UI to be updated.
Now, this approach is counter intuitive and goes against the "recommended" way of implementing pagination, but contrary to the recommended way this approach actually works.
I was unable to find anything that would prove my conclusions about fetchMore to be wrong, so if any Apollo client guru happens to stumble upon this please do shed some light into this. Until then the only solution I can offer is working with setVariables instead of fetchMore.
Keep in mind that you will need to implement a read function along with your merge. It will be responsible for slicing your cached data and triggering a network request by returning undefined if it was unable to find a full slice.

SvelteKit - /routes/a/+page.server.ts fetch('/b/') url confusion, version #sveltejs/kit#1.0.0-next.512

When I try to fetch('/b/') within the load function of /routes/a/+page.server.ts it refuses to accept local URL references.
Instead of being able to do
/b/
I have to use url:
http://localhost:3000/b/
Because the fetch() call refuses to accept the url (error: "Failed to parse URL"). I'm trying to consume my own api to reuse code. I thought SvelteKit fetch was supposed to support these local routes for api calls?
The example in documentation: https://kit.svelte.dev/docs/routing
Shows +page.svelte calling url '/api/add/' from fetch(), but not from within +page.server.ts - is there some reason they would not allow the same convention?
Thank you.
SvelteKit developers got back to me and indicated that there are two choices of fetch(url) function.
// /src/routes/test/[param0]/[param1]/+page.server.ts
import type { PageServerLoad } from './$types';
export const load: PageServerLoad = async ({ params }) => {
// ERROR: this will fail with URL parsing
let fetchResult = fetch('/api/target/');
}
SvelteKit aware fetch passed as load function parameter:
export const load: PageServerLoad = async ({ params, fetch }) => {
// NOTE: fetch was passed in as a parameter to this function
let fetchResult = fetch('/api/target/');
}
Confusing!
When I have an internal API route I want to hit within my sveltekit application, I structure it as so:
├src
|├routes
||├api
|||├example
||||├+server.js
Now, elsewhere in the app, you can hit the route like so using the fetch api:
let res = await fetch("/api/example")
refer to this section of the SvelteKit docs for a better understanding:
https://kit.svelte.dev/docs/routing

Failed to load resource: the server responded with a status of 500 () in when deployed to vercel next js

i am using next js and i added getServerSideProps to my project and when i redeployed my project to vercel i am getting the flowing error but my localhost is woeking perfectly
i am using next js and i added getServerSideProps to my project and when i redeployed my project to vercel i am getting the flowing error but my localhost is woeking perfectly
export async function getServerSideProps() {
// Fetch data from external API
const res = await fetch(`https://ask-over.herokuapp.com/questapi`);
const data = await res.json();
// console.log(data);
// Pass data to the page via props
return { props: { data } };
}
module.exports = {
reactStrictMode: true,
}
This is the result of an async call not completing before the next line of code that uses that variable runs. This can happen randomly because that is the nature of async code. The fix is to replace the line where you have data.map(...) with data ? data.map(...) : []; which will return an epty array until data gets its value, then the map function will run and your app should be ok.
In javascript, pretty much any time you're using a variable that is the result of an awaited activity, you should have null checks in place. The code above checks if data has value, then if it does have value, it will run return data.map, otherwise it will return [].

What's the best way to store a HTTP response in Ionic React?

I'm developing an app with Ionic React, which performs some HTTP requests to an API. The problem is I need to store the response of the request in a local storage so that it is accessible everywhere. The way I'm currently doing it uses #ionic/storage:
let body = {
username: username,
password: password
};
sendRequest('POST', '/login', "userValid", body);
let response = await get("userValid");
if (response.success) {
window.location.href = "/main_tabs";
} else if (!response.success) {
alert("Incorrect password");
}
import { set } from './storage';
// Handles all API requests
export function sendRequest(type: 'GET' | 'POST', route: string, storageKey: string, body?: any) {
let request = new XMLHttpRequest();
let payload = JSON.stringify(body);
let url = `http://localhost:8001${route}`;
request.open(type, url);
request.send(payload);
request.onreadystatechange = () => {
if (request.readyState === 4 && storageKey) {
set(storageKey, request.response);
}
}
}
The problem is that when I get the userValid key the response hasn't come back yet, so even awaiting will return undefined. Because of this I have to send another identical request each time in order for Ionic to read the correct value, which is actually the response from the first request. Is there a correct way of doing this other than just setting timeouts everytime I perform a request?
You are checking for the results of storage before it was set. This is because your sendRequest method is calling an asynchronous XMLHttpRequest request, and you are checking storage before the sendRequest method is complete. This can be fixed by making sendRequest async and restructuring your code a bit.
I would suggest you instead look for examples of ionic react using hooks or an API library - like fetch or Axios. This will make your life much easier, and you should find lots of examples and documentation. Check out some references below to get started:
Example from the Ionic Blog using Hooks
Example using Fetch using React
Related Stack Overflow leveraging Axios

NextJS + Redux Saga + SSR

we have a project, with a nextjs, redux saga, typescript setup.
SSR is very important for our web app, but we also have a lot of relevant widgets on every page.
This is why our page is structured in a modular way, where every widget (1-2 per page) loads the data it needs.
These widgets are relevant for SEO reasons now.
My problem is, that the API requests are not made on the serverside though. Right now it only returns the defaultState of every widget on the server and only loads them on the client.
I have searched and found a lot of instructions on how to do it, but they all rely on the fact that nextjs waits for the "getInitialProps" method until it returns the result from the server.
Since that lifecycle method is only available in the "pages" folder, that doesn't work for me.
Also if I block the "getInitialProps" component, the component is never really rendered.
Our pages are structured like below:
- pages/Home
- <HomeContainer ...> (fetches data for the main page)
- <HomeComponent >
- <Widget1Container ...> (fetches data)
- <Widget2Container ...> (fetches data)
What I want is for the serverside to wait for all the requests provided, before it returns the page to the user.
Because of the complex nature of different widgets on a page, it is not possible to create "combined" endpoint where we get the data in the "pages/Home" folder.
I know it's not ideal, but how could we make sure that the server actually makes all 3 requests (homecontainer, widget1container, widget2container) and awaits there responses, before returning?
I would like to have it like angular-universal does it. Just wait until there are not open requests or promises and then just render.
Any ideas?
Any help or ideas are deeply appreciated.
thanks
since you are redux sage, in getServerSideProps, send the start signal to saga
export const getServerSideProps = wrapper.getServerSideProps(
async (context) => {
store.dispatch(
fetchWidgetsStart("add payload")
);
store.dispatch(END);
await (store as SagaStore).sagaTask.toPromise();
const state = store.getState();
// I am making up the reducer name
const widgetListState = state.widgetsList ? state.widgetList : null;
return { props: { productListState } };
}
);
inside saga function:
export function* fetchWidgetsStart() {
yield takeLatest(
WidgetListActionTypes.WIDGET_LIST_START,
fetchWidgetssAsync
);
}
function* fetchWidgetsAsync(action: IFetchWidgetssStart) {
try {
const res: AxiosResponse<IWidget[] | []> = Promise.all([
yield axios.get(fetchWidget1),
yield axios.get(fetchWidget2),
yield axios.get(fetchWidget3)
])
yield put(fetchWidgetSuccess(res));
} catch (e: any) {
yield put(
fetchWidgetFailure(
e.response && e.response.data.detail
? e.response.data.detail
: e.message
)
);
}
}
The Promise.all() method takes an iterable of promises as an input, and returns a single Promise that resolves to an array of the results of the input promises. This returned promise will resolve when all of the input's promises have resolved, or if the input iterable contains no promises. It rejects immediately upon any of the input promises rejecting or non-promises throwing an error, and will reject with this first rejection message / error.

Resources