I am storing 40,000 plus images in cache storage using cache.put(). I can see all the images in cache storage successfully stored. But when I am using my react js website offline, some images are displaying and some are not displaying. The browser decides itself to show an image or not. I am unable to find the reason. Can anyone help me?
I got a solution. Just we need an event listener in the Service worker. If there is GET request, it will first check-in the cache first and return from there
self.addEventListener('fetch', event => {
// Let the browser do its default thing
// for non-GET requests.
if (event.request.method !== 'GET') return;
// Prevent the default, and handle the request ourselves.
event.respondWith(async function() {
// Try to get the response from a cache.
const cache = await caches.open('images');
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
// If we found a match in the cache, return it, but also
// update the entry in the cache in the background.
event.waitUntil(cache.add(event.request));
return cachedResponse;
}
// If we didn't find a match in the cache, use the network.
return fetch(event.request);
}());
});
Related
I have a page in next js. I want to call the api in "getServerSideprops" only when the page is loaded first time. Example: You visited xyz page and i want to call api on that first visit. After that if you switched to any other page and came back to this xyz page.. I want to check in my redux if i already have data for this page.. i don't want to call the api again.
Here is the code image
Server-side rendering is great to fetch data that changes on every render. Here it seems to me like your data doesn't change that often. The best solution imo would be to use incremental static regeneration. Generating a static page with revalidate every minute for exemple.
Read more at: https://nextjs.org/docs/basic-features/data-fetching/incremental-static-regeneration
// This function gets called at build time on server-side.
// It may be called again, on a serverless function, if
// revalidation is enabled and a new request comes in
export async function getStaticProps() {
const response = await apiCall('route', 'get', null)
return {
props: {
data: response.data ?? null
},
// Next.js will attempt to re-generate the page:
// - When a request comes in
// - At most once every 10 seconds
revalidate: 10, // In seconds
}
}
You may also want to set caching headers. Example below. Read more at: https://nextjs.org/docs/going-to-production#caching
// This value is considered fresh for ten seconds (s-maxage=10).
// If a request is repeated within the next 10 seconds, the previously
// cached value will still be fresh. If the request is repeated before 59 seconds,
// the cached value will be stale but still render (stale-while-revalidate=59).
//
// In the background, a revalidation request will be made to populate the cache
// with a fresh value. If you refresh the page, you will see the new value.
export async function getServerSideProps({ req, res }) {
const response = await apiCall('route', 'get', null)
res.setHeader(
'Cache-Control',
'public, s-maxage=10, stale-while-revalidate=59'
)
return {
props: {
data: response.data ?? null
}
}
}
Since you are using redux, you will have to make use of a library called next-redux-wrapper. The way you will accomplish your goal is, once you fetch your data, you will store this data in the server instance of redux and from there on, you can check every time before this request is made, if that particular data already exists in the store. You can learn more about this in the ReadMe and this answer
You can set the shallow: true that will skip the getServerSideProp() call after the initial call. shallow-routing
I’m using next.js to build static HTML webpages.
One of my webpages needs data from a third-party API, which I’d like to fetch at build time and bake into the resulting HTML.
I don’t want this call to ever happen on the client, because:
CORS prevents the request from succeeding anyway
I would have to expose an API key on the client (no thank you)
I thought getInitialProps was the answer, because the fetched data is indeed baked in during the build/export process, but when I navigate away from the page and return from it, getInitialProps gets triggered on the client, breaking everything.
My current code in getInitialProps is something like:
static async getInitialProps(){
// Get Behance posts
const behanceEndpoint = `https://www.behance.net/v2/users/${process.env.BEHANCE_USERNAME}/projects?api_key=${process.env.BEHANCE_API_KEY}`
const behanceRes = await fetch(behanceEndpoint)
let behancePosts = await behanceRes.json()
// Return only the required number of posts
return {
behancePosts: behancePosts
}
}
Any advice or best practice on how to handle this? I know Gatsby.js does it out of the box.
one possibility would be, if you just want to execute this once on the server to check if the req parameter is present in getInitialProps: (Documentation)
req - HTTP request object (server only).
One dirty approach:
static async getInitialProps({ req }){
if (req) {
// only executed on server
// Get Behance posts
const behanceEndpoint = `https://www.behance.net/v2/users/${process.env.BEHANCE_USERNAME}/projects?api_key=${process.env.BEHANCE_API_KEY}`
const behanceRes = await fetch(behanceEndpoint)
let behancePosts = await behanceRes.json()
// Return only the required number of posts
return {
behancePosts: behancePosts
}
} else {
// client context
}
Hope this helps a little bit.
I am making a PWA where users can answer the forms. I want it to make also offline, so when a user fills out a form and does not have the internet connection, the reply will be uploaded when he is back online. For this, I want to catch the requests and send them when online. I wanted to base it on the following tutorial:
https://serviceworke.rs/request-deferrer_service-worker_doc.html
I have managed to implement the localStorage and ServiceWorker, but it seems the post messages are not caught correctly.
Here is the core function:
function tryOrFallback(fakeResponse) {
// Return a handler that...
return function(req, res) {
// If offline, enqueue and answer with the fake response.
if (!navigator.onLine) {
console.log('No network availability, enqueuing');
return;
// return enqueue(req).then(function() {
// // As the fake response will be reused but Response objects
// // are one use only, we need to clone it each time we use it.
// return fakeResponse.clone();
// });
}
console.log("LET'S FLUSH");
// If online, flush the queue and answer from network.
console.log('Network available! Flushing queue.');
return flushQueue().then(function() {
return fetch(req);
});
};
}
I use it with:
worker.post("mypath/add", tryOrFallback(new Response(null, {
status: 212,
body: JSON.stringify({
message: "HELLO"
}),
})));
The path is correct. It detects when the actual post event happens. However, I can't access the actual request (the one displayed in try or fallback "req" is basically empty) and the response, when displayed, has the custom status, but does not contain the message (the body is empty). So somehow I can detect when the POST is happening, but I can't get the actual message.
How to fix it?
Thank you in advance,
Grzegorz
Regarding your sample code, the way you're constructing your new Response is incorrect; you're supplying null for the response body. If you change it to the following, you're more likely to see what you're expecting:
new Response(JSON.stringify({message: "HELLO"}), {
status: 212,
});
But, for the use case you describe, I think the best solution would be to use the Background Sync API inside of your service worker. It will automatically take care of retrying your failed POST periodically.
Background Sync is currently only available in Chrome, so if you're concerned about that, or if you would prefer not to write all the code for it by hand, you could use the background sync library provided as part of the Workbox project. It will automatically fall back to explicit retries whenever the real Background Sync API isn't available.
Quick question, using the Baqend SDK in React, I'm saving profile images using the id of an object saved in the database as the name.
But in order to get the image to update in the user's browser after it is uploaded, I'm modifying state and adding &updated=true to the end of the file.url as it is returned by Baqend.
The save image code:
uploadLogo(event) {
event.preventDefault();
const name = this.props.match.params.id+"logo";
const file = event.target.files[0];
const img = new db.File({ name: name, data: file, type: 'blob' });
img.upload({force: true}).then((file) => {
db.Companies.load(this.props.match.params.id).then(company => {
this.setState({
logo: file.url+"?updated=true"
});
company.logo = file.url;
return company.update();
},
(error) => {
alert(error);
});
});
}
Is this the correct approach with React and the Baqend SDK? Are there going to be any side-effects on this if I'm loading a bunch of images by URL that look like this: https://remarkable-apple-95.app.baqend.com/v1/file/www/cce9830b-48eb-422e-830d-72ae28571480logo?BCB&updated=true
I would imagine url parameters like this are just ignored? The only person that is going to load the image with ?updated=true after it is the one person that updates the logo and only immediately after he updates it.
Also what is the ?BCB being added in file.url doing?
Your example looks good so far.
But you should not add additional query parameters at all, as they cause cache misses in the CDN.
The BCB (Baqend Cache Buster) is actually what you are trying to archive with the ?upload=true parameter. The SDK adds those cache busters automatically if an image was changed previously.
The BCB ensures that the fresh image is fetched from the server and is only cached with revalidation headers until the old image is expired in the browser cache. Our CDN caches are aware of this special cache buster and rewrite the image request back to the original URL to ensure cache hits in the CDN.
Note that our CDN caches are instantly invalidated if the content is changed.
This staleness information is propagated to other clients as well via a Bloom filter. That ensures that other clients won't take the image out of their local cache and therefore see the new image too.
Okay. I'm kinda new to react and I'm having a #1 mayor issue. Can't really find any solution out there.
I've built an app that renders a list of objects. The list comes from my mock API for now. The list of objects is stored inside a store. The store action to fetch the objects is done by the components.
My issue is when showing these objects. When a user clicks show, it renders a page with details on the object. Store-wise this means firing a getSpecific function that retrieves the object, from the store, based on an ID.
This is all fine, the store still has the objects. Until I reload the page. That is when the store gets wiped, a new instance is created (this is my guess). The store is now empty, and getting that specific object is now impossible (in my current implementation).
So, I read somewhere that this is by design. Is the solutions to:
Save the store in local storage, to keep the data?
Make the API call again and get all the objects once again?
And in case 2, when/where is this supposed to happen?
How should a store make sure it always has the expected data?
Any hints?
Some if the implementation:
//List.js
componentDidMount() {
//The fetch offers function will trigger a change event
//which will trigger the listener in componentWillMount
OfferActions.fetchOffers();
}
componentWillMount() {
//Listen for changes in the store
offerStore.addChangeListener(this.retriveOffers);
}
retrieveOffers() {
this.setState({
offers: offerStore.getAll()
});
}
.
//OfferActions.js
fetchOffers(){
let url = 'http://localhost:3001/offers';
axios.get(url).then(function (data) {
dispatch({
actionType: OfferConstants.RECIVE_OFFERS,
payload: data.data
});
});
}
.
//OfferStore.js
var _offers = [];
receiveOffers(payload) {
_offers = payload || [];
this.emitChange();
}
handleActions(action) {
switch (action.actionType) {
case OfferConstants.RECIVE_OFFERS:
{
this.receiveOffers(action.payload);
}
}
}
getAll() {
return _offers;
}
getOffer(requested_id) {
var result = this.getAll().filter(function (offer) {
return offer.id == requested_id;
});
}
.
//Show.js
componentWillMount() {
this.state = {
offer: offerStore.getOffer(this.props.params.id)
};
}
That is correct, redux stores, like any other javascript objects, do not survive a refresh. During a refresh you are resetting the memory of the browser window.
Both of your approaches would work, however I would suggest the following:
Save to local storage only information that is semi persistent such as authentication token, user first name/last name, ui settings, etc.
During app start (or component load), load any auxiliary information such as sales figures, message feeds, and offers. This information generally changes quickly and it makes little sense to cache it in local storage.
For 1. you can utilize the redux-persist middleware. It let's you save to and retrieve from your browser's local storage during app start. (This is just one of many ways to accomplish this).
For 2. your approach makes sense. Load the required data on componentWillMount asynchronously.
Furthermore, regarding being "up-to-date" with data: this entirely depends on your application needs. A few ideas to help you get started exploring your problem domain:
With each request to get offers, also send or save a time stamp. Have the application decide when a time stamp is "too old" and request again.
Implement real time communication, for example socket.io which pushes the data to the client instead of the client requesting it.
Request the data at an interval suitable to your application. You could pass along the last time you requested the information and the server could decide if there is new data available or return an empty response in which case you display the existing data.