I have gatsby blog, and after I create a new post, and build static files, upload them on my hosting every user has to do hard refresh on my blog to see changes.
How to make auto refresh on next visit after uploading new build?
Another reason for this behavior was in my case the service worker as implemented by gatsby-plugin-offline.
Service workers are programmed to update while navigating. The problem is that when a user visits home and does not navigate any further no UPDATE will be visible. You esssentially never see an update if you have one-page website because there is nowhere to navigate to!
If you want the page to refresh automatically and invalidate the old cache, you need to trigger it. If you have gatsby-plugin-offline in your gatsby-config.js, add this line to your gatsby-browser.js
// trigger an immediate page refresh when an update is found
export const onServiceWorkerUpdateReady = () => window.location.reload();
Here some background information about this issue from the official github repository.
As #coreyway pointed out doing this automatically can be problematic. I argue that this behavior is still better than being stuck with an deprecated website version. If you are concerned about the UX the linked GitHub issuee discusses a solution to let the user trigger an update via click on a update notice message.
You likely have HTTP cache-control headers on your .html files that are telling the browser that they're safe to cache. You want to remove those cache-control headers, or at least configure your HTTP caching to require validation (must-validate). You'll want to do the same for your page-data.json files if you're using Gatsby v2.9.0+.
Related
I'm a little surprised there is nothing out there about this that I have found. But just like the title says, I have a React SPA deployed to Netlify. It goes live without error. The only issue is, if the end user has been to the site before, they have to refresh the page to see any changes I have pushed out.
Is there something I need to add to the index file perhaps?
The browser caches the compiled js bundle.
You can read more here: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control
One of your options would be to disable it, or set cache expiration to a lower value during the intense development and increase it if/when you deploy less often.
Another option could be to implement some kind of API method to check if newer version has been deployed and trigger a page refresh. (Please be careful not to discard users work, like data filled in forms, during a refresh)
Each have pros and cons.
I am looking for a pattern that would allow me to better the UX for my users. I have a REST server running behind CloudFront being consumed from a plain React application on the frontend.
I'll simplify my example to illustrate my issue.
I have an endpoint called GET /posts/<id>. When the browser asks for it, it comes with a max=age=180 which means it would get stored in the browser's cache and any subsequent call to GET /posts/<id> will be served from the browser's cache for the duration of those 180 seconds, after which it will hit the CDN again to try and obtain a fresh copy.
That is okay for most users. I don't mind if updates to any post to delay up to 3 minutes before they're propagated to all the users. But there is one user who's the author of this post. That user can make changes to this post using PATCH /posts/<id>. Let's call that user The Editor.
Here's a scenario I have right now:
The Editor loads up the post page which then calls GET /posts/5
The CDN serves the latest copy to the front end.
the Editor then makes a change to the post and submits it to be back end via PATCH /posts/5.
The editor then refreshes his browser tab using Command-R (or CTRL-R).
As a result, the front end then requests GET /posts/5 again -- but gets the stale copy from before the changes because 180 seconds haven't passed yet since the last GET and the GET issued after the PATCH
What I'd like the experience to be is:
The Editor loads up the post page which then calls GET /posts/5
The CDN serves the latest copy to the front end.
The editor then makes a change to the post and submits it to be back end via PATCH /posts/5.
After a Command-R browser tab refresh the GET /posts/5 brings back a copy of the data with the changes the editor made with PATCH right away, regardless of the 180 seconds of ttl before a fresh copy can be obtained.
As for the rest of the users, it's perfectly okay for them to wait up to 180 seconds before the change in the post propagates to them when the GET /posts/5
I am using Axios, but I do not that SWR and React-Query support mutations. To my understanding this would allow the editor to declare a mutation for the object he just PATCH'ed on the server, so that any subsequent calls he makes to GET /posts/5 will be served from there, until a fresher version can be obtained from the backend.
My questions are:
Can SWR with "mutations" serve the mutated object via the GET /posts/5 transparently?
Will the mutation survive a hard browser tab refresh? or a browser closure, re-opening and subsequent /GET posts/5?
Is there another pattern/best practice to solve that?
TL;DR: Just append a harmless, gibberish querystring to the end of the request GET /posts/<id>?version=whatever
Good question. I must admit I don't know the full answer to this problem, but I want to share one well-known technique among frontend devs.
The technique is called cache busting. I'm not sure if this is the best practice, but I'm pretty sure it's widely practiced, since it's so straight-forward to understand.
Idea is simple. When you add a changed querystring to the end, you effectively change the URL, thus no cache is hit, you evade the whole cache problem.
So the detail steps to a solution for your particular use case would go like this:
Normally you'll just request GET /posts/<id> for all users
When a user logs in, a hash key is generated from whatever algorithm. For simplicity let's just use increasing integer and call it version. You store this version in localStorage so it can survive through page refresh.
Now you need to distinguish scenario when the user is viewing his own posts or other's posts. When guy is viewing his own, you always use GET /posts/<id>?version=n
Whenever the user edits his post and hits save button, you bump version from n to n+1
Next time he goes to post view page, the app requests GET /posts/<id>?version=n+1 which is not cached, and would retrieve the up-to-date content.
One last thing, make sure your server safely ignores that ?version=n querystring.
I'm sure there're other solutions to this problem. I'm no expert of server config and HTTP headers so I'm not getting into that topic, but there must be something to look for.
As of pure frontend solution, there's Serivce Worker API for you to consider. The main point of this API is to enable devs to programmatically control cache strategies.
With this API, you could leave your current app code as-is, just install a service worker, then you could use the same cache busting technique in the background to fetch new content, or just delete the cache (using Cache API) when user edits, or even fake a response for the GET /posts/<id> from the PATCH /posts/<id> that user just send.
Depending on what CDN you use, you can invalidate a cache manually when publishing updates to a post. For example cloudfront lets you specify which path you want to fetch fresh on the next request.
For sites with lots of traffic but few updates this works pretty well, and is quite simple to implement. For sites with a lot of authors and frequently changing content you would need to get more creative though.
One strategy I've used in the past is using a technique called object versioning, where instead of invalidating the cache to an object you just publish a version of it with a timestamp. This would also mean you need to publish a manifest file when your frontend loads. The manifest contains the latest timestamps of all the content the page needs to load, and is on a much shorter TTL than the rest of the content. When you publish a new version of a post you would update the timestamp in the manifest, and the frontend pulls the latest version of it the next time the page loads.
I'm wondering how does whatsappweb deliver updates?
Do you ever notice a left green card appearing sometimes and asking you to click in a link to refresh page and run the new whatsappweb fresh code updated.
I'm almost sure they use webpack, service workers etc.
Chances are that you already had cache problems using webpack where even refreshing page it remains cached.
So how does whatsappweb solved this issue with a single refresh link?
They use a service worker, if the service worker gets updated, they trigger something in the react app, is easy to do it.
serviceWorker.register({ onUpdate: () => {console.log('new service worker')}});
just dispatch something instead of the console.log
Webpack is a building tool and isn't involved anywhere on a live site. While it offers Hot Module Reload for the development server you will not get it on the production version.
Unlike traditional desktop applications, delivering updates for websites is as straightforward as updating the files on your server (and invalidating any browser caches). You don't need to notify the user to download something, a simple refresh will get the new pages.
If you really want instantaneous updates (without waiting for the user to refresh the page) you can create some sort of WebSocket communication which when a message is received triggers a browser refresh. Nothing special and no deployment mechanisms involved.
I'm using a Gatsby setup hosted on Netlify.
I'm building some sort of recipe website where users can register, login and add recipes to their own cookbook page. To add a new recipe a user can fill in a form with some data (like recipe name, ingredients etc.). On submit the data will be stored inside a database. I then also want a dynamic page to be created for that specific recipe. (This is where you come in).
I know I can add pages via the gatsby-node.js file but I would prefer if this page can be added on the client side and exists instantly after the form submit without rebuilding the project. Is this possible in Gatsby and if so... how?
If this is not possible my best option would be calling the Netlify webhook to rebuild the project after the form submit and simply wait for the build to be completed before I can show the recipe detail page? Any thoughts on this?
Hope one of you coders can help me out here!
The way I would approach this is first show a dynamic version of the new recipe page (directly after submitting), which will only be visible to the authenticated user (based on its user ID and post ID). Maybe make this clear for the author with a note on that page. Something like: "This is a preview. Your recipe is being cooked up right now and will be ready in a few minutes."
At the same time, an incremental build would be triggered using a webhook from the backend. Both gatsby cloud and netlify support this now, so theoretically the build should be fast.
Triggering the webhook depends on your backend solution which you didn't mention. But when using for example Drupal you can use the build_hooks module. This can be configured to trigger a build when a recipe is posted back to Drupal.
I'm sure there would be a number of technical challenges but I think it should be possible. The trick would be not to generate too many dependencies in your new content, so the incremental build stays as small as possible.
Okay so I figured out there isn't a way in Gatsby to create the page directly in the client side. I decided to go with Albert's idea of first showing a dynamic page including a message saying this is a temporary page and the real recipe is "being cooked up".
On form submit I also just simply call the webhook to trigger Netlify deploy. This triggers a new deploy and in the gatsby-node.js I will create the new page based on the data I saved in my database on the form submit.
I have a create react app SPA, and I have deployed it with a registered service-worker.js (Cache-Control: max-age=0)
Everything works totally fine: When I update my code, and redeploy, navigate away from my site (or close the page), and return to my site, the service-worker.js file is downloaded again. It caches my index.html file which contains the url for my app.HASH.js file. It notifies me that new content is available, I refresh the browser page, and now I am running my latest app version.
What doesn't work is: When I navigate to different parts inside my SPA, I use react-router to change the URL. The "url-changes" don't trigger a reload of my service-worker.js file (it's cached with max-age=0 - and I have confirmed this with curl -I). Therefore a new worker is never downloaded to eventually inform me that new content is available and that I need to do a refresh.
I am not sure what the expected behaviour is supposed to be. If react-router changes the URL - should this not trigger a reload of service-woker.js when it's set to not cache itself?
In order to be able to get a new version of the SW.js file while the user is using your app, you have to manually call ServiceWorkerRegistration.update().
You could do for instance:
navigator.serviceWorker.getRegistrations()
.then(registrationsArray => {
registrationsArray[0].update();
})
This code would then be called whenever you like! This forces the browser to check for updates to the SW.js file and then handle the situation in whatever way you've configured your script to do.
This update() call should be made in any place you want to check for updates. You could call it after every URL change / new route visit or once a minute or whatever. You decide.
Checkout the MDN documentation. They also show reference code for storing a reference to the registered SW which gives you the possibility of directly calling update.
https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerRegistration/update