I am building a ReactJS app using create-react-app build, and deploying to Amazon S3 and CloudFront. Deploying is easy enough; when I create a new version, I can sync the build directory to S3, hit reload, and everything works well. The index.html contains references to the latest build via unique hash keys, so I always get the latest files.
But what about active users that are using the old version(s) of index.html? Their browsers will reference "chunks" that no longer exist. How should I update to my latest build without disturbing these users?
A blunt solution might be to keep the files from the older builds, but then the user would not be automatically updated to the new version. And cleanup of the old files would be messy.
In my CloudFront setup, when an active user requests a missing chunk, the browser will be directed to index.html for the 404/403 errors. Perhaps the upgrades are already managed automatically, forcing a reload of index.html (and therefore references to the new files)?
Related
My website changes every day - I run a news website with new stories every day. I want Google to index my site as often as possible and want/need to autogenerate the sitemap.
I use Google App Engine (with Node.js) to run my site. With GAE - I do not have write-access to the root directory. To post the site map - I need to re-deploy my whole site after generating the map. That is an unnecessarily complex step.
I have searched far and wide and cannot see how to save my sitemap. So - I considered using a static one with a dynamically generated child that I store in another location where I have write access. Google says it wants all linked sitemaps in the same directory. So that appears to be a dead-end.
Can I use "App Deploy" in such a way that only the sitemap is uploaded? Any other possibilities? Appreciate any and all suggestions. It seems unlikely that Google didn't provide some way to solve this.
For a site where new URLs are being created regularly (like a news, blog site, etc), don't 'store' your sitemap. It should be generated on demand i.e. your App should include code to generate the content when the link <your_website>/sitemap.xml is loaded.
Separately, you should note that gcloud app deploy doesn't always deploys all your files. It usually deploys only files that have changed. You can easily confirm this by running the deploy command, changing a single file and then running the deploy command again. You will see that the logs will say something like - Uploading 1 files to Google Cloud Storage and the deploy will be faster. You can change X number of files, deploy again and the message will be updated to indicate it is only deploying x files.
However, I'm not sure what it uses to compute the diff. Maybe it compares it to the files currently in your staging bucket and if the files in the staging bucket have been deleted (they have a default life span of 15 days) it will deploy all the files again (but as I said, I'm not sure of this)
I am trying to alert users when there is a new version deployed in my react app. Here I am using webpack to bundle our modular application which yields a deployable /dist directory. Once the contents of /dist have been deployed to a server, clients (typically browsers) will hit that server to grab the site and its assets. I am using caching techniques as mentioned in https://webpack.js.org/guides/caching/ and for each deployment, I am getting new content hash id.
I need something similar to the below screenshot where the user gets a notification to refresh the page. Is there any way you guys can help me out in this?
I am new in development world. I am trying to understand how the Heroku filesystem works.
I did an Express project using multer to upload images.
In production, everything worked well including fetching the images from my static folder.
However, when I did it with React (frontend=React & Backend=Express) the images are not displaying even though console shows no error.
According to my research Heroku says
Heroku filesystem is ephemeral - that means that any changes to the
filesystem whilst the dyno is running only last until that dyno is
shut down or restarted
and that I should use a dedicated file storage service such as AWS S3 (for static files).
How does this apply to my React project since I didn't use it in the Express project?
In production, everything worked well including fetching the images from my static folder.
In fact, it probably didn't.
Files can be saved to Heroku's ephemeral filesystem, and even loaded, but, as you have seen, this isn't permanent. Whenever your dyno restarts the filesystem resets. This happens frequently (at least once per day).
Express vs. React is irrelevant. You should always use something like S3 for user uploads on Heroku.
I use CloudFront from AWS and I have an S3 static website.
I use ReactJs and I changed some texts on most pages.
The problem I have now is that I use
npm run build
to produce the production application. I want to update the content on AWS in the S3 bucket (I previously uploaded the same files) however two things happen:
-When I access in incognito, everything works fine, I am given the updated version of the website
-When I access in normal mode with web browsers that I used to access the website before, I am still given the old version of files.
I accessed AWS documentation and I have two solutions:
-Wait 24 hours for CloudFront to cache files in edge locations
-Use version name for the files (for example, change the name of image.jpg to image_1.jpg; image_2.jpg etc.)
I would definitely go with the second option which indeed is time consuming but for sure less than 24 hours. Should I change the name of EACH file that I have in build or just those in static?
Any other solutions?
Something I haven't tried is, before uploading to AWS S3, create a folder such as V1 and upload my react files. When I make a change I call the folder V2 and so on.
Using version names is the most robust method. It gives you full control on the cache behavior without messing around with CloudFront.
So yes, each time there's a new version update your file names.
Btw, if you bootstrapped your react app using create-react-app then the build process does that by default. It will name each bundle with a unique hash every time the bundle changes. This way you can utilize long-term cache in the browser and CF for your files.
You'd probably still have to invalidate the root index.html each deployment as its name doesn't change between versions.
I have a React/redux app which is deployed on CloudFront + s3. There is no static hosting enabled on the bucket. I understand that invalidating cache on a new deployment clears cache in all the edge locations and the new changes will be served up. But what happens to the active prod users when the cache is invalidated? Are they able to continue on the app without any errors? Does it get worse for the active users if the redux store structure changed in the new version?
Clearing the cloudfront cache will bring up the fresh content from your origin. However, that would not affect the existing production users. They would continue to be served from the cached content as long as their session continues.
That being said, they would be served the fresh content when their session restarts.
There would be no errors whatsoever.
Hope this helps.
I've been wondering the same thing for my React website, which is made up of many chunks. I wouldn't worry about your Redux state unless you're saving it to cookie/localstorage and loading it again. In that case you could write some migration check during loading. Or even have it versioned in some way.
Regarding caching, I don't recommend deleting any files for up to a year. That way your active users would still be able to download chunks while they're active on your website.
During deployment I upload all the new files and clear the cache on all *.html files to get the latest reference to js and css files.