I use CloudFront from AWS and I have an S3 static website.
I use ReactJs and I changed some texts on most pages.
The problem I have now is that I use
npm run build
to produce the production application. I want to update the content on AWS in the S3 bucket (I previously uploaded the same files) however two things happen:
-When I access in incognito, everything works fine, I am given the updated version of the website
-When I access in normal mode with web browsers that I used to access the website before, I am still given the old version of files.
I accessed AWS documentation and I have two solutions:
-Wait 24 hours for CloudFront to cache files in edge locations
-Use version name for the files (for example, change the name of image.jpg to image_1.jpg; image_2.jpg etc.)
I would definitely go with the second option which indeed is time consuming but for sure less than 24 hours. Should I change the name of EACH file that I have in build or just those in static?
Any other solutions?
Something I haven't tried is, before uploading to AWS S3, create a folder such as V1 and upload my react files. When I make a change I call the folder V2 and so on.
Using version names is the most robust method. It gives you full control on the cache behavior without messing around with CloudFront.
So yes, each time there's a new version update your file names.
Btw, if you bootstrapped your react app using create-react-app then the build process does that by default. It will name each bundle with a unique hash every time the bundle changes. This way you can utilize long-term cache in the browser and CF for your files.
You'd probably still have to invalidate the root index.html each deployment as its name doesn't change between versions.
Related
My website changes every day - I run a news website with new stories every day. I want Google to index my site as often as possible and want/need to autogenerate the sitemap.
I use Google App Engine (with Node.js) to run my site. With GAE - I do not have write-access to the root directory. To post the site map - I need to re-deploy my whole site after generating the map. That is an unnecessarily complex step.
I have searched far and wide and cannot see how to save my sitemap. So - I considered using a static one with a dynamically generated child that I store in another location where I have write access. Google says it wants all linked sitemaps in the same directory. So that appears to be a dead-end.
Can I use "App Deploy" in such a way that only the sitemap is uploaded? Any other possibilities? Appreciate any and all suggestions. It seems unlikely that Google didn't provide some way to solve this.
For a site where new URLs are being created regularly (like a news, blog site, etc), don't 'store' your sitemap. It should be generated on demand i.e. your App should include code to generate the content when the link <your_website>/sitemap.xml is loaded.
Separately, you should note that gcloud app deploy doesn't always deploys all your files. It usually deploys only files that have changed. You can easily confirm this by running the deploy command, changing a single file and then running the deploy command again. You will see that the logs will say something like - Uploading 1 files to Google Cloud Storage and the deploy will be faster. You can change X number of files, deploy again and the message will be updated to indicate it is only deploying x files.
However, I'm not sure what it uses to compute the diff. Maybe it compares it to the files currently in your staging bucket and if the files in the staging bucket have been deleted (they have a default life span of 15 days) it will deploy all the files again (but as I said, I'm not sure of this)
Our site is being made available with the following structure:
Static Blob Container Azure > CDN > Cloudflare > User.
The React app build is made available in an Azure Static Blob Container that is accessed by an Azure CDN. When we access the app via the CDN URL, we never have a cache problem. We also use cloudflare to manage the DNS and supposedly improve the cache. But when we access the app through cloudflare, we have a serious cache problem, returning extremely old versions for users who have accessed the site before.
Even after turning off all cache options available in Cloudflare's dashboard and its graphics show that cache consumption has dropped, the bug still persists. We were unable to identify where our problem is in the structure mentioned above.
The problem is because a CDN uses multiple nodes to serve the content. The proper way to 'solve' this is appending a version in the filename or path, this way, whenever you need to change something, the CDN will download the latest version. Just using a regular 'app.js' is not enough.
More info:
How to force the browser to reload cached CSS and JavaScript files
https://stackoverflow.com/a/34604256/1384539
I am new in development world. I am trying to understand how the Heroku filesystem works.
I did an Express project using multer to upload images.
In production, everything worked well including fetching the images from my static folder.
However, when I did it with React (frontend=React & Backend=Express) the images are not displaying even though console shows no error.
According to my research Heroku says
Heroku filesystem is ephemeral - that means that any changes to the
filesystem whilst the dyno is running only last until that dyno is
shut down or restarted
and that I should use a dedicated file storage service such as AWS S3 (for static files).
How does this apply to my React project since I didn't use it in the Express project?
In production, everything worked well including fetching the images from my static folder.
In fact, it probably didn't.
Files can be saved to Heroku's ephemeral filesystem, and even loaded, but, as you have seen, this isn't permanent. Whenever your dyno restarts the filesystem resets. This happens frequently (at least once per day).
Express vs. React is irrelevant. You should always use something like S3 for user uploads on Heroku.
I am building a ReactJS app using create-react-app build, and deploying to Amazon S3 and CloudFront. Deploying is easy enough; when I create a new version, I can sync the build directory to S3, hit reload, and everything works well. The index.html contains references to the latest build via unique hash keys, so I always get the latest files.
But what about active users that are using the old version(s) of index.html? Their browsers will reference "chunks" that no longer exist. How should I update to my latest build without disturbing these users?
A blunt solution might be to keep the files from the older builds, but then the user would not be automatically updated to the new version. And cleanup of the old files would be messy.
In my CloudFront setup, when an active user requests a missing chunk, the browser will be directed to index.html for the 404/403 errors. Perhaps the upgrades are already managed automatically, forcing a reload of index.html (and therefore references to the new files)?
I am new in react+redux. Recently I got a task and have no idea.
Is it possible not to use any web server to run my react+redux project?
After building my project, I got a folder of static files.
Is it possible to place those in CDN not having web server eg. nodejs or tomcat?
thanks.
If you do not need save data on server - you do not need it. Take a look on GitHub pages for sample.
Of course you can but you need an HTML page to be parsed by the browser, in which you would include the necessary SCRIPT tags to load and run your React app.
I recommend you give Surge (https://surge.sh/) a try, it's free right now and you can easily host your React project with ease (including the .html files and every asset generated by your build). You can even create a npm script that builds and then deploys from the CLI with a single surge call. Works like a charm!
You can even use a custom domain name, or choose a specific surge subdomain if it's available, like gibbok.surge.sh. ;)
No it is not possible without a webserver.
If your app is a static website for example, with no server functionalities (like nodeJs) you could host your builded application (static files) on any simple http server included github pages without setup a nodeJs server.
If you app has some dynamic functionalities, example it use nodeJs to work with server file system or db, you need a node server.
In both cases a http server is necessary to delivery JS/HTML and assets to the browser when requested.
On a CDN you could store some static files, which can be used in your project, which is great if you are building a js library.
In case you do not want to care about the http server you could use any simple hosting solution, example GitHub page, or any hosting which allow running a website with some static files.