isomorphic/universally-rendered React 15 app breaks with Cloudflare HTTP Proxy ("orange cloud") - reactjs

I have an isomorphic/universal React app, meaning it is rendered on the server by the same JS that powers the client-side, Single Page App user experience.
I configure this app's DNS using Cloudflare, and use their "orange cloud" feature to accelerate and protect my site's traffic, as explained in the following graphic and in their support article:
From the linked-to article: "Cloudflare can operate in two modes - DNS only (unproxied; 'grey cloud') and as a HTTP proxy ('orange cloud') with our security, CDN & performance features."
I have discovered that running my app with React 15 and Cloudflare's "orange cloud" HTTP Proxy feature results in an error:
reactProdInvariant.js:31 Uncaught Error: Minified React error #32; visit http://facebook.github.io/react/docs/error-decoder.html?invariant=32&args[]=2 for the full message or use the non-minified dev environment for full errors and additional helpful warnings.
The text of the linked-to bug:
Unable to find element with ID 2.
This ID probably refers to the head tag of my page:
<!doctype html>
<html lang="en-us" data-reactroot="" data-reactid="1" data-react-checksum="-1233854461"><head data-reactid="2">...
I do NOT get this issue with React 14.
I do NOT get this bug when using the raw, un-DNS-ed address of my app, nor do I get it when I switch to the "grey cloud" to use Cloudflare as only a DNS service.
I do NOT get this issue when I disable server-side rendering.
When I google the text of the linked-to "actual" error, I find this Github thread that confirms that this has something to do with server-side rendering.
However, my situation differs slightly: that thread's author encounters the error with an "unable to find element" ID referring to a style tag, whereas in my case the unfound element ID refers to the head tag.
When I google the text of the production "wrapped" error, I find this Github thread that confirms that this has something to do with Cloudflare's HTTP Proxy. This comment says: "If you're using CloudFlare, please disable auto-minification of HTML."
So far, I can't figure out how to do that. It's hard for me to find good information on what exactly Cloudflare does with their HTTP Proxy, and how I can configure it.
Questions:
Why exactly does this bug happen?
Where can I find information about what exactly the "orange cloud" does?
What is the best way to fix this problem while maintaining the benefits of Cloudflare's HTTP Proxy?

React on the server needs to render the application to a string of HTML that makes sense to the browser. At the same time, React in the browser needs to read this HTML and understand it in relation to your JS code: it needs to identify which DOM tree came from which React component, in a very detailed way. Thus, in effect, the intermediate HTML is a serialization format between React on the server and React on the client with an additional requirement for it to make sense for the browser even in the absence of React.
When you enable advanced Cloudflare functionality, it treats the HTML as a “regular” HTML, not fancy server-side rendered compnents. My baseless speculation on one thing that could be going wrong is stripping HTML comments. In general, this is a natural thing to do for minification. But React uses HTML comments to put a <!-- empty --> placeholder where a React component returns null. Naturally, stripping these breaks React.
Cloudflare is there to make serving websites faster at a lower cost. They have a wide variety of tools to achieve that, see their introduction guide. Minifying HTML is completely natural and it is unfortunate it breaks your use case, but this is what we get when the meaning of nodes and attributes and flexibility changing them in our HTML is not strictly defined.
I think the most straight-forward way for you to move forward for now is to disable HTML minification in Cloudflare settings.

Related

Image violates the following Content Security Policy directive - Create React App

I'm getting the following Content Security Policy error in chrome when running my React app. I tried googling this for a long time, but I couldn't find enough information about how to fix this when using create-react-app. I would appreciate any help very much.
After a bit search about your issue, I ended up here in MDN. I will shortly define what the problem is but for more information I strongly suggest you read the provided link.
So what is happening here exactly?
This is because the website is configured to use Content Security Policy(CSP) to protect against someone maliciously loading code from a third party. The Content-Security-Policy meta-tag allows you to reduce the risk of XSS attacks by allowing you to define where resources can be loaded from, preventing browsers from loading data from any other locations. This makes it harder for an attacker to inject malicious code into your site.
How to solve this then?
According to the MDN link that I provided, we should solve this by adding the following meta tag to our index.html.
<meta http-equiv="Content-Security-Policy" content="default-src 'self' *.trusted.com">
NOTE: *.trusted.com should be the trusted site or list of them.
Then what should it happen on your own localhost?
I have faced several issues like yours and then found out when this error has shown up on your console, this is not necessarily showing you have this exact problem on your project, and the other problems in your main code could cause such an error. I just found some similar issues that will share below:
Same issue on angular
Same issue on ionic
Same issue on react
and so on
So what you have to do?
First of all, please check all your existing codes and paths in your project and make sure there are no errors in neither of them. When you get rid of all your errors this should be gone as usual, but if the problem insists to exist please make sure to disable all your extensions in your browser (you can safely test it on incognito without disabling anything) and then run the project and see if the error is gone or not.
So there is two-step at all to get rid of that:
Get rid of all your project and pathing errors
Make sure all your extensions are disabled

Sharing on social media, the URL does not render any meta data

We have built a project (Web Application) in React .net core using react in client-side rendering.
We've used react-helmet for dynamically assigning meta tags.
The issue being when the app renders in the browser. The browser gets only the static HTML on initial load which does not include the dynamic meta tags we have set. However on inspecting you get those meta tags under "Elements".
Also, if we use these URL for sharing on any social media, like WhatsApp or Facebook, the URL does not render any metadata as it should.
Tried searching for solutions to our problem, the most obvious answer we came across was to try server-side rendering instead. We get that, but it is not a solution to try out at this juncture when we're ready with app to roll it out.
Others we came across were "react-snap", "react-snapshot", but no luck
with react-snap, it requires to upgrade React's version to 16+, which we did but I guess not all dependencies were upgraded, there was an error saying "
hydrate is not a function
(hydrate concerns the react-dom)
With react-snapshot, we could not find the necessary type definition, which is required in react .net core to function properly
Please guide for the next probable step (except the paid ones like prerender, etc)?
Main goal: Social Applications should render the meta data when we paste/share the URL within them.
Prerender is the only solution.
I used a node dependency called "prerender" -> https://github.com/prerender/prerender
It works enabling a web server wich make http requests. Assigning value to a boolean: window.prerenderReady = true; in your website tells your server when the page is ready to "take the photo" and it returns the Html when so. You need to program an easy script that parses all the site urls and save those html contents to files. Upload them to your server and using .htaccess or similar target the crawlers external-hit-facebook,twitterbot,googlebot, etc.. to show them the prerendered version and 'the real site' to the rest of user-agents.
It worked for me.
The meta tags for Open Graph need to be present in the HTML which is sent back to the client when fetching a URL. Browsers or bots will not wait until the app is rendered on the client side to determine what the metatags are - they will only look at the initially loaded HTML.
If you need the content of your Open Graph metadata to be dynamic (showing different content depending on the URL, device, browser etc.) you need to add something like react-meta-tags into your server code.
There are no type definitions available for any of the react meta tags libraries, but you can add your own. It can be a bit tricky, but check out the official documentation and the templates they have provided to get started.
If you don't need it to be dynamic, you could add the tags into the static parts of the <head>-tag in your index.html.
I had the same issue today. I had two React Web applications that need this. Here is how I solved it:
put your preview image in the public folder
still in public folder, Open index.html, add the line <meta property="og:image" content="preview.png"/>
or <meta property="og:image" content="%PUBLIC_URL%/preview.png"/>.
Go to https://www.linkedin.com/post-inspector/ to check if it works.
I hope this would help!

Equivalent of an edge side include in a universal React application

I have a universal React application. Within one of the pages I want to include some HTML from another server - Edge side includes (ESI) have been mentioned, but this methodology doesn't seem to be compatible with Universal React applications, as:
we may not be able to recreate the functionality of an Edge side include client side, at least not without revealing the external URL to the browser. I guess we could create a proxy page on our server to do this, and load the html snippet via AJAX when doing this client side, but that still leaves us with the second issue...
Using an ESI means injecting non React DOM content in to the already (server side) rendered DOM of the React application when processing on CDN server. I am pretty sure this would make the data-react-checksum invalid and I can't think of a way to avoid this.
Is there an alternative, React-friendly, universal rendering approach that could be used?

Trying to understand how an isomorphic react app is supposed to do client-side routing

Pardon my English, it is a second language. The whole point of an isomorphic app, as opposed to a regular client-side SPA is so the client doesn't have to download the whole JS file initially which results in really slow initial load time.
I've been trying to teach myself server-side rendered React, and after watching countless videos around the concept and following countless tutorials on the actual implementation, I still can't get my head around this (at least this is how I understand it):
Despite the server conditionally rendering pages and sending props to the client on url change, the client side still uses a router that includes all the entry points for the app (by requiring all of them, and then loading the file based on the url location). Doesn't that means all the files are included in the main client JS file anyways since it's already been required by the client-side router? Doesn't that defeat the whole purpose of server-rendered React? Or am I thinking about this the wrong way?
In short, how does an isomorphic React app really works with a client-side router that includes (by requiring them) all of the app's entry points?
I'm not sure that "The whole point of an isomorphic app [...] is so the client doesn't have to download the whole JS file initially which results in really slow initial load time" is necessarily true. I think the primary reason people do this is for SEO reasons and to improve perceived load time. You still get the benefit of showing the users the page before they have to load all the JavaScript (e.g. yes, they have to load all the JS, but it's OK because they already have most/all of the content). The app upgrades to an SPA transparently, providing a seamless experience for the user.
That said, you can implement a system where you don't have to load all the JS at once with something like webpack's code splitting. There's even a simple React Router example that does this.

Will Googlebot correctly handle <link rel="canonical" ref="{{url}}>" for angularjs page

Is it safe to use<link rel="canonical" ref="{{canonical_url}}" /> in an angularjs page, or is there a preferred way to handle this?
Google is now automatically crawling and rendering javascript, which is great for angularjs sites. But, I'm concerned that Googlebot may not wait for the rendering when deciding about canonical pages, and I don't want to mess up our site by having Googlebot think all pages are "url".
Using Google's Webmaster Tools, I can see that Google's crawler can render the pages just fine, but I'm not sure how to tell how it's dealing with a canonical tag. Other reading implies that Googlebot stops reading/rendering a page if it sees a canonical tag for another page it has already processed.
As #JB Nizet mentioned, as I witnessed, Google as of Sep 25 2015 executes JS, render it properly in webmaster but more like "for demo". En mass production links still not executed RELIABLY by Google ( sometime not at all, sometime quite good).
So we still have to use prerender.io (which is quite good). But attention: use the one with phantomjs2, phantomjs1 is not good at parsing/reading angular apps.
IMPORTANT things: they do support httpCode, and all your double curly braces.
Good luck!
Source: my painful days setting up SEO for our angular platform
Google will render and index plain SPA without any static html snapshots.
BUT: i have a project with change over to https. usually i dont redirect with 301 but just add https in the canonical tag. on this site google has not yet recognized the change.
Same time on another project, same changeover to https with same code but with html snapshots included:
google has recognized the change to https and right version is in in the search results.
i wrote my own directive to have a nice seo header with canonical tags. its better to use your pages head tag as directive call.
https://github.com/w11k/w11k-angular-seo-header
If you want to prerender your site do not use a moron service like prerender but simply add a task to your build process. Setup with Grunt/Gulp is about 15 mins.. once!

Resources