Why og:image does not rendered with React? - reactjs

Try to have thumbnail when sending website link in Facebook message. Started to use s-yadav/react-meta-tags followed tutorial but image is not present after link sent.
Link: https://ticket-44-portal-staging.herokuapp.com/buyTicket?status=1959165690885328
applied following code in React componenet:
return (
<div className="container">
<MetaTags>
<title>{searchedEventTime.name}</title>
<meta name="description" content={searchedEventTime.description} />
<meta property="og:title" content={searchedEventTime.name} />
<meta
property="og:image"
content={`https://ticket-t01.s3.eu-central-1.amazonaws.com/${eventId}.cover.jpg`}
/>
</MetaTags>
I can see rendered meta tags in HTML, why it isn't work?

It is because the website is a single-page app. Before the JavaScript is loaded, everything rendered by React is not there yet(including those meta tags). You can verify it by right-clicking the page and select "view source", you will see that inside body, there is only a <div id="root"></div>. The problem is that many search engines and crawlers don't actually run JavaScript when they crawl. Instead, they look at what's in the initial HTML file. And that's why Facebook cannot find that "og:image" tag. There are two ways to solve this problem.
TL;DR Host your app on Netlify if you can. They offer prerendering service.
First, you may look at prerendering which is a service to render your javascript in a browser, save the static HTML, and return that when the service detects that the request is coming from a crawler. If you can host your React on Netlify, you can use their free prerendering service(which caches prerendered pages for between 24 and 48 hours). Or you can check out prerender.io. This is the solution if you don't want to move to another framework and just want to get the SEO stuffs working.
Another more common way to deal with this problem is do static site generation(SSG) or server side rendering(SSR). These mean that HTML content is statically generated using React DOM server-side APIs. When the static HTML content reaches client side, it will call the hydrate() method to turn it back into a functioning React app. Two most popular frameworks are Gatsby.js and Next.js. With these frameworks, you will be writing React and JSX code like you already do. But they offer more to power your app, including SSG, SSR, API routes, plugins, etc. I'd recommend you to check out their "Get started" tutorials. Transferring from a create-react-app to one of these frameworks can take less than a day, depending of the size of your project.
Next.js Tutorials
Gatsby.js Tutorials
After implementing one of these solutions, visit Facebook Sharing Debugger to ask Facebook to scape your page again.

Related

set dynamic meta tags in reactjs when js is disabled

I'm getting some problems with google crawlers and meta tags, I'm using reactjs with react-helmet (no ssr).
React-helmet does work but google search does not seem to find the tags I added. I know that crawlers do run the website without js enabled, when testing this I can see that react-helmet does not render the tags when js is disabled (works fine when it is enabled).
The fact that react-helmet does not render tags when js is disabled might be the reason why the description and titles are wrong in google search. maybe that finding a way to renders the proper tags when js is disabled might fix the problem.
Any idea on how to do that?
thanks for the help.
You have two options if you want to make your React pages discoverable on search engines: server-side rendering (SSR) or prerendering.
Server Side Rendering:
React can render pages on the server-side, making it easier for search engine crawlers to discover and index your content. However, it requires a more advanced setup using a library like Next.JS along with additional time-consuming and expensive server.
Prerendering
Meanwhile, using Prerender is painless and straightforward. Their software will seamlessly create fully-rendered, static HTML versions of your website for social media, search engine crawlers and more.
However, you have to install the Prerender.io middleware on your server.

Helmet with decoupled CMS (Drupal)

i have built a site on a CMS (Drupal) with React apps. I use Helmet to generate metadatas (title / description) from my components/child components.
In the components I use this code:
<Helmet>
<title>{...my custom title...}</title>
<meta name="description" content={...my custom description...} />
</Helmet>
If I analyse the code in the Dev Tools of Chrome, I see that the metadatas are updated.
If I analyse the source code of the page, the metatadas aren't updated. These metadatas, are generated outside the React app, in a HTML section. So, the metadatas title and description keep the default values.
What is the good method to "update" the metadatas in the HTML section? I need to update only specific metadatas (title, description, canonical...), the others elements of the head section are generated by the CMS.
Thanks for yours recipes and helps.
You are confusing client-side code and server-side code.
unless you SSR(server-side render) your pages, Helmet will run client-side (with javascript updating the DOM).
So when you view the source of the page you will not see the title and description from your CMS.
The good news is that most crawlers today can run javascript and read your helmet metadata.
So to answer your question: you will need to server-side render your react and hydrate it on the front end to see the helmet output on the source of the html.
Another solution is to server-side render just the shell of the page (head body and one div with id for react) and control the <head> section outside of react. It is a bit easier than server-side rendering react.

Sharing on social media, the URL does not render any meta data

We have built a project (Web Application) in React .net core using react in client-side rendering.
We've used react-helmet for dynamically assigning meta tags.
The issue being when the app renders in the browser. The browser gets only the static HTML on initial load which does not include the dynamic meta tags we have set. However on inspecting you get those meta tags under "Elements".
Also, if we use these URL for sharing on any social media, like WhatsApp or Facebook, the URL does not render any metadata as it should.
Tried searching for solutions to our problem, the most obvious answer we came across was to try server-side rendering instead. We get that, but it is not a solution to try out at this juncture when we're ready with app to roll it out.
Others we came across were "react-snap", "react-snapshot", but no luck
with react-snap, it requires to upgrade React's version to 16+, which we did but I guess not all dependencies were upgraded, there was an error saying "
hydrate is not a function
(hydrate concerns the react-dom)
With react-snapshot, we could not find the necessary type definition, which is required in react .net core to function properly
Please guide for the next probable step (except the paid ones like prerender, etc)?
Main goal: Social Applications should render the meta data when we paste/share the URL within them.
Prerender is the only solution.
I used a node dependency called "prerender" -> https://github.com/prerender/prerender
It works enabling a web server wich make http requests. Assigning value to a boolean: window.prerenderReady = true; in your website tells your server when the page is ready to "take the photo" and it returns the Html when so. You need to program an easy script that parses all the site urls and save those html contents to files. Upload them to your server and using .htaccess or similar target the crawlers external-hit-facebook,twitterbot,googlebot, etc.. to show them the prerendered version and 'the real site' to the rest of user-agents.
It worked for me.
The meta tags for Open Graph need to be present in the HTML which is sent back to the client when fetching a URL. Browsers or bots will not wait until the app is rendered on the client side to determine what the metatags are - they will only look at the initially loaded HTML.
If you need the content of your Open Graph metadata to be dynamic (showing different content depending on the URL, device, browser etc.) you need to add something like react-meta-tags into your server code.
There are no type definitions available for any of the react meta tags libraries, but you can add your own. It can be a bit tricky, but check out the official documentation and the templates they have provided to get started.
If you don't need it to be dynamic, you could add the tags into the static parts of the <head>-tag in your index.html.
I had the same issue today. I had two React Web applications that need this. Here is how I solved it:
put your preview image in the public folder
still in public folder, Open index.html, add the line <meta property="og:image" content="preview.png"/>
or <meta property="og:image" content="%PUBLIC_URL%/preview.png"/>.
Go to https://www.linkedin.com/post-inspector/ to check if it works.
I hope this would help!

angularjs ngMeta title description is not crawling by google

In my Angularjs app, I am using this https://github.com/vinaygopinath/ngMeta.
<title ng-bind="ngMeta.title"></title>
<meta property="og:title" content="{{ngMeta.title}}" />
<meta property="og:description" content="{{ngMeta.description}}" />
My controller code is
app.controller('controller1',function($scope, $location , $http , $routeParams, ngMeta, $route){
$scope.$route = $route;
ngMeta.setTitle('site title');
ngMeta.setTag('description', 'this is description');
ngMeta.setTag('keywords', 'tag1, tsg2, tag3');
});
after page loads everything working fine, but google is showing {{ngMeta.description}} {{ngMeta.title}} like this
any help to solve this.
SEO in SPA apps are not working in traditional ways. You have to tweak some code to correctly crawl your website. There are two steps two do that:
Add a meta tag in your head to tell the crawler that "this is a highly JS dependent site, you have to request this page differently."
Like: <meta name="fragment" content="!">
You have to pre-render your site (Means: make a static site if JS loaded fully and run correctly - correct title and description in head) for urls with ?_escaped_fragment_= parameter with them.
IE: If you have www.example.com, your server needs to return the pre-rendered site for requests like: www.example.com?_escaped_fragment_=
Why:
When crawler bot sees the meta tag it will request the page with ?_escaped_fragment_= parameter, thinking it will get the pre-rendered page with hard-coded {{ngMeta.title}} and {{ngMeta.description}}.
So, how to pre-render your site?
Use: https://prerender.io/ as Stepan Suvorov said or,
checkout http://phantomjs.org/screen-capture.html or http://htmlunit.sourceforge.net/
Resources:
https://builtvisible.com/javascript-framework-seo/
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
I was facing a similar issue with react applications for SEO.
I would suggest the best way to handle this would be to use prerendering.
I am saying this because if your Single Page Application is making API calls, google crawler does not wait for ajax calls to be finished, so technically it does not read any of your content coming through API calls. It took me few days figure this out using webmasters.
So, you can use phantomJS or prerender.io(which uses phantomjs internally). So what this does is, it parses your html file and replaces the contents of Javascript with static content, thus serving proper and full content to the bots.
Or else you can serve the page directly from backend if the page is loaded for the first time, So this also helps.

Handle meta tags and document title in React for crawlers

I have a React (redux) app that uses client side rendering. I wanted my sites description and title to be crawlable by google (they seem to crawl async stuff cause my site shows up fine in google with text from my h1 tags) so I found a library called react-helmet which builds on react-document-library. This library allows me to change the document title and description depending on what route I am currently on.
So here are my questions:
Currently (1 week later) my google search results are unchanged which makes me think either google hasn't crawled my site or google crawled it but didn't notice the dynamic change of description and just used my h1 tags. But how can I check which one has happened?
I notice Instagram have a client side rendered app but somehow when I check the page source they have already changed the title and description on each page even though the body tag is just an empty div as is typical of a client side rendered app. I don't get how they can do that.
react-helmet
Follow the React Helmet server rendering documentation: https://github.com/nfl/react-helmet#server-usage.
Use Google Search Console to see how Google crawls your site, and to initiate a crawl/index of your pages: https://www.google.com/webmasters/tools/
As for how instagram can show meta tags in a client-side app – they probably render and serve static content server-side when they detect a crawler or bot is viewing it. You can do this yourself for your content without converting your entire app to server-side rendering. Google prerendering services. (I won't mention any examples because I don't want to boost their SEO without having an opinion on them.)
Another option is to render your React app statically, and serve it when necessary. See Graphcool's prep (seems slightly outdated), react-snap, and react-snapshot. All render the site on a local server and download the rendered html files. If all you need is the <head> then you should be good!

Resources