I am using prerender for ExpressJS app. I've kept <meta name="fragment" content="!"> in index.html page. It is working fine when I am using mywebsite.com/?_escape_fragment_=, but when I use fetch as google for mywebsite.com/ I see no data. But for the ugly format with ?_escape_fragment_= it is working like charm.
Will Googlebot crawl mywebsite.com as mywebsite.com/?_escape_fragment_= when it is being indexed or I need to do something for that?
Fetch as Google has a known issue where it doesn't automatically check for the fragment meta tag. If you enter ?_escaped_fragment_= on the end of the URL in Fetch as Google, it should see the prerendered page correctly.
The real Googlebot does not have that bug, and they will see the fragment meta tag and successfully crawl the ?_escaped_fragment_= URL on their own.
Related
I have a project in ReactJS with multiple pages. I use the react routing to handle different URL paths.
Also, I have already deployed the source code using tomcat 10 and made the following configuration in web.xml:
<error-page>
<error-code>404</error-code>
<location>/index.html</location>
</error-page>
I'm able to open any web-site URLs directly in the address bar. However, I have a problem with meta tags, because the server firstly redirect any request to index.html and after it load my sub page (for example /contact). If I put /contact URL in Facebook I can see meta tags from index.html, but I want to see meta tags from Contact.jsx page. I think that Facebook gets meta tags from the first page from server request and doesn't allow redirect to the second page (/contact).
How can I resolve it?
I think that I should update meta tags using JS in index.html, but I don't like this idea. Because I have already put meta tags into Contact.jsx using react-meta-tags.
Maybe, anybody know better solution?
Try to have thumbnail when sending website link in Facebook message. Started to use s-yadav/react-meta-tags followed tutorial but image is not present after link sent.
Link: https://ticket-44-portal-staging.herokuapp.com/buyTicket?status=1959165690885328
applied following code in React componenet:
return (
<div className="container">
<MetaTags>
<title>{searchedEventTime.name}</title>
<meta name="description" content={searchedEventTime.description} />
<meta property="og:title" content={searchedEventTime.name} />
<meta
property="og:image"
content={`https://ticket-t01.s3.eu-central-1.amazonaws.com/${eventId}.cover.jpg`}
/>
</MetaTags>
I can see rendered meta tags in HTML, why it isn't work?
It is because the website is a single-page app. Before the JavaScript is loaded, everything rendered by React is not there yet(including those meta tags). You can verify it by right-clicking the page and select "view source", you will see that inside body, there is only a <div id="root"></div>. The problem is that many search engines and crawlers don't actually run JavaScript when they crawl. Instead, they look at what's in the initial HTML file. And that's why Facebook cannot find that "og:image" tag. There are two ways to solve this problem.
TL;DR Host your app on Netlify if you can. They offer prerendering service.
First, you may look at prerendering which is a service to render your javascript in a browser, save the static HTML, and return that when the service detects that the request is coming from a crawler. If you can host your React on Netlify, you can use their free prerendering service(which caches prerendered pages for between 24 and 48 hours). Or you can check out prerender.io. This is the solution if you don't want to move to another framework and just want to get the SEO stuffs working.
Another more common way to deal with this problem is do static site generation(SSG) or server side rendering(SSR). These mean that HTML content is statically generated using React DOM server-side APIs. When the static HTML content reaches client side, it will call the hydrate() method to turn it back into a functioning React app. Two most popular frameworks are Gatsby.js and Next.js. With these frameworks, you will be writing React and JSX code like you already do. But they offer more to power your app, including SSG, SSR, API routes, plugins, etc. I'd recommend you to check out their "Get started" tutorials. Transferring from a create-react-app to one of these frameworks can take less than a day, depending of the size of your project.
Next.js Tutorials
Gatsby.js Tutorials
After implementing one of these solutions, visit Facebook Sharing Debugger to ask Facebook to scape your page again.
I have a page at the moment with different kinds of "items" with an unique URL for each item.
www.example.com/mysite/items/banana-split/images
www.example.com/mysite/items/apple-pie/images
I'm using AngularJS with ngRoute and ngMeta + JS + HTML5 + CSS only and I host my site at one.com.
It works fine at the moment and when I click on an item everything works fine and the title is changed and I try to change the metaData also.
Though when I link my products to Whatsapp or Facebook or similar services it only displays:
{{ngMeta.title}} {{ngMeta.description}}
I know it has to do with the prerendering of the page and it doesn't prerender my page. I've looked at prerender.io etc, but since it's using NodeJS I don't think I can host it on one.com. Maybe on AWS, but one.com is what I pay for.
Are there any way for me to fix this from .htaccess file or from the pure frontend?
In my Angularjs app, I am using this https://github.com/vinaygopinath/ngMeta.
<title ng-bind="ngMeta.title"></title>
<meta property="og:title" content="{{ngMeta.title}}" />
<meta property="og:description" content="{{ngMeta.description}}" />
My controller code is
app.controller('controller1',function($scope, $location , $http , $routeParams, ngMeta, $route){
$scope.$route = $route;
ngMeta.setTitle('site title');
ngMeta.setTag('description', 'this is description');
ngMeta.setTag('keywords', 'tag1, tsg2, tag3');
});
after page loads everything working fine, but google is showing {{ngMeta.description}} {{ngMeta.title}} like this
any help to solve this.
SEO in SPA apps are not working in traditional ways. You have to tweak some code to correctly crawl your website. There are two steps two do that:
Add a meta tag in your head to tell the crawler that "this is a highly JS dependent site, you have to request this page differently."
Like: <meta name="fragment" content="!">
You have to pre-render your site (Means: make a static site if JS loaded fully and run correctly - correct title and description in head) for urls with ?_escaped_fragment_= parameter with them.
IE: If you have www.example.com, your server needs to return the pre-rendered site for requests like: www.example.com?_escaped_fragment_=
Why:
When crawler bot sees the meta tag it will request the page with ?_escaped_fragment_= parameter, thinking it will get the pre-rendered page with hard-coded {{ngMeta.title}} and {{ngMeta.description}}.
So, how to pre-render your site?
Use: https://prerender.io/ as Stepan Suvorov said or,
checkout http://phantomjs.org/screen-capture.html or http://htmlunit.sourceforge.net/
Resources:
https://builtvisible.com/javascript-framework-seo/
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
I was facing a similar issue with react applications for SEO.
I would suggest the best way to handle this would be to use prerendering.
I am saying this because if your Single Page Application is making API calls, google crawler does not wait for ajax calls to be finished, so technically it does not read any of your content coming through API calls. It took me few days figure this out using webmasters.
So, you can use phantomJS or prerender.io(which uses phantomjs internally). So what this does is, it parses your html file and replaces the contents of Javascript with static content, thus serving proper and full content to the bots.
Or else you can serve the page directly from backend if the page is loaded for the first time, So this also helps.
I have a React (redux) app that uses client side rendering. I wanted my sites description and title to be crawlable by google (they seem to crawl async stuff cause my site shows up fine in google with text from my h1 tags) so I found a library called react-helmet which builds on react-document-library. This library allows me to change the document title and description depending on what route I am currently on.
So here are my questions:
Currently (1 week later) my google search results are unchanged which makes me think either google hasn't crawled my site or google crawled it but didn't notice the dynamic change of description and just used my h1 tags. But how can I check which one has happened?
I notice Instagram have a client side rendered app but somehow when I check the page source they have already changed the title and description on each page even though the body tag is just an empty div as is typical of a client side rendered app. I don't get how they can do that.
react-helmet
Follow the React Helmet server rendering documentation: https://github.com/nfl/react-helmet#server-usage.
Use Google Search Console to see how Google crawls your site, and to initiate a crawl/index of your pages: https://www.google.com/webmasters/tools/
As for how instagram can show meta tags in a client-side app – they probably render and serve static content server-side when they detect a crawler or bot is viewing it. You can do this yourself for your content without converting your entire app to server-side rendering. Google prerendering services. (I won't mention any examples because I don't want to boost their SEO without having an opinion on them.)
Another option is to render your React app statically, and serve it when necessary. See Graphcool's prep (seems slightly outdated), react-snap, and react-snapshot. All render the site on a local server and download the rendered html files. If all you need is the <head> then you should be good!