site map alone good enough for seo with angular 1 app? - angularjs

Options for server side rendering the ang apps
1 - pretender.io
Free for up to 250 pages per month
Here is a tut.: https://scotch.io/tutorials/angularjs-seo-with-prerender-io
2 - https://github.com/steeve/angular-seo
Open source and free
_escaped_fragment_ idea ie:
localhost/app.html#!/route becomes:
localhost/app.html?_escaped_fragment_=/route
At the server side, phatomjs picks up this translated request and returns a full
rendered html file for google to crawl.
However has anyone tried running a plain ang app with a site map? I am building an api onto Kirby CMS which has a nice site map functionality... but I don't know if this will be enough for seo...

It depends. Google will try to crawl ajax calls
https://developers.google.com/webmasters/ajax-crawling/docs/learn-more
However this can't be said about other search engines or crawlers.
Sitemap will not help you much, because it does not contain all content of your site. Sitemap is basicaly there to give crawlers map what they should crawl/index.

Related

Could prerender.io slow down page load?

Just launched a site bluesophy for which I am using a self-hosted instance to deliver HTML pages of my AngularJS app to google bots. Page speed insights from google are rather poor : respectively 33 and 27 / 100 for mobile and desktop. I know a couple of things I could do right now, but wanted to know if somehow using prerender.io could affect page speed load, or did someone face similar issue and how did they deal with it ?

AngularJS application problems appearance in Google search

I have a personal project which consumes my free time and effort for about a year without significant profit. I have problems with it appearance in Google and would really appreciate to get help here.
This project (http://yuppi.com.ua - similar to craiglist in US) is WEB-based AngularJS 1.2 application that uses PHP rest API hosted on GoDaddy. And in order to make this application popular it have to be very visible in internet and very searchable in Google and users have to be able to share pages via social networks or skype.
According to Google specification, google crawlers doesn't run javascript to get content of a web page before index, so I've added _escaped_fragment_ page that displays content of web page without javascript. For example:
Page: http://yuppi.com.ua/#!/items/sub/18/_
Dirty : yuppi.com.ua/?_escaped_fragment_=/items/sub/18/_
This dirty page will be redirected here where google will see content.
http://yuppi.com.ua/server/crawler_proxy/routee.php?path=/items/sub/18/
So basically I have two versions on HTML file for that page. One version is the one that available to users, which has styles, a lot more HTML tags etc. And the second is the version for Google crawler - very light-weight without any styles. And I am expecting to see clean link to my site in Google, not dirty.
So, If to search all links to a web site in Google you will see that one of the links displays it's "dirty" state.
Another problem is sharing links in Skype.
When I send a link to someone, I am expecting that this link will be transformed to thumbnail image but it is not happens. Instead I see ungly link to my web site.
Please help me to understand how to make happy everyone: users, google crawler, GoDaddy and me.
I was encountering the same problems last year with a big project and we ended to use : https://prerender.io/.
It's a prerendering system that work with a phantomjs browser to detect bot request and render a full html template. It does also instanciate a cache service to not render again a template that haven't change.
Hope it help's.

Why is my angularjs site not completely crawlable?

I have created my first AngularJS website. I have set up pushstate (html5 mode), added fragment metatag, created sitemap in google and tested "google fetch" functionality. After few days, my website is still not completely indexed by google. Google indexed only 1 url instead of 4 (my sitemap contains 4 url's). My website is Tom IT. This main page is index, but this subpage that is also in the sitemap (you can find my sitemap in sitemap.xml in the root of my domain tom-it.be), does not appear in search results. I also added robots.txt.
Google crawlers can parse the pages that generated by SPA and appear at SERPs, but not immediately, may need several days. In my experience, use AngularJS may need 3 days, and use EmberJS need 7 days.
If your website wants to be crawled completely, the important information should put in HTML, or use other techniques, for example, prepare another page for crawlers, server pre-rendering or PhantomJS.

Crawling a website that uses angular routes

I have a personal website that I use for some of my motorbike racing. I created it recently using node and angular. I decided to try angular routes for my page navigation etc. I think it worked well but I'm annoyed that my website isn't showing on google search.
When I've looked into how to get google to find your website I've followed many suggestions with meta names etc but when I came to a sitemap I discovered that most crawlers etc have problems finding any links on my website to other pages.
You can see my website here - MPC Racing
I have tried using this automatic sitemap creator and it can't find anything apart from my main page - XML Sitemap
Do you have any suggestions on how I can my website more easily found by search engines?
For example, a design company designed all my graphics for my bike and if I type into google "Webstep Racing Team" I get the link to their website as the first hit but nothing at all on my website. What is it they are doing and I'm not? - Webstep Racing Team
In Google Webmaster tools there is an option to 'Fetch as google'. So you see what google sees when it crawls your angular app. It gives you an image of what google sees.
However for me the problem is that the crawler does not crawl the angular links within the app.
By default the hashes are getting ignored by search engines, because normally they refer to parts of the same page.
You can follow google guidelines for ajax crawling urls to get the hashed url indexed by Google. The same standard also supported by Bing according to searchEngineLand post.
And because you are using angularJs, you might find Matias Niemela's post on how to have your AngularJS application indexed very useful. Demo and source code.

How do i create an image site map for drupal 7 site?

I am maintaining a drupal 7 site with more than 500 pages. Most of the pages contains images.
I want to create image site map for my site.
Example image site map reference from google
ie, i want to list out images under each url. I have tried curl to retrieve images form each url. But it takes more time to execute each url.
Is there any easy and fastest way to implement it?
You can use a contributed module Google Image Sitemap
Google Image Sitemap
I'll do some tests and in a few days I hope you can tell something more

Resources