AngularJS and Google - angularjs

I have a PHP website which is currently not using AngularJS. I'd like to change its design and while I'm at it, start using AngularJS to make a SPA website. This website contains programming tutorials, so its position in google search result is crucial. I've been searching recently to see if AngularJS can be used in that case and the most interesting post I found is this one : Google bot crawling on AngularJS site with HTML5 Mode routes
I'd like to know if my website will still be referenced the same way if I use AngularsJS. I'll pay attention that URL stays the same, but will the Google bot be able to crawl my website the same way it does now ? I need to be sure because I don't want to loose all my traffic because Google is no longer able to crawl my site.

Google now runs JavaScript and can index AngularJS sites.
Source
Some caveats though: most top ranked search results are still server-rendered, so how the Google algorithm ranks your site is unclear. Also other search engines such as Bing may not be able to read your site.

Related

angular urls in sitemap.xml with hashbang or not?

how should I put Angular url's in the sitemap.xml?
Now I've added this:
https://www.domain.com/user-area#!/logon
but I think Google doesn't like it.
I've read about the _escaped_fragment_ but I don't understand what that means.
The _escaped_fragment_ was recently depreciated by google. See their blog post here:
http://googlewebmastercentral.blogspot.com/2015/10/deprecating-our-ajax-crawling-scheme.html
Anyways, from what you wrote, it seems like your server structure isn't prepared for the _escaped_fragment_. I won't go into too much detail about it here, since it was depreciated after all.
Anyways, Google's bots weren't always able to process websites with AJAX content (content rendered via Javascript). To create a workaround, Google proposed adding the hashbang #! to all AJAX sites. Bots would be able to detect the hashbang and know that the website had content rendered through AJAX. The bots would then request a pre-rendered version of the AJAX pages by replace the hashbang with the _escaped_fragment_. However, this required the server hosting the AJAX pages to know about the _escaped_fragment_ and be able to serve up a pre-rendered page, which was a difficult process to set up and execute.
Now, according to the depreciation blogpost, the URL you entered in your sitemap.xml should be fine, since Google's bots should be able to "crawl, render, and index the #! URLs". If you really want to know if Google can understand your website, I'd recommend using their webmaster console, located at https://www.google.com/intl/en/webmasters/ . Using that tool, you can register your site with google, observe how google indexes your site, and be notified if any problems arise with your site.
In general though, getting Google to index an AJAX site is a pain. I'd strongly recommend using the webmaster console and referring to it frequently. It does help.

Fetch as Google Webmaster tools

I have an AngularJS SPA site which I wanted to test using google's "Fetch as Google" feature in webmaster tools. I am a little confused about the results. The screenshot from Googlebot looks correct however the response doesn't include any of the contents inside the "ui-view" (ui-router)... can someone explain what is happening here? Is google indexing the site properly since the screenshot is correct? Or is google not able to execute the JS properly for indexing?
This is a mixed bag. From some tests I've seen the GoogleBot is able to index some of the AJAX fetched content in some cases. A safe bet though to make all the search engines happy is to use prerender.io or download their open source stuff (uses PhantomJS) to have your site be easily indexable. Basically what this does is saves the version of your site after async operations have completed for a given URL and then you setup a redirect on your server that points any of the potential bots for search engines over to the preprocessed page. It sounds pretty complicated but following the instructions on the site it's not too hard to setup, and if you don't want to pay for prerender.io to serve cached copies of your pages to search engines you can run the server component yourself too.

Rejected request to participate in Google AdSense (Phase 1)

I am trying to register my web app for the service of Google AdSense.
After five attempts are still stuck in the first phase for the error: Content insufficient.
My web app is a search engine for youtube, developed with AngularJs and NodeJs
AngularJS fully embraces the asynchronous model and this is what creates problems for Google's crawlers. So I implemented the library prerender.io and I have set up the meta tag to fix any problems with the # of Angular.
 
<meta name = "fragment" content = "">
I also prepared the sitemap.xml within the site, and that index the pages through the portal of google for webmasters.
The site being a search engine, does not have static content and reach, so I also added some texts and descriptions in the page principle, which houses the other partial files.
Despite these changes, I could not pass the first stage, how can I do to fix?
I'm not making all the necessary steps for a web app Angular?
To be approved in Google Adsense you need about 10 topics of 400 words. The content should be unique. According to Google Adsense TOS it is not allowed to put ads in pages having copied content. So in your case if you provide search results you will provide copied content.

appspot.com url shows up in google search results instead of custom domain name

I have set up http://www.footballverdict.com and it's hosted on Google App Engine. Everything works fine. You can visit the custom domain without problems. For some reason when I do a search on Google for "football verdict", the results show startorsit.appspot.com/ask and startorsit.appspot.com/about. There is no footballverdict.com in sight for the main site! It's been at least two months since I hooked up the custom domain. The blog sub-domain does show up in the search results, but that's because it's not hosted on Google App Engine.
Does anyone know how to get the custom domain into the search results and remove the appspot.com sub-domain?
The easiest way to handle this is to have your app detect if it's being requested on appspot.com, and if it is, send a 301 to your canonical domain. Search engines will pick up on this and start listing your canonical site instead.
The answer? Canonical URLs.
Google Webmaster Tools has a great little blurb about it here, and Yoast has another one here.
I hope this gets you pointed in a good direction.
Best of Luck! ~Isaac
Are you using google webmaster? I think this might help. http://www.google.com/support/webmasters/bin/answer.py?answer=83106
It will still take some time for the updates to get into google tho(up to 180 days).

Can Google be used for site search on a database backed website?

I'm developing a web site with Google App Engine, and I want to have a search feature for user submitted stuff. Since this project is just a toy and I don't control the server, I'd like to just use Google to handle search. However, since the content is stored in the database, I don't think Google can discover the dynamic urls. Unless maybe I create a page that links the last N submissions and hope it gets crawled frequently. Thoughts?
Absolutely. As long as the database is exposed in a web page which can be crawled, Google will crawl it (unless told not to).
The best way to make it all accessible is decent navigation between pages. However, lacking that, a site map page linked from the home page should suffice.
This is an excellent candidate for a sitemap.
You can generate the XML any way you want, and give it to Google. The best part is, it is a "private" XML file; no need to have ugly listings of dynamic URLs for users to see.

Resources