I want defined robots.txt as you can see below to block robots from login url:
user-agent: *
disallow: /#!/login
Sitemap: http://mysite.ir/sitemap.xml
but google does not recognize # sign and ignore whole the site. how can i fix that problem.
As per i know angularJS is not good for SEO but yet you can do some basic SEO.
Anyway, this post may help you to create robots.txt Robot.txt in angularJS
I resolved this problem with $locationProvider.html5Mode(true); as you can see below:
(function() {
'use strict';
angular.module('myapp').config(stateConfig);
function stateConfig($locationProvider) {
$locationProvider.html5Mode(true);
}
})();
and set this in head section of index.html:
<base href="/">
with this changes angularjs removes # from urls in the, keep in mind that you should make some changes in server side.
Related
I am trying to remove the hashtags from my Angularjs app's URLs using the locationProvider, and it works well until I refresh a page manually. When I am refreshing page my angularjs js and css file now loading.
Please help me on that.
In index.jsp
App.js
myapp.config(function($routeProvider, USER_ROLES, IdleProvider, KeepaliveProvider, $locationProvider) {
$locationProvider.html5Mode(true);
}
urlrewrite.xml
/login resources/partials/login.html /login/brandname resources/partials/login.html /home resources/partials/home.html
web.xml:
http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" version="3.0"> Archetype Created Web Application UrlRewriteFilter org.tuckey.web.filters.urlrewrite.UrlRewriteFilter confPath /WEB-INF/urlrewrite.xml UrlRewriteFilter /* REQUEST FORWARD
adding base base tag could also help you solve the # problem. And also did you checked this link? https://scotch.io/tutorials/pretty-urls-in-angularjs-removing-the-hashtag
We have a catalog that loads an AngularJS application dynamically according to the defined subdomain. For example: http://subdomain.billiving.biz
We need to support dynamic title that will be set by AngularJS. I saw in Google documentation that we need to add this link:
<meta name="fragment" content="!">
Anything else we should do to set this correctly?
Thanks
You have to follow HashBang URL standard or HTML5 URL
Something like
angular.module('app', []).config(['$locationProvider', function($location) {
$location.hashPrefix('!');
}]);
Here is the good guide to follow Angular SEO
I eventually ended up using Push-State. This includes two changes:
Enabling Html5Mode:
.config(['$locationProvider', function($locationProvider) {
$locationProvider.html5Mode({enabled: true});
}]);
Providing a base href from your app (when working local need full localhost path)
<base href="/">
That's it.
I want to remove # from my angularjs app, I am developing an website, I think it may help to make SEO friendly website, May be i am wrong, Please help
Thanks
I agree with Joe Lloyd for the way to remove the # from your url but it won't help you to make your angularjs website crawlable.
Check out the following steps :
Configure the html5mode
config(function ($routeProvider, $locationProvider) {
$locationProvider.html5Mode(true);
Add <meta name="fragment" content="!"> to your pages so the google bot knows to add the ?_escaped_fragment_= at the end of your request.
Handle the ?_escaped_fragment_=on the server side and serve a static snapshot of the requested html page (phantomjs can make the job) so the bot will index the rendered version of the html page.
Populate your sitemap.xml and send it to google using www.google.com/webmasters/
Have fun !
Config
To remove the # in your url you need to add this to your main module. You put the app into html5 mode.
//This config is used to remove the # in the html
app.config(["$locationProvider", function($locationProvider) {
$locationProvider.html5Mode({
enabled: true,
requireBase: false
});
}]);
Server Side (NodeJS)
This will have a server side effect when you hit refresh you need to add an extra route that is able to find the new url. this will work in your express js app.js file. Where the public folder contains your angular app and you have your index.html file in there
// ### CATCH REFRESH TO INDEX ###
app.all('/*', function(req, res, next) {
// Just send the index.html for other files to support HTML5Mode
res.sendFile('public/index.html', { root: __dirname });
});
I'm using $routeProvider and $locationProvider to handle pushstate URLS in a single page app (SPA), something like this:
angular.module('pets', [])
.config(function($routeProvider, $locationProvider) {
$locationProvider.html5Mode(true);
$routeProvider.when('/pet/:petId', {
controller: 'petController'
});
})
.controller('petController', function($scope, petService, $routeParams){
petService.get('/api/pets/' + $routeParams.petId).success(function(data) {
$scope.pet = data;
});
});
The URL is used to pull content from the server which may or may not exist.
If this was an ordinary multipage website, a request for missing content would trigger a 404 header response from the server, and a request for moved content would trigger a 301. This would alert Google to the missing or moved content.
Say for example I hit a URL like this:
http://example.com/pet/123456
and say there is no such pet in the database, how can my SPA return a 404 on that content.
Failing this, is there some other way to correctly alert the user or search engine that the requested URL doesn't exist? Is there some other solution I'm not considering?
The real question is does http://example.com/pet/123456 return anything at all?
If your starting point is http://example.com/ and there's a link to http://example.com/pet/123456 then Angular will call the petController which in turn makes an AJAX call to http://example.com/api/pet/123456.
A crawler wouldn't do that but instead would try to call http://example.com/pet/123456 directly.
So your server must be able to handle that call. If there is no pet with the id 123456 then it should return 404. Problem solved. If there is then it should return the SPA. The application should then handle the situation accordingly.
According to this answer How do search engines deal with AngularJS applications?, You should use Headless Browser to process crawlers requests, and serve back snapshots of the page with the appropriate Response Code. https://developers.google.com/webmasters/ajax-crawling/docs/html-snapshot
The google example did not include 301,302 or 404 cases. However, their code could be modified to analyze the content of the snapshot and change the response code.
I found prerender.io offers this service, but it is not free. However, they have a free plan if you have fewer than 250 pages. Prerender asks that in case of 404 or 301, you add a meta tag to the DOM.
<meta name="prerender-status-code" content="404">
this meta tag is then detected by their headless browser and the response code is changed.
Try this
angular.module('pets', [])
.config(function($routeProvider, $locationProvider) {
$locationProvider.html5Mode(true);
$routeProvider.when('/pet/:petId', {
controller: 'petController'
}). otherwise({ yourUrl:'/404.html'}) // Render 404 view;
})
I am trying to get my AngularJS with html5mode enabled indexed by google.
In my app.js I have the following snippet:
$locationProvider.html5Mode(true);
In my index.html head I have the following:
<base href="/"/>
<meta name="fragment" content="!"/>
I expected to get requests on urls of the format /?_escaped_fragment_=support. Instead I'm getting requests on /support?_escaped_fragment_=.
Is there something wrong with my config or did I expect something weird?
according to the yearofmoo SEO article you should get this:
http://yourwebsite.com/?_escaped_fragment_=/some/page/with/ajax/content
YearofMoo - SEO