As now a days Google is making several changes day by day in
their products but after a long time a big update arrive to us in Google
Webmaster tool. Earlier there was lots of confusion that to block which page
what should we write in robots.txt but now this is available that you can test
each and every url of your website that it is blocked or not.
Lets welcome the robots.txt testing tool in Webmaster tool.
How it works: We already having feature to check crawling for
our website (Fetch as Google) now just after this in Webmaster we have option
visibility off robots.txt Tester, You need to click on that tab and you could
be able to see existing robots.txt code available there. At bottom you will
found option to test any page of your website. Enter any subpage or leave this
blank to crawl your home page. This will show result as Allowed and highlight
code in Green due to which page is
accessible by crawler.
In same way if you search any blocked page to test, this
will show result as Blocked and highlight code in Red due to which page is accessible by crawler.
To read about same visit to official blog at http://goo.gl/etNDar
No comments:
Post a Comment