View Full Version : robots.txt restriction error
2011-08-22, 06:47 AM
I've had an issue with a robots.txt restriction error recently. Because of it, Google couldn't crawl my URL. What should I do?
2011-08-23, 11:10 AM
URLs restricted by robots.txt errors
Google was unable to crawl the URL due to a robots.txt restriction. This can happen for a number of reasons. For instance, your robots.txt file might prohibit the Googlebot entirely; it might prohibit access to the directory in which this URL is located; or it might prohibit access to the URL specifically. Often, this is not an error. You may have specifically set up a robots.txt file to prevent us from crawling this URL. If that is the case, there's no need to fix this; we will continue to respect robots.txt for this file.
URLs restricted by robots.txt errors - Webmaster Tools Help (http://www.google.com/support/webmasters/bin/answer.py?answer=35235)
Powered by vBulletin® Version 4.1.5 Copyright © 2013 vBulletin Solutions, Inc. All rights reserved.