News
Here is another PSA from Gary Illyes of Google. In short, if you serve a 4xx status code with your robots.txt file, then Google will ignore the rules you have specified in that file.
A new document from Google explains how HTTP status codes impact a site’s appearance in search results.
The 4xx range of status codes functions largely as expected. Google appropriately processes standard codes like 404 (not found) and 410 (gone), which remain essential for proper crawl management.
Pages that serve 4xx HTTP status codes (except 429) don't waste crawl budget. Google attempted to crawl the page, but received a status code and no other content. Forum discussion at Mastodon.
Google is warning against using 404 and other 4xx client server status errors, such as 403s, for the purpose of trying to set a crawl rate limit for Googlebot. “Please don’t do that,” Gary ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results