Can’t make the #ChromeDevSummit this year? Catch all the content (and more!) on the livestream, or join your peers for a CDS Extended event at a hosted location nearby. To learn more, check out the Chrome Dev Summit 2019 website.
robots.txt is not valid
The robots.txt file tells search engines what pages of your site they can crawl. An
invalid robots.txt configuration can cause 2 general types of problems:
Not crawling public pages, causing your relevant content to show up less in search results.
Crawling private pages, exposing private information in search results.
Expand the robots.txt is not valid audit in your report to learn why your robots.txt file is
Here is an explanation of common errors:
No user-agent specified. Put a User-agent directive before your Allow
or Disallow directive.
Pattern should either be empty, start with "/" or "*". Start your Allow or
Disallow directive with one of these characters, or leave it empty.
Unknown directive. The directive name listed in the Content column is not part
of the robots.txt specification.
Invalid sitemap URL. The sitemap URL should begin with http, https, or ftp.