Robots.txt Checker

Check if robots.txt is accessible on your site, whether it blocks important pages, and if it links to a sitemap

Check Results

This only checks robots.txt. For a comprehensive analysis, use the full page check.

You can also audit your entire site. Duplicate titles and descriptions, orphan pages, broken links between sections, and other site-wide issues can only be found with a full site audit.

If you don't have an SEO specialist, we can help fix the errors found.

Full Page Check Full Site Audit Fix Errors

What Is robots.txt and Why It Matters

The robots.txt file is a text file in the site root that tells search bots which pages can be crawled and which cannot. Google checks this file before starting to crawl the site.

What This Tool Checks

  • File presence — whether robots.txt is accessible at /robots.txt
  • Site blocking — whether the file blocks the entire site from indexing (Disallow: /)
  • Sitemap link — whether the file contains a Sitemap directive with the sitemap URL

What robots.txt Should Look Like

Minimum correct robots.txt file:

User-agent: *
Disallow:
Sitemap: https://example.com/sitemap.xml

An empty Disallow value means crawling is allowed for all pages. The Sitemap directive helps search engines find the sitemap faster.

Common robots.txt Mistakes

  • File missing — search engine gets no instructions
  • Disallow: / — entire site blocked from indexing
  • No sitemap link — harder for search engines to discover all pages
  • Important sections blocked (CSS, JS, images) — search engine can't properly render the page

Frequently Asked Questions

Is robots.txt required?
Not formally — the site will work without it. But Google Search Console recommends having one. Without robots.txt, crawlers will scan all accessible pages, including utility pages.
Does robots.txt prevent indexing?
No. Robots.txt only prevents crawling. A page blocked in robots.txt can still be indexed if other sites link to it. To prevent indexing, use the noindex meta tag.
How quickly will Google see robots.txt changes?
Google typically re-crawls robots.txt every few days. To speed things up, use the robots.txt testing tool in Google Search Console.

Other Checks