Check if robots.txt is accessible on your site, whether it blocks important pages, and if it links to a sitemap
This only checks robots.txt. For a comprehensive analysis, use the full page check.
You can also audit your entire site. Duplicate titles and descriptions, orphan pages, broken links between sections, and other site-wide issues can only be found with a full site audit.
If you don't have an SEO specialist, we can help fix the errors found.
The robots.txt file is a text file in the site root that tells search bots which pages can be crawled and which cannot. Google checks this file before starting to crawl the site.
Minimum correct robots.txt file:
User-agent: *Disallow:Sitemap: https://example.com/sitemap.xml
An empty Disallow value means crawling is allowed for all pages. The Sitemap directive helps search engines find the sitemap faster.
Disallow: / — entire site blocked from indexing