Test Your Robots.txt
Test Results
Robots.txt Tips
The User-agent: *
directive applies to all web crawlers not specified elsewhere.
Use Disallow: /
to block all access to your entire website.
The Allow
directive lets you make exceptions to Disallow
rules.
Allow
directives are processed before Disallow
directives when they have the same path length.
Include a Sitemap:
directive to help search engines discover your content.
Remember that malicious bots might ignore your robots.txt rules entirely.