Robots.txt definition

Robots.txt is a simple website file that guides search engine crawlers, managing crawl access and server load, but doesn’t secure or deindex content—use noindex/passwords instead.


What is robots.txt?

When to use it and when to avoid it

A diagram explaining Robots.txt in terms of other concepts.

How to create and check a robots.txt file

Explore Sanity Today

Now that you've learned about Robots.txt, why not start exploring what Sanity has to offer? Dive into our platform and see how it can support your content needs.

Last updated: