Hrizn logo
Glossary

robots.txt

A text file at the root of a website that tells search engine crawlers which pages they can and cannot access - the basic traffic directive for web crawlers.

r

Technical

What Is robots.txt?

The robots.txt file is a standard protocol for communicating with web crawlers. It uses "allow" and "disallow" directives to control which parts of a site different crawlers can access. For dealerships, proper robots.txt configuration is important for: blocking low-value pages from crawling (internal search results, admin pages, filtered SRP variations), allowing AI crawlers (GPTBot, ChatGPT-User, PerplexityBot) for GEO visibility, including a reference to the XML sitemap, and preventing crawl budget waste. A common mistake is blocking too much - some dealership website providers accidentally block important pages or entire sections from crawling.

Diverse team of dealership professionals standing together
Diverse team of dealership professionals standing together
Don't Wait

Build Before You Need To

The teams gaining ground aren't reacting faster. They're building a content system that works for them even when they're not working on it.

That advantage grows every month.

Start Free

We Rise Together.