llms.txt is a markdown file placed at your website's root (/llms.txt) that tells large language models which pages are most important. Proposed by Jeremy Howard (fast.ai) in September 2024, it's now implemented by 784+ sites including Anthropic, Vercel, and Cloudflare. Unlike robots.txt which blocks crawlers, llms.txt guides AI to your best content. Vercel reports 10% of their signups now come from ChatGPT.
What Is llms.txt?
Think of llms.txt as a sitemap for AI. While XML sitemaps list every page for Google, llms.txt curates your most valuable content for large language models like ChatGPT, Claude, and Perplexity.
The file uses Markdown format—readable by both humans and AI. It was proposed in September 2024 by Jeremy Howard, co-founder of fast.ai and Answer.AI, as a way to bypass the noise of HTML, ads, and JavaScript.
llms.txt vs robots.txt vs Sitemap
These three files serve different purposes:
- robots.txt — Tells crawlers what NOT to access (blocking)
- sitemap.xml — Lists ALL pages for search engines (comprehensive)
- llms.txt — Highlights BEST content for AI (curated)
They're complementary. Your robots.txt blocks sensitive areas, your sitemap lists everything, and your llms.txt guides AI to what truly matters.
Who's Using It?
The standard has been adopted by major tech companies:
- Anthropic — Uses llms.txt for Claude documentation (docs.claude.com/llms-full.txt)
- Vercel — Reports 10% of signups from ChatGPT; also proposed inline HTML llms.txt
- Cloudflare — Multiple product-specific llms-full.txt files
- Supabase, ElevenLabs, Hugging Face — Developer documentation
"Mintlify originally developed llms-full.txt in a collaboration with Anthropic, who needed a cleaner way to feed their entire documentation into LLMs without parsing HTML."— Mintlify
File Format & Structure
An llms.txt file follows a specific markdown structure:
# Your Company Name
> Brief description of what your site offers.
> Include key value propositions here.
## Docs
- [Getting Started](https://example.com/docs/start): Quick setup guide
- [API Reference](https://example.com/docs/api): Complete API docs
- [Tutorials](https://example.com/docs/tutorials): Step-by-step guides
## Blog
- [Best Practices](https://example.com/blog/best): Industry standards
- [Case Studies](https://example.com/blog/cases): Real examples
## Optional
- [Changelog](https://example.com/changelog): Version history
- [Roadmap](https://example.com/roadmap): Upcoming features
Implementation Steps
Creating your llms.txt takes about 10 minutes:
# 1. Create the file at your root
$ touch public/llms.txt
# 2. Add your content (use template above)
$ nano public/llms.txt
# 3. Verify it's accessible
$ curl -I https://yoursite.com/llms.txt
# Should return: HTTP/2 200
# 4. Optional: Create expanded version
$ pip install llms-txt
$ llms_txt2ctx llms.txt -o llms-full.txt
File Types: llms.txt vs llms-full.txt
There are two common variants:
- llms.txt — Navigation overview with links (small, fast to parse)
- llms-full.txt — All documentation compiled into one file (comprehensive)
Cloudflare even offers product-specific files (e.g., /workers/llms-full.txt) so AI can fetch only relevant context.
Reality Check: Current Adoption
An OtterlyAI study found that over three months and 60,000+ AI bot visits, only 0.1% requested llms.txt. The standard is still early.
However, AI coding assistants (Cursor, Claude Code) actively use it. And as AI search grows 527% YoY, early adopters will benefit most.
Should You Implement It?
Yes, if you have:
- Documentation or help centers
- Blogs or media content
- Product pages or FAQs
- eCommerce with structured content
It takes 10 minutes and positions you for the AI traffic wave. As Vercel's 10% ChatGPT signup rate shows, the ROI is real.