In September 2024, Jeremy Howard — the guy behind fast.ai and a former president of Kaggle — published a short proposal. He called it llms.txt, and the idea was almost annoyingly simple: give AI models a plain-text map of your website so they don't have to guess what's important.
No one paid much attention at first. Then Cloudflare added one. Then Stripe. Then Anthropic, Vercel, Supabase, and Zapier. Today, over 780 websites have published an llms.txt file, and three separate community directories track who's adopted it.
The question isn't really whether llms.txt will matter. It's whether you'll set yours up before or after your competitors do.
What llms.txt actually is (in plain English)
Think of your website from an AI model's perspective for a second.
When ChatGPT or Perplexity needs to answer a question about your product, it sends a crawler to your site. That crawler sees HTML — headers, footers, navigation menus, cookie banners, newsletter popups, sidebar widgets. Buried somewhere in that noise is the content that actually matters.
Your XML sitemap doesn't help much here. It lists every URL on the site equally — the homepage, the privacy policy, that test page from 2019 you forgot to delete. There's no hierarchy, no context, no "start here."
llms.txt solves this by giving AI models exactly what they need: a structured Markdown file at yoursite.com/llms.txt that says "here's what this site is about, and here are the pages that matter most."
That's it. A curated table of contents for machines that read.
The format: simpler than you'd expect
An llms.txt file is just Markdown with a specific structure:
# Your Site Name
> A one-line summary of what your site does.
Some optional context about your project, company, or product.
## Documentation
- [Getting Started](https://yoursite.com/docs/start): Setup guide for new users
- [API Reference](https://yoursite.com/docs/api): Complete API documentation
## Blog
- [Latest Release](https://yoursite.com/blog/v2): What's new in version 2.0
The rules are minimal:
- One H1 heading — your site or project name (the only required element)
- A blockquote — brief summary with key facts
- H2 sections — groups of links, each with an optional description
- Markdown links —
[Page Name](URL): optional description
That's the entire spec. Jeremy Howard intentionally kept it simple because "websites already have sitemaps, robots.txt, and meta tags. What they don't have is a curated overview written specifically for language models."
llms.txt vs llms-full.txt — yes, there are two
The standard actually defines two files:
llms.txt is the summary — your site's elevator pitch plus links to key pages. Think of it as the table of contents. Most llms.txt files are a few hundred lines at most.
llms-full.txt is the entire book. It contains the full Markdown content of your key pages, concatenated into one massive file. Cloudflare's llms-full.txt is 3.7 million tokens. Anthropic's is 481,000 tokens. Vercel's has been described as "a 400,000-word novel."
Why would you want that? Because an AI model with a large enough context window can ingest your entire documentation in one shot — no crawling, no parsing HTML, no missing pages.
For most WordPress sites, llms.txt alone is enough. But if you run a documentation-heavy site or want AI to have deep access to your content, llms-full.txt is worth considering.
Who's actually using it?
The short answer: mostly developer-facing companies. The long answer is more interesting.
Stripe organizes their llms.txt by product category, with an "Optional" section for niche tools like Stripe Climate. Their file tells AI agents exactly how to retrieve plain-text versions of documentation pages.
Cloudflare structures theirs by product line — Workers, Pages, R2, and so on. Each entry has a description that helps AI models decide which page to fetch based on the user's question.
Anthropic (the company behind Claude) maintains both files. Their llms.txt is 8,364 tokens; the full version is 481,349 tokens covering their entire API documentation.
Vercel went a step further and published a proposal for inline LLM instructions in HTML — extending the llms.txt idea into individual pages.
Other adopters include Supabase, Zapier, Modal, Coinbase, and hundreds of smaller companies tracked across three community directories.
The honest truth: does it actually work?
Here's where most articles about llms.txt get vague. We won't.
The data is mixed. Search Engine Land tracked 10 websites for 180 days — 90 days before adding llms.txt, 90 days after. The results:
- 2 out of 10 sites saw AI traffic increases (12.5% and 25%)
- 8 sites saw no measurable change
- 1 site actually declined by 19.7%
But here's the catch: the two sites that grew had also launched new content and PR campaigns around the same time. The llms.txt file alone didn't drive the increase — it was part of a broader effort.
Adoption is still early. Rankability scanned the top 1,000 most-visited websites globally. As of mid-2025, only 0.3% had an llms.txt file. That's 3 out of 1,000.
No AI provider has officially committed to reading llms.txt. OpenAI, Anthropic, Google — none have confirmed their crawlers consistently follow llms.txt instructions. Google's John Mueller compared the standard to the deprecated keywords meta tag.
So why bother?
Three reasons:
Low effort, no downside. Setting up llms.txt takes 5 minutes. There's zero risk. It won't hurt your rankings, slow your site, or cause conflicts with anything.
The direction is clear. Even if no one reads llms.txt today, AI search is growing at 34% CAGR. Gartner predicts 25% of search volume will shift away from traditional engines by late 2026. When AI providers do formalize how they read websites, llms.txt (or something very similar) will be the standard.
First-mover advantage is real. If only 0.3% of top sites have llms.txt, that means 99.7% of your competitors don't. Early adopters set the baseline that later standards build on.
How llms.txt differs from robots.txt and XML sitemaps
This trips people up, so here's a clear comparison:
| robots.txt | XML Sitemap | llms.txt | |
|---|---|---|---|
| Purpose | Controls crawler access | Lists all URLs for indexing | Curated overview for AI understanding |
| Audience | Search engine crawlers | Search engine crawlers | Language models (LLMs) |
| Format | Custom syntax | XML | Markdown |
| Content | Allow/Disallow rules | URLs + metadata | Descriptions + key page links |
| Analogy | Security guard | Phone book | Tour guide |
They serve completely different purposes. You need all three — robots.txt controls who gets in, the sitemap lists what exists, and llms.txt explains what matters.
Setting up llms.txt on WordPress
You have two options: manual or plugin. Manual works, but it's tedious to maintain. Here's why.
Option 1: Manual setup
Create a file called llms.txt in your WordPress root directory (same place as wp-config.php):
# Your Site Name
> Brief description of your website and what it offers.
## Pages
- [Home](https://yoursite.com/): Main landing page
- [About](https://yoursite.com/about/): Company information
- [Services](https://yoursite.com/services/): What we offer
## Blog
- [Latest Post](https://yoursite.com/blog/latest/): Description
Upload it via FTP or your hosting file manager.
The problem: Every time you publish a new post, update a page, or change your site structure, you need to manually edit this file. For a site with 50+ posts, maintaining llms.txt by hand gets old fast.
You also need to handle URL rewrites so WordPress serves the file correctly, which means editing .htaccess or adding a custom rewrite rule in functions.php.
Option 2: Automatic generation with a plugin
This is where things get practical. Several WordPress plugins now generate llms.txt automatically. The setup typically looks like:
- Install the plugin
- Go to settings and choose which post types to include (posts, pages, products, etc.)
- Click "Generate" — the plugin creates llms.txt based on your actual content
- It updates automatically when you publish or edit content
What Prime SEO does specifically:
Prime SEO includes llms.txt generation as part of its AI Settings module. Here's what it offers beyond basic generation:
- Auto-generation of both llms.txt and llms-full.txt
- Post type selection — choose which content types appear in the file
- Section management — organize links by custom sections, not just post type
- AI Bots Manager — control which AI crawlers can access your site (16 bots including GPTBot, ClaudeBot, PerplexityBot)
- AI Crawler Stats — see which AI bots actually visit your site and how often
- AI Visibility Score — a quick check of how AI-ready your site is
The llms.txt and AI bots features are free — no Pro license required.
Install Prime SEO on WordPress.org — takes about 2 minutes.
What to include in your llms.txt (and what to skip)
Not every page belongs in your llms.txt. The whole point is curation — helping AI focus on what matters.
Include:
- Homepage
- Core product/service pages
- Documentation and how-to guides
- Key blog posts (evergreen content, not news)
- Pricing page
- FAQ or knowledge base
Skip:
- Privacy policy, terms of service (AI doesn't need these for answering questions)
- Category and tag archives
- Author pages
- Thin pages with little content
- Pages marked
noindex - Test/staging pages
Good descriptions matter. Don't just list URLs — add context:
## Documentation
- [Getting Started](https://yoursite.com/docs/start): Step-by-step installation
and configuration guide for new users
- [API Reference](https://yoursite.com/docs/api): Complete REST API documentation
with code examples in Python, JavaScript, and PHP
That description tells an AI model exactly what it'll find at each URL, so it can pick the right source for the user's question.
A real-world example
Here's what our own llms.txt looks like at prime-seo-plugin.com:
# Prime SEO
> Free WordPress SEO plugin with AI search optimization.
> Manage LLMs.txt, control AI crawlers, generate XML sitemaps,
> add schema markup, and optimize for both Google and AI search engines.
## Documentation
- [Getting Started](https://prime-seo-plugin.com/docs/getting-started-with-prime-seo/):
Installation and initial setup guide
- [AI Settings](https://prime-seo-plugin.com/docs/ai-settings-module/):
LLMs.txt editor, AI Bots Manager, and AI Crawler Stats
- [XML Sitemap](https://prime-seo-plugin.com/docs/xml-sitemap-module/):
Sitemap configuration and post type settings
- [Schema Markup](https://prime-seo-plugin.com/docs/schema-markup-module/):
JSON-LD structured data for posts and pages
## Features
- [Compare](https://prime-seo-plugin.com/compare/):
Prime SEO vs Yoast vs Rank Math vs AIOSEO
- [AI SEO](https://prime-seo-plugin.com/ai-seo/):
AI search optimization features overview
- [Pro](https://prime-seo-plugin.com/pro.html):
AI Generator, Bulk Generator, Video Sitemap
Notice how each link has a description that tells the AI model what the page contains. An AI answering "How do I set up schema markup in Prime SEO?" knows exactly which link to pull from.
Checklist: your llms.txt in 15 minutes
Here's the quick version for people who skim (no judgment — we do it too):
- Decide on content — list 10-20 most important pages on your site
- Choose your method — manual file or plugin (we recommend plugin for WordPress)
- Write descriptions — every link needs a 1-line explanation
- Organize sections — group by type (Docs, Blog, Product, etc.)
- Test the URL — visit
yoursite.com/llms.txtand verify it loads - Set up llms-full.txt (optional) — if you have documentation-heavy content
- Monitor AI traffic — check if AI bots are reading your new file
- Update regularly — or use a plugin that updates automatically
What's next for the standard?
llms.txt is still a proposal, not an official web standard. It doesn't have W3C backing or formal support from major AI providers.
But the momentum is real. Three community directories track adoption. GitHub discussions are active. Major tech companies are implementing it on their documentation sites. And the underlying problem — AI models struggling to understand website structure — isn't going away.
Whether the final standard is called llms.txt or something else, the concept of "a curated, machine-readable summary of your website" is becoming essential. Setting it up now costs nothing and positions your site for whatever comes next.
The sites that prepared for mobile-first indexing early didn't regret it. The same pattern is playing out with AI search readiness.
FAQ
Is llms.txt required for WordPress? No. It's optional, and your site will function perfectly without it. But adding it improves your chances of being correctly cited when AI search engines reference your content.
Will llms.txt improve my Google rankings? No. Google does not use llms.txt for ranking. It specifically targets AI language models like ChatGPT, Claude, and Perplexity. Your traditional SEO setup (sitemap, meta tags, schema) remains separate.
Do I need both llms.txt and llms-full.txt? For most sites, llms.txt alone is sufficient. llms-full.txt is useful if you have extensive documentation and want AI models to access the full content without crawling each page individually.
How often should I update llms.txt? Every time you add or remove important content from your site. If you use a plugin like Prime SEO, this happens automatically when you publish or update posts.
Is llms.txt the same as robots.txt? No. robots.txt controls which crawlers can access your site. llms.txt tells AI models what your site is about and which pages are most important. They serve different purposes, and you need both.
Ready to optimize for AI search?
Prime SEO is the first WordPress plugin built for AI search engines. Free forever.
Install Free Plugin