How to Create an llms.txt File
llms.txt is a markdown file you place at the root of your website to help AI systems understand what your site is about. This guide covers everything: the spec, how to write your file, what good and bad implementations look like, and how to deploy it on any platform.
What is llms.txt?
llms.txt is a plain markdown file hosted at yoursite.com/llms.txt that gives AI systems a structured summary of your website. Think of it as a curated introduction -- written for machines, readable by humans -- that tells AI assistants like ChatGPT, Claude, and Perplexity where to find your most important content.
The concept is straightforward: instead of making AI crawl your entire site and guess what matters, you give it a clean summary. One file. Plain text. No complex setup.
The spec was created by Jeremy Howard, co-founder of fast.ai and a respected figure in the AI community. You can read the full specification at llmstxt.org.
The llms.txt File Format
An llms.txt file is structured markdown with a specific format. Here is an annotated example showing every element:
# Acme Corp > Acme Corp builds project management tools for remote teams. ## Core Pages - [Product Overview](https://acme.com/product): Core features and capabilities - [Pricing](https://acme.com/pricing): Plans from free to enterprise - [About Us](https://acme.com/about): Company mission and team ## Documentation - [Getting Started](https://acme.com/docs/start): Setup guide for new users - [API Reference](https://acme.com/docs/api): REST API documentation ## Optional - [Blog](https://acme.com/blog): Product updates and industry insights - [Careers](https://acme.com/careers): Open positionsHere is what each element does:
| Element | Syntax | Required? | Purpose |
|---|---|---|---|
| Title | # Your Name | Required | The H1 heading -- your brand or site name |
| Description | > One-line summary | Required | A blockquote summarizing what your site is about |
| Sections | ## Section Name | Required | H2 headings that group your links by topic |
| Links | - [Title](URL): Description | Required | Annotated links to your key pages |
| Optional section | ## Optional | Optional | Lower-priority pages AI can skip if context is limited |
Hosting requirements: The file must return a 200 status code, use text/plain or text/markdown MIME type, be encoded in UTF-8, and be publicly accessible without authentication.
llms.txt vs. robots.txt vs. sitemap.xml
These three files serve different purposes. You likely need all of them.
| robots.txt | sitemap.xml | llms.txt | |
|---|---|---|---|
| Purpose | Controls which pages crawlers can access | Lists all pages for search engines to index | Summarizes key content for AI understanding |
| Audience | Search engine bots | Search engine indexers | AI language models |
| Format | Plain text directives | XML | Markdown |
| Location | /robots.txt | /sitemap.xml | /llms.txt |
| Tells machines... | "You can or cannot visit these pages" | "These pages exist on my site" | "Here is what matters and why" |
robots.txt is access control. sitemap.xml is a map. llms.txt is a guide. They complement each other -- llms.txt does not replace either of the others. If you already have a sitemap, you can use it as a starting point for generating your llms.txt. For a deeper comparison, see our SEO vs. AI Optimization guide.
Does llms.txt Actually Work?
This is the question most articles dodge. Here is what we know as of early 2026.
What is confirmed
- The spec was created by Jeremy Howard, a widely respected figure in AI research
- 840+ websites have published llms.txt files, including companies like Anthropic and Cloudflare
- Documentation platforms like GitBook and Mintlify generate llms.txt automatically
- AI coding tools like Cursor reference llms.txt when indexing project documentation
What is not yet confirmed
None of the major AI chat providers -- OpenAI (ChatGPT), Anthropic (Claude), or Google (Gemini) -- have publicly stated that their models specifically parse llms.txt during web browsing or search.
Why we still recommend implementing it:
- Near-zero cost. Creating an llms.txt file takes minutes and requires no infrastructure.
- Significant upside. If AI providers begin honoring llms.txt -- and the trajectory suggests this is likely -- sites with files already in place have a head start.
- Structural benefit. The process forces you to think clearly about your site's most important content and how it should be presented. That exercise has value regardless.
- Adoption creates momentum. The more sites implement llms.txt, the stronger the signal to AI providers that the standard is worth supporting.
Step 1: Audit Your Website Content
Before writing your file, identify the 5-20 pages that best represent your site. These are not necessarily your most-visited pages -- they are the pages that answer the question: "What does this site do, and what content is most valuable?"
Content audit checklist
What to exclude: Login pages, admin dashboards, utility pages (terms, privacy -- unless they are important to your business identity), paginated archive pages, and duplicate content.
Not sure where your site stands? Run our free AI SEO check to see how AI-ready your site is right now.
Step 2: Plan Your Section Structure
Organize your pages into logical groups using ## headings. The section names should be descriptive and obvious. Common patterns include:
SaaS / Product
- Core Pages
- Documentation
- Resources
- Optional
E-commerce
- Shop
- Product Categories
- Customer Service
- Optional
Documentation
- Getting Started
- API Reference
- Guides
- Optional
The ## Optional section is a special convention from the spec. Pages listed under it signal to AI systems that this content can be skipped when context is limited. Use it for content that is useful but not essential -- blog archives, career pages, press releases.
See how real companies structure their files in our examples directory.
Step 3: Write Your llms.txt File
Here is a complete template you can copy and adapt:
# [Your Company Name] > [One or two sentences describing what your company does.] ## Core Pages - [Homepage](https://yoursite.com): [What the page covers] - [Product](https://yoursite.com/product): [Features and capabilities] - [Pricing](https://yoursite.com/pricing): [Plans and pricing details] ## Documentation - [Getting Started](https://yoursite.com/docs): [Setup and onboarding] - [API Reference](https://yoursite.com/api): [Technical documentation] ## Optional - [Blog](https://yoursite.com/blog): [Updates and insights] - [About](https://yoursite.com/about): [Company background]Writing good descriptions
Do this
- [Pricing](https://acme.com/pricing): Plans from free to enterprise, billed monthly or annually- [API Docs](https://acme.com/api): REST API reference with authentication, endpoints, and rate limits
Not this
- [Pricing](https://acme.com/pricing): Click here to see our amazing prices!- [API Docs](https://acme.com/api): Documentation
Descriptions should be factual and specific. Avoid marketing language, superlatives, and vague labels. AI systems need facts, not pitches.
Skip the manual work
Our free generator creates a spec-compliant llms.txt from your sitemap in under a minute.
Try the GeneratorStep 4: Create llms-full.txt (Optional)
llms-full.txt is a companion file that includes the full content of your key pages, not just links and descriptions. It is useful for documentation sites, knowledge bases, and any site where AI systems would benefit from reading complete page content rather than just summaries.
When to use it:
- Your site is primarily documentation or educational content
- You want AI systems to have deep knowledge of your product
- Your pages have substantial text content worth indexing in full
When to skip it:
- Your site is primarily visual (e-commerce product pages with images)
- Your content changes frequently (the file would constantly be outdated)
- The standard llms.txt file is sufficient for your needs
File size: Keep llms-full.txt under the context window limits of current AI models. A practical limit is roughly 100,000 tokens (~75,000 words). For most sites, this is more than enough.
Step 5: Validate Your File
Before deploying, check your file for common issues:
Broken or relative URLs
All URLs must be absolute (https://yoursite.com/page, not /page)
Missing description blockquote
The > line after your H1 title is required by the spec
Wrong MIME type
Server must return text/plain or text/markdown, not text/html
Links to non-existent pages
Test every URL in your file -- 404s erode trust with AI systems
Including too many pages
This is a curated summary, not a sitemap. 5-20 key pages is the sweet spot
Step 6: Deploy to Your Website
Your llms.txt file needs to be accessible at https://yoursite.com/llms.txt. How you get it there depends on your platform.
Option A (simplest): Upload llms.txt to your WordPress root directory via FTP/SFTP (the same directory that contains wp-config.php).
Option B: Use a plugin like Yoast SEO or Rank Math that supports custom file serving, or a file manager plugin to place the file in your root.
Verify: Visit yoursite.com/llms.txt in your browser. You should see the raw markdown text.
For a detailed walkthrough, see our WordPress implementation guide.
Option A: Create a new page with the handle llms-txt and paste your content as plain text. Then set up a URL redirect from /llms.txt to /pages/llms-txt.
Option B: Use a Liquid template to serve the file with the correct content type. Create a new template page.llmstxt.liquid with content_type: "text/plain".
Verify: Visit yourshop.com/llms.txt and confirm the content loads as plain text.
Upload your llms.txt file via the Asset Manager, then create a 301 redirect from /llms.txt to the asset URL in Project Settings > Hosting > 301 Redirects.
Note: Webflow serves assets from a CDN subdomain. The redirect ensures the file is accessible at your root domain.
Static: Place llms.txt in your /public directory. It will be served automatically at the root.
Dynamic: Create an API route or route handler at /llms.txt that returns the content with Content-Type: text/plain.
Upload llms.txt to your web root via FTP, SFTP, or your hosting file manager. The file should sit alongside your index.html, robots.txt, and sitemap.xml.
Verify: Run curl -I https://yoursite.com/llms.txt and confirm the status is 200 and the content type is text/plain.
Step 7: Monitor and Maintain
An llms.txt file is not a set-it-and-forget-it asset. Review it when:
- You add or remove key pages -- new products, deprecated features, removed content
- You restructure your site -- navigation changes, URL migrations, new sections
- Quarterly review -- check links still work, descriptions are accurate, nothing important is missing
- Your business focus shifts -- new target audience, new product lines, pivoted messaging
You can monitor AI bot activity by checking your server logs for requests to /llms.txt. Look for user agents containing "GPTBot", "ClaudeBot", "Google-Extended", or "PerplexityBot". For more on tracking AI traffic, see our measuring AI traffic guide.
Real-World llms.txt Examples
Here are three approaches from real implementations in our examples directory:
Documentation sites typically organize by content type: getting started, API reference, guides, tutorials. The structure mirrors the documentation sidebar, making it intuitive for AI to navigate.
Sections: Getting Started, API Reference, Guides, Tutorials, Optional
SaaS companies often organize by what visitors are looking for: product info, pricing, support, and resources. This approach works well when your audience ranges from evaluators to existing customers.
Sections: Product, Pricing, Documentation, Resources, Optional
E-commerce sites benefit from listing top product categories rather than individual products. Include your shopping policies, shipping info, and customer service pages so AI can answer buyer questions accurately.
Sections: Shop, Categories, Customer Service, About, Optional
Browse all 840+ real implementations to find examples in your industry.
Common Mistakes to Avoid
| Mistake | Why it matters | Fix |
|---|---|---|
| Including every page | Dilutes importance signals; AI cannot tell what matters | Curate to 5-20 key pages |
| Marketing language in descriptions | AI needs facts, not sales copy | Use factual, specific descriptions |
| Relative URLs | AI systems may not resolve relative paths | Always use full https:// URLs |
| Never updating | Stale file links to removed pages, misses new content | Review quarterly or after major changes |
| Wrong MIME type | AI parsers may not process HTML-served markdown | Serve as text/plain |
| Confusing with robots.txt | llms.txt provides content guidance, not access control | Use both files for their intended purposes |
For optimization techniques beyond avoiding mistakes, see our llms.txt best practices guide.
Frequently Asked Questions
As of early 2026, no major AI chat provider has publicly confirmed they parse llms.txt during web browsing. However, several AI coding tools and documentation platforms actively support it, and adoption is growing steadily with 840+ implementations tracked in our examples directory. The low cost of implementation makes it a reasonable bet on the future of AI search.
Keep it concise -- typically 20-50 lines covering your 5-20 most important pages. This is a curated summary, not a complete site index. If you need to provide more detail, use llms-full.txt as a companion file.
No. llms.txt alone is sufficient for most websites. llms-full.txt is an optional companion file that includes full page content -- primarily useful for documentation sites or knowledge bases where AI systems would benefit from reading complete articles rather than just summaries.
No. llms.txt is a separate file that does not affect your existing search engine rankings. It complements your SEO efforts by providing AI-specific context alongside your existing robots.txt and sitemap.xml. There is no downside risk to adding one.
Review it quarterly or whenever you make significant site changes: adding new products, removing pages, restructuring navigation, or changing your business focus. If your site content is relatively stable, a quarterly check is sufficient.
robots.txt controls which pages crawlers can access -- it sets boundaries. llms.txt provides a structured summary of your content for AI understanding -- it gives context and priority. They serve different purposes and you should use both. See our SEO vs. AI Optimization guide for a deeper comparison.
Yes. llms.txt is a plain text file that works on any web platform. See the deployment section above for platform-specific instructions. For a detailed WordPress guide, see our WordPress implementation guide.
Yes. Our free llms.txt generator creates a spec-compliant file from your sitemap. It analyzes your site structure and generates a properly formatted file you can download and deploy. No credit card required.
llms.txt must be a plain text file using markdown formatting. It should be served with a text/plain or text/markdown MIME type, encoded in UTF-8, and publicly accessible without authentication at yoursite.com/llms.txt.
The llms.txt specification was created by Jeremy Howard, co-founder of fast.ai and a widely respected figure in the AI and machine learning community. The spec is open and maintained at llmstxt.org.
Start Creating Your llms.txt
Join 840+ companies that have already prepared for AI search. Generate your file automatically or validate an existing one.