Tristan Watson
Tristan Watson Founder · March 29, 2026 · 12 min read
AI SEO

How to Do GEO: A Step-by-Step Guide

Most GEO guides tell you what generative engine optimization is. This one tells you how to do it -- step by step, with technical details, so you can start today.

Generative engine optimization (GEO) is the practice of making your website citable by AI search engines. ChatGPT, Claude, Perplexity, Gemini -- they are all answering questions about your industry right now. GEO determines whether they mention you or your competitors.

If you have read the explainers and want the actionable playbook, this is it. We will walk through every technique, in order, with the technical details you need to implement each one.


The GEO Lifecycle: 5 Phases

GEO is not a single tactic. It is a lifecycle. At llmstxt.studio, we have catalogued 840+ sites with llms.txt files in our Examples Directory and tracked what separates sites that get cited from sites that do not.

The pattern is consistent. Effective GEO follows five phases: Scan, Build, Understand, Monitor, and Measure. The techniques below map to each phase.

Scan
Build
Understand
Monitor
Measure

Phase 1: Scan -- Audit Your AI Readiness

Before you optimize anything, you need to know where you stand. A GEO audit checks five things:

  1. llms.txt presence -- Does your site have an llms.txt file at the root?
  2. Structured data -- Does your site use schema.org markup that AI can parse?
  3. Robots.txt AI directives -- Can AI crawlers actually access your site?
  4. Content structure -- Is your content organized with clear headings, lists, and definitions?
  5. Sitemap health -- Does your XML sitemap reflect your current pages?

Run a free AI Readiness Check to get scored on all five factors in 30 seconds. No signup required. This gives you a baseline before you start optimizing.

Why this matters

Most sites fail on at least 2 of these 5 factors. The audit tells you exactly where to focus your effort instead of guessing.


Phase 2: Build -- Create Your AI Profile

This is the highest-impact phase. You are building the structured foundation that AI uses to understand your site.

Technique 1: Generate and deploy an llms.txt file

llms.txt is to AI what robots.txt is to search engines. It is a plain-text file that lives at yoursite.com/llms.txt and gives AI a structured summary of your website.

The llms.txt specification defines the format. Here is what a well-structured file looks like:

# Your Business Name

> A one-line description of what your business does.

## About
- [About Us](https://yoursite.com/about): Who you are, your expertise, your history.

## Services
- [Web Design](https://yoursite.com/services/web-design): Custom website design for small businesses.
- [SEO Services](https://yoursite.com/services/seo): Search engine optimization and content strategy.

## Resources
- [Blog](https://yoursite.com/blog): Industry insights and how-to guides.
- [Case Studies](https://yoursite.com/case-studies): Real results from real clients.

The key structural elements:

  • H1 heading (#) -- Your business or site name
  • Blockquote (>) -- A concise description of what you do
  • H2 sections (##) -- Logical groupings of your content
  • Markdown links with descriptions -- Each page with context about what it contains

You can write this by hand, or use a tool like the llmstxt.studio generator to create it automatically from your sitemap. The generator reads your sitemap, categorizes your pages, and produces a spec-compliant file in seconds.

Deploy it by uploading the file to your site's root directory -- the same place your robots.txt lives. On WordPress, that is usually /public_html/. On Vercel or Netlify, put it in your /static or /public folder.

Technique 2: Add structured data (schema.org)

Structured data gives AI explicit context about your content. Instead of inferring that a page is a product page from the HTML, the AI reads @type: Product and knows immediately.

The highest-impact schema types for GEO:

Schema TypeUse WhenGEO Impact
OrganizationHomepage, About pageEstablishes entity identity
LocalBusinessAny local businessCritical for "near me" AI queries
FAQPageFAQ sectionsDirect answer extraction by AI
HowToTutorial contentStep-by-step answer formatting
ProductProduct pagesPrice, availability, review signals
ArticleBlog postsAuthority and freshness signals

Add structured data as JSON-LD in the <head> of each page. Most CMS platforms have plugins for this -- Yoast for WordPress, or built-in support in Shopify and Squarespace.

Technique 3: Configure robots.txt for AI crawlers

Your robots.txt file might be blocking AI from reading your site. Many default configurations block crawlers like GPTBot, ClaudeBot, and PerplexityBot.

Check your robots.txt for these directives and make sure they are not blocking the crawlers you want:

# Allow AI crawlers (add these to your robots.txt)
User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Google-Extended
Allow: /

If your robots.txt contains Disallow: / for any of these user agents, AI cannot crawl your site -- and it cannot cite what it cannot read. Our AI Readiness Check tests 8 AI crawlers against your robots.txt automatically.


Phase 3: Understand -- Optimize Your Content for AI

With the technical foundation in place, the next phase is making your content easier for AI to understand, extract, and cite.

Technique 4: Write citation-worthy content

AI cites content that directly and authoritatively answers questions. The pattern is specific:

  • Lead with definitions. If someone asks "what is [your topic]," the AI needs a clear, quotable sentence. Put it in the first paragraph -- not buried in paragraph four.
  • Use concrete data. Numbers, statistics, and specific claims are more citable than vague assertions. "We serve 200+ clients in Austin" beats "We serve many clients."
  • Structure with headings. AI parses H2 and H3 tags as topic boundaries. A well-structured page with clear headings is easier for AI to extract answers from.
  • Answer questions directly. FAQ sections, how-to steps, and definition paragraphs give AI clean extraction points. These are the passages that end up in AI-generated answers.

Technique 5: Build topical authority

AI search engines prefer to cite authoritative sources. Authority in GEO comes from:

  • Content depth. Cover your topic thoroughly. Multiple related articles on the same subject signal expertise.
  • Internal linking. Link related pages to each other. This helps AI understand the relationships between your content.
  • External references. When authoritative sources link to you or mention you, AI treats that as a trust signal.
  • Freshness. Updated content with recent dates signals that the information is current. AI deprioritizes stale content.

This is where GEO and SEO overlap significantly. The content work that builds authority for Google also builds authority for AI. The difference is that GEO requires the structural layer (llms.txt, schema, crawler access) on top of the content layer.

Technique 6: Enhance your llms.txt with AI descriptions

A basic llms.txt file lists your pages with short descriptions. An enhanced one includes AI-generated summaries that give the language model rich context about each page's content.

The difference:

# Basic
- [Blog](https://yoursite.com/blog): Our blog.

# Enhanced
- [Blog](https://yoursite.com/blog): Industry analysis, how-to guides, and case studies on small business web design. Published weekly since 2019. Topics include responsive design, conversion optimization, and CMS selection for non-technical founders.

The enhanced version gives AI far more to work with. When someone asks about "small business web design guides," the AI knows exactly what your blog covers. llmstxt.studio's AI Enhancement feature visits each page on your site and writes these descriptions automatically.


Phase 4: Monitor -- Keep Your AI Profile Current

Technique 7: Set up Sitemap Monitoring

Your website changes. You add blog posts, update services, launch new products. If your llms.txt file does not reflect those changes, AI is working with outdated information.

Sitemap Monitoring detects when your sitemap changes and alerts you that your llms.txt needs updating. This prevents a common failure mode: you do GEO once, forget about it, and six months later your AI profile describes a business that no longer matches reality.

Manual check: compare your sitemap's page count to your llms.txt page count quarterly. Automated: use a monitoring tool that watches your sitemap for changes. llmstxt.studio checks daily on Pro plans and hourly on Premium.


Phase 5: Measure -- Track AI Citations

This is where most GEO guides stop -- at implementation. But without measurement, you are optimizing blind. The critical question: is AI actually citing you?

Why citation tracking matters

You can implement every technique above and still not get cited. Maybe a competitor has stronger authority. Maybe AI does not consider your industry a good fit for citation. Maybe your content is structured but not authoritative enough.

Citation tracking answers these questions with data:

  • Are you being cited? Yes or no, for specific queries in your industry.
  • Who is being cited instead? Competitor intelligence shows you exactly who AI recommends.
  • Is it improving? Trend data shows whether your GEO work is moving the needle.

Run an AI Citation Check to see where you stand. The check generates industry-specific queries from your llms.txt content, runs them against AI search engines, and reports whether your domain appears in the citations -- along with every competitor that does.


GEO Best Practices Checklist

Here is a summary of every technique, organized as a checklist you can work through:

Scan

Run an AI Readiness Check to establish your baseline

Check robots.txt for blocked AI crawlers

Verify your XML sitemap is current

Build

Generate a spec-compliant llms.txt file

Deploy it to your site root (yoursite.com/llms.txt)

Add schema.org structured data to key pages

Allow GPTBot, ClaudeBot, and PerplexityBot in robots.txt

Understand

Lead pages with clear definitions and direct answers

Use concrete data and specific claims

Structure content with descriptive H2/H3 headings

Add FAQ sections with question-and-answer format

Enhance llms.txt descriptions with rich page summaries

Monitor

Set up Sitemap Monitoring to detect content changes

Update llms.txt when you add or remove pages

Review and refresh llms.txt descriptions quarterly

Measure

Run AI Citation Checks to track who AI recommends

Review competitor data to understand the competitive landscape

Track citation trends over time to measure GEO impact


Common GEO Mistakes

After analyzing 840+ sites in our Examples Directory, these are the patterns we see in sites that fail to get cited:

  • Generating an llms.txt file but never deploying it. The file needs to be live at your site root. An undeployed file does not exist to AI.
  • Blocking AI crawlers in robots.txt. Many WordPress security plugins block unknown bots by default. Check your directives.
  • Writing vague descriptions. "Our blog" tells AI nothing. "Weekly guides on commercial real estate investing for first-time buyers" tells it everything.
  • Optimizing once and forgetting. GEO is a lifecycle. Your site changes, and your AI profile needs to keep up.
  • Not measuring. Without citation tracking, you have no idea if your work is paying off. You could be investing time in techniques that are not moving the needle for your specific industry.

Frequently Asked Questions

How long does GEO take to show results?

The technical foundation -- llms.txt, structured data, robots.txt -- takes under an hour. Content optimization is ongoing. Most sites see changes in AI citation behavior within 2-4 weeks after deploying structured improvements.

Do I need a developer to do GEO?

No. Generating an llms.txt file, adding structured data via CMS plugins, and configuring robots.txt are all non-technical tasks. Tools like llmstxt.studio automate the file generation entirely -- you just review and deploy.

What is the most important GEO technique?

Creating and deploying an llms.txt file. It gives AI a structured summary of your site instead of forcing it to guess from raw HTML. Combine it with citation tracking to measure results, and you have the GEO foundation covered.

Can I do GEO and SEO at the same time?

Yes -- and you should. They are complementary. Structured data helps both Google and AI. Quality content ranks on Google and gets cited by AI. The GEO-specific additions (llms.txt, AI crawler access, citation tracking) do not conflict with any SEO work.


Start With Your AI Readiness Check

AI is recommending someone in your industry right now. The techniques above work -- but only if you know where to start.

Run a free AI Readiness Check to see how your website scores across 5 AI-readiness factors. 30 seconds. No signup required. Then follow the lifecycle: Scan, Build, Understand, Monitor, Measure.

Share

Find out if AI recommends you

We scan your site, generate your AI profile, and continuously monitor 40 prompts about your business. $19/mo.

Get Started Free