Your Website Has a New Visitor, and It’s Not Human
I’ve been optimizing websites for search engines for over two decades. The visitor changed once when Google replaced directories. It changed again when mobile overtook desktop. Now it’s changing a third time, and most people aren’t paying attention.
AI agents are browsing the web on behalf of users. Not theoretically. Not “someday.” Right now. And when they hit your site, they’re choking on the same HTML soup that browsers love: nav bars, JavaScript bundles, cookie consent modals, CSS frameworks. All of it useless to an agent that just needs your content.
David Cramer, co-founder of Sentry, published his company’s implementation on March 12, 2026. Cloudflare is already doing it at scale. This isn’t a whitepaper. It’s production code.
The 80% Problem
When an AI agent requests your homepage, it gets everything a Chrome browser would get. Every script tag, every stylesheet, every tracking pixel. Cloudflare ran the benchmark: serving markdown instead of HTML reduces token consumption by 80%. Checkly’s testing showed 99.7% reduction in some cases.
That’s not optimization. That’s the difference between an agent being able to use your site and giving up.
The agents aren’t going to tell you they bounced. There’s no analytics dashboard for “AI agent couldn’t parse your navigation menu.” You just don’t show up when someone asks their assistant to find a service like yours.
Four Standards Are Emerging
I’ve been tracking four approaches that are gaining real traction:
1. Content Negotiation
This one’s almost funny. The HTTP Accept header has existed since 1997. A client sends Accept: text/markdown, and your server can respond with clean markdown instead of HTML. The plumbing was always there. We just never had a visitor that needed it.
Cloudflare is implementing this across their infrastructure. Cramer documented exactly how Sentry handles it: detect the agent, strip the browser chrome, serve structured content. Redirect auth pages to programmatic alternatives like APIs or CLI tools.
2. llms.txt
Think of it as robots.txt for language models. A curated markdown file in your root directory that tells LLMs what your site is about and where to find the important stuff. Jeremy Howard from fast.ai proposed it, and over 600 sites have adopted it, including Anthropic, Stripe, and Cloudflare.
I’ll be honest: no major LLM search engine officially supports llms.txt yet. Google’s John Mueller has pushed back on the idea of serving markdown to crawlers. But 600+ implementations from serious companies tells me the direction is clear, even if the official support hasn’t caught up. Here is a competitive site I track using it: seatgeek.com and seer interactive.
3. Agent Skills
This goes beyond content. agentskills.io provides packaged, executable instructions that tell agents what they can actually DO on your site. Not just “here’s our pricing page” but “here’s how to check availability and book a consultation.” It started at Anthropic and is now an open standard.
4. WebMCP
This is the one with institutional weight behind it. WebMCP is a W3C spec from Google and Microsoft that lets websites expose structured tools directly to AI agents through the browser. Instead of an agent screenshotting your page and guessing where to click, your site tells the agent exactly what it can do. Register a JavaScript function with navigator.modelContext.registerTool(), give it a description and schema, and agents can call it natively. It shipped in Chrome Canary in February 2026. Think of it as the difference between an agent fumbling through your UI and your site handing it a menu.
What the Market Looks Like
The industry is already splitting. Traditional SEO is one track. AI Search Optimization, which some are calling GEO (Generative Engine Optimization), is another. Agencies are charging $1,500 to $50,000+ per month for GEO services. The numbers back it up: AI search traffic is up 527% year over year, and Semrush found that the average AI search visitor is worth 4.4x more than a traditional organic visitor.
The terminology is new but the game is the same: make your content findable by whatever’s doing the finding. Twenty years ago that meant link building. Today it means making your site parseable by machines that don’t have eyes.
The Accessibility Connection
Here’s something that surprised me. UC Berkeley and University of Michigan researchers found that AI agents perform 36% better on accessible websites. The accessibility tree, originally built for screen readers, is becoming the primary interface agents use to understand your site.
Build for humans who need help seeing your site, and you’ve built for agents too. That’s not a coincidence. That’s good architecture paying compound interest.
What to Do About It
If you’re running a business site in 2026, here’s the short list:
Serve markdown when agents request it. Implement content negotiation on your server. When the Accept header asks for text/markdown, give it clean content.
Create an llms.txt file. Put it in your root directory. List your key pages with brief descriptions.
Strip the browser junk from agent responses. No nav, no JavaScript, no CSS. Just content and links.
Restructure for hierarchy. Important information first. Links organized by relevance.
Audit your accessibility. It’s doing double duty now, and it was always the right thing to do anyway.
The visitor changed. The optimization follows. This is what we’re building at Adapt Marketing, agent-ready content optimization for businesses that want to be found by the next generation of search.

