AI Strategy
Matt Gifford10 min read

Your Website Can’t Talk to AI — Here’s Why That’s About to Matter

AI agents are becoming the new browsers. Six emerging standards are defining how websites communicate with them — and the early steps take minutes, not months.

Your Website Can’t Talk to AI — Here’s Why That’s About to Matter

Right now, your website has one job: talk to humans.

A person visits. They read. They click around. Maybe they fill out a form. That model has worked for 30 years.

It's about to break.

The next wave of traffic to your site won't come from people typing URLs or clicking Google results. It will come from AI agents — software that browses, reads, compares, and takes action on behalf of the people who used to do that themselves.

And here's the problem: your website doesn't know how to talk to them.

Not because it's broken. Because it was never designed to. Nobody's website was. But the standards that will change that are being written right now — and the businesses that move early will be the ones AI agents can actually find, understand, and recommend.

AI agents are the new browsers

Think about how you find a restaurant in a city you're visiting. Five years ago, you Googled it. Two years ago, you might have asked ChatGPT. Today, you might tell an AI agent: “Find me a quiet Italian place near the hotel, check if they have outdoor seating, and book a table for 7pm.”

That agent doesn't browse the web the way you do. It doesn't admire your hero image or read your mission statement. It needs structured, machine-readable information: what you offer, where you are, how to interact with your services. If your site can't provide that, the agent moves on — silently, without a bounce rate to warn you.

This isn't a theoretical shift. Google's A2A protocol is already stable. WebMCP is in Chrome Canary. The infrastructure is being built. The question isn't whether AI agents will interact with websites. It's whether yours will be one of them.

“If your site can't talk to AI agents, you're invisible to the fastest-growing channel of discovery. And there's no analytics dashboard to warn you.”

For SMBs, this matters more than it does for enterprise. Large companies have API teams and integration budgets. You don't. But the good news is: the early steps are surprisingly simple, and the playing field is wide open. Fewer than 1% of websites have even added an llms.txt file. The other standards are even less adopted. Moving now puts you ahead of almost everyone.

Six standards. One goal. Here's the map.

The race to define how websites talk to AI agents is happening across multiple fronts. Some of these standards compete. Some complement each other. None of them require you to rebuild anything — they're additive layers on top of what you already have.

Schema.org / JSON-LD The Foundation

W3C

Mature

This is the one you should already have. Schema.org provides a standardised vocabulary for describing your content — what your business does, where it is, what you sell, what your blog posts are about. JSON-LD is the format Google recommends for embedding it.

If structured data is the language AI agents understand, Schema.org is the dictionary. It's been around for over a decade, but its importance is about to spike — AI agents use structured data as their primary source of truth about your site, which means the businesses with proper structured data will surface in AI recommendations. The ones without it won't.

llms.txt The Welcome Mat

Community: Anthropic, Vercel, Cloudflare

~1% Adoption

Think of llms.txt as a robots.txt for AI models. It's a simple text file at the root of your site that tells AI systems what your site is about, what content is available, and how it should be attributed.

Implementation takes about five minutes. Adoption is still under 1% — mostly because most businesses don't know it exists. That's an opportunity, not a warning.

WebMCP The Big One

Microsoft + Google (W3C)

Chrome 146 Canary

This is the standard to watch. WebMCP turns your website's interactive elements — contact forms, booking systems, product catalogues, search functions — into structured tools that AI agents can invoke directly.

Instead of an AI agent trying to fill out your form by simulating mouse clicks (fragile, slow, error-prone), WebMCP lets it call your form as a structured function with defined inputs and outputs. It's the difference between an AI agent scraping your site and using your site.

Chrome's early preview landed in version 146 Canary in February 2026. It supports both a declarative API (works with existing HTML forms) and an imperative JavaScript API for more complex interactions.

agents.json API Contract

OpenAPI Extension

v0.1.0

If your business exposes data through APIs — product feeds, availability, pricing — agents.json makes that data AI-agent-ready with standardised descriptions and discovery.

A2A Protocol Agent Network

Google → Linux Foundation

Stable

Not about websites talking to agents, but agents talking to each other. Think of it as the infrastructure that lets your booking agent communicate with a customer's scheduling agent.

MCP-B Browser Shortcut

Community (MCP Extension)

Early

In about 50 lines of code, your website can become an MCP server — exposing capabilities to AI agents without any backend infrastructure.

How we're making ShipsMind AI-ready

We don't write about things we haven't done. Here's what we've already implemented on this site — and why.

01

llms.txt

~10 minutes to implement

Our llms.txt file went live as part of our SEO/GEO optimisation sprint. It describes what ShipsMind does, lists key pages, and provides attribution guidance. Total implementation time: about ten minutes, including the time it took to write the content description.

02

Structured Data (Schema.org / JSON-LD)

Part of every page build

Every blog post on this site includes Article schema and BreadcrumbList schema. The homepage carries Organisation schema. This isn't new — Google has recommended it for years — but its value is compounding. AI agents now use structured data to decide which businesses to recommend. Not having it is no longer a minor SEO gap. It's a visibility gap.

03

SEO/GEO Optimisation

Ongoing discipline

GEO — Generative Engine Optimisation — is SEO's younger sibling: optimising content for AI-generated search results, not just traditional rankings. We've structured our content, metadata, and internal linking specifically for both human readers and AI systems that summarise and recommend.

“The work we're doing now is additive — it makes our site better for humans and readable for AI. There's no trade-off. That's rare in technology, and it won't last.”

Why WebMCP changes everything

Here's the part where it gets concrete.

Imagine a potential client asks their AI agent: “Find an AI consulting firm that works with small businesses and can start this month.” Today, that agent would scrape search results, read websites, and try to piece together an answer from unstructured text.

With WebMCP, that agent could directly query our site's capabilities as structured tools: what services we offer, our availability, how to book a consultation — all without rendering a single pixel of HTML.

Now scale that to your business:

  • Your contact form becomes a structured function any AI agent can call
  • Your product catalogue becomes a queryable database, not a page to scroll
  • Your booking system becomes an API any agent can invoke on a user’s behalf

That's not a small change. That's a new distribution channel.

We're tracking WebMCP's progress through Chrome's release cycle. When it reaches stable, we'll implement it — and we'll write about the process when we do. For now, the right move is awareness and planning, not premature implementation.

Three things you can do this week

You don't need to understand every standard. You don't need a developer on retainer. Here's what matters right now:

01

Add llms.txt

5 minutes

Create a plain text file at yoursite.com/llms.txt. Describe what your business does, list your key pages, and state how you'd like content attributed. That's it. You've just made your site more legible to every AI system that checks for it.

02

Add Schema.org JSON-LD

1–2 hours

If you're on WordPress, Shopify, or Squarespace, there are plugins that generate this automatically. If you have a custom site, your developer can add Organisation, LocalBusiness, and Article schemas in an afternoon. This is the single highest-impact change for AI readability.

03

Watch WebMCP

Ongoing

You don't need to implement it yet. But you should know it exists, understand what it will enable, and start thinking about which of your website's interactive features would benefit from being AI-agent-accessible. When Chrome ships it in stable, you'll want to be ready — not scrambling.

Common questions

Your website is about to need a second language. We can help you learn it.

Making your site AI-readable isn't a one-time project — it's an ongoing capability. We help businesses identify which standards matter for their situation, implement them correctly, and stay ahead as the landscape evolves. If you're not sure where to start, that's exactly the conversation we're built for.

See Where Your Site Stands

No jargon, no pressure, no commitment. Just a clear picture of where your site stands and what to do next.