The new standard for AI-ready websites

Your Website Is
Invisible to AI.
Fix It in Minutes.

AI Website Map crawls your site and generates every file that LLMs, AI search engines, and AI agents need to understand, reference, and work with your content — automatically.

Free Score — No Sign-Up Required

Instant scan · 100% free · no account needed

15
AI files generated
0–10
readiness score
<2 min
average generate time
The Problem

The AI Revolution Is Happening Without Your Website

Billions of queries are answered by AI every day — but most websites are completely invisible to these systems. Not because the content isn't good, but because it isn't in a format AI can read.

🤖

LLMs can't understand your site

ChatGPT, Claude, and Gemini summarize the web using structured files — not raw HTML. Without llms.txt or knowledge.json, these models either misrepresent your content or ignore it entirely.

🕵️

AI agents don't know what you offer

Autonomous AI agents need an agent-manifest.json to discover your API endpoints, capabilities, and workflows. Without it, agents cannot integrate with or recommend your product.

📉

You're losing AI search traffic

Perplexity, SearchGPT, and AI Overviews now answer millions of queries directly. Sites with proper AI-readable files are cited 3–5× more often. Your competitors are already adapting.

The Solution

One Tool. 15 Files. Complete AI Visibility.

AI Website Map crawls your site — with or without JavaScript — and generates the complete suite of AI-readable files in under two minutes. Download a single zip and deploy to your root domain.

🗺️

Smart crawling

Auto-detects JavaScript-heavy sites and uses headless Chromium when needed. Discovers pages via sitemap.xml, internal links, and subdomains.

📦

All 15 files, ready to deploy

Every file the AI ecosystem expects — from the LLM-optimized llms.txt to the agent-ready agent-manifest.json — bundled in one zip with a deployment guide.

AI Readiness Score

Get a 0–10 score showing exactly which files are missing and what impact deploying them has on your AI discoverability.

How It Works

Up and Running in Three Steps

No technical setup. No scraping scripts. Just your URL and two minutes.

1

Enter your URL

Paste in any public website URL. AI Website Map handles the rest — no login to the site, no code changes, no configuration needed.

2

We crawl & analyse

Our crawler reads your pages, discovers your structure, extracts headings, content, API endpoints, and metadata — with full JavaScript support for React, Next.js and SPA sites.

3

Download your AI files

Get a zip of all 15 AI-readable files plus a deployment guide. Upload them to your root domain. Your website is now fully visible to every major AI platform within minutes.

What You Get

Every File the AI Ecosystem Needs From You

Each file serves a specific purpose in the emerging AI web. Together they give your site complete coverage across LLMs, AI search, and autonomous agents.

LLM & AI Search
.txt
llms.txt
/llms.txt

The primary summary file for LLMs. Explains your site's purpose and key pages in plain language — directly cited by ChatGPT, Claude, Perplexity and Gemini.

LLM Summary
.txt
llms-full.txt
/llms-full.txt

Complete content dump of every crawled page — for LLMs that need to deeply understand your full site to answer detailed questions.

Full Content
.json
knowledge.json
/knowledge.json

Structured machine-readable site data — organization, pages, navigation, contact, and metadata. The go-to file for AI integrations and agent workflows.

Machine Data
.md
knowledge.md
/knowledge.md

Human-readable Markdown summary of your site's knowledge base — ideal for RAG pipelines, embedded AI chatbots, and developer documentation.

Human + AI
Agent Discovery
.json
agent-manifest.json
/agent-manifest.json

Capability card for autonomous AI agents — declares supported actions, API endpoints, authentication, and integration workflows so agents can discover and use your product.

AI Agents
.json
capabilities.json
/capabilities.json

Machine-readable list of your site's capabilities (search, contact, pricing, blog, etc.) — used by AI orchestration systems to route queries to the right service.

Capabilities
.txt
ai-agents.txt
/ai-agents.txt

Agent interaction descriptor following the emerging ai-agents.txt standard — describes capabilities and interaction rules for autonomous AI systems.

Agent Standard
AI Governance
.txt
ai.txt
/ai.txt

Training consent and usage directives — signals to AI companies whether your content may be used for model training, indexing, summarization, and citation.

IP & Consent
.txt
trust.txt
/trust.txt

Ownership declaration and content policy — verifies your domain identity and sets AI crawl, summarization, and training permissions following the trust.txt standard.

Trust & Ownership
Web Standards
.txt
robots.txt
/robots.txt

Updated crawl directives with explicit rules for AI bots — GPTBot, Claude-Web, PerplexityBot and more — plus sitemap references for all crawlers.

Crawler Control
.xml
sitemap.xml
/sitemap.xml

Freshly generated XML sitemap with priority and frequency metadata — helps search engines and AI crawlers discover and index every page efficiently.

SEO + AI
.json
openapi.json
/openapi.json

Auto-generated OpenAPI 3.1 spec describing your site's pages as structured API endpoints — used by AI coding assistants and developer tools.

API Spec
Semantic & Structured Data
.json
context.jsonld
/context.jsonld

JSON-LD semantic context using Schema.org vocabulary — bridges your site's data with AI knowledge graphs, Wikidata, and semantic search engines.

Semantic Web
.json
schema.json
/schema.json

JSON-LD structured data blocks for every page (Organization, WebPage, Article, Product) — enables Google rich results and AI knowledge graph ingestion.

Structured Data
.json
pricing.json
/pricing.json

Structured pricing data extracted from your plans page — enables AI shopping assistants and comparison tools to answer pricing questions accurately.

Pricing Data
Why It Matters

The Numbers Behind AI Discoverability

AI-powered discovery is the fastest-growing traffic channel on the web. Sites with proper AI files are getting ahead — now.

3–5×

More AI citations

Sites with llms.txt are cited significantly more often in AI-generated answers on Perplexity and ChatGPT Browse.

1B+

AI search queries / month

AI-powered search now handles billions of queries monthly and is growing faster than any previous web technology.

<5%

Sites are AI-ready today

Fewer than 1 in 20 websites have deployed even basic AI-readable files — giving early adopters a massive first-mover advantage.

2 min

To full AI visibility

From entering your URL to downloading your complete AI file package — the average time on AI Website Map is under two minutes.

Free Tool

See Your AI Readiness Score in 5 Seconds

Enter any URL and we'll check which AI-readable files exist at your root domain, score your site from 0–10, and show you exactly what's missing — completely free, no account required.

FAQ

Frequently Asked Questions

llms.txt is a plain-text file placed at the root of your website (e.g. example.com/llms.txt) that summarizes your site's content, purpose, and key pages in a format optimized for Large Language Models like ChatGPT, Claude, and Gemini. LLMs increasingly use this file to understand and accurately represent your website in AI-generated answers. Without it, these models must guess — often incorrectly — or ignore your site entirely.
A sitemap.xml lists your URLs for search engine crawlers. AI Website Map generates a complete ecosystem of files: natural language summaries (llms.txt), structured data (knowledge.json), agent capability cards (agent-manifest.json), training consent (ai.txt), and more. These files speak different languages to different AI systems — a sitemap alone doesn't give LLMs or AI agents the context they need.
Directly, AI Website Map targets AI-powered discovery — ChatGPT, Perplexity, Claude, Gemini, and AI agents — rather than traditional Google ranking. Indirectly: a complete, accurate sitemap.xml and improved robots.txt help Google crawl your site more efficiently. Being cited in AI-generated answers also drives high-quality referral traffic. As Google integrates more AI features (AI Overviews, etc.), having AI-readable content becomes increasingly relevant to search ranking as well.
Yes. AI Website Map automatically detects whether your site requires JavaScript to render meaningful content. For React, Next.js, Vue, Angular, and other SPA frameworks, it uses a headless Chromium browser (Playwright) to render pages before extracting content — the same approach used by Google's crawler. You don't need to configure anything; the detection is automatic.
Regenerate whenever your site's content or structure changes significantly — after a product launch, major content update, pricing change, or new section. Most teams regenerate monthly. AI Website Map makes this a one-click process: re-run the generate, download the new zip, overwrite the files on your server. A scheduled regeneration feature is on the roadmap.
Yes. The AI Readiness Score is completely free — enter any URL and get a 0–10 score with a per-file breakdown in 5 seconds, no account required. Full file generation (the complete zip with all 15 AI-readable files) requires a paid plan.