Enter any website URL and get a ready-made llms.txt file in seconds — so ChatGPT, Perplexity, and Claude can find and cite you accurately.
Paste any website address — COVEN fetches the page and sitemap to understand your content.
Claude analyses your site and writes a well-structured llms.txt following the official Jeremy Howard standard.
Copy the file, add it to your site root, and AI engines will start using it immediately at inference time.
llms.txt in your website root so it's accessible at yourdomain.com/llms.txt.
For Cloudflare Pages, drop it in your public/ folder. For WordPress, upload to your site root via FTP.
Fewer than 5% of sites have one — learn more at llmstxt.org.
See how this site scores across all 8 GEO signals — AI crawlers, schema, citability, IndexNow and more — with a free scan or a full GEO & AI Readiness audit.
About llms.txt
An llms.txt file is a plain-text Markdown file at /llms.txt on your website that tells AI models what your site is about and where to find your most important content.
The llms.txt standard was proposed on 3 September 2024 by fast.ai and Answer.AI co-founder Jeremy Howard at llmstxt.org. It's becoming the AI equivalent of robots.txt.
Sites with a well-formed llms.txt are cited more accurately and more often in AI responses. Fewer hallucinations about your brand, more correct citations.
As of 2026, llms.txt adoption is still extremely low. Sites that implement it now gain a first-mover advantage in AI search before it becomes a standard expectation.
LLMs have limited context windows. Modern websites contain 500k+ tokens of HTML noise. llms.txt gives AI a clean, curated summary — maximising the signal in every query.
Go further
A complete GEO profile covers AI crawler access, schema markup, citability scoring, IndexNow, and brand authority signals. COVEN audits all eight dimensions.