PRODUCT

llms.txt builder.

Generate /llms.txt and /llms-full.txt files for any site with one API call.

What llms.txt is

A machine-readable file at a site's root that tells AI crawlers which content they can use. The equivalent of robots.txt for AI bots like GPTBot, ClaudeBot, and PerplexityBot.

Generate for any site

POST /v1/llms-txt-build with {url, max_pages}. Returns the generated llms.txt content and a list of pages used as inventory.

curl -X POST \
  https://api.crawlcrawl.com/v1/llms-txt-build \
  -H 'Authorization: Bearer crk_...' \
  -H 'Content-Type: application/json' \
  -d '{"url": "https://example.com", "max_pages": 1000}'
{
  "llms_txt": "User-agent: GPTBot\nAllow: /",
  "llms_full_txt": "User-agent: GPTBot\nAllow: /path1\nAllow: /path2",
  "pages_used": 42
}

What it returns

The short policy file (llms.txt) and the detailed inventory file (llms-full.txt). You choose which to publish.

Common use cases

Generate once when launching a new site. Refresh weekly via cron to keep AI crawlers updated.

Generate an llms.txt file in 30 seconds.

Get an API key and start building.

Get an API key — free