Crawl any site. Find orphan pages, broken links, and weekly changes. Run on cron; send results to your stack via webhook.
robots.txt and meta tag directives.GET https://api.crawlcrawl.com/v1/crawls/{id}/orphans
GET https://api.crawlcrawl.com/v1/crawls/{id}/links
GET https://api.crawlcrawl.com/v1/crawls/{old}/diff/{new}
GET https://api.crawlcrawl.com/v1/robots-policy?url=https://example.com
{
"orphans": [
{
"url": "https://example.com/orphan-page",
"title": "Orphan Page"
}
]
}
Schedule weekly crawls and receive audit results via webhook.
POST https://api.crawlcrawl.com/v1/crawls
Authorization: Bearer crk_...
{
"url": "https://example.com",
"cron": "0 6 * * 1",
"webhook_url": "https://your-stack.com/webhook"
}
Compare results against the previous run using /diff.
One API key per client project.