Robots.txt Serving

Yercekimsiz can serve a custom robots.txt per domain at the edge, reducing origin load and enabling per-domain crawler policies.

How it works

Instead of the origin serving robots.txt, the WAF responds to /robots.txt requests using domain-scoped content stored in config. Requests are logged for analytics and crawler diagnostics.

Key Features

  • Per-domain robots.txt content
  • Edge serving to reduce origin load
  • Request logging and crawler metrics
  • API & dashboard to edit policies in realtime

Example robots.txt

User-agent: *
Disallow: /api/
Disallow: /_next/

Notes

Robots serving is purely read-only from the origin's perspective — the WAF answers robots requests to reduce origin traffic and provide centralized logging.