# Advanced robots.txt for AI crawlers and search engines User-agent: Googlebot Allow: / Crawl-delay: 1 User-agent: Bingbot Allow: / Crawl-delay: 1 User-agent: ChatGPT-User Allow: / User-agent: GPTBot Allow: / User-agent: Google-Extended Allow: / User-agent: PerplexityBot Allow: / User-agent: Claude-Web Allow: / User-agent: ClaudeBot Allow: / User-agent: Twitterbot Allow: / User-agent: facebookexternalhit Allow: / User-agent: LinkedInBot Allow: / User-agent: WhatsApp Allow: / User-agent: Slackbot Allow: / User-agent: * Allow: / Crawl-delay: 2 # Sitemap location Sitemap: https://outwrite.ai/sitemap.xml # Disallow areas not meant for crawling Disallow: /private/ Disallow: /admin/ Disallow: /*.json$ Disallow: /api/internal/ Disallow: /.well-known/ # Allow important files Allow: /.well-known/security.txt Allow: /robots.txt Allow: /sitemap.xml