# robots.txt — crawl directives per bot. Paired with /ai.txt (training directives). User-agent: * Allow: / Sitemap: https://researchvials.netlify.app/sitemap.xml # AI / LLM crawlers — per-bot crawl directives. # OpenAI User-agent: GPTBot Allow: / # OpenAI User-agent: ChatGPT-User Allow: / # OpenAI User-agent: OAI-SearchBot Allow: / # Anthropic User-agent: ClaudeBot Allow: / # Anthropic User-agent: Claude-Web Allow: / # Anthropic User-agent: anthropic-ai Allow: / # Perplexity User-agent: PerplexityBot Allow: / # Google User-agent: Google-Extended Allow: / # Google User-agent: Googlebot Allow: / # Common Crawl User-agent: CCBot Allow: / # Apple User-agent: Applebot-Extended Allow: / # Apple User-agent: Applebot Allow: / # ByteDance User-agent: Bytespider Allow: / # Meta User-agent: Meta-ExternalAgent Allow: / # Meta User-agent: FacebookBot Allow: / # DuckDuckGo User-agent: DuckAssistBot Allow: / # Cohere User-agent: cohere-ai Allow: / # Amazon User-agent: Amazonbot Allow: / # You.com User-agent: YouBot Allow: / # Diffbot User-agent: Diffbot Allow: / # Timpi User-agent: Timpibot Allow: / # Mistral User-agent: MistralAI-User Allow: /