Allowing Sensei to read your site

Sensei tried to read your site and got a 403.

Our scraper identifies itself honestly. If you’re here, your Cloudflare, Vercel Firewall, or WAF is blocking requests from Sensei/1.0 (+https://asksensei.dev/bot). To get a real Presence reading, allowlist the bot by User-Agent — the same way you’d allow Googlebot.

We read once per weekly reading, obey robots.txt, and cache inside Sensei so we never hammer your origin. If allowlisting isn’t an option, you can always run Sensei against a preview URL that isn’t behind the WAF.

  1. Cloudflare

    In your Cloudflare dashboard, go to Security → Bots → Configure. Add a custom rule that allows requests where User-Agent contains `Sensei`. If you're on the free plan and don't have custom rules, disable the 'Block AI Scrapers and Crawlers' toggle for your marketing subdomain.

    # Cloudflare WAF custom rule expression
    (http.user_agent contains "Sensei/1.0") or (http.user_agent contains "+https://asksensei.dev/bot")
  2. Vercel Firewall

    Vercel Dashboard → your project → Firewall → Custom Rules. Add a rule that permits requests matching our User-Agent and set the action to Allow. If you're using an allowlist-mode firewall (default-deny), include our UA string in the allowlist.

    # Vercel Firewall custom rule
    when   user_agent contains "Sensei/1.0"
    action allow
  3. AWS WAF / CloudFront

    Create a Web ACL rule with a string-match condition on the User-Agent header. Match `Sensei/1.0` (case-insensitive) and set the action to Allow — place it above any Block rules for bots.

    # AWS WAF managed rule exclusion (CloudFront)
    Priority: 10 (before any default-deny rules)
    Match:    Header "User-Agent" contains "Sensei/1.0"
    Action:   Allow
  4. nginx / Generic reverse proxy

    If you're self-hosting behind nginx or a custom reverse proxy that blocks unknown bots, add an exception for our User-Agent in your server block.

    # nginx — allow Sensei ahead of any bot-blocking rules
    map $http_user_agent $is_sensei {
        default                   0;
        "~*Sensei/1\.0"          1;
    }
    
    server {
        if ($is_sensei) { set $allow_bot 1; }
        # ... your existing bot-blocking rules here
    }

Verifying the allowlist

Check it with one curl.

After you ship the allowlist rule, run this from any terminal. If it returns 200, Sensei can read the site. If it still returns 403, the rule didn’t match — double-check the User-Agent substring.

curl -I -A "Sensei/1.0 (+https://asksensei.dev/bot)" https://YOUR-SITE.com