Documentation Index
Fetch the complete documentation index at: https://docs.searchable.com/llms.txt
Use this file to discover all available pages before exploring further.
What this does
Searchable’s Agent Analytics dashboard shows you, in near real-time, which AI assistants — ChatGPT, Claude, Perplexity, Google’s AI Overviews, and the rest — are crawling your site, what pages they’re hitting, and which of those crawls turn into referred users. To do that, Searchable needs to see your inbound HTTP requests. The setup below points your hosting platform’s request logs at our ingest endpoint. We classify the AI bots, drop everything else, and never store query strings or other PII.Setup takes about 5 minutes. There are no agents to install and no code to change in your application.
Pick your platform
Vercel
One-click Log Drain. Works on Pro and Enterprise.
Cloudflare Worker
Edge Worker — works on any Cloudflare plan, including Free.
Cloudflare Logpush
Native Logpush job — Enterprise plan only.
Amazon CloudFront
Standard Logging V2 via Firehose — any CloudFront distribution.
Netlify Edge Function
Edge Function — works on every Netlify plan, including Free.
Netlify Log Drain
HTTP Log Drain — Enterprise plan only.
Fastly
Real-Time Log Streaming via HTTPS endpoint.
Google Cloud Platform
HTTPS Load Balancer / Cloud CDN logs via Pub/Sub push.
Other platforms
Custom REST API or middleware for everything else.
Which one should I use?
I host on Vercel
I host on Vercel
Use Vercel Log Drain. It’s the most direct path — Vercel streams request logs to Searchable with one entry in your project settings. No code changes.
I'm on Cloudflare (any plan)
I'm on Cloudflare (any plan)
Use the Cloudflare Worker. It works on every Cloudflare plan including Free, and the Free tier easily covers ~100k requests/day. The Worker forwards AI-bot traffic to Searchable from the edge — no latency added to real users.
I'm on Cloudflare Enterprise
I'm on Cloudflare Enterprise
You can use either path. Cloudflare Logpush is the most “native” option — Cloudflare batches and ships logs directly with no Worker involved. If you’d rather avoid the Enterprise-only Logpush feature, the Worker path works on Enterprise too.
I'm on Amazon CloudFront
I'm on Amazon CloudFront
Use Amazon CloudFront → Firehose. CloudFront’s free Standard Logging V2 feature ships records to Amazon Data Firehose, which forwards them to Searchable. Works on any CloudFront distribution — you only pay for the Firehose ingestion.
I host on Netlify
I host on Netlify
Use the Netlify Edge Function on any plan (including Free), or Netlify Log Drain if you’re on the Enterprise plan. The Edge Function path runs at Netlify’s edge with no added latency for real users.
I host on Fastly
I host on Fastly
Use Fastly Real-Time Log Streaming. Configure an HTTPS Logging endpoint inside the Fastly UI — no code changes — and Fastly streams request logs to Searchable.
I host on Google Cloud Platform
I host on Google Cloud Platform
Use Google Cloud → Pub/Sub push. Cloud Logging captures every HTTPS Load Balancer or Cloud CDN request; a Pub/Sub topic with a push subscription forwards those logs to Searchable.
I'm somewhere else (other platforms, custom)
I'm somewhere else (other platforms, custom)
Use the custom integration. Searchable accepts a small REST POST from your application middleware or any platform that can ship structured request logs.
What gets sent to Searchable
For each request that matches an AI-bot user agent, we receive:- HTTP method, path, and host (no query strings)
- User agent
- Referer
- Country code (geo-IP, no precise location)
- Response status and bytes
- Timestamp
User-Agent / Referer, and full IP addresses are never sent.
Verifying the connection
After completing setup, return to Agent Analytics → Setup in your Searchable dashboard. The status strip at the bottom updates as events arrive:- Waiting for first event — your platform hasn’t sent anything yet. AI bots typically hit any indexed site within a few hours.
- Connected — events are flowing. The strip shows the count from the last 24 hours.
Next steps
Add a domain
Agent Analytics is scoped to your project’s primary domain.
Connect Search Console
Layer in keyword data so you can correlate AI crawls with search demand.