Documentation Index
Fetch the complete documentation index at: https://docs.searchable.com/llms.txt
Use this file to discover all available pages before exploring further.
What this does
Cloudflare Logpush is Cloudflare’s native, batched log delivery system. We point a Logpush job at Searchable’s ingest endpoint, Cloudflare ships HTTP request logs to us, we classify the AI bots, and drop everything else.Logpush requires the Cloudflare Enterprise plan. If you’re on a different plan, use the Cloudflare Worker path instead — it works on every plan including Free.
Logpush vs. Worker — which one?
| Logpush | Worker | |
|---|---|---|
| Plan required | Enterprise | Any plan, including Free |
| Latency to dashboard | ~30–60 seconds (batched) | Real-time (per-request) |
| Setup complexity | One Logpush job, no code | Paste a script, add a route |
| Sites per zone | All sites on the zone | Per-route |
Prerequisites
A Cloudflare zone on the Enterprise plan with Logpush enabled
Permission to create Logpush jobs on the zone
A Searchable project with your domain confirmed
Setup (dashboard)
Generate the destination URL in Searchable
- Open your Searchable dashboard
- Go to Agent Analytics → Setup
- Pick Cloudflare Logpush as your crawler source
- Click Generate token
Open the Logpush UI on your zone
Open dash.cloudflare.com → your zone → Analytics & Logs → Logpush.Click Create a Logpush job.
Configure the job
| Field | Value |
|---|---|
| Dataset | HTTP Requests |
| Destination | HTTP destination |
| Destination URL | the full URL you copied from Searchable (auth is already in the query string) |
| Timestamp format | Unix |
| Fields | see below |
RayIDClientRequestMethodClientRequestURIClientRequestPathClientRequestHostClientRequestSchemeClientRequestUserAgentClientRequestRefererClientIPClientCountryEdgeResponseStatusEdgeStartTimestampOriginResponseTimeEdgeColoCodeCacheStatusEdgeResponseBytes
Setup (API)
If you’d rather create the job via the Cloudflare API, here’s the equivalent call. Replace{zone_id}, {cf_api_token}, and the sa_… placeholder with your own values.
Already have a token? Build the destination URL manually
If you have an existingsa_… token (saved from a previous Generate flow, stored in 1Password, etc.) you can construct the destination URL yourself:
- The literal space between
Bearerand the token must be percent-encoded as%20 - Our token alphabet (
sa_<base64url>.<hex>) is already URL-safe, so nothing else needs encoding - Cloudflare’s
header_*query-parameter convention is documented here
Verifying the connection
Cloudflare batches Logpush deliveries every ~30–60 seconds, so events take a moment to appear after the first AI bot hit. In Searchable:- Go to Agent Analytics → Setup
- Look at the status strip at the bottom of the page
- Click Check if it still shows “Waiting for first event”
Multiple zones
Each Logpush job is scoped to a single Cloudflare zone. If your domain is split across multiple zones (e.g.example.com and example.io):
- Generate one integration token per zone, or reuse one — both work
- Create one Logpush job per zone with the same destination URL
- Searchable will tag events by host, so you’ll still see them per-domain in the dashboard
Troubleshooting
Logpush job shows delivery errors
Logpush job shows delivery errors
Open the job in Cloudflare and check the error message:
401 Unauthorized— theheader_Authorizationquery parameter is missing or malformed. Re-copy the destination URL from Searchable, or rebuild it manually using the format above.400 Bad Request— most oftentimestamps=unixis missing. Edit the job and confirm timestamp format is Unix.Invalid destination URL— Cloudflare’s destination validation pings the endpoint. If validation fails, double-check the host (searchable-tracker.searchable.workers.dev) and that the URL has the?header_Authorization=Bearer%20…query string attached.
Status stays on 'Waiting for first event'
Status stays on 'Waiting for first event'
Logpush only delivers logs for traffic that’s actually flowing through Cloudflare. Things to check:
- The Cloudflare zone is proxied (orange cloud) for the routes AI bots are hitting, not just DNS-only (grey cloud)
- The Logpush job is enabled (jobs can be paused)
- Your domain in Searchable matches the zone’s hostname (check Agent Analytics → Setup → Confirm your domain)
I see double-counted events
I see double-counted events
If you also run the Cloudflare Worker path on the same zone, both will send each AI-bot request. Searchable de-duplicates server-side using the Cloudflare Ray ID, so the dashboard shows each request once. If you’d rather not run both, pick one and disable the other.
My plan isn't Enterprise — can I still use Logpush?
My plan isn't Enterprise — can I still use Logpush?
No. Logpush is Enterprise-only on Cloudflare. Use the Cloudflare Worker path instead — it works on Free, Pro, Business, and Enterprise.
Removing the integration
- Cloudflare → Analytics & Logs → Logpush → disable or delete the job
- Searchable → Agent Analytics → Setup → Tokens → revoke the token
401).
Next steps
Cloudflare Worker
Edge Worker option — works on every Cloudflare plan.
See the data
Open Agent Analytics to see which assistants are crawling your site.