Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.searchable.com/llms.txt

Use this file to discover all available pages before exploring further.

What this does

Cloudflare Logpush is Cloudflare’s native, batched log delivery system. We point a Logpush job at Searchable’s ingest endpoint, Cloudflare ships HTTP request logs to us, we classify the AI bots, and drop everything else.
Logpush requires the Cloudflare Enterprise plan. If you’re on a different plan, use the Cloudflare Worker path instead — it works on every plan including Free.

Logpush vs. Worker — which one?

LogpushWorker
Plan requiredEnterpriseAny plan, including Free
Latency to dashboard~30–60 seconds (batched)Real-time (per-request)
Setup complexityOne Logpush job, no codePaste a script, add a route
Sites per zoneAll sites on the zonePer-route
If you’re on Enterprise, Logpush is the lowest-overhead option. If you’d rather avoid an Enterprise feature dependency, the Worker path works on Enterprise too.

Prerequisites

A Cloudflare zone on the Enterprise plan with Logpush enabled
Permission to create Logpush jobs on the zone
A Searchable project with your domain confirmed

Setup (dashboard)

1

Generate the destination URL in Searchable

  1. Open your Searchable dashboard
  2. Go to Agent Analytics → Setup
  3. Pick Cloudflare Logpush as your crawler source
  4. Click Generate token
Searchable returns a single ready-to-paste destination URL with the auth header pre-encoded as a query parameter. It looks like this:
https://searchable-tracker.searchable.workers.dev/v1/cloudflare-logs?header_Authorization=Bearer%20sa_…
Copy it now — the token portion won’t be shown again.
Cloudflare Logpush doesn’t have a separate header field — auth has to be encoded into the destination URL itself using header_* query parameters. Searchable handles that encoding for you.
2

Open the Logpush UI on your zone

Open dash.cloudflare.com → your zone → Analytics & Logs → Logpush.Click Create a Logpush job.
3

Configure the job

FieldValue
DatasetHTTP Requests
DestinationHTTP destination
Destination URLthe full URL you copied from Searchable (auth is already in the query string)
Timestamp formatUnix
Fieldssee below
Pick exactly these fields:
  • RayID
  • ClientRequestMethod
  • ClientRequestURI
  • ClientRequestPath
  • ClientRequestHost
  • ClientRequestScheme
  • ClientRequestUserAgent
  • ClientRequestReferer
  • ClientIP
  • ClientCountry
  • EdgeResponseStatus
  • EdgeStartTimestamp
  • OriginResponseTime
  • EdgeColoCode
  • CacheStatus
  • EdgeResponseBytes
The timestamp format must be Unix, not RFC3339. Logpush jobs default to RFC3339, and Searchable will reject batches that don’t include a Unix timestamp.
4

Save and verify

Click Save. Cloudflare will validate the destination immediately — Searchable accepts the validation ping.Return to Agent Analytics → Setup in Searchable. The status strip should show Connected within a few minutes once an AI bot hits your site.

Setup (API)

If you’d rather create the job via the Cloudflare API, here’s the equivalent call. Replace {zone_id}, {cf_api_token}, and the sa_… placeholder with your own values.
curl -X POST "https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs" \
  -H "Authorization: Bearer {cf_api_token}" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "searchable-ai-logs",
    "logpull_options": "fields=RayID,ClientRequestMethod,ClientRequestURI,ClientRequestPath,ClientRequestHost,ClientRequestScheme,ClientRequestUserAgent,ClientRequestReferer,ClientIP,ClientCountry,EdgeResponseStatus,EdgeStartTimestamp,OriginResponseTime,EdgeColoCode,CacheStatus,EdgeResponseBytes&timestamps=unix",
    "destination_conf": "https://searchable-tracker.searchable.workers.dev/v1/cloudflare-logs?header_Authorization=Bearer%20sa_YOUR_INTEGRATION_TOKEN",
    "dataset": "http_requests",
    "enabled": true
  }'

Already have a token? Build the destination URL manually

If you have an existing sa_… token (saved from a previous Generate flow, stored in 1Password, etc.) you can construct the destination URL yourself:
https://searchable-tracker.searchable.workers.dev/v1/cloudflare-logs?header_Authorization=Bearer%20<YOUR_TOKEN>
Notes:
  • The literal space between Bearer and the token must be percent-encoded as %20
  • Our token alphabet (sa_<base64url>.<hex>) is already URL-safe, so nothing else needs encoding
  • Cloudflare’s header_* query-parameter convention is documented here
You can also use the “Already have a token? Build the URL” form in Searchable’s setup UI to do this for you.

Verifying the connection

Cloudflare batches Logpush deliveries every ~30–60 seconds, so events take a moment to appear after the first AI bot hit. In Searchable:
  1. Go to Agent Analytics → Setup
  2. Look at the status strip at the bottom of the page
  3. Click Check if it still shows “Waiting for first event”
You can also confirm in Cloudflare: Analytics & Logs → Logpush → your job. The job’s metrics show successful and failed delivery counts.

Multiple zones

Each Logpush job is scoped to a single Cloudflare zone. If your domain is split across multiple zones (e.g. example.com and example.io):
  1. Generate one integration token per zone, or reuse one — both work
  2. Create one Logpush job per zone with the same destination URL
  3. Searchable will tag events by host, so you’ll still see them per-domain in the dashboard

Troubleshooting

Open the job in Cloudflare and check the error message:
  • 401 Unauthorized — the header_Authorization query parameter is missing or malformed. Re-copy the destination URL from Searchable, or rebuild it manually using the format above.
  • 400 Bad Request — most often timestamps=unix is missing. Edit the job and confirm timestamp format is Unix.
  • Invalid destination URL — Cloudflare’s destination validation pings the endpoint. If validation fails, double-check the host (searchable-tracker.searchable.workers.dev) and that the URL has the ?header_Authorization=Bearer%20… query string attached.
Logpush only delivers logs for traffic that’s actually flowing through Cloudflare. Things to check:
  • The Cloudflare zone is proxied (orange cloud) for the routes AI bots are hitting, not just DNS-only (grey cloud)
  • The Logpush job is enabled (jobs can be paused)
  • Your domain in Searchable matches the zone’s hostname (check Agent Analytics → Setup → Confirm your domain)
If everything looks right, hit your site with a known AI user agent and wait ~60 seconds for the next Logpush batch:
curl -H "User-Agent: GPTBot/1.0 (+https://openai.com/gptbot)" https://yourdomain.com/
If you also run the Cloudflare Worker path on the same zone, both will send each AI-bot request. Searchable de-duplicates server-side using the Cloudflare Ray ID, so the dashboard shows each request once. If you’d rather not run both, pick one and disable the other.
No. Logpush is Enterprise-only on Cloudflare. Use the Cloudflare Worker path instead — it works on Free, Pro, Business, and Enterprise.

Removing the integration

  1. Cloudflare → Analytics & Logs → Logpush → disable or delete the job
  2. Searchable → Agent Analytics → Setup → Tokens → revoke the token
Both sides are independent — revoking the token alone is enough to stop ingestion immediately, even if the job stays configured (its deliveries will start returning 401).

Next steps

Cloudflare Worker

Edge Worker option — works on every Cloudflare plan.

See the data

Open Agent Analytics to see which assistants are crawling your site.