Docs
Quickstart.
Sign up, create an endpoint, paste the URL into Stripe (or any source), watch the row appear in BigQuery. About 15 minutes.
What you'll have at the end
A live Hooktopus endpoint that accepts webhook POSTs, archives every event to R2, and writes typed rows into a BigQuery table called events partitioned by day and clustered by endpoint name. Plus your first dbt staging model, downloaded as a ZIP and ready to drop into your repo.
1 · Sign up for Hooktopus
Head to app.hooktopus.io/sign-up and sign up with Google or magic link. You'll be redirected to a setup wizard that creates your first workspace. The workspace ID looks like ws_abc123 and is part of every endpoint URL.
2 · Create an endpoint
Endpoints are URLs that accept webhook POSTs. Each one belongs to a workspace. Hit the New endpoint button on /w/[slug]/endpoints and give it a name — something like stripe or shopify_orders. Snake case is the convention.
You'll get a URL that looks like:
https://in.hooktopus.io/ws_abc123/stripe
endpoint_name column in BigQuery. Pick something you'll be happy with in dbt sources.3 · Paste it into your source
For Stripe, that's Developers → Webhooks → Add endpoint. For Shopify, it's Settings → Notifications → Webhooks. For your own application, just POST to the URL when an event happens. Hookdeck handles signature verification for the ~30 sources we support out of the box.
Or test with curl first
curl -X POST "https://in.hooktopus.io/ws_abc123/stripe" \
-H 'content-type: application/json' \
-d '{
"id": "evt_test_1",
"type": "charge.succeeded",
"data": { "object": { "amount": 12999, "currency": "usd" } }
}'
# → 202 {"ok":true,"event_id":"019e326c-96f5-7a5f-..."}
The Inspector at /w/[slug]/inspector shows the event arriving in real time. (Inspector ships in Phase C of our build plan; for now you can verify in R2.)
4 · Add a BigQuery destination
Hit Destinations → Add destination. The five-step wizard walks through:
- Pick destination type (BigQuery is GA; Postgres is coming in v1.2)
- Copy a
gcloudsnippet that creates a service account - Grant
bigquery.dataEditoron your dataset - Paste the SA JSON (we encrypt it with AES-GCM)
- Pick the dataset and click Test connection — we write/read/delete a test row
Full IAM steps live at /docs/bigquery-setup.
5 · Watch the row land
SELECT event_id, received_at, endpoint_name, payload
FROM `yourproject.hooktopus_raw.events`
ORDER BY received_at DESC
LIMIT 5;
Within 30 seconds of the curl above, you'll see a row with endpoint_name = 'stripe' and the payload as a native BigQuery JSON column.
6 · Generate your first dbt staging model
After ~50 events land, head to /w/[slug]/dbt-export. Pick the endpoint, scan the observed fields (each shows its observed % and a type badge), toggle off the ones you don't want, then Download ZIP. Drop the contents into models/staging/hooktopus/ in your dbt project, commit, and run:
dbt run --select stg_stripe
dbt test --select stg_stripe
You now have a typed table feeding the rest of your dbt project. Schema drift alerts are on by default — if Stripe quietly adds a field next week, you'll hear about it in Slack within the hour.
Next steps
- dbt integration — how regeneration works without overwriting your edits
- Schema drift alerts — configure Slack and the heartbeat threshold
- Replay from R2 — re-write any time range to recover from outages