Webhooks to your data warehouse.
Connect any source. We handle the pipeline and write the dbt staging models, so you can skip straight to analysis.
No card · 15 minutes from signup to first chart
01 · Your sources
Any webhook source.
Paste your endpoint URL into Stripe, Shopify, GitHub, or any custom source. HMAC + 30+ signature schemes handled.
02 · Hooktopus
Catches every event.
- Never drops an event, even during outages.
- Cleans the payload into typed columns.
- Slack alert when payloads change shape.
03 · Your BigQuery + dbt
Rows + a staging model.
⌁ stg_stripe.sql · generated
Catches webhooks from anywhere — 30+ sources at launch
The pipe
Three things, each done well.
No transforms. No identity resolution. No workflow builder. We're a webhook ingestor that speaks BigQuery and dbt — and that's deliberate.
Catch every event
One URL per source. Your events are safely stored before Hooktopus even acknowledges them — so a downstream outage never costs you data.
Land in BigQuery
Clean, typed columns in your BigQuery dataset — not a JSON blob you have to parse downstream. Apostrophes, quotes, emojis all survive intact.
Speak dbt
Auto-generates a typed stg_<source>.sql and a sources YAML from the events you've already received. Drop the ZIP into your dbt project, run it, ship it.
The wedge
The webhook tool that speaks dbt.
Other tools dump JSON in a table and leave you to write 200 lines of JSON_VALUE casts. Hooktopus walks the payload, observes types, and ships you a typed staging model — ready for your repo.
- 1
We observe.
A sampled walker scans every payload, recording each JSON path → {observed_types, count, sample_values}.
- 2
You threshold.
Pick the fields seen in ≥X% of events. Default is 1% — keeps the 200 optional Stripe fields out of your model.
- 3
We render.
Always-numeric paths get SAFE_CAST, strings get JSON_VALUE, objects get JSON_QUERY. Plus raw_payload as the escape hatch.
- 4
You ship.
Download the ZIP, drop into models/staging/hooktopus/, run dbt run --select stg_stripe. Done.
{{ config(materialized='incremental', unique_key='event_id') }}
with source as (
select
event_id,
received_at,
json_value(payload, '$.id') as id,
json_value(payload, '$.type') as event_type,
cast(json_value(payload, '$.created') as int64) as created_unix,
json_value(payload, '$.data.object.id') as object_id,
safe_cast(json_value(payload, '$.data.object.amount') as int64) as amount_cents,
json_value(payload, '$.data.object.currency') as currency,
payload as raw_payload
from {{ source('hooktopus', 'events') }}
where endpoint_name = 'stripe'
{% if is_incremental() %}
and received_at > (select max(received_at) from {{ this }})
{% endif %}
)
select * from source
The early-warning system
Tells you when your source changes its mind.
Stripe quietly adds a field. Shopify renames one. Your stg model breaks at 3am. Hooktopus watches every payload — when the shape drifts, you hear about it in Slack within the hour.
slack · #data-alerts
New field in stripe.charge.succeeded
Field data.object.payment_method_details.card.iin appeared in 92% of events over the last hour. Sample value: "424242".
- →Three drift classes: new field, type change, field disappearing. Each gets its own copy and severity.
- →Aggregated, not noisy: 1-hour window, minimum 10 events with the change before firing. One Slack message, not 10,000.
- →Three delivery channels: Slack webhook, Resend email, in-app dashboard. Pick the ones you'll actually read.
- →Daily digest: a Tuesday-morning rollup of every change Hooky noticed, so you don't miss the slow ones.
Meet your hook-catcher
Hi, I'm Hooky.
Hooky's the reason this product feels less like infrastructure and more like a colleague. He shows up worried when your destination errors out, excited when you hit a milestone, and sleepy when you haven't sent events in a while. Eight arms keep the data moving. One job keeps us focused.
Quiet wins
Built for analysts who already pay for their stack.
Hooktopus isn't trying to be your CDP, your iPaaS, or your reverse ETL. It does the one thing in between — the part nobody loves writing.
“We replaced a Cloud Run service and 600 lines of dbt boilerplate. The dbt generator saves us a full day every time a new webhook source goes live.”
Analytics Engineer
Series B fintech
“Honest pricing, dbt-native, doesn't pretend to be a workflow tool. Took 20 minutes from signup to first chart in Hex.”
Head of Data
DTC apparel
“Drift alerts caught a Stripe schema change two weeks before it would've broken our finance models. Worth $19 a month forever.”
Senior Data Analyst
B2B SaaS, ~80 people
Pricing
An order of magnitude cheaper than the workflow tools.
Same volume on Zapier or n8n runs ~$8/1k events. Hooktopus starts at $0.19/1k and drops from there.
Free
$0/mo
up to 10k events
Most popular
Starter
$19/mo
up to 100k events
Pro
$49/mo
up to 500k events
Business
$99/mo
up to 2M events
Scale
$299/mo
up to 10M events
FAQ
Questions we get on day one.
Can I try it free?
+
Can I try it free?
+Yes. The free tier covers 10,000 events / month forever — no card, no trial period. Plenty to evaluate the whole product end-to-end.
How long does setup take?
+
How long does setup take?
+About fifteen minutes from signup to your first chart. The destination wizard walks you through the warehouse setup with the exact commands to copy-paste.
Which sources does it work with?
+
Which sources does it work with?
+Anything that POSTs JSON to a URL. Signature verification is handled out of the box for Stripe, Shopify, GitHub, HubSpot, Twilio, Linear, Intercom, and 25+ more. For your own application, just POST to the endpoint.
Is my data safe if your service has an issue?
+
Is my data safe if your service has an issue?
+Yes. Every event is backed up on our side before we acknowledge receipt, so your source never sees a failure. If your warehouse is down we retry; once it's back, you can replay any time range with one click. Nothing is lost.
Can I export and leave whenever I want?
+
Can I export and leave whenever I want?
+Yes. Your data lives in your own warehouse — we don't hold it hostage. Close your account from settings and we delete every trace of you within 24 hours. Your warehouse stays untouched.
Do I have to use dbt?
+
Do I have to use dbt?
+No. If you don't use dbt you still get clean typed rows in your warehouse — just skip the staging-model export. dbt is the deepest integration but it's not required.
Ship today
Webhook → BigQuery in five minutes.
Free for the first 10k events per month. No card. No "talk to sales." Just paste a URL.