How to verify a conversion is actually firing — without trusting the dashboard
The platform dashboard is the last place to check whether a conversion is firing correctly. Here's the four-layer verification stack that catches the failures dashboards hide — pixel-side, network-side, server-side, and platform-side.
The Google Ads dashboard says "Recording conversions." The GA4 conversion event report shows daily volume. Both look healthy.
Then the client asks why ROAS dropped 30% last month and you discover the conversion has been firing without the value parameter for six weeks. Optimization has been blind. The dashboard never told you because the dashboard counts the firings, not the contents.
Conversion tracking has four independent layers, and you have to check all of them. The dashboard is layer four — the last one to falsify, the first one to confidently lie.
The four layers, in order
- Pixel layer (browser): the tag actually fires when the user converts.
- Network layer (browser): the request leaves the browser intact.
- Server layer (your backend or the platform's edge): the request is received and processed.
- Platform layer (Google Ads, GA4, Meta Events Manager): the conversion is attributed correctly.
A break at layer 1 takes layers 2-4 down with it. A break at layer 2 (consent mode, ad blocker, network failure) lets layer 1 succeed but breaks the rest. A break at layer 3 (CAPI failure, server-side validation rejecting the event) lets 1+2 succeed but the event never lands. A break at layer 4 (parameter rename, deduplication misconfig, attribution model change) lets 1+2+3 succeed but the conversion is mis-counted or invisible.
Verification works upward. You can't trust layer 4 unless you've verified 1-3 first.
Layer 1: pixel verification (the GTM Preview check)
In GTM Preview mode, trigger the conversion event on the live site. Watch the Tags tab.
For each conversion-firing tag, confirm:
- Tag fired? Should appear in "Tags Fired" with a timestamp matching the user action (within 1-2 seconds).
- Right trigger? Click the tag — see which trigger event matched. If "All Pages" matched but you expected "Purchase Completed", the tag is firing on every page view (over-counting).
- Right values? Click the tag → "Properties" → confirm
value,currency,transaction_id,event_id, and any custom parameters are present and non-empty. Emptyvalueparameters are the most common silent failure.
Common gotcha: value is set via a dataLayer variable that reads dataLayer.transaction_value, but the page actually pushes dataLayer.value. Variable resolves to undefined, tag fires, value is missing. Dashboard shows the conversion but reports $0.
Layer 2: network verification (Network tab)
Open DevTools → Network tab → filter by domain. Trigger the conversion. Look for:
- Pixel image requests: GA4 hits
https://www.google-analytics.com/g/collect, Meta Pixel hitshttps://www.facebook.com/tr/, etc. Confirm the request was sent (not blocked or pending). - Status code 200: not 4xx (rejected), not pending (still in flight when page navigates away).
- Response body (when applicable): some endpoints return validation feedback. GA4's
/g/collectresponds 204 No Content on success.
Three common layer-2 failures:
- Ad blocker — request never leaves the browser. uBlock Origin / Adblock Plus / Brave shields all block analytics.com and connect.facebook.net by default. Test in both states (with + without).
- Consent mode — request is blocked before send because consent is
denied. Look for the IDga-disable-{measurement_id}cookie or for the request to silently not appear. - Page navigation cancels the request — common on form submits where the page redirects before the pixel finishes. Fix: use
transport_type: 'beacon'on GA4, orfetchwithkeepalive: trueon custom pixels.
Layer 3: server-side verification
If the conversion uses Conversions API (CAPI), Enhanced Conversions for Web (ECW), or your own server-side event forwarding, the request goes through your backend or the platform's edge. Verify it landed.
For Meta CAPI:
- Events Manager → Test Events: send a test event with a known
test_event_code. Should appear in the test panel within seconds with a "Server" badge. Delete it after verifying. - Match Quality score: should be >7 for the dataset. Lower means your CAPI payload is sparse on user identifiers (
em,ph,fbc,fbp). See Meta Pixel + CAPI without double-counting for the dedup contract.
For Google Ads Enhanced Conversions:
- Conversion action diagnostics → Diagnostic column: should show "Recording conversions" with no warnings. Common warnings: "Few enhanced conversions" (ECW user-data isn't being sent), "Tag misconfigured."
- Diagnostics tab → "Enhanced conversions match rate": should be >50%. Lower means your hashed-user-data payload is missing fields.
For your own server-side relay (e.g. server-side GTM):
- Log every outbound conversion request with status + response body for 7 days post-deploy. Sample 10 — confirm payloads look right.
- Sentry-instrument the relay so failures surface immediately.
Layer 4: platform verification (the dashboard)
Now check the platform dashboard — but with the bias that you've already verified 1-3, so any discrepancy here is likely an attribution / configuration issue, not a tracking failure.
For each conversion type:
- Google Ads: Conversion actions list → "Last conversion" timestamp. Should be recent (within hours for active campaigns). Also check the "Source" column — direct from website tag vs. imported from GA4 vs. offline upload. Mismatched expectation here is a frequent miss.
- GA4: Reports → Realtime → Events. Trigger the conversion live, watch it appear. Then Reports → Engagement → Events → confirm the event is marked as a key event. Then Reports → Monetization → Ecommerce purchases → confirm value attributed.
- Meta: Events Manager → Overview → "Last received" per event. Match Quality + dedup % per event.
When the dashboard disagrees with layers 1-3, the diagnosis is usually:
- Attribution model differences: dashboard counts differently than you expect (e.g. data-driven vs. last-click). See GA4 vs Google Ads conversions for the four reasons numbers don't match.
- Lookback window: conversion fired today but credited to a click 12 days ago, dashboard shows it on the historical row.
- Counting setting: "every" vs. "one" per click. A purchase action set to "one" deduplicates per click and drops repeat purchases.
Make the four-layer check a recurring rhythm
A one-time verification is worth doing. A monthly recurring verification per client is worth running. The structure:
| Client | Conversion | Layer 1 | Layer 2 | Layer 3 | Layer 4 | Last verified |
|---|---|---|---|---|---|---|
| Acme Bakery | purchase | ✅ | ✅ | ✅ | ✅ | 2026-04-15 |
| Acme Bakery | add_to_cart | ✅ | ✅ | n/a | ⚠️ value drift | 2026-04-15 |
| Beta Co | lead_submit | ❌ ad blocker | — | — | — | 2026-04-08 |
The columns are independent — each gets verified separately. The "last verified" column is the operating discipline; without it the table rots into a snapshot.
What this looks like in a tracking map
The four-layer check is the operational form of what every client-side conversion node SHOULD have as metadata: a lastVerifiedAt timestamp, a healthStatus (working / broken / missing / unverified), and a notes field describing the layer that broke if it did.
That's exactly the tracking infrastructure map shape: typed nodes with health state and verification recency, queryable across all clients in one view.
Without that structure, the four-layer check is a Google Sheet that goes stale the day after you close the audit. With it, the question "show me every Google Ads conversion that's been unverified for 30+ days, across every client" is one filter.
The takeaway
Conversion dashboards are the last place to check whether tracking is healthy, not the first. The four-layer verification works bottom-up: pixel fires, request leaves, server receives, platform attributes. Any layer can break independently of the others, and a higher layer succeeding tells you nothing about a lower one.
Run the check monthly, document per-client, and the conversation "are we tracking everything we should be?" goes from days of investigation to one filter on a dashboard you already own.