Skip to content
← ALL WRITING

2026-04-22 / 9 MIN READ

Reconciling GA4, Meta, and Shopify purchase counts

The short, full, and nuanced answer on ga4 meta shopify reconciliation. Five sources of variance and the one number to trust per question asked.

"My GA4 purchase count does not match Shopify, and neither of them matches Meta. Who is right?"

I hear this question at least once a month from DTC operators. The answer is all of them and none of them, depending on which question you are trying to answer. Each platform is counting something slightly different, and a healthy analytics stack shows some variance between them. Zero variance almost always means something is broken, not that everything is clean.

This is the short, full, and nuanced answer on ga4 meta shopify reconciliation, drawn from a multi-platform analytics engine I shipped earlier this year where this exact variance was the first thing every stakeholder noticed.

healthy bandShopify100%GA491%Meta78%0%50%100%130%

[healthy] GA4 lands 9% below Shopify; Meta lands 22% below (Meta only sees Meta-attributed).

Shopify anchors at 100%; healthy GA4 and Meta land inside the 70-100% band. Outside it, the variance is a bug.

The short answer

Trust Shopify for total orders and revenue. Trust GA4 for the web sessions and user journeys that produced purchases. Trust Meta for conversions the ad algorithm is optimizing against, within its own attribution window.

All three numbers will disagree. That is expected, not a defect. The variance between them is the signal worth watching.

The fuller answer: what each platform actually counts

GA4, Meta, and Shopify are three different tools measuring three overlapping slices of the same commercial reality. Shopify reports orders that completed. Meta reports conversions the ad algorithm is willing to attribute to ad exposures. GA4 reports purchase events from sessions it managed to cookie. They rarely agree on the denominator, let alone the numerator.

Shopify is the closest thing to ground truth. It counts every completed order, including POS, wholesale channels, B2B, draft orders manually converted, and subscription rebills from Recharge or Bold. It carries no attribution and no filtering. If the order processed a payment, Shopify counted it.

GA4 counts purchase events from web sessions it could observe. Cookie loss matters a lot here. An iPhone user in Safari who cleared their ITP-gated cookies between the ad click and the purchase will show up in Shopify but not in GA4's attribution model. GA4 also filters bot traffic and obeys consent-mode gating, so numbers move around quietly as consent rates shift.

Meta counts conversions within its ad-attributed window (typically 7-day click or 1-day view post-iOS, though you can tune this). Meta's count is a superset of the conversions GA4 would attribute to Meta (because Meta uses probabilistic modeling for unreported conversions) and a subset of Shopify total (because not every order came from a Meta ad).

Five sources of variance

Every ga4 meta shopify reconciliation I have run decomposed the disagreement into some mix of these five sources.

Consent and bot filtering. GA4 respects consent-mode v2; a user who declined analytics cookies is invisible. Bots are filtered heuristically by each platform with different thresholds. Meta's bot filter is opaque; Shopify has no equivalent because payment processors drop most bot traffic before checkout anyway.

Attribution window mismatch. GA4's default attribution is a 30-day click, 1-day view, using data-driven attribution. Meta's default is 7-day click, 1-day view. If a customer sees a Meta ad on day 1, clicks nothing, and buys on day 10, Meta misses it and GA4 catches it. The opposite happens when attribution windows differ by platform. Before reconciling, confirm the windows.

POS, draft orders, and non-web commerce in Shopify. Shopify totals include orders that never touched your website. If you run POS at retail events or have a B2B channel placing draft orders, those show up in Shopify totals but never appear in GA4 or Meta. Filter them out before comparing.

Dedup failures. If your CAPI implementation is double-counting, Meta's number will be inflated by 30 to 60 percent relative to Shopify. If the browser pixel and the server event disagree on event_id, Meta counts both. I covered the fix in the event_id strategy across Shopify Pixel, CAPI, and GTM. A dedup failure looks exactly like an attribution problem until you grep the payloads.

Currency and refund handling. Shopify reports gross order value in store currency. GA4 reports purchase_value based on what the event sent. Meta reports conversion value in the ad account currency. If any of these do not agree on decimal handling, the currency conversion rate at event time, or whether refunds have been offset, you will see percentage-scale disagreement that looks like a data bug and is actually a bookkeeping artifact.

The nuanced answer: when discrepancy is a bug versus a feature

Healthy variance bands for a DTC Shopify store with working CAPI and GA4:

  • GA4 purchase count usually lands 5 to 15 percent below Shopify (cookie loss + consent + bot filter)
  • Meta's Shopify-attributable conversion count lands 10 to 30 percent below Shopify total (Meta only gets credit for Meta-attributed orders; other channels own the rest)
  • Meta's revenue total after a good CAPI rebuild lands within 5 to 10 percent of what GA4 reports for Meta as a source

If you see GA4 reporting 40 percent below Shopify, something is eating sessions (likely a broken GA4 implementation or a page-load tag not firing). If Meta is reporting above Shopify total, you have a dedup problem, not an attribution miracle.

A healthy stack shows some variance between platforms. Zero variance almost always means something is broken, not that everything is clean.

How I run ga4 meta shopify reconciliation in practice

The reconciliation itself is a one-afternoon spreadsheet. I still do it manually the first time on every new engagement because the act of lining up the numbers reveals where the instrumentation is broken.

Anchor on Shopify. Export orders for the window you care about. Filter to web orders only (exclude POS, draft, wholesale). This is your denominator.

Pull GA4 purchase events for the same window. Use the GA4 API (Google Analytics Data API) and query for purchase events with session-scoped source / medium. Group by source / medium and count.

Pull Meta purchases for the same window via the Marketing API, filtered to your pixel. Include both 7-day click and 1-day view windows so you can see how much of the gap is window-related.

Join on date, not on order_id (most platforms will not expose order_id uniformly). Diff column by column. Any row with more than a 30 percent cross-platform gap gets flagged for investigation.

The investigation itself usually points at one of the five variance sources above. Fix the root cause, re-pull, and the numbers converge into the healthy bands.

If your numbers agree suspiciously well, dedup is often the culprit. The debugging postmortem on CAPI payload mismatch covers the single most common failure mode that makes this reconciliation look clean when it is not.

If the variance feels seasonal or keeps widening, attribution windows are doing it. I laid out the current window behavior on iOS and Android privacy in the field notes on attribution windows after iOS and Android privacy updates.

For the full CAPI context this variance lives inside, start with the field guide to Meta CAPI for DTC operators. Most reconciliation problems are downstream of a CAPI configuration choice.

If the variance keeps costing you optimization decisions, a DTC Stack Audit runs the same reconciliation method against your live stores and tells you which of the five sources is eating the most signal.

FAQ

Which platform should I report to the board as my 'real' revenue?

Shopify. It is the only platform that reports completed transactions; the others report models of completed transactions. GA4 and Meta numbers belong on marketing dashboards, not on the P&L.

How much variance is too much?

For a DTC Shopify store with working CAPI, GA4 should land 5 to 15 percent below Shopify and Meta should land 10 to 30 percent below Shopify for Meta-attributed conversions. Outside those bands, you have an instrumentation problem worth diagnosing. Inside them, the variance is the real behavior of the stack.

Why does Meta sometimes report more conversions than Shopify?

Dedup failure. The browser pixel and the server CAPI event are firing for the same order with different event_ids, so Meta counts both. The fix is in the event_id strategy, not in Meta. Check Test Events for duplicate entries, then fix the hash.

Should I change my attribution windows to make the numbers match?

No. Change windows only when the window genuinely misrepresents the customer journey, not to improve visual alignment on a dashboard. Matching windows between platforms is a reasonable exercise for like-for-like comparison, but aligning windows to hide a broken implementation is a common way to ship worse attribution.

How often should I run this reconciliation?

Monthly at minimum, weekly if the brand is spending seven figures per year on Meta. A silent variance drift is the earliest signal that something shifted in the tracking stack, often before the symptoms show up on ROAS dashboards. A quarterly reconciliation is reactive; monthly catches problems before they cost a full budget cycle.

Sources and specifics

  • The reconciliation method above is drawn from a multi-source analytics engine I shipped in Q1 2026; see the analytics engine case study for production details.
  • Healthy variance bands are based on diff patterns across multiple DTC Shopify engagements, not on a public benchmark.
  • GA4 default attribution as of this writing is 30-day click and 1-day view using data-driven attribution; Meta's default is 7-day click and 1-day view.
  • The fifth variance source (currency and refund handling) is the one most often missed in a first-pass reconciliation and the one that looks most like a data bug.
  • Test Events confirmation is from Meta's public Events Manager tooling.

// related

Let us talk

If something in here connected, feel free to reach out. No pitch deck, no intake form. Just a direct conversation.

>Get in touch