Your ROAS went up 40% the week you turned on CAPI. Then three weeks later it quietly corrected itself back down. If you've seen that pattern, you weren't recovering lost conversions. “You were double-counting existing ones, and Meta's optimizer spent three weeks training on bad data.”
This is the part most guides skip when they sell you on server-side tracking.
The conventional wisdom says CAPI fixes everything
The pitch makes sense on the surface. iOS 14.5 killed browser cookie tracking for a significant share of your traffic. Meta's browser pixel can't see those users, so it can't attribute their purchases. Server-side CAPI runs on your infrastructure, outside the browser, so iOS restrictions don't apply. More signal recovered, better attribution, better optimization.
Every agency selling CAPI implementation leads with this framing. Most of the time they're not wrong about the problem. They are often wrong about what their implementation actually does. (This is also part of why I've written about what a productized audit actually delivers versus an agency retainer - the implementation details matter more than the engagement model.)
The gap isn't in the theory. It's in deduplication - the mechanism that tells Meta "this Purchase event from the browser and this Purchase event from your server are the same purchase, don't count it twice." When that mechanism isn't wired correctly, you're not adding a recovery layer. You're adding a duplication layer.
Why double-counting happens in practice
Meta uses a field called event_id as its deduplication key. When a browser pixel fires a Purchase event, it should emit an event_id. When your server-side CAPI fires the same event, it should emit the identical event_id. Meta sees both events, matches them on that key, and counts only one.
That's the design. In practice, three things go wrong.
First, Shopify's native CAPI integration doesn't share event_id between the browser pixel and the server layer by default. The web pixel fires with one ID generated in the browser. The server event fires with a different ID generated on the server. Meta sees two unique events and counts both.
Second, if you have a GTM web container firing the pixel and a GTM server container (usually on Stape) firing CAPI, the two containers need to share the same event_id through a mechanism like a first-party cookie or shared variable. A lot of implementations set this up with a hardcoded prefix plus a random string, regenerated independently on each side. The strings don't match. Two unique events, two counted conversions.
Third, if you have Klaviyo, Triple Whale, or Northbeam installed, there's a good chance each of them has their own Meta tag configured. Klaviyo alone can fire a Purchase event through its own Meta integration while your GTM stack fires another. You end up with three sources emitting separate Purchase signals for the same transaction.
The 48-hour correction window
Where the double-counting actually lives
Meta surfaces the deduplication rate in Events Manager under the Deduplication tab for each event. The healthy band is 85 to 95 percent. That range means: roughly 10 to 15 percent of events are unique server-only events (real CAPI recovery, the thing you wanted), and the rest are properly deduplicated pairs.
Below 85 percent is confirmed double-counting. Above 95 percent often means your server events are being rejected entirely - Meta can't match them and discards them - which is the opposite failure mode.
I rebuilt the CAPI stack for a Shopify DTC brand in Q2 2024. When I ran the first diagnostic, their event_id dedup rate for Purchase events was sitting at 61 percent. Their match quality score was 4.2, below the threshold where Meta's optimizer runs blind. They had been running this configuration for 11 months and celebrating the ROAS numbers without checking the dedup rate once.
After fixing the event_id wiring across both containers, adding external_id (hashed Shopify customer ID) on every authenticated event, and removing the duplicate Klaviyo meta tag, the dedup rate moved into the healthy band. Match quality went to 9.1 within 48 hours. The conversion counts dropped - which was the right outcome. Fewer phantom conversions meant Meta had accurate signal to optimize against.
The full account of that rebuild is in the tracking gap write-up if you want the step-by-step sequence.
What actually works: the three-step dedup fix
The fix has three parts. They're not complicated. The sequence matters.
Step 1: Wire a shared event_id between browser and server.
The pixel and the CAPI event need to emit the same string for the same user action. The reliable way is to generate the event_id in the browser, store it in a first-party cookie before the pixel fires, then read it server-side when building the CAPI payload. On Stape, this is available as a shared memory variable. In a custom GTM setup, it's a cookie you set via a Custom HTML tag before the conversion tag fires.
If you're on Shopify's native integration, verify in Events Manager that Purchase events have a dedup rate in the 85-95 percent band. The native integration is supposed to handle this for standard checkout events. Check the actual rate before assuming it works.
Step 2: Pass external_id on every authenticated event.
external_id is a hashed identifier for the customer, typically the Shopify customer ID. When a logged-in user triggers ViewContent, AddToCart, or InitiateCheckout, that hashed ID goes with the event. It's the highest-leverage single fix for match quality. I've seen it move match quality by 2 to 3 points on its own.
This isn't a dedup fix directly, but it raises match quality enough that Meta's model becomes more confident in the events it does count, which partially compensates for the signal you may still be losing.
Step 3: Audit every Meta tag source in your stack.
Pull up your GTM web and server containers and list every tag that fires a Meta event. Then check Klaviyo's integrations, your attribution tool's settings, and any other marketing platform installed on the store. For each one, confirm whether it's firing a duplicate Purchase or not. If it is, disable the redundant one. One authoritative source per event is the target.
The event-source inventory check in the DTC Stack Audit covers exactly this - including what to look for in each tool's settings panel.
Objections answered
"Meta says the native Shopify integration handles dedup automatically."
It does, for standard Shopify checkout events when you're using the native integration and nothing else. The moment you add a custom event, a GTM overlay, or a third-party attribution tool, the native integration's dedup coverage ends at the boundary of what it controls. Your custom events are on their own.
"My ROAS went up and my orders match Shopify's numbers."
The ROAS number is what Meta reports, calculated against the events Meta received. If those events are double-counted, the denominator (your ad spend) stays fixed while the numerator (attributed revenue) inflates. Shopify order numbers don't help because Shopify doesn't know how many times you fired a Meta event for each order. The dedup rate in Events Manager is the right diagnostic, not ROAS.
"Triple Whale / Northbeam gives me accurate cross-channel attribution, so I don't need to worry about this."
Attribution tools reconcile which channel gets credit for a conversion. They don't fix the event count going into Meta's Conversions API. Even if Northbeam correctly attributes a purchase to a Facebook ad, if your CAPI sent three Purchase events for that one order, Meta's training data is still corrupted. The optimizer learns from the raw events, not from the attribution reconciliation you see in a dashboard.
When Shopify native CAPI actually is enough
There's a version of your business where the native integration handles everything correctly: standard Shopify checkout only, a single Meta pixel source without any GTM overlay, and no third-party attribution tools or custom events in the stack.
That's a small subset of real DTC stores with any marketing complexity, but it exists. If that's your setup, your dedup rate in Events Manager will tell you whether native is doing the job. If it's in the 85-95 percent band across all events, leave it alone.
The audit work becomes necessary when your stack has grown through the typical DTC lifecycle - pixel added, then GTM, then Klaviyo, then an attribution tool, then CAPI on top - and nobody mapped how these systems interact at the event level. That's the pattern I see most often when I run a full Operator's Stack diagnostic. The tracking layer is almost always the last thing someone audits, and by then the stack has four overlapping sources sending data to the same place.
Frequently asked questions
How do I check if my Meta CAPI is double-counting right now?
Open Events Manager, select the Purchase event, and look at the Deduplication tab. If the dedup rate is below 85 percent, you have confirmed double-counting. If you don't see a Deduplication tab, you likely don't have server-side CAPI enabled at all - only the browser pixel is running.
What is `event_id` and where does it come from?
event_id is a string you generate. It just needs to be unique per user action and consistent between the browser and server events for that same action. A common pattern is purchase_{orderNumber}_{timestamp}, generated once in the browser and passed to the server. The exact format doesn't matter as long as both events send the same string.
Does deduplication work for non-purchase events?
Yes, but it matters most for Purchase because that's the event Meta's optimizer weights most heavily. ViewContent and AddToCart double-counting is still a problem for training signal quality, but the business impact is largest on Purchase. Start there.
How long does it take for dedup to fix after I correct the configuration?
Meta's deduplication processes retroactively for events in a recent rolling window, but new data starts resolving within 24-48 hours of the configuration change. Conversion history from before the fix remains as-is. You'll see the dedup rate normalize in Events Manager within a couple days.
Will fixing dedup lower my reported conversion numbers?
Yes, if you were double-counting. The reported conversion count will drop, sometimes significantly. That's the correct outcome. The events that disappear were phantom events. Your actual business performance didn't change; your measurement of it became more accurate.
Can I fix this without touching GTM?
It depends on your stack. If you're on Shopify's native CAPI integration only, the fix may be as simple as verifying the native integration's settings. If you have a GTM web container and a server container, the fix requires a GTM change to wire event_id correctly. There's no way around modifying wherever the events are generated.
Sources and specifics
- The 85-95 percent healthy dedup rate band is from Meta's Events Manager Deduplication documentation for the Conversions API (Meta Business Help Center, 2024).
- The Q2 2024 rebuild referenced in this article tracked a Shopify store's match quality moving from 4.2 to 9.1 and dedup rate from 61 percent to the healthy band within 48 hours of configuration changes. Specific client is anonymized.
- Shopify's native CAPI integration handles dedup for standard checkout events natively; custom events require separate
event_idwiring as of Shopify Checkout Extensibility (2024). - Klaviyo, Triple Whale, and Northbeam each have the ability to send independent Meta events if their Meta integrations are enabled alongside GTM CAPI implementations.
- The 48-hour correction window for Meta's deduplication processing is from observed Events Manager behavior across multiple store configurations, not from Meta's published documentation.
