GA4 and GTM are separate products, but for most implementations they're not independent systems. GTM is the delivery mechanism for the data GA4 receives. When the container has problems — duplicate tags, misconfigured triggers, zombie conversion tags — those problems don't stay in GTM. They show up as inflated event counts, phantom conversions, and attribution errors in GA4 reports that are very difficult to diagnose if you don't know where to look.
The frustrating part is that GA4 has no way to signal that the data it received was problematic. It processes what it gets. If GTM sends the same purchase event twice, GA4 records two purchases. If GTM fires a conversion tag on every page instead of just the thank-you page, GA4 reports a conversion on every page. GA4 isn't broken — it's doing exactly what it's supposed to do with bad inputs.
Duplicate events — the most common GTM data quality issue
The most common duplication pattern: a GA4 Configuration tag firing a page_view event alongside a separate GA4 Event tag also configured to fire page_view. Or a purchase event set up twice — once via Enhanced Ecommerce on the GA4 Config tag and once as a standalone GA4 Event tag on the order confirmation page. Both send the same hit to GA4.
GA4 can't deduplicate these because from its perspective they're two separate events that happened to arrive at the same time with similar parameters. Your session counts, event counts, and any conversion based on those events will be inflated by as much as 2x.
Why it happens: Multiple people implementing tracking over time, each unaware of what the other added. Agencies layering tracking on top of existing implementation. A migration from UA to GA4 where the old event structure was rebuilt without removing the original.
Phantom conversions — zombie tags still counting
This one is specifically damaging because conversion data drives decisions. A campaign ran two years ago that required a custom GA4 conversion event. The campaign ended. The conversion event tag was never removed. It's still firing on whatever trigger was set at the time — possibly something broad like all form submissions or all button clicks.
In GA4's conversion report, this shows up as a conversion event with a name nobody recognizes. If GA4 is connected to Google Ads for Smart Bidding, phantom conversions are actively distorting your bidding strategy based on events that no longer mean anything. If someone has deleted the associated event as a GA4 conversion but the tag is still firing, the events are still in the raw data — they just don't appear in the conversion column.
The insidious version: the zombie conversion tag fires on a trigger that's broader than intended. A tag meant to fire only on a specific thank-you page is instead firing on all pages that contain the word "thank" in the URL. Every user who lands on any of those pages is recorded as a conversion.
Staging and developer traffic — polluting your production data
A GTM container published to a staging environment that uses the same GA4 Measurement ID as production is sending developer and QA sessions into your GA4 data. This inflates session counts, reduces conversion rates (developers browsing the site rarely convert), and can introduce test events and purchase events from QA testing into your revenue data.
The problem is that GA4's internal traffic filter only works if it's correctly configured with the right IP ranges — and in many implementations it's either not set up, set up incorrectly, or doesn't cover all the environments that are generating traffic.
The GTM fix: Add a hostname condition to every trigger in the container — only fire on the production hostname. This is more reliable than IP-based filters because it works regardless of where the developer is connecting from.
Trigger scope errors — events firing in the wrong place
A scroll depth event meant to fire on blog posts only, configured with an All Pages trigger. A form submission event meant to fire on contact form completions, configured on a CSS selector that also matches newsletter signups and checkout forms. An outbound click event firing on internal links because the URL match condition wasn't specific enough.
In GA4, these show up as inflated event counts, funnel steps that don't make sense, and engagement metrics that overstate actual user intent. The data isn't wrong in the sense that those events fired — they did fire. But they fired in the wrong context, so the signal they're supposed to carry is diluted or misleading.
This is often invisible without deliberate testing. The event name looks right. The count looks plausible. You only discover the problem when you look at which pages the event is firing on and find it on pages where it has no business being.
Consent gaps — missing data that should be there
This one works in reverse — instead of too much data, you get too little, or data that's inconsistent across your user base. If your GA4 Configuration tag is correctly gated on analytics consent but some of your GA4 Event tags are not, you'll see a mismatch: page_view events for users who declined consent, but no custom events for the same users.
The larger issue is with Consent Mode v2 and modeled conversions. GA4 uses consent signals to model behavior for users who decline tracking. If your consent mode implementation is incomplete — tags not correctly mapped to consent types — GA4's modeling doesn't work correctly, and your reported conversion numbers for EEA traffic will be understated.
Starting the diagnosis
If your GA4 data has problems and you suspect GTM is the source, the most efficient starting point is GTM Preview mode combined with GA4 DebugView. Load your site in Preview, trigger the key user journeys, and watch what fires in GTM alongside what lands in GA4 DebugView. The combination shows you exactly what's being sent and when.
Specifically look for:
- More than one GA4 tag firing on the same user action
- GA4 tags firing on pages or events where you didn't expect them
- Event names in DebugView that you don't recognize or can't account for
- Events firing in DebugView that should be gated by consent but aren't
What you find in Preview mode is diagnostic. Fixing it means going back to the container — correcting triggers, removing duplicates, adding consent configuration, and running a full container audit to surface everything systematically rather than issue by issue.
