When Universal Analytics was switched off on July 1, 2023, millions of Google Analytics properties were forced into GA4 whether they were ready or not. Many teams had been running GA4 in parallel for months. Some had completed a careful migration. Most had done the minimum to keep data flowing and planned to sort out the details later.
Two-plus years on, a large proportion of those properties still have configuration problems inherited directly from the migration. Data retention set to 2 months because nobody changed the default. Goals that didn't survive the import. Attribution models that shifted without anyone noticing. Conversion events that were set up once, assumed to be working, and never checked again.
This post is for everyone who migrated and hasn't fully audited what broke. It covers the most significant differences between UA and GA4, which ones cause active problems in your data today, and — where possible — how to fix them.
First: GA4 and UA are fundamentally different products
Most migration problems trace back to a single misunderstanding: GA4 is not a new version of Universal Analytics. It is a different product with a different data model, different terminology, different default settings, and a different philosophy about how user behaviour should be measured. Treating it as an upgrade — rather than a replacement — is what causes most of the configuration mistakes that persist in migrated properties.
| Universal Analytics | GA4 | |
|---|---|---|
| Data model | Session-based (hits within sessions) | Event-based (every interaction is an event) |
| Primary unit | Session | Event |
| Conversions | Goals (up to 20 per view) | Conversion events (up to 30 per property) |
| Bounce rate | Single-page sessions with no interaction | Replaced by Engagement Rate (sessions ≥10s, 2+ pages, or conversion) |
| Audience definition | View-level | Property-level |
| Data retention | Up to 50 months (paid), 26 months (standard) | Maximum 14 months (without BigQuery) |
| Attribution model default | Last non-direct click | Data-driven (or last click if insufficient data) |
| Views | Yes — multiple views per property | No views — data filters only |
| Sampling | Common in standard reports at high traffic | Less sampling in standard reports; Explorations still sample |
| Historical data | Available in UA property until sunset | Not transferred — GA4 starts fresh |
The core implication of this table is that a property migrated from UA to GA4 by copying configuration — importing goals as conversion events, recreating audiences, replicating report structures — will have a GA4 property that looks like UA but behaves like GA4. That mismatch is the source of most ongoing confusion.
What actually broke
The following are the most common GA4 migration failures — issues that were introduced during the transition and persist as active data quality problems today. Each one has a status: Fixable means the damage is recoverable. Partial fix means you can stop further damage but can't undo what's already lost. Data lost means the window for correction has passed.
In Universal Analytics, standard properties retained data for 26 months. GA4's default is 2 months — and it was not changed automatically during migration. Properties that were set up in GA4 anytime before the owner manually changed this setting have been permanently deleting user-level and event-level data on a rolling 60-day cycle.
For many properties, this has been happening since 2022 or earlier. Any GA4 property that launched without immediately setting retention to 14 months has a gap in its event-level history. The aggregate report data (sessions, pageviews, conversions) is still available in standard reports, but Explorations, cohort analysis, path reports, and any segment-based analysis drawing on event-level data cannot reach beyond whatever retention window was in effect when that data was collected.
UA historical data was similarly not transferred. Google does not provide a mechanism to import UA data into GA4. If you had years of UA data you were hoping to reference inside GA4's interface, that is not possible. The UA data existed in a separate property that stopped receiving new data in July 2023.
UA Goals and GA4 Conversion Events are fundamentally different things. UA Goals were session-level metrics — a session either contained the goal or it didn't. GA4 conversion events are event-level — an event fires and is marked as a conversion. The migration wizard offered to import UA goals as GA4 conversion events, but the conversion logic is different enough that many imported goals don't fire accurately, and some don't fire at all.
The most common failure: destination goals (UA's "user visited this URL") imported as page_view events with a page path filter. This works in some setups and fails silently in others, particularly on sites with dynamic URL parameters, AJAX-loaded confirmation pages, or single-page app routing. The conversion event appears in the list, the filter looks correct, but the event never fires for the target page.
A secondary problem: UA allowed 20 goals per view and teams often had a mix of important and trivial goals. The migration imported all of them as conversions, creating properties with 15–20 conversion events where only 3–4 represent meaningful business outcomes. This dilutes the conversion signal that Smart Bidding uses and makes conversion reports difficult to interpret.
Universal Analytics used last non-direct click attribution by default. GA4 uses data-driven attribution for properties with sufficient conversion volume, and last click (including direct) for those that don't. This is not a minor difference — data-driven attribution distributes credit across multiple touchpoints based on a machine learning model, while last non-direct click assigns 100% of credit to the final non-direct touchpoint before conversion.
The practical consequence: if you migrated to GA4 and compared channel performance against your UA baseline, some channels will appear to have significantly changed performance not because their actual performance changed, but because the rules for assigning them credit changed. Channels that were often the first touch in a longer journey — display, social, content — typically gain credit in data-driven models. Channels that were the last non-direct touch before conversion — branded search, email — typically lose some credit.
Very few teams noticed this change. Even fewer documented which attribution model was in effect when they were making budget decisions. The result is channel performance benchmarks that are not comparable across the UA-to-GA4 transition boundary.
UA's bounce rate measured sessions where the user viewed only one page and left without triggering any interaction. GA4 replaced it with Engagement Rate — the percentage of sessions that were "engaged," defined as sessions lasting 10 or more seconds, containing two or more pageviews, or containing a conversion event. Bounce rate is now the inverse: 100% minus Engagement Rate.
These metrics measure different things. A user who lands on a long-form article, reads it for eight minutes, and leaves without clicking anything is a bounce in GA4's definition — the session lasted less than 10 seconds only if they left very quickly, but if GA4's 10-second timer hasn't triggered a session_start event by the time they leave, it may be counted as unengaged. More practically: a user who reads a blog post thoroughly and leaves satisfied is counted very differently in UA (bounce) vs GA4 (possibly engaged, depending on time-on-page).
The real problem is the comparison. Teams that tracked UA bounce rate as a KPI and imported that benchmark into GA4 are comparing incompatible metrics. A GA4 "bounce rate" of 35% and a UA bounce rate of 35% are not the same thing and cannot be directly compared.
Universal Analytics supported multiple views per property. Most well-configured UA setups had at least three: a raw unfiltered view (always keep an unfiltered view), a working view with internal traffic and bot filters applied, and sometimes additional views for specific subdomains or regions. GA4 eliminated views entirely. There is one property, one stream, and data filters — which work differently and have significantly more limited functionality.
Properties that relied on UA views for filtering internal traffic, separating staging from production, or creating region-specific reporting segments had to rebuild all of that logic in GA4 using data filters and Explorations. Many didn't. The result is production properties receiving internal traffic, staging traffic, and all the contamination that UA's filtered view was quietly removing.
Additionally: UA views had their own historical data. If you were running analysis in a filtered UA view, that filtered data is not accessible in GA4. The raw UA property data exists in the UA property until Google removes it, but filtered view data is not separately exportable.
GA4 and UA count sessions differently. In UA, a session reset at midnight and after 30 minutes of inactivity. In GA4, a session resets only after 30 minutes of inactivity — it does not reset at midnight. A user who visits at 11:45pm and is still active at 12:15am is two sessions in UA and one session in GA4. Direct traffic following a campaign visit also counted differently: UA started a new session when source changed, GA4 may not.
The result is that GA4 typically reports fewer sessions than UA for the same real-world traffic. The difference varies by site — for properties with a lot of overnight activity or cross-midnight sessions, the gap can be meaningful. For typical business-hours-skewed properties, it's usually 5–15% fewer sessions in GA4.
This matters because any team that set session-based targets in UA — traffic goals, session benchmarks, acquisition targets — and then carried those targets into GA4 is measuring against an incompatible baseline. The target was calibrated against UA's counting methodology. GA4 will never hit it, not because performance has declined, but because the ruler changed.
UA's Enhanced Ecommerce and GA4's ecommerce tracking use different event schemas. The migration did not transfer ecommerce implementation — even if you had perfectly configured Enhanced Ecommerce in UA, GA4 started from scratch. Teams that set up GA4 in a hurry often implemented only the purchase event and skipped the full funnel. The result: GA4 shows purchase totals but cannot show the checkout funnel, product list performance, or cart behaviour that UA's Enhanced Ecommerce provided.
GA4's event schema is also stricter about required parameters. The purchase event requires transaction_id, value, currency, and an items array with at least one item. Missing any required parameter silently corrupts product-level reporting and revenue attribution. Many rush migrations have purchase events firing with incomplete parameter sets — technically tracking the conversion, but missing the data that makes the conversion useful for analysis.
Consent Mode v2 became a requirement for EEA traffic in March 2024 — after most teams had already migrated to GA4. Many migration projects didn't include it because it wasn't required at the time, and a significant number of properties still don't have it. Without Consent Mode v2, GA4 cannot collect data from users who decline cookie consent, and cannot model their behaviour for conversion reporting. For sites with meaningful EU traffic, this means a structural gap in conversion data that has been compounding since the requirement came into force.
This is particularly damaging for Google Ads campaigns targeting EU audiences. Smart Bidding cannot optimise against conversions it can't see or model. Campaign performance in EU markets is systematically underreported, which leads to budget decisions based on incomplete data.
What you can't fix — and need to accept
Some migration consequences are permanent. It's worth being clear-eyed about these rather than spending time trying to solve problems that don't have solutions.
| What's lost | Why it can't be recovered | What to do instead |
|---|---|---|
| UA historical data in GA4 | Google does not provide a mechanism to import UA data into GA4's interface or data model | Export UA data to BigQuery or CSV before your UA property is permanently deleted. Treat pre-July 2023 as a separate data source |
| Event-level GA4 data lost to 2-month retention | Deleted data cannot be restored. No backup exists once the retention window has passed | Change retention to 14 months now. Set up BigQuery export for permanent retention going forward |
| Direct UA-to-GA4 metric comparisons | Sessions, bounce rate, users, and conversions are all counted differently and are not methodologically comparable | Establish GA4-only baselines. Use a clear date cutoff in all historical comparisons. Document the methodology change for stakeholders |
| UA views and filtered view data | GA4 doesn't support views. Filtered UA view data wasn't exported separately and isn't accessible | Recreate filtering logic using GA4 data filters and internal traffic rules |
Check your property for migration problems now
The fastest way to determine which of these issues are active in your property is to run a systematic check across all of them. Manually, the process looks like this:
- Check data retention setting — Admin → Data Settings → Data Retention
- Review every conversion event for recent activity — Admin → Events
- Check current attribution model — Admin → Attribution Settings
- Run hostname report for contamination — Reports → Tech → Hostname
- Check acquisition report for payment processor referrals and UTM fragmentation
- Verify Consent Mode v2 status — Admin → Data Display → Consent Settings
- Cross-reference GA4 revenue against platform order data
That's a 45-minute to 2-hour process if you know exactly where to look. GA4 Health Check automates the entire audit — all 47 checks, including every item above — and returns a scored PDF report in 60 seconds. The report identifies every active issue in your property, classifies it by severity, and provides specific remediation steps for each one.
If you migrated to GA4 and haven't run a systematic audit since, there's a high probability you have at least three of the issues described in this post actively affecting your data right now.
