Tech 9 min read

Late April Google Search Volatility and the Search Console / GA4 Data Lag

IkesanContents

The late-April turbulence in Google Search is not an officially announced core update.
Search Engine Roundtable reported rising ranking volatility on April 27-28, but even in their own coverage they noted that “Google usually doesn’t confirm these.”

On the Google Search Status Dashboard, the most recent major ranking event is the March 2026 core update.
It started March 27, 2026 at 02:00 PT and finished April 8, 2026 at 06:00 PT.
The same dashboard also lists the March 2026 spam update, a short rollout from March 24 at 12:00 to March 25 at 07:30.

So reading the April 27-28 movement as “the core update that ended April 8 is still going” is sloppy.
Officially, it’s done.
If search results are still shifting afterward, it makes more sense to separate unconfirmed ranking changes, normal re-evaluation, site-side changes, and measurement-side lag.

The fact that Search Console hasn’t caught up to the latest date is a separate layer.
GSC’s Performance report typically takes 2 to 3 days before collected data becomes visible.
On top of that, the most recent data is treated as preliminary and may change later.
Daily data in Search Console (except the 24-hour view) is bucketed by Pacific Time, so it can look a day off from JST.

GA4 has the same vibe: standard reports are not real-time.
On this site I previously wrote about setting up GA4 with Partytown on Astro, but the reporting side has its own wait times.
Google’s help docs say intraday data for standard properties takes 2-6 hours, daily takes 12+ hours, and full processing can take 24-48 hours.
Numbers in reports shift during that window.

When both Search Console and GA4 are missing the latest dates, it looks like search itself is broken.
But GSC measures clicks and impressions on search results, while GA4 measures events on your site.
Even when you’re looking at the same date, timezone differences, JavaScript execution, consent management, bot filtering, anonymized queries, and canonical URL aggregation mean the numbers won’t match.

If you’re tracking the late-April volatility, start by not mixing the core update period through April 8 with the unconfirmed movement on April 27-28 on the same graph.
Treat the last 2 days in Search Console as soft.
Don’t treat GA4’s real-time or last-few-hours numbers as finalized daily values.
That alone makes it easier to tell whether “search dropped” or “the reports haven’t caught up yet.”

Search Console API Pitfalls

The Search Console API has the same constraints as the UI when pulling daily data.
Google’s API docs explain that due to internal limits in Search Console, not all rows are guaranteed to be returned; it returns the top rows.
When you query by date dimension, dates with no data are simply omitted from the results.

In other words, a missing row for the latest date does not prove zero search traffic for that day.
It’s far more common that the data simply hasn’t appeared yet, it’s still in the preliminary window, or it got dropped by the filter/dimension combination.
Start by querying date alone with no filters to see which dates Search Console is actually returning.

What Actually Moved the Rankings

Almost certainly, delayed re-evaluation after the March core update.

“Rollout complete” for a core update means the new ranking criteria have propagated to all data centers.
It does not mean every page has been re-evaluated under the new criteria.
After rollout, Googlebot re-crawls each site and re-scores pages against the new criteria.
Sites with lower crawl frequency or fewer updates get this re-evaluation later.
As a result, rankings shifting in bulk 2-4 weeks after rollout completion is not unusual.

April 27 is about 3 weeks after the core update finished.
Sites that thought “we weren’t affected” right after the update see rankings suddenly shift 3 weeks later. Classic pattern.
The timing matches the volatility heat that Search Engine Roundtable picked up.

A few unnamed minor adjustments stacking up is possible too, but since the 3-week post-core-update timing explains it, there’s no reason to look for other causes.

GA4 Event Tracking Issues on This Site

Separate from GA4’s reporting delays, this is something that actually tripped me up on this site.

I’d been running GA4 using the method from the Partytown setup article, but internal link clicks weren’t being captured as events.
I’d configured link events exactly per the GA4 docs, yet no numbers showed up.
After digging into the cause, switching to a direct script embed (no Partytown) fixed it.
Partytown runs scripts outside the main thread, and in combination with Astro’s client-side navigation, event firing can get dropped.

Another thing I noticed: GA4’s engagement time alone can’t distinguish “actually reading” from “just sitting there.”
GA4’s engaged sessions count any session where the page was in the foreground for 10+ seconds as “engaged.”
Leave a tab open while you go eat lunch? Still counts if it’s in the foreground.
Long dwell time does not equal actually read.

Without scroll events tracking how far down the page users get, you can’t measure read-through rate.
GA4’s enhanced measurement includes a scroll event, but by default it only fires at 90% page depth.
To catch mid-page drop-offs, set up custom events at 25%, 50%, 75%, or configure scroll distance triggers in Google Tag Manager.

Sending Language as a Custom Dimension in GA4

This site publishes articles in Japanese and English.
To know how much each language gets read, I send a custom dimension for language to GA4.

You can also pull by path in GA4’s Explore reports, but the moment your path structure changes, consistency breaks.
With a custom dimension, regardless of how URL design evolves, “which language are they reading?” always has an answer.

No AdSense Auto-Ad Tag in the Head

Related to measurement, I don’t put the AdSense auto-ad tag in the head because of page speed.
Ads are handled through individual embeds.

When auto-ad tuning kicks in, full-screen interstitials appear and ads get auto-inserted into the body text.
I’d lose it if I saw that on someone else’s blog, so doing it on my own site is out of the question.

I want to cover the blog’s operating costs.
But tanking UX for that defeats the purpose.
The whole point of keeping ad placement minimal is that ad-heavy sites are everywhere. Doing the same thing would make it meaningless.

GSC Data Was Stuck at April 20

When I started writing this article, Search Console’s performance data was stuck at April 20.
It’s since moved up to the 24th.

Index coverage status was also acting up.
URLs that URL Inspection was showing as “not found” or “not indexed” quietly flipped to indexed after some time passed.
They hadn’t actually dropped from the index; the status display just hadn’t caught up.

Core Update Numbers Look Inflated Right After

When measuring the impact of a core update, taking the numbers at face value right afterward makes the effect look bigger than it actually is.

During rollout, old and new ranking criteria coexist across data centers.
The same query can return different rankings depending on which data center you hit.
Search Console aggregates this mixed state as-is, so numbers swing significantly by day.
Cherry-picking a high-swing day and saying “we dropped X%” doesn’t represent the settled value.

Speee’s core update explainer also says don’t panic during rollout, wait 2-3 weeks.
Google used to say you might need to wait until the next core update to recover, but a 2025 documentation update acknowledged that recovery through ongoing smaller updates is also possible.
The era of “you fell in one core update, now wait six months” is over.

Preliminary data settling into final values also shifts the numbers.
When rollout volatility, preliminary data corrections, and GA4 reporting delays stack up, the first week’s numbers can appear to swing 2-3x more than reality.
”I thought we crashed” only to see half of it bounce back two weeks later is a report that surfaces every core update cycle.

This Site Went Up

For what it’s worth, this site’s numbers actually went up after the March 2026 core update re-evaluation arrived.

No clear reason why.
Core updates are zero-sum: someone goes down, someone goes up.
Probably just inherited traffic from competitors in the same space who went down.
E-E-A-T tightening reportedly favors first-party information and experience-based content.
A personal blog documenting hands-on work may have benefited. But post-hoc explanations like this are easy to construct, so take it with a grain of salt.

Sites that went up tend to stop at “we’re safe” and move on, but the next core update can reverse it just as easily.
The fact that this time was a plus means nothing.

The Thread from December 2025

March 2026 sits on the same trajectory as the December 2025 core update (December 11-29, 18-day rollout).

According to Speee’s analysis, December 2025 increased the weight on freshness and recency in E-E-A-T evaluation.
It also shifted from evaluating individual page quality to looking at topical consistency across the entire site.
Stuffing a variety of topics into one site is less favored than focusing on a specific domain.
Search result diversity was redefined from “number of sources” to “diversity of perspectives.”
Ten articles recycling the same facts matter less than three articles offering different angles.

March 2026 pushed this direction further. The pattern tightens quarter by quarter.
Strategizing based on a single core update’s results is futile if the criteria shift again next quarter.

Excessive “(not set)” in GA4

The suspicion that update-period aggregation was still messy is supported by an abnormal amount of “(not set)” across various GA4 dimensions.
This has been gradually decreasing, which points to measurement dropouts.
In my case, as “(not set)” decreases, other dimension values increase correspondingly, suggesting values that should have been recorded under proper labels were simply not captured.

References