AI overviews have started to roll out in Google’s search results, with there still being many elements of the SERP feature that are unclear to the SEO community.

Similar to an experiment I’ve done in the past for Twitter carousels, I went ahead and developed a new experiment focused on how AI overviews are tracked in Google Search Console to try to gain insight into how they operate.

While there were several obstacles that made the experiment difficult, I was able to trigger 867 impressions in Google Search Console for a query that would yield an AI overview result, with the help of the SEO community on Twitter and LinkedIn. 

Here’s what I learnt from running this experiment and how this knowledge can be applied to your own understanding of AI overviews within Google’s search results.

The Problem

There is currently the inability to track URL inclusions within AI overviews in Google Search Console and I suspect this may never happen, or at least not in the distant future.

There is also an assertion that Google has now launched AI overviews in the US, with the rollout soon to become global. This hasn’t exactly been the case, with this being explored in more detail throughout the write-up.

Experiment Outline

The experiment outline was quite simple. All that was required from the participants was to click a link that was a query string for a Google search, then click on the citation for my website within the AI overview.

Social media posts from Twitter and LinkedIn related to my AI overview experiment.

Details were also provided around the need for using a VPN if located outside of the US to be able to trigger an AI overview within Google’s search results, with the query itself being one that isn’t normally searched based on my GSC data (allowing for the query to be distinguishable).

The “who is brodie seo” query was chosen because I needed to use something where my website had a high chance of being referenced consistently, in an attempt to control variables out of my control in an experiment like this.

Limitations

A major limitation to the experiment is that AI overviews currently only triggers for users who are logged in, with non logged-in users (or if using incognito) being unable to have the AI overviews appearing.

Another limitation is the lack of corroborating external data. Because of the requirement for individual users needing to be logged in to see the AI overviews, reliable external tools like Semrush and Ahrefs are unable to track inclusions.

While it would be possible to track inclusions in a logged in experience, that would only be for a single account user and the results would be unlikely to be representative of what users see more broadly within search results.

Results Interpretation

As a starting point for the results interpretation, we can safely exclude any data within GSC that is outside of the US. I have not seen any evidence for this point in time for AI overviews appearing within other regions outside of a limited mobile capacity.

AI overview experiment data when filtering by users from the US only.

Based on this data, we can effectively use 362 of the 876 impressions from the dataset which originate from the US. This dataset had 49 clicks, with a CTR of 13.5% (above the total average) and an average position of 1.7.

Because the instructions were to click the result for my website within the AI overview, the 13.5% CTR is considered to be quite low, which does pose questions around GSC’s tracking capabilities for this feature. For comparison, the Twitter carousels test I created in the past had a CTR of 28.1% from similar directions.

Labs Opt-in Users

A major question mark within the dataset relates to how Google is reporting on users who have opted in to labs (via the Search Generative Experience) and how AI answers are represented there compared to the logged in experience in normal search results.

As highlighted by Lily Ray, the labs experience is signified by the beaker icon that appears above the answer. More recently, this changed to include the text ‘Search Labs’ to make it more clear to users that they are seeing the opt-in experience.

AI overview result within the Search Labs experience.

Is this data recorded in GSC? I’m not entirely sure. When the SGE was first introduced, it was my understanding that AI answers data wouldn’t be recorded within GSC. Now that users outside of the opt-in experience can experience the AI answers, it is unclear whether that opt-in data is now also recorded in GSC.

Based on my test results and the comments I received on my post describing the experiment, there was a reasonable amount of US-based participants who were actually in labs, and were generating results that looked different to the logged-in experience. When looking at the CTR for my experiment, I feel that there was reasonable chance that the opt-in labs data was also being recorded in GSC.

I suspect the low CTR is more so related to users clicking the experiment search query link within incognito or while using a signed-in account that doesn’t have access to AI overviews yet (there’s still a portion that don’t have it), and realised this on their first attempt and decided not to click anything as a result.

Snippet Testing

Something that became immediately apparent during the test was that while most people who were located in the US or using a VPN were able to see a similar result to what I had shared, but there were also some people who didn’t get an AI overview at all for the query or saw something different to the result I’d shared.

While featured snippets also operate in a similar way to this (the results aren’t always the same for different users in a region), that experience at least involves a single URL being featured. In the case of AI overviews, many URLs are featured, which inherently makes external tracking non-representative.

How GSC Tracking (Likely) Works

Through the introduction of AI overviews, their helpfulness for sending more traffic to websites is still unclear. With recent communication from Google’s CEO being that:

“If you put content and links within AI overviews, they get higher click through rates than if you put it outside of AI overviews.” – Sundar Pichai

While this statement could in fact be true on face value, I do believe that this information without context doesn’t tell the full story. Specifically, in relation to how data is recorded in GSC across the various ways that AI overviews can appear for websites. There are 3 core situations for how citations for websites can appear within AI overviews.

Tier 1: Visible Citations

This situation is essentially the same as how featured snippets would be recorded within GSC, with the primary difference being the amount of links that can appear (featured snippets have one, AI overviews can have multiple) and the use of a carousel for the citations.

AI overview citations are visible by default on page load.

Because everything visible on page #1 of Google has the ability to trigger an impression once the search is made (even if not scrolled into view), most AI overview formats would trigger an impression by default.

It is only citations that are not within view on page load that would only trigger an impression if scrolled to. This is a common aspect of AI overviews because of the breadth of references that can be used, with the carousel of citations appearing being a common way for how links appear. This becomes more complicated within ‘Partially Visible’ and ‘Hidden’ citation formats.

Tier 2: Partially Visible Citations

The second tier to AI overview tracking relates to answers that are only partially visible. In terms of GSC reporting, it is likely that the CTR would be higher, but only because an impression would be triggered once ‘show more’ was clicked.

Citations in AI overviews are only partially visible to users.

Because the citations are technically visible by default for the user, there is a reasonable likelihood that a citation may be clicked. The reason for why the citations are being shown only partially in this instance relates to the length of the AI overview answer, which would normally be capped at a specific length for a standard featured snippet result.

Tier 3: Hidden Citations

The final tier relates to citations that are hidden by default. While both tier 1 and 2 have elements of hidden citations, tier 3 seems to be the one that Sundar Pichai has referred to in terms of them having “higher click through rates”.

Citations in AI overviews are hidden by default and only show once expanded.

There are other features within Search that operate in a similar way, such as for product grids, where a click already needs to happen in order for an impression to be triggered with GSC. Because interest has already been shown, the likelihood of an additional click then becomes exceptionally high.

Findings Summary

While my experiment didn’t go as smoothly as I would have liked, with there being obstacles related to the rollout method and design of AI overviews, it still at least clarified several areas for me:

  • The public experience for AI overview data is clearly being recorded within GSC. It is unclear whether the Search Labs AI Overview data is now being recorded in GSC.
  • Even in the US, maybe only half of Google users have the capacity to see AI overviews (when factoring in non-logged in users and accounts that still don’t have access).
  • Answers and citations within AI overviews are about as stagnant (in terms of consistency) as featured snippets. This alongside not being shown for non-logged in users can make tracking difficult.
  • Similar to product grid results, tiers 2 and 3 mean that while there is a high CTR, traffic will likely remain quite low from AI overviews.
  • While there have been calls to include AI overview filtering within GSC, similar to featured snippets, it is unlikely that this data will ever be included.

It will only be possible to gain true insight into AI overviews at scale once Google decides to show them outside of the logged-in experience for users. This could be the next stage of the phased approach, so keep an eye out for these changes and I’ll make sure to update my post if anything changes.