Google’s limited use of links to publisher websites within their AI chatbot are hypocritical. They have long held publishers to a high standard, but aren’t playing by the same rules or contributing back to the ecosystem that has fueled their success.
Google has always pushed the boundaries of attribution with publisher content within their search engine. A key milestone in the publisher vs. Google battle started with the introduction of featured snippets, and it appears to be getting worse.
Within this article, I’ll be covering some of the early previews of Bard, the current state of the experimentation product for public use, Google’s own advice with respect to linking externally, along with a more balanced approach that Google can use.
An early preview of Bard
Going back to February of 2023, the race was on among Google and Microsoft to see who could release an AI chatbot first. Microsoft had already partnered with OpenAI, and appeared to be moving at a faster pace than what Google was.
Google’s first preview of Bard on February 6th displayed some concerning attributes. Within the chatbot itself, there were no visible citations for how content was being sourced, along a direct replacement of a featured snippet, with minimal attribution.
This preview was met with contempt from publishers. Google had always been known to push the envelope with content attribution, but Bard’s preview was the clearest it’s been. Yet Google proceeded with the proposed plans.
Bard was then available to the public to test out on March 21st on a separate sub-domain bard.google.com, focused on the separate AI chatbot tool, with the Search integration being on Google’s roadmap for the near future.
The issue was that Bard was rarely referencing content, even when it would be seen as a direct quote from a website. And in the best case scenario for a publisher, Google will mention a URL (or multiple in some instances) in a section at the bottom of chat results titled ‘Sources – Learn more’.
Here’s what a more common Bard response looks like:
Even when using information that was directly taken from pages, no citations are provided for the review I wrote or my Linkedin posts. When asking the AI chat about why it isn’t using citations, it was because “it is a large language model”, which clearly isn’t enough reason to operate in the way it is.
Based on recent communication from Google, it doesn’t appear as though they are rushing to address this specific problem with Bard. Bing’s AI Chat appears to be doing an excellent job at referencing its sources, so why isn’t Google?
Google’s guidelines for external linking
To add insult to injury for publishers, Google currently maintains webmaster documentation that recommends using links for establishing trust.
Google’s link best practices documentation has a complete section on external links, which states the following:
“Linking to other sites isn’t something to be scared of; in fact, using external links can help establish trustworthiness (for example, citing your sources).” – Google
So Google, what are you scared of? And why aren’t you putting ‘trustworthiness’ at the forefront of Bard? It appears that the hypocrisy of Google’s own advice over the years for building trust with your readers has gone out the window.
Ever since broad core Google updates came into play a few years back, Google has been stressing the importance of Expertise, Authoritativeness and Trustworthiness (E-A-T). With the recent inclusion of an additional ‘E’ for Experience.
The frustrating part of these types of quality-focused Google algorithm updates is that they can have a severe impact on both traffic and revenue for publishers. Especially if Google suddenly decides that a site isn’t trustworthy.
Google has long held publishers to a high standard for ensuring that they are a trustworthy source of information. It would only make sense for Google to follow their own advice here, especially if they are trying to build the best product in the space.
Bing AI Chat as a more trustworthy alternative
Many users of AI Chat tools are flocking to Bing because of its ability to reference where information populated into their answers are coming from, an existing flaw with ChatGPT.
While Google’s Bard isn’t using any sort of referencing in many instances during my own testing, Bing is setting that standard for how AI Chat should be done, with clear referencing for how information is sourced.
The approach that Bing is using seems quite advanced, but is an approach to AI-powered snippets that they have used dating back as late as 2018. If Bing was able to take this approach as far back as 2018, why can’t Google do this in 2023?
This is the frustrating part in all of this. Google is clearly capable of providing citations for how information is sourced, but there appear to be roadblocks that are preventing them from doing so.
Is it because they want to reduce the overlap among their Search and AI Chat products? Likely so, but that doesn’t seem to be an option with how many users are currently using ChatGPT.
What I do know is that Google is not currently acting in a way that supports their own ecosystem. Not providing adequate citations for publishers makes me lose a lot of trust in Bard, and is preventing them from making Bard comparable to Bing in my eyes.
Bard’s road to redemption
Aside from the lack of content creator support, Bard seems to have a lot of potential. In my testing, there have been various instances where Bard has performed better than both Bing and ChatGPT.
As publishers, we should hold Google to the same standard that they have applied to us. Bard needs to start giving more credit to the publishers that are fueling the AI chatbot.
Microsoft has even mentioned that they are exploring a revenue share system for partners who have contributed content to Bing’s AI Chat. If Bing introduces this, Bard will have fallen behind by a country mile.
As publishers, we will be watching closely to see Google’s next moves. At the moment, I’m not expecting a significant change to their approach to citations based on recent communication, but who knows what could happen.