Google tested out AI overviews for months before releasing them nationwide last week, but clearly, that wasn’t enough time. The AI is hallucinating answers to several user queries, creating a less-than-trustworthy experience across Google’s flagship product. In the last week, Gizmodo received AI overviews from Google that reference glue-topped pizza and suggest Barack Obama was Muslim.
Advertisement
The hallucinations are concerning, but not entirely surprising. Like we’ve seen before with AI chatbots, this technology seems to confuse satire with journalism – several of the incorrect AI overviews we found seem to reference The Onion. The problem is that this AI offers an authoritative answer to millions of people who turn to Google Search daily to just look something up. Now, at least some of these people will be presented with hallucinated answers.
“The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web,” said a Google spokesperson in an emailed statement to Gizmodo, noting many of the examples the company has seen have been from uncommon queries. “We’re taking swift action where appropriate under our content policies, and using these examples to develop broader improvements to our systems, some of which have already started to roll out.”
In my experience, AI overviews are more often right than wrong. However, every wrong answer I get makes me question my entire experience on Google Search even more – I have to asses each answer carefully. Google notes that AI is “experimental” but they’ve opted everyone into this experiment by default.
“The thing with Search — we handle billions of queries,” Google CEO Sundar Pichai told The Verge on Monday when asked about the AI overview rollout. “You can absolutely find a query and hand it to me and say, ‘Could we have done better on that query?’ Yes, for sure. But in many cases, part of what is making people respond positively to AI Overviews is that the summary we are providing clearly adds value and helps them look at things they may not have otherwise thought about.”
Strangely, Google Search occasionally responds to a query with “An AI overview is not available for this search,” while other times, Google will just not say anything and show traditional search results. I got this answer when I searched “what ethnicity are most US presidents” and when I searched “what fruits end in me.”
A Google spokesperson says its systems occasionally start generating an AI overview, but stop it from appearing when it doesn’t meet its quality threshold. Notably, Google had to pause Gemini’s answers and image generation around racial topics for months after it upset large swaths of the country. It’s unclear if this “stop and start” AI overview generation is related.
What is clear is that Google felt pressured to put its money where its mouth is, and that means putting AI into Search. People are increasingly choosing ChatGPT, Perplexity, or other AI offerings as their main way to find information on the internet. Google views this race existentially, but it may have just jeopardized the Search experience by trying to catch up.
This week, Google Search has told people a lot of strange things through AI overviews. Here are some of the weirdest ones Gizmodo has found.
Services Marketplace – Listings, Bookings & Reviews