My New Online Hobby Is Asking Google AI What Made-Up Proverbs Mean

My New Online Hobby Is Asking Google AI What Made-Up Proverbs Mean

Google’s AI Overview isn’t shy of an AI hallucination or two, and its latest one is another classic to add to the list.

AI Overview Believes Everything Is a Idiom, and It’s Wonderful

In short, if you head over to Google Search and input a random sentence that sounds vaguely like an idiom or proverb, the AI Overview will do its very best to place some meaning to your empty words.

First spotted on Threads, though brought to my attention through Greg Jenner’s Bluesky account, these AI hallucinations are some of my favorites.There are some amazing examples of the lengths Google’s AI Overview will go to explain how something makes sense or fits into its vision of the input. One particular favorite came from MakeUseOf’s Editor in Chief, Ben Stegner: “There’s no irony like a wet golf course meaning.”

google ai overview idiom hallucination example.

To which the AI Overview dug deep and responded, “The saying ‘there’s no irony like a wet golf course’ plays on the common understanding that gold, a sport often associated with sunny, well-maintained greens, can be surprisingly challenging and frustrating when conditions are wet.”

Another one I tried was “giant pandas always fall twice,” which had the AI Overview detailing how pandas are clumsy and enjoy rolling around instead of walking. But not content to stop there, it began delving into the metabolism and energy conservation efforts of pandas.

giant pandas always fall twice ai hallucination example.

AI Overview’s Latest Hallucination Is Why You Cannot Trust AI Chatbots

As amusing as these wonderfully weird, forced explanations are, they highlight the very real problem with AI chatbots (not just AI Overview). AI hallucination is real and very much an issue, especially if its output is taken at face value.

When AI hallucination was confined to folks specifically using AI chatbots like ChatGPT, Claude, Gemini, and so on, the potential danger was somewhat limited. Sure, the AI hallucinated, and it was a problem, but those people were specifically seeking out AI chatbots.

Google’s AI Overview and its next version, AI Mode, change the rules. Anyone attempting to use Google for a regular search runs the risk of encountering fake, AI-slop responses, delivered and presented to you as if they were fact. Without serious scrutiny, Google Search as we know it is on its way out, replaced by something far worse, requiring greater literacy skills than before.

never throw poodle at pig fake ai hallucination example.
The Sleight Doctor / Bluesky

This latest round of AI hallucination is the perfect example of that. In one example from The Sleight Doctor, AI Overview went as far as to cite a Bible verse, from which this supposed idiom was derived. That phrase?

“Never throw your poodle at a pig.”

Leave a Comment

Your email address will not be published. Required fields are marked *