The new “AI Overview” function of Google seems to be producing erroneous and sometimes harmful information above certain search results. It aggregates results gathered by artificial intelligence.
Users on social media swiftly noted that the tool, which replaced organic search results when it was made public earlier this month, was producing some questionable answers. Glue the cheese into it, the search engine, which commands more than 90% of the market, advised on how to prevent cheese from slipping off a pizza.
But the worst things the feature can produce aren’t dishes that taste bad. A prompt on cleaning a washing machine was answered by Google, according to one X user, with a potentially lethal mustard gas recipe (a danger that the AI feature verified with a search).
And although many big language models—like Chat GPT from OpenAI and Llama from Meta—scrape the internet for training data without getting permission, Google’s overview function seems to be somewhat obvious in its copying. One X user claims that Google used the phrase “my kid’s favorite” to describe a smoothie recipe, taking it from a recipe page that had a comparable dish.
Creating the feature in reaction to Bing CoPilot from Microsoft and Open AI, Google says the tool helps smaller websites by bringing their content into results and reaching more searchers.
“People are visiting a greater diversity of websites for help with more complex questions with AI Overviews,” said a feature announcement. “We notice that more people click on the links in AI Overviews than they would have if the page had looked like a standard web listing for that query.”
Critics fear that by really preventing click-through to the original content providers that Google sources, the AI-generated results may eventually starve the articles that AI Overview depends on. Publisher-representing group News Media Alliance demanded stronger action against AI models that train on unapproved material.
“As shown by marketplace agreements, AI businesses understand the importance of this content and depend on high-quality material to train their algorithms. Alliance President and CEO Danielle Coffey stated in a statement on legal action against OpenAI, “It is illegal for AI companies to continue using publishers’ valuable content without paying due compensation.”
Underneath the AI-generated results, Google discloses, “generative AI is experimental,” yet the findings nonetheless present a problem at a time when reliance on computer-generated content is increasing.
Take the AI’s own advise and try a search engine that really “prioritizes what you’re looking for without bias” if you’re looking for substitutes for the search engine of the massive tech company Alphabet.