-
SNP MSP Ruth Maguire to stand down in 2026 after cancer diagnosis - 6 mins ago
-
O.C. firefighter who feared he was paralyzed in crash walks out of rehab - 14 mins ago
-
AOC Blasts Rumored DNC Chair Contender: ‘Disease’ - 15 mins ago
-
Ukraine war likely to end more quickly under Trump, Zelenskyy says – National - 19 mins ago
-
From 13-Year Old Vaibhav Suryavanshi To 17-Year Old Ayush Mhatre: 5 Youngest Players In IPL Auction - 21 mins ago
-
Man charged with Dundee car park rape of girl, 16 - 24 mins ago
-
Internet Heartbroken After Best Friend Dogsitting Ends in Tragedy - 33 mins ago
-
Utah Utes vs. No. 17 Colorado Buffaloes Highlights | FOX College Football - 35 mins ago
-
SNP plans to cut staff at party headquarters - 38 mins ago
-
Mike Florio Has Wild Theory Regarding Tom Brady’s Broadcast Future - 46 mins ago
Google’s AI Overviews Said to Suffer From AI Hallucination, Advises Using Glue on Pizza
Google’s brand-new AI-powered search tool, AI Overviews, is facing a blowback for providing inaccurate and somewhat bizarre answers to users’ queries. In a recently reported incident, a user turned to Google for cheese not sticking to their pizza. While they must’ve been expecting a practical solution for their culinary troubles, Google’s AI Overviews feature presented a rather unhinged solution. As per recently surfaced posts on X, this was not an isolated incident with the AI tool suggesting bizarre answers for other users as well.
Cheese, Pizza and AI Hallucination
The issue came to light when a user reportedly wrote on Google, “cheese not sticking to pizza”. Addressing the culinary problem, the search engine’s AI Overviews feature suggested a couple of ways to make the cheese stick, such as mixing the sauce and letting the pizza cool down. However, one of the solutions turned out to be really bizarre. As per the screenshot shared, it suggested the user to “add ⅛ cup of non-toxic glue to the sauce to give it more tackiness”.
Google AI overview suggests adding glue to get cheese to stick to pizza, and it turns out the source is an 11 year old Reddit comment from user F*cksmith 😂 pic.twitter.com/uDPAbsAKeO
— Peter Yang (@petergyang) May 23, 2024
Upon further investigation, the source was reportedly found and it turned out to be a Reddit comment from 11 years ago, which appeared to be a joke rather than an expert culinary advice. However, Google’s AI Overviews feature, which still carries a “Generative AI is experimental” tag at the bottom, provided it as a serious suggestion to the original query.
Yet another inaccurate response by AI Overviews came to light a few days ago when a user reportedly asked Google, “How many rocks should I eat”. Quoting UC Berkeley geologists, the tool suggested, “eating at least one rock per day is recommended because rocks contain minerals and vitamins that are important for digestive health”.
Issue Behind False Responses
Issues like this have been surfacing regularly in recent years, especially since the artificial intelligence (AI) boom kicked off, resulting in a new problem known as AI hallucination. While companies claim that AI chatbots can make mistakes, instances of these tools twisting the facts and providing factually inaccurate and even bizarre responses have been increasing.
However, Google isn’t the only company whose AI tools have provided inaccurate responses. OpenAI’s ChatGPT, Microsoft’s Copilot, and Perplexity’s AI chatbot have all reportedly suffered from AI hallucinations.
In more than one instance, the source has been discovered as a Reddit post or comment made years ago. The companies behind the AI tools are aware of it too, with Alphabet CEO Sundar Pichai telling The Verge, “these are the kinds of things for us to keep getting better at”.
Talking about AI hallucinations during an event at IIIT Delhi In June 2023, Sam Altman, [OpenAI]( CEO and Co-Founder said, “It will take us about a year to perfect the model. It is a balance between creativity and accuracy and we are trying to minimise the problem. [At present,] I trust the answers that come out of ChatGPT the least out of anyone else on this Earth.”
Affiliate links may be automatically generated – see our ethics statement for details.
Source link