A Google search for “miserable failure” in 2005 brought up the official biography of President George W. Bush on the White House website as the top result.
This wasn’t a political stunt on the part of tech companies, but more of a prank — early internet jokers may remember the “Googlebombing” hilarious stunts of the 2000s, the most famous of which were insults to Bush.
Google bombing happened when trolls linked web pages (such as a biography of Bush) to specific pieces of text on their own sites (such as “miserable failure” in Bush’s case). With enough instances, the search engine’s algorithms misinterpreted those terms as generally linked. This produced a variety of amusing results: a Google search for “liar” and “poodle” brought up the webpage of then-British Prime Minister Tony Blair, while “dangerous cults” brought up the website of the Church of Scientology as the first result.
Google tweaked its algorithm to prevent further Googlebombs in 2007. But the specter of the Googlebomb is once again haunting the tech giant, this time due to Google’s own technical failings.
After Google rolled out its AI-powered search summaries in May, users quickly noticed the strange, erroneous results these summaries sometimes delivered. When asking Google about the health benefits of running with scissors, it replied that the exercise is a great aerobic workout that “improves pores and builds strength.” In another query, Google recommended the health benefits of rocks, seemingly referencing a satirical article in The Onion.
“Eating the right rocks can potentially be beneficial to your health as they contain minerals that are important for your body’s health,” Google’s AI summary said in response to a reporter’s question.
“Pizza Glue” is still going strong
But nothing set the internet abuzz quite like this suggestion from AI Overview to “mix about 1/8 cup of non-toxic adhesive into your sauce” to stop cheese from sliding off your pizza slices.
A Google spokesperson dismissed the erroneous results, writing at the time that “the examples we saw were typically highly unusual search queries and not representative of most people’s experiences,” adding that the “vast majority” of AI summaries provide high-quality information along with links that allow searchers to dig deeper into their search.
But even as Google publicly expressed confidence in its new AI tools, it quietly began reducing their visibility: According to research from content marketing platform BrightEdge, Google gradually reduced the proportion of AI summaries shown in search results from 84% to 15%.
A Google spokesperson disputed the data, noting that the numbers differ from those the company has seen, and added that this was likely because BrightEdge was looking at a narrowed set of queries that weren’t a representative sample of Google search traffic, including users who have opted out of AI summaries.
One example that still occasionally shows up in AI summaries is the internet sensation that resulted from mixing glue into pizza. The Verge Recently, it was reported that if you ask Google how much glue you should put on your pizza, you get the same results, but this time citing a Business Insider article about the whole fiasco.
So it seems that the more journalists write about stupid AI profiles, the more they encourage the algorithms to produce the same wrong results. It’s a kind of self-fulfilling feedback loop, reminiscent of the absurdity of the Google-bombing era, except here the only person at fault is Google itself.
when luck Journalists tried a variety of searches, including pizza, cheese, and glue, but no AI-generated summaries were shown, which could mean Google noticed our continued failures and quickly adjusted its platform.
A Google spokesperson said the query continues to show up in many searches, but that edits to the technology are underway.
“We continue to improve how and when we show you the AI Summary to make it as helpful as possible, including technical updates to improve the quality of our responses,” they said. luck.