OpenAI’s search tool already made a mistake

OpenAI just announced SearchGPT, but there was something wrong with their demo.

A green SearchGPT screen covered in static
Illustration by The Atlantic

This is Atlantic Intelligence, a newsletter in which our writers help you understand artificial intelligence and a new era of machines. Sign up here.

Yesterday, OpenAI made what should have been a triumphant entry into the AI ​​search wars: the startup announced SearchGPT, a prototype tool that can use the Internet to answer questions of all kinds. But there was a problem, as I reported: even the demo had something wrong.

In a video accompanying the ad, a user searches Music Festivals in Boone, North Carolina, in AugustSearchGPT’s top suggestion was a fair that ends in July. The dates the AI ​​tool returned, July 29 to August 16, are not the dates of the festival but the dates when its box office is closed.

AI tools are supposed to reshape the web, the physical world, and our lives—in the context of internet search, by providing instant, direct, personalized answers to the most complex queries. Unlike a traditional Google search, which brings up a list of links, a search bot will answer your question directly. For that reason, websites and media publishers fear that AI search bots will eat up their traffic. But first, these programs need to work. SearchGPT is just the latest in a long line of AI search tools that exhibit all kinds of errors: making things up out of thin air, incorrectly attributing information, mixing up key details, apparent plagiarism. As I wrote, current AI “cannot properly copy and paste from a music festival website.”


A green SearchGPT screen covered in static
Illustration by Matteo Giuseppe Pani

UpsGPT

By Matteo Wong

Whenever AI companies present a vision for the role of artificial intelligence in the future of internet search, they tend to highlight the same points: instant summaries of relevant information, ready-made lists tailored to the needs of searchers. No to point out that generative AI models are prone to providing incorrect and sometimes outright made-up information — and yet it keeps happening. Earlier this afternoon, OpenAI, the creator of ChatGPT, announced a prototype of an AI tool that can search the web and answer questions, appropriately called SearchGPT. The release is designed to hint at how AI will transform the ways people navigate the internet — except that, before users have had a chance to test out the new program, it already seems prone to errors.

In a pre-recorded demo video accompanying the ad, a fictional user writes Music Festivals in Boone, North Carolina, in August in the SearchGPT interface. The tool then brings up a list of festivals it claims are happening in Boone this August, with the first being An Appalachian Summer Festival, which the tool says will host a series of arts events from July 29 through August 16 this year. However, someone in Boone hoping to buy tickets to one of those concerts would be in trouble. In fact, the festival started on June 29 and will have its last concert on July 27. Instead, July 29 through August 16 are the dates when the festival box office will officially be closed. (I confirmed these dates with the festival box office.)

Read the full article.


What to read next?

  • The real problem with AI hallucinations: “Boldness can quickly become a liability when builders become detached from reality,” Charlie Warzel wrote this week, “or when their arrogance leads them to believe they have the right to impose their values ​​on the rest of us in exchange for building God.”
  • Generative AI cannot cite its sources: “It’s unclear whether OpenAI, Perplexity, or any other generative AI company will be able to build products that consistently and accurately cite their sources,” I wrote earlier this year, “let alone direct audiences to original sources like news outlets. Currently, they struggle to do so consistently.”

P.S.

You may have seen the viral video of Republican vice presidential candidate J.D. Vance suggesting that liberals think the diet drink Mountain Dew is racist. It sounded absurd, but the fizzy drink “retains a deep connection to Appalachia,” Ian Bogost wrote in a fascinating piece about why Vance might have been right.

-Matthew