This article is from Big Technology, a newsletter by Alex Kantrowitz.
The demo was beautiful. At Google’s I/O developer conference this week, the company showed an experimental version of its search engine handling an almost unimaginably difficult query. Asked whether a family with kids under three years old and a dog would prefer Arches National Park or Bryce Canyon, Google scoured the internet and returned a lengthy, detailed answer. It noted that while only Bryce had paths that allowed dogs, kids might love the rock formations at Arches, and that Arches still had plenty of dog-friendly campgrounds, pullouts, and roads.
“Now, search does the heavy lifting for you,” said Google Search VP Cathy Edwards. Behind her, an A.I.-generated search response took up the full browser window.
This new search product is undeniably appealing, but it’ll likely come at a cost. When people search with Google today, they visit a bunch of websites, gather information, and synthesize it. The process is a bit of a pain, but the websites they visit depend on them to survive. By doing the “heavy lifting” itself—by using a Chatbot-like interface to save users another click—Google could leave these primary sources out of the equation, diminishing their ability to remain standalone entities, or even exist at all.
Now, an already shaky digital publishing business will prepare for the fallout. For the 2,500-plus digital publishers that Parsely covers, the analytics provider found that 29 percent of their traffic came from search last year. And now that social platforms like Facebook and Twitter are moving away from timely news content, a decline in visitors from search engines could cause existential damage.
Rasmus Kleis Nielsen, director of the Reuters Institute for the Study of Journalism, told me that search, while not without some pain and friction, has been a more stable source of traffic and partnerships for publishers than other tech platforms. “The prospect of [search engines] following social media in featuring far fewer links must be daunting,” he said, “especially for those publishers who do not have a strong, direct relationship with a loyal, returning audience.”
In some ways, digital publishers brought this moment on themselves. They created content farms and published undifferentiated stories —”What Time Is The Super Bowl?”— to win the never-ending Google traffic sweepstakes . Websites became impossible to navigate in the name of SEO. They buried recipes, wrote for amorphous audiences they didn’t care about, and lost sight of their relationships with readers. Eventually, they let search engines dictate their product, instead of making sense of it. And search engines went with it.
When ChatGPT arrived last fall, it started delivering on Google’s mission—“organize the world’s information and make it universally accessible and useful”—better, at times, than Google itself. (The problem, of course, was its pesky habit of making stuff up.) It reshaped the information on websites instead of simply sending people there. And people loved it, making it the fastest-growing consumer application ever. After OpenAI and its benefactor Microsoft threatened Google’s position with their generative-A.I. experiences, the search giant had to respond. And it did so in force this week. It’s a.I.-powered “Search Generative Experience” is still in a “labs” preview, but it’s almost certainly its future.
For publishers looking to adjust to this future, a likely reality is that their search traffic will diminish. Google and Bing do include links to their websites in their generative-A.I. products, but those links are no longer as crucial to click.
So what to do? Publishers that minimize their reliance on search traffic will be better off in the long run. Email, podcasts, and other subscription media seem poised to come out best. Ironically, the fears that ChatGPT would flood the web with crap websites seeking search traffic are likely a bit overblown since generative search would kill their business models anyway.
As for the search engines, their generative products should work exceptionally well as long as there’s content on the web to draw from. Google’s demo relied on content from the National Park Service, a tour guide website, and likely other sources across the web. It needs this content in order for it work. And without ensuring that web publishers stay solvent—either by sending them readers or some other means—it’ll have less information to draw from as it’s A.I. generates responses. Absent its primary sources, the beautiful demo, however great a user experience, might become a hollow shell.