Newsweek’s April 3 cover story finally gives checkout-stand placement to “Web 2.0,” everyone’s favorite new tech buzzword. You’ve probably seen the phrase before—in the blogosphere, in the New York Times’ coverage of dot-com executives seeking a second act, or in Wired’sprofile of Tim O’Reilly, the tech publisher who envisions a Net that entices us to contribute as well as consume. But like its predecessors, the Newsweek story pussyfoots around the most important question about Web 2.0: What the hell is it?
O’Reilly began touting the phrase in 2003. But even the man who holds an annual Web 2.0 Conference runs into trouble when pressed for a snappy, dictionary-style definition:
Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications are those that make the most of the intrinsic advantages of that platform: delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an “architecture of participation,” and going beyond the page metaphor of Web 1.0 to deliver rich user experiences.
Got all that?
The problem isn’t O’Reilly’s hard-to-understand enthusiasm. It’s that other people use Web 2.0 to mean different, often conflicting things. There are at least three incompatible definitions floating around. For O’Reilly, Web 2.0 is a mishmash of tools and sites that foster collaboration and participation. Flickr, YouTube, MySpace, Wikipedia, and the entire blogosphere are examples. You don’t buy stuff from them—you use them to share digital assets and link up with other people. Podcasting is a Web 2.0 technology, because it’s almost as easy to create a podcast as to listen to one. The more time you put into a Web 2.0 site—tagging photos, posting comments, editing wiki entries—the better it works for everyone. One blogger wisecracked that, like Soylent Green, “Web 2.0 is made of people!”
Web developers use Web 2.0 a second way, to refer to the software and languages used to build the gee-whiz features of these sites. Ajax, tag clouds, and wikis are basic components of many collaborative sites. In general, Web 2.0 tools are free, easy to master, and easy to interconnect. Google Maps + Wikipedia = Placeopedia! But the definition runs aground when Web 2.0 technologies power Gap.com, an impressive but collaboration-free shopping experience. “Ajax without participation doesn’t make for Web 2.0,” O’Reilly explained to me. A pitch-weary investor told Newsweek, “When people say to me it’s a Web 2.0 application, I want to puke.”
A third definition gets thrown around in Silicon Valley. A “Web 2.0 play” is a bid to make money by funding a bring-your-own-content site. It’s a long-shot but low-risk investment that could become the next Google. Or at least the next thing Google buys. No warehouses full of inventory, no sprawling staff, no NASA-grade supply chain management systems. Dodgeball and Digg are good examples of popular sites started on a shoestring.Google snapped up Dodgeball last year; Digg’s imminent acquisition is a foregone conclusion among valley wags. Newsweek quotes a Yahoo! exec on why they bought Flickr: “With less than 10 people on the payroll, they had millions of users generating content. That’s a neat trick.” But buyout-hungry entrepreneurs now slap the 2.0 moniker willy-nilly on mobile services and browser applications that are neither built on Ajax nor made of people.
Beyond that, publicists and self-promoters invoke Web 2.0 whenever they want to tag something as new, cool, and undiscovered—”This could be a big story for you, Paul!” That kind of hucksterism is what sends editors reaching for their red pens. Prior to Newsweek’s feature, the term “Web 2.0” only appeared in national publications wrapped in protective quote marksor cordoned off behind phrases like “what some in Silicon Valley are calling.” Before Newsweek released the word, Kong-like, from its restraining quotes, they braced readers with a long disclaimer: “The generic term for this movement, especially among the hundreds of new companies jamming the waiting rooms of venture-capital offices, is Web 2.0, but that’s misleading. …”
The salesmanship that surrounds Web 2.0 is the key to understanding what the phrase really means. The new generation of dot-com entrepreneurs confers 2.0 status upon everything because they missed out on the boom times of Web 1.0. They want a new round of buzz and bling for themselves, and who can blame them? Crawling your way up the ladder at eBay is the loser track. A winner creates eBay 2.0. And they’re right to be stoked about the Web again. Investors are emerging from hibernation, tech jobs are coming back from Bangalore, and online services have evolved to the point where Wired’smost preposterous scenarios from 10 years ago now look mundane.
But there’s no way this bubble—if there is a new bubble—will be anything like the last one. No one will forecast that the Dow will reach 30,000. (Well, almost no one.) No one will claim that Web 2.0 makes the concept of the nation-state obsolete. And no matter how much money everyone makes, no one will throw a $10 million launch party for at least the next 50 years.
The only way that 2.0 fits the current Web is if you use the original meaning. It’s a technology upgrade, one that finally does what they’d said version 1.0 would do. For a contrived neologism, at least it’s catchy. Compare Web 2.0 to other attempts to brand the zeitgeist: “Do It Together,” “The Read/Write Web,” “Small Pieces, Loosely Joined,” or Newsweek’s pick, “the Living Web.” Imagine asking your boss for $3,000 to go to the Living Web Conference.
Still, the purpose of words is to convey meaning. Calling Technorati a “Web 2.0 search engine” sounds sharp but explains nothing. If you can only describe a word by examples, skip to the examples instead: “It’s a search engine for blogs that uses tags, like Flickr.” There’s an easy way to describe today’s online culture of participation without invoking Web 2.0 at all. Just call it the Internet. That way, everyone will know what you mean.