Are venture capitalists ruining Silicon Valley? Has Silicon Valley jumped the shark? Wall Street Journal tech writer Christopher Mims thinks so. “The entire Bay Area appears to have given up on solving anything but its own problems,” he writes. Instead of revolutionizing the world with basic research in safety, energy, and medicine, Mims writes, venture capitalists are unimaginatively chasing advertising dollars and focusing exclusively on the first-world segment of twentysomething yuppies. Taking exception, Netscape founder and VC angel Marc Andreessen tweeted that the tech press was simply ignoring the bigger story by focusing exclusively on the self-celebrating consumer tech scene in San Francisco. He also stressed that communication apps remain crucial: “Communication tech/apps including Internet are the foundation for everything else we’ll do for 100 years.”
Still, Mims has a point. Despite Andreessen’s enthusiasm, the sort of information revolution that the Internet spawned in the 1990s has leveled off. We may have moved to mobile and we may be on social networks, but these aren’t the same kinds of revolutionary shifts that occurred when the population first became networked in the 1990s. And by the numbers, Mims is right: VCs have taken over Silicon Valley, putting the focus on money and financing, seeking a quick buck on advertising. But VCs are not the disease afflicting Silicon Valley—they’re only a symptom. The bigger problem is our government and our culture. And the solution does not lie in the supposed golden days of the 1990s, but in the 1950s and ’60s.
VCs, and consequently the startups they fund, think it’s better for their purposes to advertise today rather than to innovate tomorrow. VC culture has made the numbers more important than the tech. Every smart CEO and CTO I’ve known has viewed VC money as a deal with the devil: In exchange for the money, you commit to constant interference and endless pressure to deliver the goods earlier rather than better. You should only take the money after you’re strong enough to hold your ground—or, more often, when you have no other choice. When I read about most startups in the press, it feels like going through descriptions of new network shows: They regurgitate variations on American Idol or Modern Family and cancel shows within months if they haven’t generated enough buzz or ratings for advertisers.
The sort of world-improvement projects Mims wants, where they do exist, are not coming from Silicon Valley startups but from established giants who can afford to blow the cash on risky moonshots. You can think of Google or Amazon as HBO, which used the huge success of The Sopranos to take chances on The Wire, which would have been canceled by any network long before it finally caught a wave of hype in its fourth season and people realized it was the best and most uncompromising show around. Now, HBO also gave us garbage like John From Cincinnati and Lucky Louie, but HBO conceives of itself as a joint risk pool to finance projects that were unlikely to prove self-sufficient. Gmail and Google Maps are The Wire—they made only a tiny fraction of Google’s overall revenues for years, yet persisted and didn’t devolve into advertising spam because Google could fund them with Sopranos Web search money. (In case you’re wondering, Google Buzz was Lucky Louie and Google Wave was John From Cincinnati. Which makes Google Glass True Detective—everyone talks about it, but no one actually likes it.)
Consequently, Google has a host of ambitious plans to connect the third world, develop wind power, and make self-driving cars. (Full disclosure: I used to work at Google, and my wife still does.) Google can self-finance these projects via the money it gains from advertising. Though less ambitious, Amazon occupies a similar position. It can afford to move into shipping and cloud hosting and delivery drones (or not) because its core business makes it well nigh indestructible, much to publishers’ dismay. And the Bill and Melinda Gates Foundation qualifies as one of the more aggressive public works initiatives around, even if it is not focused on tech per se.
VCs do not follow the HBO model, and their vision is as blinkered as the networks’. Andreessen is right to criticize the tech press for encouraging the VC mentality: “Founders of non-consumer-tech startups routinely find same pundits mounting criticism have little interest in hearing about other domains,” he tweeted. Companies are profiled based on how much money they’ve raised, which billionaires are interested in them, how big their company is, and, of course, their revenue. It’s much easier to report these numbers than to write up the possibilities and drawbacks of new ideas. Mims cites the drug-simulation company Cellworks as one example, while Hampton Creek Foods is trying to replace the egg. But these companies lack the buzz that comes out of San Francisco tech parties and sure-thing advertising optimizers. Crazy projects only get attention when they’re done by Google or Amazon.
It seems strange and stifling that the biggest innovations seem to be emerging from established giants instead of visionary startups. What happened to all the startup innovation of the 1990s? The first wave of startups in Silicon Valley, from Netscape to Yahoo to Hotmail to YouTube to Google, didn’t create the Internet and Web—they just commercialized it. They were standing on the shoulders of giants: universities, research labs, and the U.S. government, particularly the Defense Department.
Prior to the burst of consumer-oriented Internet startups in the mid- to late-’90s, the tech industry was much more low-profile and more tightly coupled to academia and government. (You can read more about the prehistory of Silicon Valley here.) A Cold War–era collaboration between the government, academia, and industry designed and built the Internet through a long and often tedious process of meeting in rooms and debating standards. The pioneering Web browser, Mosaic, was developed by the federally funded National Center for Supercomputer Applications at the University of Illinois at Urbana-Champaign before being spun off into Netscape. This was the basic research, and it was over by the mid-1990s, before modern-day Silicon Valley had really begun. Atop that solid infrastructure, startups colonized and commercialized the wide-open Web and its unexploited resources. But the gold was already there.
In contrast, true startup companies like Apple and Microsoft, which lacked those ties to academia and government, innovated only in the consumer sector: They did things faster, cheaper, and more scrappily than IBM, DEC, HP, Intel, and research institutions such as Bell Labs and Xerox PARC. Apple’s main innovation—the graphic user interface of the Macintosh—was largely drawn from basic research done at Xerox PARC. Apple didn’t build it but just designed hip hardware for it—an achievement, to be sure, but nothing that will network the world or solve world hunger. Silicon Valley never did make those sorts of innovations.
But we have not seen innovative technologies emerging from the academic research sector in recent years, either, and there are two reasons. First, there is a brain drain: Many of the top minds in computer science have headed to industry for cushier and often more interesting jobs at Google or Microsoft or wherever. Second, the U.S. has ceased to make the investment in education and research that it did during the Cold War. The groundwork for innovation has eroded. In the 1950s and 1960s, all sorts of crazy ideas could get funded by the Defense Department with few strings attached; now, researchers at universities or government contracting firms like Raytheon spend a huge percentage of their time filling out grants begging for a piece of a shrinking government pie.
Look at the big picture, and it is no wonder why the conditions for fundamental innovation have cratered. The U.S. economy has been near stagnant for years. Neither political party is advocating for wide-scale general research with uncertain applications. Our government is practically nonfunctional, wasting billions on government shutdowns and debt ceiling arguments, while it can’t even build a simple health care website without an organizational crisis. Andreessen complains about the tech press, but consider the top-read business daily in America: On the same day that Mims’ piece ran, an op-ed appeared in the Wall Street Journal, “The Case for Crony Capitalism,” which urged deregulation as the solution to all ills; meanwhile, another celebrated cutting unemployment benefits. The front page is concerned with stock prices and Obamacare’s failings. With this dreary thinking the norm across the government and the media, blaming Silicon Valley VCs for not being visionaries is like blaming termites for infesting the rotten wood of your unmaintained house. They’re just doing their job.
As Mims suggests, the sort of widespread research and investment that brought us the Internet and the Web stands the greatest chance of restoring some of America’s reputation for technological innovation. But a look at the history shows that VCs are not the problem, nor are startups the solution. Those who forget the past are condemned to be unable to repeat it.