Why the Future Is Brighter Than You Think

<a href="http://www.flickr.com/photos/frank_wuestefeld/4306107546/">Frank Wuestefeld</a>/Flickr

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

There’s been a boomlet this year in books and articles suggesting that innovation in recent decades has slowed to a trickle and economic productivity is flattening out for the foreseeable future. Peter Thiel has been pushing this meme for a while, Tyler Cowen made a splash in January with his e-book, The Great Stagnation, and Neal Stephenson nearly took the World Policy Institute offline last week with his essay, “Innovation Starvation.” Talking about our innovation drought is suddenly all the rage. But is it really true? Or is it mostly just a product of discouragement borne of several years of lousy economic performance?

Honestly, I’m not sure. But maybe it’s worth thinking out loud about this a little. The complaints mostly take two basic forms. The first I call “Where’s my jetpack?!?” and it’s pretty easily disposed of. The argument here is that back in the 1950s we thought the future would bring us flying cars, electricity too cheap to meter, and vacations on the moon. But none of that has happened. What gives?

The answer is prosaic: Forecasters in the ’50s were wrong. It’s not that the future never arrived—it’s that the future brought us different stuff than we thought we were going to get. Our lack of flying cars simply doesn’t tell us anything about the pace of innovation.

The second form of the innovation argument is more substantive. I call it the Great-Grandma Argument, and it compares innovation in the first half of the 20th century to innovation since then. Our Great-Grandma from 1900, we’re told, would be totally flabbergasted if she were whisked to the year 1950. So much new stuff! But our mothers and fathers from 1950? If they were magically transported to 2011, they’d recognize almost everything they saw. Yawn.

There’s obviously something to this. The end of the 19th century and the first half of the 20th century was an astonishingly fertile period: lightbulbs, radios, autos, airplanes, refrigerators, penicillin, TVs, air conditioners, the telephone, and much more. The period since then has seen the digital computer and….that’s about it. Things like cell phones and flat screen TVs are mere technological improvements, not genuinely new inventions.

Which is true enough. But although I’ve often thought about innovation this way too, the more I’ve chewed it over the more I’ve decided that it misses something. Most of the best known inventions of the early 20th century were actually offshoots of two really big inventions: electrification and the internal combustion engine. By contrast, the late 20th century had one really big invention: digital computers. Obviously two is more than one, but still, looked at that way, the difference between the two periods becomes a bit more modest. The difference between the offshoots of those big inventions is probably more modest than we think too. Just as we once made better and better use of electrification, we’re now making better and better use of digital computing. And to call all these computing-inspired inventions mere “improvements” is like calling TV a mere improvement of radio. These are bigger deals than we often think. We have computers themselves, of course, plus smartphones, the internet, CAT scans, vastly improved supply chain management, fast gene sequencing, GPS, Lasik surgery, e-readers, ATMs and debit cards, video games, and much more.

Wait a second. Video games? Am I joking? No indeed. Give some thought to just what innovation and productivity gains are for.

Initially, of course, they help provide a better basic standard of living. But what happens after that? Once you have a certain level of food, shelter, sanitation, and so forth, you start adding nonessentials. Basically, luxuries, whether you call them that or not. Entertainment. Vacations. Restaurant meals. Fancier clothes, faster cars, and bigger houses.

That’s what the first half of the 20th century brought to the developed economies of the world. Can computers and their offspring make the same claim? I think they can, which is why video games are no joke. If, instead of bigger cars and better vacations, we get video games, Facebook, blogging, Hulu, and iTunes, is this any less of a productivity improvement?

I don’t see why. Above a basic level, the whole point of productivity improvements is to provide us with more fun. Facebook may show up as a smaller contribution to GDP than a nationwide chain of movie theaters, but so what? If you’d rather spend four hours a week on Facebook than fours a week going to movies, then Facebook has improved your life as much as movie theaters improved your grandparents’. If you prefer Farmville to a week in Hawaii, then Zynga has improved your life as much as the 707 improved your parents’.

But that’s not all. There’s something else that frequently warps our view of innovation too: exaggerating the impact of flashy improvements and discounting the importance of boring ones. In “Innovation Starvation,” for example, Stephenson, who’s the same age as me, looks back nostalgically at the Apollo program and then says:

In early 2011, I participated in a conference called Future Tense, where I lamented the decline of the manned space program, then pivoted to energy, indicating that the real issue isn’t about rockets. It’s our far broader inability as a society to execute on the big stuff.

Maybe. But beware of rose-colored hindsight. In current dollars, the Apollo program cost less than $20 billion per year. The atomic bomb program cost less than $4 billion per year. Both of these were dramatic accomplishments, to be sure, but although they might have seemed like “big stuff,” they actually weren’t—neither by cost standards nor by productivity enhancement standards. Compared to replacing our fossil fuel infrastructure, they were trifles.

The flip side of this is that it’s all too easy to overlook backroom process improvements. Looking at the first half of the 20th century, cars and radios and TV get all the attention, but the moving assembly line was probably more important than any of them. In the second half, Facebook and smart phones are the attention-getters, but the containerization revolution was far more important than either one. Likewise, Walmart revolutionized the retail industry in the ’90s via its logistics and supply chain innovations, but hardly one person in a hundred knows it. You could put the recent revolution in global finance in this category as well (though we obviously still have a few wee wrinkles to iron out of that one.) Computerization may be changing our daily lives, but it’s arguably changed backroom operations even more, and will continue to do so.

I’ve already gone on way too long, so I’ll wrap it up here. Just keep in mind three things when you read about innovation droughts. First: The key to innovation is the exploitation of really big inventions. Computerization is as big as it gets, and it has a much longer tail than electrification. We’re not even close to mining its full potential yet. Second: Above a certain level, the goal of productivity gains is to provide us with more fun. It doesn’t matter whether that fun comes in physical or virtual form, or how it shows up in national accounts. Third: Don’t exaggerate past innovations just because they were exciting or dramatic, and don’t discount current innovations just because they’ve happened behind the scenes or seem sort of prosaic. Hip replacements may not be as big a mobility improvement as the automobile, but they’re a bigger deal than you think—as you’ll realize someday if you have to get one because you can’t walk more than a hundred feet at a stretch with your original equipment.

We’re going through a tough stretch right now. But my best guess is that there are two big culprits here, and neither one of them is a fundamental slowdown in innovation. The first is that, even after 30 years, we still haven’t figured out how to effectively manage and regulate the post-union, post-globalization, post-Bretton Woods economy. This is a relatively short-term kind of problem, but there are still a lot of bumps left on that road. The other is that we’re simply working on some really hard problems—much harder than we anticipated when we first dived into them. Artificial intelligence is really hard. Finding a source of energy that’s cheaper per BTU than oil is really hard. Gene sequencing—along with a deep understanding of how human biology works—is really hard. But that doesn’t mean innovation has been snuffed out. It just means we’ve set our sights really high. That’s no bad thing.


Mother Jones was founded as a nonprofit in 1976 because we knew corporations and billionaires wouldn't fund the type of hard-hitting journalism we set out to do.

Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2023 demands.

payment methods


Today, reader support makes up about two-thirds of our budget, allows us to dig deep on stories that matter, and lets us keep our reporting free for everyone. If you value what you get from Mother Jones, please join us with a tax-deductible donation today so we can keep on doing the type of journalism 2023 demands.

payment methods

We Recommend


Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.


Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.