We wanted flying cars, instead we got 140 characters” – Peter Thiel
This may seem like an odd time to question the ingenuity of the technology sector. With news of artificial intelligence (A.I.), automation and machine-learning swirling around, the technology space looks like a bright spot in a dull economy. Mega venture funds and frothy startup valuations seem to bear out this optimism.
Yet, fifty years ago we imagined a very different future. Stanley Kubrick’s classic 2001: A Space Odyssey released during the heyday of the space program assumed that by the early part of this millennium, space travel would be routine; machines would be terrifyingly intelligent; and computers and humans would have intelligent conversation.
By that yardstick, the present is decidedly underwhelming. Our phones have become smarter, interfaces slicker, and communication faster. But other predictions haven’t come to pass.
Most space programs are limited to unmanned expeditions. After the lull of the past few decades, space travel is in the news again. But even Elon Musk, the eternally confident founder of Space X, expects to send a manned expedition to the moon only in 2023, the first lunar journey by humans since 1972.
Driverless vehicles are not ready to replace humans yet. The vagaries of human traffic are just too much for these ordered systems. Robots, which are used extensively in manufacturing and distribution, haven’t been able to adapt to routine human tasks. Automated assistants like Siri are great for one-off tasks, but it’s near impossible to hold a conversation, especially with a thick accent.
Has the tech sector fallen short or were our collective expectations unrealistic?
Dog walking, not cars flying
There are around 290 unicorns, or unlisted startups valued at more than $1 billion, across the world. In theory, these unicorns represent the best of our ideas. Investors seem to agree, ploughing over $980 billion into these companies.
But an analysis of the companies shows startups working on truly innovative problems constitute less than a tenth of these companies.
Most of the funds (and noise) is soaked up by startups engaged in one of two things. The first, on-demand and e-commerce platforms, find new ways to order things without leaving your couch. The other class of startups, social media and entertainment firms, are focused on designing content that keeps you in that couch for as long as possible.
Take on-demand firms. Uber has improved the taxi-hailing experience; WeWork has made renting office space easy; and with Swiggy, it’s nice not to have to walk to the restaurant for food. But we are still riding in the same taxi, working in the same office, and eating from the same restaurant as before. The order-to-fulfilment process has been streamlined and the supply more closely matches demand. But this isn’t innovation that has fundamentally altered how we live, work, or play. It’s just made it incrementally better.
The desire to disrupt any activity with on-demand or online substitutes can sometimes veer into the absurd. Early last year, SoftBank’s Vision Fund invested $300 million into Wag, a startup that lets you book a dog walker. (The Vision Fund’s stated goal is to “invest in businesses and foundational platforms that will enable the next age of innovation”.)
With social media companies, the innovation argument is even weaker. Facebook and Google evangelised the cult of connectivity. But after the initial productivity gains from improved communication, watching YouTube videos, sharing Google photos or browsing Facebook are not growth-accelerating. On the contrary, the fatigue induced by aimless scrolling on Twitter or WhatsApp are a drain on productivity.
For an industry that fetishizes disruption and innovation, this lack of creativity is disappointing. One reason for this state of affairs could be financial. The computer scientist Jaron Lanier has argued that the monetization model driving the internet has created perverse incentives for tech companies.
From the early days of the internet, we assumed that most of the content and services should be free. But the desire to democratize access to information came at a cost —the advertising-supported business model. The largest tech companies today—Facebook and Google—rely on advertising for a majority of their revenue. Even Amazon has now got into the game, by selling pixels on its site to eager sellers.
In the push to generate more advertising revenue, these companies need to continually find ways to keep users addicted (or engaged) to their platform. And as users have become smarter, the algorithms have become even more intricate.
Today, some of the largest employers of A.I. and machine-learning talent are social media platforms. ByteDance, the creator of Tik Tok and the most valuable startup on the planet, employs legions of AI and machine-learning engineers. Tik Tok does a great job of curating and customizing content, based on what’s most likely to appeal to a user. But peel away the glamour and the product is still a social network for amateur music videos.
A generation of our best engineers are spending their productive output on building behavioural nudges that manipulate users to stay on their platforms. Moving to a model where users pay for what they use could help us focus our efforts on ideas that meaningfully improve society.
Out of ideas?
But the innovation deficit could be more systemic. The economist Robert Gordon, a vocal proponent of this view, has argued that the golden age of technological innovation is behind us. To support his thesis, Gordon points out that before 1750s, there was virtually no economic growth. Inventions like the steam engine, railroads and telephone drove economic growth over the past 250 years. This growth spurt was an anomaly and there isn’t any reason to blindly believe we’ll continue to find new ways to grow.
His analyses suggests that most of the technology changes that improved our quality of life had taken place by the first half of the twentieth century. These include the internal combustion engine, electricity, air travel, indoor plumbing and kitchen appliances. We have just been incrementally improving them ever since.
The Boeing 707 was introduced in 1958. Since then, air travel hasn’t gotten that much faster. The kitchen in an average Western home largely resembles the kitchen from 1950s, albeit with more energy efficient appliances. Using productivity data for the past few decades, Gordon concludes that the major breakthroughs of recent times—personal computing and the internet—did improve our lives by automating tedious manual tasks. But the gains aren’t nearly as dramatic.
In particular, since the 2000s, most of the technological innovation has centered on entertainment and communication devices that are smaller and smarter. While our lives are certainly a lot richer, improving consumer utility isn’t on the same scale as inventing electricity. Finally, the growth of developing economies like India and China isn’t due to new innovation; these countries are merely catching-up and adopting technologies that have already been invented.
The “end of innovation” thesis is appealing but it doesn’t explain why markers for innovation show an upward trend. The number of patents filed have continued to rise in the past few decades and adoption time for newer technologies has gotten shorter.
Instead of no more ideas, perhaps ideas are just getting harder to find. As another economist, Tyler Cowen suggests, in the past few centuries, we have plucked all the low hanging fruit. Innovation from here on out will get harder.
There is good evidence to support this view. Nicholas Bloom, an economist at Stanford University, and his colleagues studied research productivity across a variety of sectors. Their findings suggest that the cost of finding new ideas has increased, both financially and in the number of researchers needed. In aggregate, they compute that research productivity falls in half every 13 years. To maintain a constant growth in per capita GDP, the US must double the amount of research effort every 13 years.
Simply put, coming up with new ideas is getting more expensive.
Moore’s law highlights this tension well. A central tenet of the computer industry for the past fifty years, the law states that the number of transistors on a chip doubles every year while the costs are halved. Despite predictions of its impending end, the trend has continued to hold— making computers faster and cheaper. But it comes at a cost. The number of researchers required to double chip density today is more than 18 times larger than the number required in the early 1970s.
The lack of any obvious white spaces is also evident in the startup world. Growing a business is getting more expensive and funding rounds are trending upwards. In 2008, less than 4% of series C rounds were over $50 million. By 2017, close to 20% of all series C rounds were above that number. Almost a third of all series D rounds in 2017 were above $50 million, up from just 4% in 2008. Startup founders are also skewing older, signalling that experience and specialization are needed to solve the harder problems (Chart 2).
The rise of mega-large funds, of which the Vision Fund is the most visible example, is another indication that investors have fewer ideas on which to place their bets. In a marketplace with limited ideas, the few startups that achieve scale can command inflated valuations.
Bye bye jet packs?
The futurist Roy Amara stated that human tendency is to overestimate the effect of a technology in the short run and underestimate the effect in the long run. The first industrial revolution gave us steam engines and railroads. The second industrial revolution was built on the commercialisation of electricity and the internal combustion engine. In both those case, it took almost 50-60 years for their effects to work through the system.
Perhaps something similar is at play. We are in the lull phase before the next wave. History suggests that a series of incremental innovations ultimately lead to major advances. With A.I, big data and automation, we are perhaps building the plumbing for tomorrow’s innovation. Extrapolating from today’s technology perhaps limits our comprehension of what could be possible.
But in the interim, we have to be content with the little things. As one commentator put it memorably, “SF [San Francisco] tech culture is focused on solving one problem: What is my mother no longer doing for me?”.
Mint short story
The past two decades have seen tech firms burst on the economic landscape, turning ‘innovative’ ideas into businesses. With new-age tech such as AI and ML at their disposal, they have garnered skyrocketing valuations as well.
The ‘innovation’ achieved in recent years appears diminutive, compared with the giant leaps enterprises took some 200 years ago that fundamentally changed the way we live, travel, communicate and eat today.
We are probably in the lull phase before the next wave. History shows that a series of incremental innovations ultimately lead to major advances. Perhaps something similar is at play in the world of tech and innovation.
Shailesh Chitnis is head of product at Compile Inc., a data intelligence company