Why Technology Changes Slowly At First… And Then Very Fast



When Peter Drucker first met IBM’s visionary CEO, Thomas J. Watson, he was somewhat taken aback. “He began talking about something called data processing,” Drucker recalled, “and it made absolutely no sense to me. I took it back and told my editor, and he said that Watson was a nut, and threw the interview away.”

That was back in the early 1930s, when “computers” were usually teams of women who performed rote calculations. The idea that data could be a valuable commodity just wasn’t on anyone’s radar yet and, in fact, wouldn’t be for decades. That would take not only advancement in technology, but also a transformation in business practices.

There there were two major eras of innovation in the 20th century. The first hit its stride in the 1920s and the second, which led to the digital revolution, had its biggest impact in the 1990s. We’re now on the brink of a new era of innovation and its impact will likely be profound. Though much like Drucker back in the 1930s, we are still unable to fully grasp what is yet to come.

The First Wave – Internal Combustion And Electricity

The first era of 20th century innovation actually began in the 1880s, with the invention of the internal combustion engine in Germany and Thomas Edison’s opening of his Pearl Street Station, America’s first electricity plant. These were, in a sense, mere curiosities, akin to tech gadgets today that gain a following among a devoted group of early adopters.

Over the next few decades, things began to gain steam. Hundreds of automobile companies sprung up, including Henry Ford’s first failed ventures and his ultimately successful Ford Motor Company, which pioneered the assembly line. The “war of the currents” broke out between Edison and Westinghouse, which expanded electrical generation and brought costs down.

Still, until the 1920s, the impact on society was minimal. Cars needed infrastructure, like roads and gas stations to be built. Electricity provided cleaner and better light, but factories needed to be redesigned and work itself had to be reimagined, before it could begin to have a measurable impact on productivity.

After that, things moved quickly. The automobile transformed logistics, moving factories from the urban north ot the rural south, corner stores were replaced by supermarkets and eventually shopping malls and big box retailers. Electrical appliances, such as air conditioners, refrigerators and radios transformed everyday life. Nothing was ever the same again.

The Second Wave – The Microbe, The Atom And The Bit

The second wave of innovation began around the 1950s, but had its roots long before that. Alexander Fleming discovered penicillin in 1928. Einstein’s theories led physicists to develop the principles of quantum mechanics in the 1920s and David Hilbert’s mathematical program inspired Turing’s model of a universal computer in 1935.

Yet much like internal combustion and electricity, the implications of these discoveries weren’t clear at first. Fleming’s discovery of penicillin was not therapeutically useful and needed much further work before it became commercially available in 1945. Quantum mechanics and Turing’s “machine” were little more than theoretical constructs.

Then things began to gain steam. The first commercial computer, UNIVAC, burst into the public consciousness during the 1952 election, when its predictions outperformed human experts. That same decade saw the first nuclear power plants and the rise of nuclear medicine. Further research into antibiotics led to a “golden age in the 1960s and 70s.

Today, these earlier revolutions have largely run their course. The standard model of physics has been largely complete since the 1960s. No new classes of antibiotics have been discovered for decades. Moore’s law, the continuous doubling of classical computing power, has slowed considerably and will end shortly.

The New Era Of Innovation – Genomics, Nanotechnology And Robotics

Today we are entering a new era of innovation and, like those earlier eras, we can’t be sure what to make of it yet. We are, in a very real sense, like people a century ago who might enjoy electric light or a Sunday drive, but have no notion of things like modern retail, household appliances or the social revolutions they would drive, such as women in the workplace.

The key technologies of this new era, as best as I can tell, will be genomics, nanotechnology and robotics, which will revolutionize how we cure disease, make new products and power our economy. It’s even tougher to predict what the implications will be, but early indications are that they will be just as impactful as the two earlier eras.

Much like the digital age was built on top of electricity, the new era of innovation will be built on top of computing. New computer chips specialized for artificial intelligence, as well as completely new architectures such as neuromorphic and quantum computing, will power how we engineer genes, other organic compounds such as proteins and materials at the atomic and molecular levels. Yet how exactly that will take place is far from clear.

That leaves us in something of a technological limbo. Productivity has slowed considerably in what some are calling the Great Stagnation. These new technologies offer the promise of a better future, but we cannot be sure how much better. The first era of innovation led to a 50 year boom in productivity between 1920 and 1970. The second only resulted in 10 years of measurable productivity gains between 1995 and 2005.

What Will The Future Hold?

To understand why the future can be so murky, let’s look at quantum computing, which has the potential to be thousands, if not millions, of times more powerful than today’s computer architecture. However, far more important than the ability to do old jobs faster is the potential to do new jobs that we could never dream of before.

In the case of quantum computing that job is to simulate quantum systems, like atoms and molecules, which can help us transform fields like drug development , materials science and manufacturing. Unfortunately, scientists don’t really know what to do with the data a quantum computer produces yet, because no one has seen anything like it before.

In time, they will learn and produce new insights. Those, in turn, will allow engineers to design new products and entrepreneurs to create new business models. What will these look like? It’s far too far down the causal chain for any one to venture anything more than a wild guess, but the potential is truly staggering.

The truth is that the next big thing always starts out looking like nothing at all. Things that truly change the world always arrive out of context for the simple reason that the world hasn’t changed yet. They need to build up ecosystems around them and identify meaningful problems to solve. That takes time.

In the meantime we are mostly left to watch and wonder. Even those actively involved with creating this new future only see a small part of it. But what we can do is be open to it and connect to it. Peter Drucker may have thought Thomas J. Watson was a bit of a nut, but he kept talking to him. Today, both are considered visionaries.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com