5 min (1403 words) Read

Download PDF

Building Taller Ladders

Technology and science reinforce each other to take the global economy ever higher

Joel Mokyr

In recent years, many economists have questioned the ability of technological progress to keep propelling the economy forward despite declining population growth and rising dependency ratios (Gordon 2016). According to those in this camp, the low-hanging fruit have mostly been picked, and further advances will become increasingly difficult (Bloom and others 2017).

Others would counter that science allows us to build taller and taller ladders to reach ever-higher-hanging fruit. Based on rapidly improving scientific insights, technological breakthroughs still have the potential to change life in the foreseeable future as much as they did in the century and a half since the US Civil War, proponents of this view contend.

Why is it plausible that scientific progress will continue to advance? Technological progress does not just affect productivity directly; it also pulls itself up by the bootstraps by giving science more powerful tools to work with. Humans have limited ability to make highly accurate measurements, to observe extremely small objects, to overcome optical and other sensory illusions, and to process complex calculations quickly. Technology consists in part in helping us overcome the limitations that evolution has placed on us and learn of natural phenomena we were not meant to see or hear—what Derek Price (1984) has called “artificial revelation.” Much of the 17th century scientific revolution was made possible by better instruments and tools, as exemplified by Galileo’s telescope and Hooke’s microscope.

Scientific progress in the modern age was similarly dependent on the tools at the disposal of researchers. A combination of improved microscopy and better lab techniques made possible the discovery of the germ theory, arguably one of the greatest medical advances of all time. In the 20th century, the number of examples that demonstrate the impact of better instruments and scientific techniques multiplied. One of the greatest heroes of modern science is X-ray crystallography. The technique has been instrumental in discovering the structure and function of many biological molecules, including vitamins, drugs, and proteins. Its most famous application was no doubt the discovery of the structure of the DNA molecule, but its use has been instrumental in 29 other Nobel-Prize-winning projects.

Of the traditional tools in use in our age, the microscope is still one of the most prominent, as it is basic to the ubiquitous tendency toward miniaturization—that is, to understand and manipulate the world at smaller and smaller levels. Scanning tunneling microscopes invented in the early 1980s started research at the nanoscopic level. The more recent Betzig-Hell super-resolved fluorescent microscope, whose developers were awarded the Nobel Prize for chemistry, is to Leeuwenhoek’s microscope what a thermonuclear device is to a firecracker. The same can be said for telescopy, where the revolutionary Hubble telescope is soon to be replaced by the much more advanced James Webb space telescope.

Two powerful scientific tools that have only recently become available and that represent complete breaks with the past are fast computing (including practically unlimited data storage and search techniques) and laser technology. Both, of course, have found innumerable direct applications in the production of capital and consumer goods. The impact of computers on science has gone much beyond analyzing large-scale databases and standard statistical analysis: a new era of data science in which models are replaced by powerful mega-data-crunching machines has arrived. Powerful computers employ machine-learning algorithms to detect patterns that human minds could not have dreamed up. Rather than dealing with models, regularities and correlations are detected by powerful computers, even if they are “so twisty that the human brain can neither recall nor predict them” (Weinberger 2017, 12).

But computers can do more than crunch data: they also simulate, and by so doing, they can approximate the solution of fiendishly complex equations that allow scientists to study hitherto poorly understood physiological and physical processes, design new materials, and simulate mathematical models of natural processes that so far have defied attempts at closed-form solution. Such simulations have spawned entirely new “computational” fields of research, in which simulation and large data processing are strongly complementary in areas of high complexity. Historically some scientists dreamed of such a tool, but it is only the most recent decade that will have the capability to do this at a level that will inevitably affect our technological capabilities and hence affect productivity and presumably economic welfare.

With the advent of quantum computing, computational power in many of these areas may increase by a substantial factor. By the same token, artificial intelligence, while still the source of much concern that it will replace educated knowledge workers and not just routinized jobs, could become the world’s most effective research assistant, even if it will never become the world’s best researcher (Economist 2016, 14).

Laser technology is an equally revolutionary scientific tool; when the first lasers were developed, it was said, its inventors thought it was a technique “in search of an application.” But in the 1980s, lasers were already used for cooling micro samples to extraordinarily low temperatures, leading to significant advances in physics. Nowadays, the deployment of lasers in science has a dazzling range. One of its most important applications is laser- induced breakdown spectroscopy, an astonishingly versatile tool used in a wide range of fields that require a quick chemical analysis at the atomic level, without sample preparation. Lidar (light radar) is a laser-based surveying technique that creates highly detailed three-dimensional images used in geology, seismology, remote sensing, and atmospheric physics and recently helped radically revise upward our estimates of the size and sophistication of pre-Columbian Maya civilization in Guatemala. But lasers are also a mechanical tool that can ablate (remove) materials for analysis. For laser ablation, any type of solid sample can be ablated for analysis; there are no sample-size requirements and no sample preparation procedures. And laser interferometers have been used to detect the gravitational waves Einstein postulated, one of the most sought-after discoveries of modern physics.

Century of biology

Yet there is far more. As Freeman Dyson has remarked, if the 20th century was the century of physics, the 21st century will be the century of biology. Recent developments in molecular biology and genetics imply revolutionary changes in humans’ ability to manipulate other living beings. Of those, the ones that stand out are the decline in the cost of sequencing genomes at a rate that makes Moore’s Law look sluggish by comparison: the sequencing cost has declined from $95 million per genome in 2001 to about $1,250 in 2015.

Especially promising is the technique to edit a base pair in a genetic sequence, thanks to recent improvements in CRISPR Cas9 techniques. The other is synthetic biology, which allows for the manufacturing of organic products without the intermediation of living organisms. The idea of cell-free production of proteins has been around for about a decade, but only recently has its full potential become known to the public, even if its realization is still years away.

Symbiotic relationship

Ecclesiastes notwithstanding, there is much under the sun that is entirely new. If the history of the first two industrial revolutions was dominated by energy, the future may well witness truly radical progress in the evolution of new materials. Naming an economic epoch after its dominant raw material (“the Bronze Age”) is an age-honored habit among historians. Many technological ideas in the past could not be realized because the materials that inventors had available were simply not adequate to make their designs a reality. But recent science-driven advances in material science allow scientists to design new synthetics that nature never had in mind. Such artificial materials, developed at the nano- technological level, promise the development of materials that deliver custom-ordered properties in terms of hardness, resilience, elasticity, and so on. New resins, advanced ceramics, new solids, and carbon nanotubes are all in the process of development or perfection.

Artifical intelligence, lasers, and genetic engineering seem to qualify as general purpose technologies (GPTs) that have many applications across a wide spectrum of uses in production and research. It seems widely agreed that usually GPTs—such as machine learning—take time to fully affect the economy, because by definition they require complementary innovations and investments. But they promise transformative changes in the human condition across many dimensions.

None of those technological predictions can be made with any certainty, and it is inevitable that some advances will be made that no one is forecasting, while other promising advances will disappoint. But the case that technological progress will continue to advance at breakneck speed does not depend on one area of technology or another. It is based on the observation that technology and science coevolve in a symbiotic manner by giving scientific researchers vastly more powerful tools to work with. Some of those tools have been known in more primitive form for centuries; others are radical innovations that have no clear-cut precursors.

Much as the new instruments and tools of the 17th century rang in the scientific revolution and the age of steam and electricity, the high- powered computers, lasers, and many other tools of our age will lead to technological advances that cannot be imagined today any more than Galileo could foresee the locomotive.

JOEL MOKYR is the Robert H. Strotz Professor of Economics at Northwestern University.

This article is based on the paper “The Past and the Future of Innovation: Some Lessons from Economic History,” forthcoming in Explorations in Economic History.

Opinions expressed in articles and other materials are those of the authors; they do not necessarily reflect IMF policy.

References:

Fair, Ray C. 2018. “Presidential and Congressional Vote-Share Equations: November 2018 Update.” Yale Department of Economics Paper, Yale University, New Haven, CT.

Goodman, Peter S., Katie Thomas, Sui-Lee Wee, and Jeffrey Gettleman. 2010. “A New Front for Nationalism: The Global Battle against a Virus.New York Times, April 10.