Big innovations require big investment

Part of a weekly series on the economic choices facing the United States and its relations with the rest of the world. For previous entries, click here.
 
Of all of the purposes of government, one of the most important but often neglected is to mobilize science and technology to solve critical challenges. Modern society depends on highly complex technological systems for our safety and prosperity. Without these advanced technological systems, we’d have no chance to sustain national prosperity, much less to meet the basic material needs of a global population of 7.4 billion people. Yet managing and improving those technologies requires a large and sustained investment by government alongside business and academia.

The key idea here is “directed technological change,” meaning that scientists and engineers are pulled together to solve a complex challenge in the national interest. The challenge is not only important and solvable, but because of its nature, is not the kind of challenge that the private sector alone will solve in a timely way on a for-profit basis.

Modern American history is replete with such endeavors. No doubt the most famous and consequential of all was the Manhattan Project during World War II. It is a stunning example of directed scientific and technological effort, showing how the most complex and cutting-edge scientific challenge can be met through targeted investments.

In 1938, on the eve of World War II, European physicists discovered the principle of nuclear fission (splitting an atom by a neutron, releasing enormous energy and more neutrons) and the possibility of a nuclear chain reaction. Within a year, physicists realized that this could lead to a new kind of atomic bomb and the threat that Nazi Germany might get there first. Albert Einstein and Edward Szilard wrote a world-changing letter to President Franklin Roosevelt advising the president of this risk and urging a US effort to develop such a weapon before Germany.

The Manhattan Project got underway intensively in 1942 and culminated in the atomic bomb in 1945. The project engaged many of the world’s leading physicists in the effort and led to countless scientific and technological breakthroughs in a three-year period. Thus was born the nuclear era.

The mobilization of great minds, national laboratories, and private companies in pursuit of well-defined objectives is therefore not a quixotic quest. The lessons of the Manhattan Project were taken up after World War II in many important areas of national security, public health, new technologies, and general science. From the birth of modern computing, to the polio vaccine and the space age, to the human genome and the Internet, directed technological change has repeatedly shaped and advanced the US economy and the modern world.

These efforts are all characterized by highly complex challenges; a sense of national urgency; and a mix of academic, philanthropic, commercial, and government organizations and financing.

Consider Jonas Salk’s polio vaccine, developed half a century ago. The main financing came through a nonprofit organization (popularly known as the March of Dimes) launched by FDR in 1938. Jonas Salk led a scientific team at the University of Pittsburgh for seven years, from 1947 to 1954, to develop the vaccine. In 1955, the vaccine was tested and massively disseminated in an unprecedented public health campaign that engaged governments at all levels. Salk’s vaccine, followed by Albert Sabin’s vaccine a few years later, ended the US epidemic. Polio is now on the verge of global eradication. When asked who owned the patent on the polio vaccine, Salk famously replied, “There is no patent. Could you patent the sun?”

As with the Manhattan Project, the US space effort was launched as a national security effort, part of the Cold War competition with the Soviet Union. After the Soviet Union successfully launched the Sputnik satellite, in 1957, the US ramped up its own efforts. In May 1961, President John F. Kennedy inspired the nation with his call for America to commit itself to “achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to Earth.” The United States thereafter spent around 0.5 percent of GDP each year for the balance of the decade to support NASA’s successful moon effort, mobilizing an estimated 500,000 workers and 20,000 companies. The spillovers in computing, semiconductors, aeronautics, telecommunications, materials sciences, and countless other areas of science and technology have of course been profound.

The Internet was similarly given birth in a US government effort again linked to national security. In this case, the Defense Department was interested in creating a network of computers that could support a resilient command-and-control system military, supported by a geographically dispersed access to a few major computer centers around the country. The core building blocks of the Internet were developed as part of the Defense Department’s ARPANET project that ran from the late 1960s until 1990. Building on ARPANET, the US National Science Foundation during the 1980s established and developed a network linking major US universities. From the 1990s onward, these publicly financed efforts became the foundation of the Internet and the vast commercial business world built upon it.

The list of such targeted technology efforts is long and inspiring. Moore’s Law, the repeated doubling of computer power roughly every two years since the late 1950s, builds on industry and government technology roadmaps. The Human Genome Project, to map the human genome, was a breakthrough program during 1990-2003 that engaged leaders in academic biology, private biotech startups, and major government research centers in the United States and abroad; the continuing spillovers in countless areas of public health, medicine, agronomy, archeology, anthropology, and many other disciplines remain vast. Even hydraulic fracturing (“fracking”) to produce oil and gas from shale rock depended on the early initiatives of the US Geological Survey.

For these reasons, the frequently heard political complaints about federal funding of early-stage technologies (e.g., of the solar company Solyndra or electric vehicle Tesla) seriously miss the point of the key role of publicly supported R&D in delivering cutting-edge technologies (a point well documented by economist Mariana Mazzucato in her book “The Entrepreneurial State’’). Not every R&D project bears fruit, to be sure; such is the nature of cutting-edge research. Yet the track record of public-private-academic-philanthropic partnerships to advance science and technologies in critical areas is a key pillar of America’s prosperity and technological excellence.

One of the threats to America’s well-being indeed is the current insufficiency of such efforts in areas of critical need. There are many highly promising and crucially important areas of R&D where purely private and profit-driven efforts based on the incentives from patenting are falling far short of social needs.

Compare, for example, the $30 billion per year funding that is directed to biomedical science through the National Institutes of Health (itself too low a budget) with the mere $7 billion per year that is currently spent by the federal government on research into renewable or other low-carbon energy technologies. The threat of climate change is on the scale of trillions of dollars of damages per year, and the solutions depend on the rapid transition from fossil-fuel-based energy to zero-carbon alternatives.

Consider two promising areas of energy research. The successful ramp-up of renewable energy depends in part on low-cost, highly reliable batteries with higher energy density (energy per unit weight). Battery technology is a major scientific and technological challenge, with the need for extensive research, much of it by highly sophisticated trial and error. Yet federal battery research is estimated to be around $300 million per year, a small fraction of the amount that could be usefully deployed by the nation’s laboratories and universities.

Another case is carbon capture and storage (CCS) technologies, the only climate-safe way to deploy fossil fuels in the future. Some technologies are attracting private capital, but much of the science (such as the geological research) is almost entirely a public good that requires public rather than private financing. Worldwide, the scale of public financing for R&D related to CCS remains minuscule, and clouds the prospects of any potential significant and timely deployment.

Other areas crying out for greater public investments in R&D include: smart grid systems to manage 21st-century infrastructure; fourth-generation nuclear energy; advanced materials sciences for environmental sustainability; the early identification and control of emerging epidemic diseases such as Zika and Ebola viruses; advanced agricultural technologies for crop resilience to climate change; improved nutrition; geriatric medicine (including the soaring costs of Alzheimer’s disease); and improved cyber-security, including for important e-governance functions such as online voting.

Because of America’s chronic underfinancing of discretionary public spending, America’s technological leadership is being undermined. Yes, America is home to more of the world’s leading universities than any other country, and still has the greatest depth of scientific and engineering capacity, yet the chronic under-investment in cutting-edge science and technology puts the US technological capacity at risk.

Measuring R&D as a share of national income, the United States now ranks ninth among high-income countries of the Organization for Economic Cooperation and Development. US R&D outlays are around 2.7 percent of national income, compared with more than 4.0 percent of GDP in Korea and Israel, and more than 3.0 percent of GDP in Denmark, Finland, Germany, Japan, and Norway. In total dollars, China is currently around three-fourths of US outlays, and is very likely to overtake the United States during the coming decade on current trends.

As in past grand, projects should be guided by urgent public needs, and by areas where public financing is vital because private financing is inappropriate. That includes areas of basic science (where patents simply make no sense); challenges where market-based approaches are inappropriate (such as control of epidemic diseases); goals that depend on the very rapid uptake of new technologies, so that private patents would clog rather than accelerate deployment (smart power-grid protocols for integrating intermittent renewable energy); and goals that involve major social policies regarding risk and liability (nuclear energy and carbon capture and storage).

In many areas, such as disease control, crop productivity, and zero-carbon energy, much of the effort should be global, with costs and benefits shared across the world. Just as the moonshot eventually turned into significant global cooperation in space, our global-scale challenges also behoove us to create international as well as national frameworks for expanded R&D. The Department of Energy under Secretary Ernie Moniz is currently engaging 21 other countries to expand R&D on low-carbon energy, in the Mission Innovation Initiative. In this case, private-sector investors led by Bill Gates are stepping up alongside the government to invest “patient capital” in early-stage technologies.

Here is my recommendation for President-elect Trump and the incoming Congress. Turn to our glorious national scientific institutions: the National Academies of Science, Engineering, and Medicine, for a 2017 report to the nation on the most promising areas for directed research and development in the years to 2030. Ask the academies to recommend a strategy for ramping up the national R&D efforts. Call on America’s research universities to add their own brainstorming alongside the National Academies. When the report is issued, late in 2017, the president and Congress should meet in a joint session of Congress to set forth a new technology vision for the nation and a new R&D strategy to achieve it.

Jeffrey D. Sachs is University Professor and Director of the Center for Sustainable Development at Columbia University, and author of “The Age of Sustainable Development.”
 

http://www.bostonglobe.com/opinion/2016/11/27/big-innovations-require-big-investment/d6nm8c4mVzo2NMSHQDFK7I/story.html