Moonshots are exciting, inspiring. But they are also expensive — and not all of them work out like Apollo 11. So are they worth it? Should we focus time, money and energy on these kinds of ambitious scientific projects?
The answer is yes.
Think of it this way. Imagine that the Apollo moon landing had not succeeded. Would it still have been worth it? Almost certainly. After all, the challenge of getting to the moon and back meant coming up with all kinds of technological innovations and solutions that have then had almost miraculous commercial applications.
For example, Apollo needed small computers for its trip. So NASA says it made a big purchase of a relatively new technology — the integrated circuit, the technology behind what we now know as the microchip.
The companies that pioneered this product — firms such as Fairchild Semiconductor, whose alumni founded Intel — partly had NASA to thank, the agency says, for the popularity of the technology that followed. The U.S. government, by buying the initial products, helped the computer revolution take off.
And it’s not just the microchip. GPS technology, which is now powering the next phase of the information revolution, was originally developed by the U.S. military. But it was only after the 1983 Soviet shooting down of a Korean Air Lines flight that the Reagan administration said it would share the technology so that civilian airplanes would not wander into restricted and dangerous territories. And it was only after the Cold War ended that the Clinton administration opened up the technology fully to commercial applications, which of course unleashed a flood of innovation that continues to this day.Â
Or consider the mapping of the human genome. According to the Battelle Memorial Institute, the federal government spent $3.8 billion on the massive project from 1990 to 2003, an amount few other entities could ever have afforded. But in leading the way, it encouraged others, and now a person’s DNA can be sequenced for as little as $1,000.
The impact on the economy of human genome sequencing from 1988 to 2010 was estimated by Battelle at almost $800 billion — enough to support more than 310,000 jobs in 2010 alone.
Reading all this, you might think the United States is on the right track. Alas, the problem is, it is not.
Federal funding for research and development has barely been keeping up with inflation in recent years if you look at the numbers from the National Science Foundation.
Since big entitlements such as Social Security are mandatory spending programs, it is discretionary spending, such as science, that often bears the brunt of the budget ax.
 And this cutting comes at a time when others around the world are moving fast. The United States has dominated the world of basic science for years, even decades. But recently, its share of global research and development has been falling — from 37% of the total in 2001 to 30% in 2011, according to the National Science Foundation.
As scientist Neal Lane points out, China is on course to surpass the United States in the percentage of its gross domestic product it spends on research and development in just a few years.
It used to be that funding basic science was not a partisan issue. As Lane notes, a certain rock-ribbed Republican was a big proponent of basic research. Reagan said in a 1988 radio address:
“The remarkable thing is that although basic research does not begin with a particular practical goal, when you look at the results over the years, it ends up being one of the most practical things government does. This is why I’ve urged Congress to devote more money to research. It is an indispensable investment in America’s future.”
Americans used to understand that moonshots inspire us, but that they also power America’s future. Let’s hope that today’s politicians follow Reagan’s advice and invest in science, research and development.
Follow us on Twitter @CNNOpinion.
Join us on Facebook/CNNOpinion.