Time Magazine
February 5, 2006

Are We Losing Our Edge?

The U.S. still leads the world in scientific innovation. But years of declining investment and fresh competition from abroad threaten to end our supremacy


Gabriel Aeppli was born in Switzerland, but when he was 1 year old, his father came to the U.S. to pursue a career as a mathematician. Back then, America was a scientific "city on the hill," a place where enormous resources, academic freedom, a tradition of skepticism and a history of excellence lured everyone from astronomers to zoologists from all over the world, and like Aeppli's father, many of them never had any interest in leaving.

Aeppli, now 48, attended M.I.T., where he got a Ph.D. in electrical engineering, and went on to work at Bell Labs, the legendary research arm of AT&T. Then he moved on to the NEC research laboratory, outside Princeton, N.J., as a senior research scientist. But while industrial labs used to be well-funded havens for freewheeling scientific inquiry, says Aeppli, "my career was limited because opportunities to lead were very few." So he left for an academic job in Britain. He now holds a chair in physics at University College London and also directs the London Center for Nanotechnology. "I've been able to start with a clean sheet of paper and create something unique in a world-class city," he says. "We doubt that could be done anywhere else."

Edison Liu is a Hong Kong native who studied in the U.S. and eventually rose to become director of the division of clinical sciences at the National Cancer Institute. But in 2001 the government of Singapore made him an offer he couldn't refuse: the directorship of the brand new Genome Institute along with a $25 million starting budget--part of a $288 million integrated network of life-science research centers and biotech start-ups called Biopolis. Says Liu: "I came because I saw that the entire leadership of the country, the fabric of the country was thirsting for biology."

If those were just isolated cases, they would be easy to dismiss. Such stories, though, have become disturbingly common. After more than a half-century of unchallenged superiority in virtually every field of science and technology, from basic research to product development, America is starting to lose ground to other nations. It's still on top for now; the U.S continues to lead the world in economic performance, business and government efficiency and in the strength of its infrastructure. As recently as 2001, the U.S., with just 6% of the world's population, churned out 41% of its Ph.D.s. And its labs regularly achieve technological feats, as last month's rollout of a new, superpowerful Macintosh computer and the launch of a space probe to Pluto make clear.

But by almost any measure--academic prizes, patents granted to U.S. companies, the trade deficit in high-technology products--we're losing ground while countries like China, South Korea and India are catching up fast. Unless things change, they will overtake us, and the breathtaking burst of discovery that has been driving our economy for the past half-century will be over. In his 2005 best seller, The World Is Flat, Thomas Friedman argues that globalization has collapsed the old hierarchy of economic engine-nations into a world where the ambitious everywhere can compete across borders against one another, and he identifies the science problem as a big part of that development. Borrowing a phrase from Shirley Ann Jackson, president of Rensselaer Polytechnic Institute, he calls it America's "quiet crisis."

Some critics have tried to put the blame for the U.S.'s scientific decline on President George W. Bush, citing his hostility to stem-cell research, his downplaying of global warming, his statements in support of "intelligent design" as an alternative to evolution, and his Administration's appointment of nonscientists to scientific panels as well as its alleged quashing of dissenting scientists (see story on page 37). Although that record has certainly roiled the scientific community at home, experts in business and academia have been warning for decades that U.S. science was heading for trouble for three simple reasons. The Federal Government, beset by deficits for most of the past three decades, has steadily been cutting back on investment in research and development. Corporations, under increasing pressure from their stockholders for quick profits, have been doing the same and focusing on short-term products. And the quality of education in math and science in elementary and high schools has plummeted, leading to a drop in the number of students majoring in technical fields in college and graduate school. In the past, hungry immigrants looking for America's prestigious Ph.D.s made up for that decline in the U.S. science and engineering labor force. Now if they come to America for Ph.D.s, students often return with them to gleaming labs in their homelands.

The warnings about those three forces have been largely ignored. In the aftermath of 9/11, for example, the political class complained that nobody had heeded a report issued nine months earlier by former Senators Gary Hart and Warren Rudman warning of a major terrorist attack on U.S. soil. The report also said "the inadequacies of our systems of research and education" posed a threat to U.S. national security greater "than any potential conventional war that we might imagine." Nobody paid attention to that part either.

People are paying attention now, though. Responding to an increasingly insistent drumbeat of lobbying over the past few months from industry leaders, scientists and legislators, Bush announced in his State of the Union address last week the launch of what he called the American Competitiveness Initiative. The plan: double federal funding of research in basic areas like nanotechnology, supercomputing and alternative energy; make permanent the R&D tax credit; and train 70,000 additional high school science and math teachers. Aboard Air Force One the next morning, the President told Lamar Alexander, the Tennessee Republican Senator who has been pushing the idea hard for the past year, that he's determined to make it happen. "I want to make sure that everyone knew I was taking this seriously," said Bush.

In contrast to his then dead-end proposal to reform Social Security, so are lawmakers on both sides of the aisle. Last spring Alexander, along with Democratic Senator Jeff Bingaman of New Mexico, wrote to the prestigious National Academies, an umbrella group that includes the National Academy of Sciences and the National Academy of Engineering, and asked for a formal assessment of the U.S.'s eroding superiority in science and technology.

The result, produced in just three months: a 505-page report, co-authored by a team of distinguished scientists, CEOs, Nobel prizewinners and university presidents--including Texas A&M president Robert Gates, director of the CIA under President George H.W. Bush and a close friend of the Bush family. Titled "Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future," it outlined in detail just how bad the situation was in nearly every area of research and called for new government funding. At about the same time, the National Association of Manufacturers and the Chamber of Commerce were issuing reports with similar conclusions.

Democrats seized publicly on the issue first. In November, House minority leader Nancy Pelosi announced a series of proposals modeled on those in "Gathering Storm." Tellingly, though, she avoided criticizing the President, going as far as removing some negative language at the last moment. The idea, said a party official, was to get something accomplished, not just score political points. Even so, Pelosi's opening shot made Republicans nervous. "The feeling," says an industry official who was involved in discussions with the White House, "was, 'We cannot let them have this issue.'" Indeed, top Bush aides, including Karl Rove and the Secretaries of Labor, Education and Commerce, began lobbying internally for some sort of presidential initiative. Bush aides say the indispensable player in moving the package to the presidential podium was Bush's workout partner and close friend former Commerce Secretary Donald Evans, who made it a crusade after a fellow Texan on the National Academies committee handed him the report. "This is like Sputnik," Evans tells TIME. "We need to give this the same focus and energy."

By Dec. 6, when Republican Representative Sherwood Boehlert, chairman of the House Science Committee, met with Office of Management and Budget (OMB) director Josh Bolten to press for more money for scientific research, Boehlert found Bolten unexpectedly receptive. Later that day, four Cabinet Secretaries showed up for a meeting on scientific research held at the Commerce Department. Energy Secretary Samuel Bodman surprised Boehlert by staying all day. Although the Congressman has been advocating increased scientific research for years, Boehlert says, "Now it was getting the visibility." Soon Bush himself was pushing the proposals through the OMB, which often rips such initiatives apart.

If all that was at stake was some sort of bragging rights--who has the most Nobels, who gets to look down its national nose at the rest of the world--none of that would have happened because it wouldn't really matter. After all, Americans may have invented the integrated circuit and the Internet and the light bulb, but people all over the world get to use them. Same goes for the statin drugs that lower cholesterol and the iPod. And we are obviously free to use inventions made elsewhere, such as Velcro and the ballpoint pen.

There's much more to it, though. "Imagine," says Stanford University president John Hennessy, "that the next round of innovations in networking is done in India or China. How many years is it before either Cisco relocates to India or China and grows most of its new jobs there or the next Cisco is actually created there?" That's not so farfetched, says Du Pont CEO Chad Holliday: "If the U.S. doesn't get its act together, Du Pont is going to go to the countries that do, and so are IBM and Intel. We'd much rather be here, but we have an obligation to our employees and shareholders to bring value where we can."

That means not only that Americans have to be better than the rest of the world at inventing things but also that we have to be better at the basic research that precedes invention. Back in the 19th and early 20th centuries, people like Edison, Morse and the Wright brothers proved that Americans were pretty good at creating useful technology. But all of it was based on fundamental science done in places like Britain, Germany and France, where the true intellectual action was.

If not for Hitler, it might still be, but his aggression drove scientists out of Europe, and the desperate need to defeat him galvanized the U.S. and Britain into pouring money into defense research, creating powerful new technologies--radar, sonar, the atom bomb. U.S. leaders learned that pure research like atomic and electromagnetic physics, combined with massive government funding, could lead to dramatic breakthroughs in military technology. Because the Soviet Union almost immediately became just as ominous a threat as Nazi Germany had been, Congress created the National Science Foundation in 1950 to fund basic and applied science, mostly at universities, "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense ..." In 1958 it founded NASA in response to renewed fears of Soviet technical competition ignited by the launch of Sputnik the previous year. Also in 1957 and for the same reason, the Department of Defense started the Advanced Research Projects Agency (ARPA). And it established or beefed up national laboratories in New Mexico, California, Illinois, Washington and New York.

All those organizations focused in varying degrees on applied science--attempts to invent useful new technologies--but all of them put money into pure science as well. So did private corporations, including AT&T, IBM and Xerox, which hired not just engineers but also mathematicians, physicists, biologists and even astronomers and gave them free rein. The strategy led to utterly impractical but revolutionary discoveries. The Big Bang theory of the cosmos, to name just one example, got its first experimental proof at AT&T's Bell Labs.

But the strategy paid off in an avalanche of astonishing and profitable technologies as well, from computer chips to fiber-optic cables to lasers to gene splicing and more. According to a 2003 National Academies report, no fewer than 19 multibillion-dollar industries resulted from fundamental research in information technology alone. Yet, says David Patterson, president of the Association for Computing Machinery, "people have this idea of academic research as this fuzzy, ivory-tower stuff that probably doesn't pay off."

That extraordinary track record also made scientists and engineers into national heroes. They won the war, they got us to the moon, they protected us from polio and dozens of other illnesses, and they gave us a standard of living far higher than that of any other country. Young people were inspired to emulate their egghead heroes, and federal funding made that possible. Energy Secretary Bodman, for example, recalls that he went to graduate school on a National Science Foundation fellowship in 1960. "Without that fellowship," he says, "I can virtually guarantee I wouldn't have done it."

For nearly a half-century, the strategy of putting money into science guaranteed that the U.S. would lead the world by just about every measure of scientific and technological prowess. So, what changed? American business, for one thing. Competitive pressure and the need to prop up stock prices forced many companies to abandon research and focus mostly on short-term product development. Freewheeling corporate research labs that didn't contribute visibly to the bottom line--AT&T's Bell Labs, Xerox's Palo Alto Research Center, IBM's Thomas J. Watson Research Center--have been restructured.

Much the same happened to military-funded research. The Defense Advanced Research Projects Agency (DARPA, the successor agency to ARPA) halved its funding of academic information-technology research from 2001 to 2004. "They say that because we're in a war, we need to have a shorter-term focus," laments Patterson. "But during Vietnam," he says, DARPA-funded researchers "laid the technology, the underlying vocabulary, of the Internet. They were doing fundamental, important, long-term research."

Nonmilitary research grants, meanwhile, have been essentially flat for the past 15 years. The one exception: the National Institutes of Health, whose budget doubled from 1998 to 2003. "Unless there's an emotional appeal, basic research is well beyond the time span of the next election," says Craig Barrett, chairman of Intel. "There is a very emotional attachment to research on cancer or chronic illnesses. It's much more difficult to say, What will the structure of the transistor look like in the next 15 years?"

As the size of individual grants shrinks, university researchers have to win more of them to keep research going, which requires enormous amounts of extra paperwork. "It's decreased their quality of life," says Paul Jennings, provost of Caltech and a civil engineer. When students see how much time a professor spends on bureaucratic busywork, says Jennings, they say, "I don't want to do that." It's not just red tape either, says Paul Nurse, president of Rockefeller University and a 2001 Nobel laureate in physiology or medicine. "If we compare what our best undergraduates get paid as a graduate student vs. what they get paid in investment banking, there's no doubt that there's tremendous economic pressure to suck you away from what is perhaps your first academic love." As for teaching science at the precollege level, salaries and working conditions are even more dismal.

Students at élite universities are getting that message loud and clear. Melisa Gao, 20, is a senior majoring in chemistry at Princeton, but when recruiters from consulting firms and investment banks showed up on campus last fall, she went on several interviews, and she will take a job as a consultant after graduation. She says, "They love the fact that science majors can think analytically, that we're comfortable with numbers." Increasingly, science majors love those companies back. Gao says, "There are no guarantees if you go into science, especially as a woman. You have to worry about getting tenure. Or if you go into industry, it takes you a long time to work your way up the ladder." If you go into finance or consulting instead, "by the time your roommate is out of grad school, you've been promoted, plus you're making a lot more money, while they're stuck in lab."

Even at M.I.T., the U.S.'s premier engineering school, the traditional career path has lost its appeal for some students. Says junior Nicholas Pearce, a chemical-engineering major from Chicago: "It's marketed as--I don't want to say dead end but sort of 'O.K., here's your role, here's your lab, here's what you're going to be working on.' Even if it's a really cool product, you're locked into it." Like Gao, Pearce is leaning toward consulting. "If you're an M.I.T. grad and you're going to get paid $50,000 to work in a cubicle all day--as opposed to $60,000 in a team setting, plus a bonus, plus this, plus that--it seems like a no-brainer."

Another problem has been the tarnished image of science itself. Catchphrases that felt inspiring in the 1950s--"Better living through chemistry," "Atoms for peace"--have a darker connotation today. Du Pont, which invented nylon, became known as well for napalm. Chernobyl and Three Mile Island soured Americans on nuclear power. Shuttle crashes and a defective Hubble telescope made NASA look inept. Substances from DDT to PCBs to ozone-eating chlorofluorocarbons proved more dangerous than anyone realized. Drug disasters like the thalidomide scandal made some people nervous about the unintended consequences of new drug treatments. It's in that context of skepticism toward science that some reasonable questions have been raised lately about genetically modified foods and the scope of human embryonic work.

Even so, the U.S. commitment to science might have remained strong if the Soviet Union hadn't collapsed in the late '80s. "We don't have this shadow of Sputnik or the cold war overhanging us," says Stanford's Hennessy, "and we need a different form of inspiration." In fact, says Robert Birgeneau, a physicist and chancellor of the University of California, Berkeley, it already exists, if only we would recognize it. "We have a different kind of war, an economic war," he says. "The importance of investing in long-term research for winning that war hasn't been understood."

Not in this country, anyway. But other nations, realizing how successful the U.S. model of scientific research has been, have begun to copy it in earnest. Finland decided back in the 1970s to focus on electronics and a handful of other high-tech industries, and now has the most research scientists per capita in the world. South Korea decided to concentrate on reproductive technology, and although the research of superstar Hwang Woo Suk has been exposed as mostly fraudulent, the country has plenty of other world-class experts in cloning and stem-cell research.

Singapore, meanwhile, with its Biopolis project, is pulling in top biomedical scientists--not just Edison Liu but Americans like geneticist Sydney Brenner and, most recently, husband-and-wife cancer researchers Neal Copeland and Nancy Jenkins, who are leaving the National Cancer Institute after two decades. They turned down competing offers from Stanford and the Memorial Sloan-Kettering Cancer Center because, Copeland says, "what's going on over there is amazing. There's plenty of funding and a lot less bureaucracy." Moreover, says Liu, "In the U.S. the state government says, Let's do one thing, while the Federal Government is trying to stamp it out." Singapore, by contrast, has a single set of reasonably permissive regulations.

Small, economically developed countries aren't the only ones that have created science-friendly cultures: 54% of the staff at the Chinese Academy of Engineering and an astonishing 81% of the scientists at the Chinese Academy of Sciences are people who have returned from abroad. Deng Hongkui's story is typical. When he went to the U.S. in 1989 for postgraduate study in virology, he thought he would go back to China only to visit family and friends. But in 2000 he returned as director of one of Peking University's newest research centers. Deng was promised his own team of students and faculty members and whatever state-of-the-art facilities he needed to pursue his research on stem cells. It clearly wasn't the same country he had left 11 years earlier. "It was more exciting, more dynamic," he says. "Before I never [thought] about doing research there because I needed resources, but it looked to me that resources were available. The whole environment was changing."

Those countries offer more than just funding. They're also determined to reproduce the spirit of wide-open inquiry that has made U.S. science so appealing and successful, says Steven Chu, director of the Lawrence Berkeley National Laboratory in Berkeley, Calif., and a 1997 Nobelist in physics. Wherever he goes, administrators at foreign universities ask him how to create an American-style learning and thinking environment. "They are catching up quickly," he says.

That is especially true in China, where the government has put its muscle behind an all-out effort to transform homegrown science. "Ten years ago in China, it was virtually all derivative stuff," says Chu. "Students would sit and listen and try to capture every word. Now they're asking lots of questions." During a 100th-anniversary celebration for Peking University a few years ago, Chu found himself seated next to China's Minister for Education. "She was asking for my autograph," he says, shaking his head. "It was totally topsy-turvy. Can you imagine in the U.S. the Secretary of Education fawning on a Nobel prizewinner? It just won't happen." In his book Thomas Friedman puts it another way: "In China today, Bill Gates is Britney Spears. In America today, Britney Spears is Britney Spears--and that is our problem."

Indeed, P. Roy Vagelos, a former CEO and chairman of Merck, traveled last fall to China, where he met a number of U.S.-educated Chinese scientists who had returned to work in their homeland. "The new labs are spectacular," he says. "Unbelievable. The equipment leaves nothing to be desired." The government is doling out generous research grants to academic scientists. In all, it invested nearly 110 billion yuan on science in 2004, up from less than 50 billion yuan in 1999. Chinese scientists also get cash awards that can run into thousands of dollars for getting papers published in scholarly journals.

The beefing up of research labs in China and elsewhere is not just luring natives back to their homeland. It is also retaining promising students who might once have gone to the U.S. to study. That matters because keeping U.S. universities the best in the world depends on luring the very best students. Tougher visa regulations put in place after 9/11 don't help either. Chu has plenty of horror stories. One former student went home to Taiwan for a brief vacation. When he applied for his re-entry visa, he said he was studying atomic physics. Even though that subject had nothing to do with nuclear-weapons work, 18 months passed before he could return. "These stories get passed around," says Chu. "If you're being courted all around the world, if you could go to graduate school anywhere you wanted, why would you come to the U.S.?"

In absolute terms, of course, the U.S. is still the world leader in scientific research. A half-century's worth of momentum is tough to derail. Yet, says Shirley Tilghman, president of Princeton and a molecular biologist, "there's still reason to feel some urgency. The world is not standing still while we take a pause."

For the first time in decades, however, there's hope that the pause may be ending. Given its bipartisan appeal, the Bush Competitiveness Initiative is likely to pass. Funding won't be easy, given the soaring deficit, but the people who dole out the money are enthusiastic. "I am very, very supportive," Representative Frank Wolf, the House Republican in charge of science funding, told TIME, "and I think the President is going to get what he requested." Sometimes, marvels Alexander, "these things sit for years and then suddenly come together in a big way."

With reporting by Matthew Cooper, Eric Roston/ Washington, Bryan Walsh/ Hong Kong, Amanda Bower/ San Francisco, Reported by Hannah Beech/ Shanghai