Jan Hendrik Schon’s success seemed too good to be true, and it was. In only four years as a physicist at Bell Laboratories, Scho

admin2011-02-11  26

问题   Jan Hendrik Schon’s success seemed too good to be true, and it was. In only four years as a physicist at Bell Laboratories, Schon, 32, had co-authored 90 scientific papers — one every 16 days, which astonished his colleagues, and made them suspicious. When one co-worker noticed that the same table of data appeared in two separate papers — which also happened to appear in the two most prestigious scientific journals in the world, Science and Nature — the jig was up. In October 2002, a Bell Labs investigation found that Schon had falsified and fabricated data. His career as a scientist was finished.
  If it sounds a lot like the fall of Hwang Woo Suk — the South Korean researcher who fabricated his evidence about cloning human cells — it is. Scientific scandals, which are as old as science itself, tend to follow similar patterns of hubris and comeuppance. Afterwards, colleagues wring their hands and wonder how such malfeasance can be avoided in the future. But it never is entirely. Science is built on the honor system; the method of peer-review, in which manuscripts are evaluated by experts in the field, is not meant to catch cheats. In recent years, of course, the pressure on scientists to publish in the top journals has increased, making the journals much more crucial to career success. The questions raised anew by Hwang’s fall are whether Nature and Science have become too powerful as arbiters of what science reaches the public, and whether the journals are up to their task as gatekeepers.
  Each scientific specialty has its own set of journals. Physicists have Physical Review Letters; cell biologists have Cell; neuroscientists have Neuron, and so forth. Science and Nature, though, are the only two major journals that cover the gamut of scientific disciplines, from meteorology and zoology to quantum physics and chemistry. As a result, journalists look to them each week for the cream of the crop of new science papers. And scientists look to the journals in part to reach journalists. Why do they care? Competition for grants has gotten so fierce that scientists have sought popular renown to gain an edge over their rivals. Publication in specialized journals will win the accolades of academics and satisfy the publish- or-perish imperative, but Science and Nature come with the added bonus of potentially getting your paper written up in The New York Times and other publications.
  Scientists are also trying to reach other scientists through Science and Nature, not just the public. Scientists tend to pay more attention to the Big Two than to other journals. When more scientists know about a particular paper, they’re more apt to cite it in their own papers. Being off-cited will increase a scientist’s "Impact Factor", a measure of how often papers are cited by peers. Funding agencies use the Impact Factor as a rough measure of the influence of scientists they’re considering supporting.
  Whether the clamor to appear in these journals has any beating on their ability to catch fraud is another matter. The fact is that fraud is terrifically hard to spot. Consider the process Science used to evaluate Hwang’s 2005 article. Science editors recognized the manuscript’s import almost as soon as it arrived. As part of the standard procedure, they sent it to two members of its Board of Reviewing Editors, who recommended that it go out for peer review (about 30 percent of manuscripts pass this test). This recommendation was made not on the scientific validity of the paper, but on its "novelty, originality, and trendiness", says Denis Duboule, a geneticist at the University of Geneva and a member of Science’s Board of Reviewing Editors, in the January 6 issue of Science.
  After this, Science sent the paper to three stem-cell experts, who had a week to look it over. Their comments were favorable. How were they to know that the data was fraudulent? "You look at the data and do not assume it’s fraud," says one reviewer, anonymously, in Science.
  In the end, a big scandal now and then isn’t likely to do much damage to the big scientific journals. What editors and scientists worry about more are the myriad smaller infractions that occur all the time, and which are almost impossible to detect. A Nature survey of scientists published last June found that one-third of all respondents had committed some forms of misconduct. These included falsifying research data and having "questionable relationships" with students and subjects — both charges leveled against Hwang. Nobody really knows if this kind of fraud is on the rise, but it is worrying.
  Science editors don’t have any plans to change the basic editorial peer-review process as a result of the Hwang scandal. They do have plans to scrutinize photographs more closely in an effort to spot instances of fraud, but that policy change had already been decided when the scandal struck. And even if it had been in place, it would not have revealed that Hwang had misrepresented photographs from two stem cell colonies as coming from 11 colonies. With the financial and deadline pressures of the publishing industry, it’s unlikely that the journals are going to take markedly stronger measures to vet manuscripts. Beyond replicating the experiments themselves, which would be impractical, it’s difficult to see what they could do to make Science beyond the honor system.
According to the passage, manuscripts of science are evaluated to______.

选项 A、find novelty
B、catch fraud
C、test scientific validity
D、detect suspicious scientific points

答案A

解析 细节题。第五段最后一句提到,《科学》杂志在评审稿件时采取的手段是为了检验研究的“新颖性、独创性和潮流性”,并非为了检验研究是否真实,数据是否有伪造的痕迹,故A正确。
转载请注明原文地址:https://jikaoti.com/ti/TXpYFFFM
0

相关试题推荐
最新回复(0)