The editors of Science recognize that they have a problem. They aren't very transparent or trustworthy. This is true. These same editors have been guilty of publishing and promoting lots of poor quality science over the past few years. Three examples come to mind ...
- Arseniclife: Science published a ridiculous claim that arsenic could replace phosphorus in DNA. That paper has been refuted but never retracted.
- Ardipithicus ramidus: Science fell for the authors' hype.
- ENCODE: Science falls for the hype promoted by ENCODE leaders. Editorial and feature writers announce the death of junk DNA
Don't worry. The editors have been working hard to fix the problem. After a year of study they announce their solution in the June 3, 2016 issue in the lead editorial: Taking up TOP. The author is the current Editor-in-Chief, Marcia McNutt.
She begins with ...
Nearly 1 year ago, a group of researchers boldly suggested that the standards for research quality, transparency, and trustworthiness could be improved if journals banded together to adopt eight standards called TOP (Transparency and Openness Promotion).* Since that time, more than 500 journals have been working toward their implementation of TOP. The editors at Science have held additional retreats and workshops to determine how best to adapt TOP to a general science journal and are now ready to announce our new standards, effective 1 January 2017.So, what is TOP and how is it going to make Science more trustworthy? Does it involve firing some well-known writers and editors? Does it involve better reviewers?
Nope. TOP is just a way of making sure that raw data is available to other researchers.
... we believe the benefits of requiring the availability of data, code, and samples on which the authors' interpretations rest are worth the effort in compliance (and in some cases in adjusting data ownership expectations), while acknowledging that some special circumstances will require exemptions. This practice increases transparency, enables reproducibility, promotes data reuse, and is increasingly in line with funder mandates. We are also requiring the citation of all data, program code, and other methods not contained in the paper, using DOIs (digital object identifiers), journal citations, or other persistent identifiers, for the same reason. Citations reward those who originated the data, samples, or code and deposited them for reuse. Such a policy also allows accurate accounting for exactly which specific data, samples, or code were used in a given study.That's not going to fix the main problem.