I recently posted some thoughts on the complexity of the scientific literature noting that many papers are simply too difficult to understand. This includes papers that are well within my areas of interest [How to read the scientific literature? and The scientific literature is becoming more complex].
Nature journal recognizes that there's a problem. A few weeks ago (June 16, 2016) they published a brief comment on Nature distilled.They begin by describing the problem ...
Any journal that tries to publish the most important results that it is sent, in all fields of science, will run into the same problem. Every bit of our output, we hope, is useful and interesting to somebody somewhere. But even the most optimistic of our editors would concede that the pool of readership for each of these specific advances is only a small subsection of our audience, professional researchers included. To the outside world, science is science. To those who read Nature, science is a multiplicity of specialisms — and specialists.Let's make one thing clear. It's not just the complexity of a paper that's the problem and it's not just that the science isn't explained in easy to understand sentences. There's also the more serious problem of content. Sometimes the papers are hard to understand because the significance of the results is exaggerated and its importance is not placed in proper context.
We know that most of you are specialists, and that you don’t read most of what we present to you. You’re busy people. It is hard enough to follow the literature that you need to read. Even the titles of research papers in an unfamiliar field can look incomprehensible. But if you’re anything like us, one reason you got into science in the first place was curiosity about the world — and not just the tiny piece of it that you now focus on. Wouldn’t it be useful and interesting to keep better track of the rest? Or at least, the rest that is published in Nature, and therefore already judged to be important?
The ENCODE papers are good examples of this problem. It wasn't easy to understand that they did but, more importantly, it wasn't easy to understand the significance of their results because the authors didn't explain their results very well. They made unsubstantiated claims.
Here's how Nature hopes to fix the problems they identified.
We think so, and this week we begin an experiment to see how many of you agree. We have revisited 15 recently published Nature papers and asked the authors to produce two-page summaries of each. The summaries remain technical — these are not articles suitable for the popular press — but they try to communicate both the research advance and why it matters. The authors of these papers have been enthusiastic — they want the broadest possible readership — and we thank them for their cooperation. Now we want to know what you think. The first three summaries are published online this week (see go.nature.com/1uhcy3x). The rest will be released in the coming weeks. Please take a look. Be brave — pick a topic that you expect to struggle with — and then fill in the online survey to let us know what you think. The rest will be released in the coming weeks. Please take a look. Be brave — pick a topic that you expect to struggle with — and then fill in the online survey to let us know what you think.I looked at two papers that were about biology and I didn't think the summaries added anything to my understanding. That's partly because the papers weren't that hard to understand in the first place if you were just satisfied with knowing what they did.
Both papers raised lots of questions in my mind about the biological significance of the studies and whether they were accurate and reproducible. The author summaries didn't help much. [Non-coding recurrent mutations in chronic lymphocytic leukaemia and DNA-dependent formation of transcription factor pairs alters their binding specificity].
If the scientific literature is difficult to understand, and it is, then there's a problem with the authors. They aren't able to explain what they did in a reasonable manner and they aren't able to place their work in a proper context so we can evaluate the significance of the result. Asking them to try again (and doubling their citations) is probably not going to help.
The ENCODE authors couldn't do it.
It's a lot like asking the fox to guard the henhouse.