You can't understand the junk DNA debate unless you've read Michael Lynch's book The Origins of Genome Architecture. That means you have to understand modern population genetics and the role of random genetic drift in the evolution of genomes. There's no evidence in Parrington's book that he has read The Origins of Genome Architecture and no evidence that he understands modern evolutionary theory. The only evolution he talks about is natural selection (Chapter 1).
Here's an example where he demonstrates adaptationist thinking and the fact that he hasn't read Lynch's book ...
At first glance, the existence of junk DNA seems to pose another problem for Crick's central dogma. If information flows in a one-way direction from DNA to RNA to protein, then there would appear to be no function for such noncoding DNA. But if 'junk DNA' really is useless, then isn't it incredibly wasteful to carry it around in our genome? After all, the reproduction of the genome that takes place during each cell division uses valuable cellular energy. And there is also the issue of packaging the approximately 3 billion base pairs of the human genome into the tiny cell nucleus. So surely natural selection would favor a situation where both genomic energy requirements and packaging needs are reduced fiftyfold?1
Nobody who understands modern evolutionary theory would ask such a question. They would have read all the published work on the issue and they would know about the limits of natural selection and why species can't necessarily get rid of junk DNA even if it seems harmful.
People like that would also understand the central dogma of molecular biology.
1. He goes on to propose a solution to this adaptationist paradox. Apparently, most of our genome consists of parasites (transposons), an idea he mistakenly attributes to Richard Dawkins' concept of The Selfish Gene. Parrington seems to have forgotten that most of the sequence of active transposons consists of protein-coding genes so it doesn't work very well as an explanation for excess noncoding DNA.
Parrington addresses this issue on page 63 by describing experiments from the late 1960s showing that there was a great deal of noncoding DNA in our genome and that only a few percent of the genome was devoted to encoding proteins. He also notes that the differences in genome sizes of similar species gave rise to the possibility that most of our genome was junk. Five pages later (page 69) he reports that scientists were surprised to find only 30,000 protein-coding genes when the sequence of the human genome was published—"... the other big surprise was how little of our genomes are devoted to protein-coding sequence."
Contradictory stuff like that makes it every hard to follow his argument. On the one hand, he recognizes that scientists have known for 50 years that only 2% of our genome encodes proteins but, on the other hand, they were "surprised" to find this confirmed when the human genome sequence was published.
He spends a great deal of Chapter 4 explaining the existence of introns and claims that "over 90 per cent of our genes are alternatively spliced" (page 66). This seems to be offered as an explanation for all the excess noncoding DNA but he isn't explicit.
In spite of the fact that genome comparisons are a very important part of this debate, Parrington doesn't return to this point until Chapter 10 ("Code, Non-code, Garbage, and Junk").
We know that the C-Value Paradox isn't really a paradox because most of the excess DNA in various genomes is junk. There isn't any other explanation that makes sense of the data. I don't think Parrington appreciates the significance of this explanation.
The examples quoted in Chapter 10 are the lungfish, with a huge genome, and the pufferfish (Fugu), with a genome much smaller than ours. This requires an explanation if you are going to argue that most of the human genome is functional. Here's Parrington's explanation ...
Yet, despite having a genome only one eighth the size of ours, Fugu possesses a similar number of genes. This disparity raises questions about the wisdom of assigning functionality to the vast majority of the human genome, since, by the same token, this could imply that lungfish are far more complex than us from a genomic perspective, while the smaller amount of non-protein-coding DNA in the Fugu genome suggests the loss of such DNA is perfectly compatible with life in a multicellular organism.
Not everyone is convinced about the value of these examples though, John Mattick, for instance, believes that organisms with a much greater amount of DNA than humans can be dismissed as exceptions because they are 'polyploid', that is, their cells have far more than the normal two copies of each gene, or their genomes contain an unusually high proportion of inactive transposons.
In other words, organisms with larger genomes seem to be perfectly happy carrying around a lot of junk DNA! What kind of an argument is that?
Mattick is also not convinced that Fugu provides a good example of a complex organism with no non-coding DNA. Instead, he points out that 89% of this pufferfish's DNA is still non-protein-coding, so the often-made claim that this is an example of a multicellular organism without such DNA is misleading.
[Mattick has been] a true visionary in his field; he has demonstrated an extraordinary degree of perseverance and ingenuity in gradually proving his hypothesis over the course of 18 years.
Hugo Award Committee Seriously? That's the best argument he has? He and Mattick misrepresent what scientists say about the pufferfish genome—nobody claims that the entire genome encodes proteins—then they ignore the main point; namely, why do humans need so much more DNA? Is it because we are polyploid?
It's safe to say that John Parrington doesn't understand the C-value argument. We already know that Mattick doesn't understand it and neither does Jonathan Wells, who also wrote a book on junk DNA [John Mattick vs. Jonathan Wells]. I suppose John Parrington prefers to quote Mattick instead of Jonathan Wells—even though they use the same arguments—because Mattick has received an award from the Human Genome Organization (HUGO) for his ideas and Wells hasn't [John Mattick Wins Chen Award for Distinguished Academic Achievement in Human Genetic and Genomic Research].
It's frustrating to see active scientists who think that most of our genome could have a biological function but who seem to be completely unaware of the evidence for junk. Most of the positive evidence for junk is decades old so there's no excuse for such ignorance.
I wrote a post in 2013 to help these scientists understand the issues: Five Things You Should Know if You Want to Participate in the Junk DNA Debate. It was based in a talk I gave at the Evolutionary Biology meeting in Chicago that year.1 Let's look at John Parrington's new book to see if he got the message [Hint: he didn't].
There's one post for each of the five issues that informed scientists need to address if they are going to write about the amount of junk in your genome.
The genetic load argument has been around for 50 years. It's why experts did not expect a huge number of genes when the genome sequence was published. It's why the sequence of most of our genome must be irrelevant from an evolutionary perspective.
This argument does not rule out bulk DNA hypotheses but it does rule out all those functions that require specific sequences in order to confer biological function. This includes the speculation that most transcripts have a function and it includes the speculation that there's a vast amount of regulatory sequence in our genome. Chapter 5 of The Deeper Genome is all about the importance of regulatory RNAs.
So, starting from a failed attempt top turn a petunia purple, the discovery of RNA interference has revealed a whole new network of gene regulation mediated by RNAs and operating in parallel to the more established one of protein regulatory factors. ... Studies have revealed that a surprising 60 per cent of miRNAs turn out to be recycled introns, with the remainder being generated from the regions between genes. Yet these were parts of the genome formerly viewed as junk. Does this mean we need a reconsideration of this question? This is an issue we will discuss in Chapter 6, in particular with regard to the ENCODE project ...
The implication here is that a substantial part of the genome is devoted to the production of regulatory RNAs. Presumably, the sequences of those RNAs are important. But this conflicts with the genetic load argument unless we're only talking about an insignificant fraction of the genome.
But that's only one part of Parrington's argument against junk DNA. Here's the summary from the last Chapter ("Conclusion") ...
As we've discussed in this book, a major part of the debate about the ENCODE findings has focused on the question of what proportion of the genome is functional. Given that the two sides of this debate use quite different criteria to assess functionality it is likely that it will be some time before we have a clearer idea about who is the most correct in this debate. Yet, in framing the debate in this quantitative way, there is a danger that we might lose sight of an exciting qualitative shift that has been taking place in biology over the past decade or so. So a previous emphasis on a linear flow of information, from DNA to RNA to protein through a genetic code, is now giving way to a much more complex picture in which multiple codes are superimposed on one another. Such a viewpoint sees the gene as more than just a protein-coding unit; instead it can equally be seen as an accumulation of chemical modifications in the DNA or its associated histones, a site for non-coding RNA synthesis, or a nexus in a 3D network. Moreover, since we now know that multiple sites in the genome outside the protein-coding regions can produce RNAs, and that even many pseudo-genes are turning out to be functional, the very question of what constitutes a gene is now being challenged. Or, as Ed Weiss at the University of Pennsylvania recently put it, 'the concept of a gene is shredding.' Such is the nature of the shift that now we face the challenge of not just recognizing the true scale of this complexity, but explaining how it all comes together to make a living, functioning, human being.
Parrington doesn't cover the genetic load argument at all in his book. I don't know why since it seems very relevant. We could not survive as a species if the sequence of most of our genome was important for biological function.
The July 16th (2015) issue of Nature has a few articles devoted to science education [An Education]. The introduction to these articles in the editorial section is worth quoting. It emphasizes two important points that I've been advocating.
Evidence shows us that active learning (student centered learning) is superior to the old memorize-and-regurgitate system with professors giving powerpoint presentations to passive students.
You must deal with student misconceptions or your efforts won't pay off.
So many people have been preaching this new way of teaching that it's truly astonishing that it's not being adopted. It's time to change. It's time to stop rewarding and praising professors who teach the old way and time to start encouraging professors to move to the new system. Nobody says it's going to be easy.
We have professors whose main job is teaching. They should be leading the way.
One of the subjects that people love to argue about, following closely behind the ‘correct’ way to raise children, is the best way to teach them. For many, personal experience and centuries of tradition make the answer self-evident: teachers and textbooks should lay out the content to be learned, students should study and drill until they have mastered that content, and tests should be given at strategic intervals to discover how well the students have done.
And yet, decades of research into the science of learning has shown that none of these techniques is particularly effective. In university-level science courses, for example, students can indeed get good marks by passively listening to their professor’s lectures and then cramming for the exams. But the resulting knowledge tends to fade very quickly, and may do nothing to displace misconceptions that students brought with them.
Consider the common (and wrong) idea that Earth is cold in the winter because it is further from the Sun. The standard, lecture-based approach amounts to hoping that this idea can be displaced simply by getting students to memorize the correct answer, which is that seasons result from the tilt of Earth’s axis of rotation. Yet hundreds of empirical studies have shown that students will understand and retain such facts much better when they actively grapple with challenges to their ideas — say, by asking them to explain why the northern and southern hemispheres experience opposing seasons at the same time. Even if they initially come up with a wrong answer, to get there they will have had to think through what factors are important. So when they finally do hear the correct explanation, they have already built a mental scaffold that will give the answer meaning.
In this issue, prepared in collaboration with Scientific American, Nature is taking a close look at the many ways in which educators around the world are trying to implement such ‘active learning’ methods (see nature.com/stem). The potential pay-off is large — whether it is measured by the increased number of promising students who finish their degrees in science, technology, engineering and mathematics (STEM) disciplines instead of being driven out by the sheer boredom of rote memorization, or by the non-STEM students who get first-hand experience in enquiry, experimentation and reasoning on the basis of evidence.
Implementing such changes will not be easy — and many academics may question whether they are even necessary. Lecture-based education has been successful for hundreds of years, after all, and — almost by definition — today’s university instructors are the people who thrived on it.
But change is essential. The standard system also threw away far too many students who did not thrive. In an era when more of us now work with our heads, rather than our hands, the world can no longer afford to support poor learning systems that allow too few people to achieve their goals.
The old system is also wasteful because it graduates students who can't think critically and don't understand basic concepts.
After years of negotiation between the administration and the Faculty Association, the university has finally allowed full time lecturers to calls themselves "professors" [U of T introduces new teaching stream professorial ranks]. This brings my university into line with some other progressive universities that recognize the value of teaching.
Unfortunately, the news isn't all good. These new professors will have a qualifier attached to their titles. The new positions are: assistant professor (conditional), teaching stream; assistant professor, teaching stream; associate professor, teaching stream; and professor, teaching stream. Research and scholarly activity is an important component of these positions. The fact that the activity is in the field of pedagogy or the discipline in which they teach should not make a difference.
Meanwhile, current professors will not have qualifiers such as "professor: research," or "professor: administration," or "professor: physician," or "professor: mostly teaching."
The next step is to increase the status of these new professors by making searches more rigorous and more competitive, by keeping the salaries competitive with other professors in the university, and by insisting on high quality research and scholarly activity in the field of pedagogy. The new professors will have to establish an national and international reputation in their field just like other professors. They will have to publish in the pedagogical literature. They are not just lecturers. Almost all of them can do this if they are given the chance.
Some departments have to change the way they treat the new professors. The University of Toronto Faculty Association (UTFA) has published a guideline: Teaching Stream Workload. Here's the part on research and scholarly activity ....
In section 7.2, the WLPP offers the following definition of scholarship: “Scholarship refers to any combination of discipline-based scholarship in relation to or relevant to the field in which the faculty member teaches, the scholarship of teaching and learning, and creative/professional activities. Teaching stream faculty are entitled to reasonable time for pedagogical/professional development in determining workload.”
It is imperative that teaching stream faculty have enough time in their schedules, that is, enough “space” in their appointments, to allow for the “continued pedagogical/professional development” that the appointments policy (PPAA) calls for. Faculty teaching excessive numbers of courses or with excessive administrative loads will not have the time to engage in scholarly activity. Remember that UTFA fought an Association grievance to win the right for teaching stream faculty to “count” their discipline-based scholarship. That scholarship “counts” in both PTR review and review for promotion to senior lecturer.
And here's a rule that many departments disobey ...
Under 4.1, the WLPP reminds us of a Memorandum of Agreement workload protection: “faculty will not be required to teach in all three terms, nor shall they be pressured to volunteer to do so.” Any faculty member who must teach in all three terms should come to see UTFA.
I often observe that in discussions of evolution, both evolution skeptics and those who embrace neo-Darwinian evolution are prone to make one of two significant mistakes. Both stem from a failure to distinguish between microevolution and macroevolution.
The book was necessary because there has been so much criticism of the original Stephen Meyer's book Darwin's Doubt. David Klinghoffer has an interesting way of turning this defeat into a victory because he declares,
... the new book is important because it puts to rest a Darwinian myth, an icon of the evolution debate, namely...that there is no debate, about evolution or intelligent design!
Intelligent Design Creationists often get upset when I refer to them as creationists. They think that the word "creationist" has only one meaning; namely, a person who believes in the literal truth of Genesis in the Judeo-Christian Bible. The fact that this definition applies to many (most?) intelligent design advocates is irrelevant to them since they like to point out that many ID proponents are not biblical literalists.
There's another definition of "creationist" that's quite different and just as common throughout the world. We've been describing this other definition to ID proponents for over two decades but they refuse to listen. We've been explaining why it's quite legitimate to refer to them as Intelligent Design Creationists but there's hardly any evidence that they are paying attention. This isn't really a surprise.
God Only Knows is one of my favorite pop songs.1 It's from the Pet Sounds album by the Beach Boys (1966).
Experts have admired Brian Wilson and the Beach Boys for decades but most people have forgotten (or never knew) about their best songs. (Good Vibrations was released as a single at the same time as Pet Sounds.)
I haven't yet seen the movie about Brian Wilson (Love & Mercy).
The first video is a BBC production from 2014 paying tribute to (and featuring) Brian Wilson. The second video is from 1966.
1. I will delete any snarky comments about God and atheism.
Opponents of junk DNA usually emphasize the point that they were surprised when the draft human genome sequence was published in 2001. They expected about 100,000 genes but the initial results suggested less than 30,000 (the final number is about 25,0001. The reason they were surprised was because they had not kept up with the literature on the subject and they had not been paying attention when the sequence of chromosome 22 was published in 1999 [see Facts and Myths Concerning the Historical Estimates of the Number of Genes in the Human Genome].
The experts were expecting about 30,000 genes and that's what the genome sequence showed. Normally this wouldn't be such a big deal. Those who were expecting a large number of genes would just admit that they were wrong and they hadn't kept up with the literature over the past 30 years. They should have realized that discoveries in other species and advances in developmental biology had reinforced the idea that mammals only needed about the same number of genes as other multicellular organisms. Most of the differences are due to regulation. There was no good reason to expect that humans would need a huge number of extra genes.
That's not what happened. Instead, opponents of junk DNA insist that the complexity of the human genome cannot be explained by such a low number of genes. There must be some other explanation to account for the the missing genes. This sets the stage for at least seven different hypotheses that might resolve The Deflated Ego Problem. One of them is the idea that the human genome contains thousands and thousands of nonconserved genes for various regulatory RNAs. These are the missing genes and they account for a lot of the "dark matter" of the genome—sequences that were thought to be junk.
Here's how John Parrington describes it on page 91 of his book.
The study [ENCODE] also found that 80 per cent of the genome was generating RNA transcripts having importance, many were found only in specific cellular compartments, indicating that they have fixed addresses where they operate. Surely there could hardly be a greater divergence from Crick's central dogma than this demonstration that RNAs were produced in far greater numbers across the genome than could be expected if they were simply intermediates between DNA and protein. Indeed, some ENCODE researchers argued that the basic unit of transcription should now be considered as the transcript. So Stamatoyannopoulos claimed that 'the project has played an important role in changing our concept of the gene.'
This passage illustrates my difficulty in coming to grips with Parrington's logic in The Deeper genome. Just about every page contains statements that are either wrong or misleading and when he strings them together they lead to a fundamentally flawed conclusion. In order to critique the main point, you have to correct each of the so-called "facts" that he gets wrong. This is very tedious.
I've already explained why Parrington is wrong about the Central Dogma of Molecular Biology [John Avise doesn't understand the Central Dogma of Molecular Biology]. His readers don't know that he's wrong so they think that the discovery of noncoding RNAs is a revolution in our understanding of biochemisty—a revolution led by the likes of John A. Stamatoyannopoulos in 2012.
The reference in the book to the statement by Stamatoyannopoulos is from the infamous Elizabeth Pennisi article on ENCODE Project Writes Eulogy for Junk DNA (Pennisi, 2012). Here's what she said in that article ...
As a result of ENCODE, Gingeras and others argue that the fundamental unit of the genome and the basic unit of heredity should be the transcript—the piece of RNA decoded from DNA—and not the gene. “The project has played an important role in changing our concept of the gene,” Stamatoyannopoulos says.
I'm not sure what concept of a gene these people had before 2012. It appears that John Parrington is under the impression that genes are units that encode proteins and maybe that's what Pennisi and Stamatoyannopoulos thought as well.
If so, then perhaps the publicity surrounding ENCODE really did change their concept of a gene but all that proves is that they were remarkably uniformed before 2012. Intelligent biochemists have known for decades that the best definition of a gene is "a DNA sequence that is transcribed to produce a functional product."2 In other words, we have been defining a gene in terms of transcripts for 45 years [What Is a Gene?].
This is just another example of wrong and misleading statements that will confuse readers. If I were writing a book I would say, "The human genome sequence confirmed the predictions of the experts that there would be no more than 30,000 genes. There's nothing in the genome sequence or the ENCODE results that has any bearing on the correct understanding of the Central Dogma and there's nothing that changes the correct definition of a gene."
You can see where John Parrington's thinking is headed. Apparently, Parrington is one of those scientists who were completely unaware of the fact that genes could specify functional RNAs and completely unaware of the fact that Crick knew this back in 1970 when he tried to correct people like Parrington. Thus, Parrington and his colleagues were shocked to learn that the human genome only had only 25,000 genes and many of them didn't encode proteins. Instead of realizing that his view was wrong, he thinks that the ENCODE results overthrew those old definitions and changed the way we think about genes. He tries to convince his readers that there was a revolution in 2012.
Parrington seems to be vaguely aware of the idea that most pervasive transcription is due to noise or junk RNA. However, he gives his readers no explanation of the reasoning behind such a claim. Spurious transcription is predicted because we understand the basic concept of transcription initiation. We know that promoter sequences and transcription binding sites are short sequences and we know that they HAVE to occur a high frequency in large genomes just by chance. This is not just speculation. [see The "duon" delusion and why transcription factors MUST bind non-functionally to exon sequences and How RNA Polymerase Binds to DNA]
If our understanding of transcription initiation is correct then all you need is a activator transcription factor binding site near something that's compatible with a promoter sequence. Any given cell type will contain a number of such factors and they must bind to a large number of nonfunctional sites in a large genome. Many of these will cause occasional transcription giving rise to low abundance junk RNA. (Most of the ENCODE transcripts are present at less than one copy per cell.)
Different tissues will have different transcription factors. Thus, the low abundance junk RNAs must exhibit tissue specificity if our prediction is correct. Parrington and the ENCODE workers seem to think that the cell specificity of these low abundance transcripts is evidence of function. It isn't—it's exactly what you expect of spurious transcription. Parrington and the ENCODE leaders don't understand the scientific literature on transription initiation and transcription factors binding sites.
It takes me an entire blog post to explain the flaws in just one paragraph of Parrington's book. The whole book is like this. The only thing it has going for it is that it's better than Nessa Carey's book [Nessa Carey doesn't understand junk DNA].
1. There are about 20,000 protein-encoding genes and an unknown number of genes specifying functional RNAs. I'm estimating that there are about 5,000 but some people think there are many more.
2. No definition is perfect. My point is that defining a gene as a DNA sequence that encodes a protein is something that should have been purged from textbooks decades ago. Any biochemist who ever thought seriously enough about the definition to bring it up in a scientific paper should be embarrassed to admit that they ever believed such a ridiculous definition.
Intelligent Design Creationists are committed to the idea that most of our genome is functional. They oppose the idea that a significant proportion is junk. It's easy to see why a junky genome is incompatible with intelligent design but there's something more to their opposition than that.
I think they've painted themselves into a corner. They are so opposed to evolution and modern science that they will take any opportunity to discredit it. They saw a chance to do so about twenty years ago when they became aware of the controversy surrounding junk DNA. This was their chance to (pretend to) rely on real science to back their position. By taking a stance against junk DNA they could seen to be supporting the latest evidence ... or so they thought.
Intelligent Design Creationists claim that they "predicted" that most our genome would be functional. They claim that "Darwinists" predicted junk DNA. The second part isn't true since evolutionary theory is silent on whether some genomes could become bloated with junk DNA or not. However, the ID proponents are sticking to their guns in spite of the growing consensus that most of our genome is junk.
It's Sunday. I assume that most of you are at church but here's something for the rest of you. It's a sermon by William Lane Craig on The Moral Argument for the Existence of Gods. I found the link on Evolution News & Views (sic) a blog that's devoted to the "science" of Intelligent Design Creationism [Watch: Three from William Lane Craig]. I'm not sure what this has to do with evolution except that the word is mentioned a few times in the videos.
I'm also not sure what this has to do with intelligent design since, as we all know, the "science" of intelligent design has nothing to do with gods.
The National Academies of Sciences (USA) formed a committee to look into scientific integrity. A summary of the report was published in the June 26th issue of Science (Alberts et al., 2015)
I'd like to highlight two paragraphs of that report.
Like all human endeavors, science is imperfect. However, as Robert Merton noted more than half a century ago "the activities of scientists are subject to rigorous policing, to a degree perhaps unparalleled in any other field of activity." As a result, as Popper argued, "science is one of the very few human activities—perhaps the only one—in which errors are systematically criticized and fairly often, in time, corrected." Instances in which scientists detect and address flaws in work constitute evidence of success, not failure, because they demonstrate the underlying protective mechanisms of science at work.
All scientists know this, but some of us still get upset when other scientists correct our mistakes. We have learned to deal with such criticism—and dish it out ourselves—because we know that's how knowledge advances. Our standards are high.
I've just read Conceptual Breakthroughs in Evolutionary Genetics by John Avise. Avise is a Distinguished Professor of Ecology & Evolutionary Biology in the School of Biological Sciences at the University of Califonia at Davis (Davis, California, USA). He has written a number of excellent books including, Inside the Human Genome: A Case for Non-Intelligent Design.
His latest book consists of 70 idiosyncratic "breakthroughs" that have changed the way we think about biology. Each one is introduced with a short paragraph outlining "The Standard Paradigm" followed by another paragraph on "The Conceptual Revolution." There are 70 chapters, one for each "breakthrough," and all of them are two pages in length.
Chapter 42 is entitled: "1970 The Flow of Information."
Here's the "standard paradigm" according to John Avise.
In biochemical genetics, the molecular direction of information flow is invariably from DNA → RNA → protein. In other words, DNA is first transcribed into RNA, which then may be translated into polypeptides that make up proteins. This view was so ensconced in the field that it had become known as the "central dogma" (Crick, 1970) of molecular biology.
It's true that the Watson version of the Central Dogma was "ensconced" by 1970 and it's true that the incorrect Watson version is still "ensconced" in the textbooks.
... once (sequential) information has passed into protein it cannot get out again (F.H.C. Crick, 1958)
The central dogma of molecular biology deals with the detailed residue-by-residue transfer of sequential information. It states that such information cannot be transferred from protein to either protein or nucleic acid. (F.H.C. Crick, 1970)
The version that John Avise refers to is the incorrect version promoted by Jim Watson.
I understand that many biologists have been taught an incorrect version of the Central Dogma but if you are going to write about it you are wise to read the original papers. In this case, Avise quotes the correct paper but he clearly has not read it.
Now let's look at the "conceptual revolution" according to John Avise.
Researchers showed that biochemical information could also flow from RNA → DNA. The key discovery came when Howard Temin and David Baltimore, working independently and on different viral systems, identified an enzyme (reverse transcriptase) that catalyzes the conversion of RNA into DNA, thus enabling the passage of genetic information in a direction contrary to the central dogma.
How do I know that John Avise has not read Crick's 1970 paper? Because here's what Crick says in that paper ...
"The central dogma, enunciated by Crick in 1958 and the keystone of molecular biology ever since, is likely to prove a considerable over-simplification."
This quotation is taken from the beginning of an unsigned article headed "Central dogma reversed", recounting the very important work of Dr Howard Temin and others showing that an RNA tumor virus can use viral RNA as a template for DNA synthesis. This is not the first time that the idea of the central dogma has been misunderstood, in one way or another. In this article I explain why the term was originally introduced, its true meaning, and state why I think that, properly understood, it is still an idea of fundamental importance.
Crick tells us that the discovery of reverse transcriptase did NOT conflict with the central dogma. Thus, John Avise's conceptual revolution never happened. What happened instead, at least for some biologists, is that the discovery of reverse transcriptase taught them that their view of the central dogma was wrong. Most biologists still haven't experienced that particular conceptual revolution.
Crick, F.H.C. (1958) On protein synthesis. Symp. Soc. Exp. Biol. XII:138-163.
Crick, F. (1970) Central Dogma of Molecular Biology. Nature 227, 561-563. [PDF file]