More Recent Comments

Friday, December 13, 2019

The "standard" view of junk DNA is completely wrong

I was browsing the table of contents of the latest issue of Cell and I came across this ....
For decades, the miniscule protein-coding portion of the genome was the primary focus of medical research. The sequencing of the human genome showed that only ∼2% of our genes ultimately code for proteins, and many in the scientific community believed that the remaining 98% was simply non-functional “junk” (Mattick and Makunin, 2006; Slack, 2006). However, the ENCODE project revealed that the non-protein coding portion of the genome is copied into thousands of RNA molecules (Djebali et al., 2012; Gerstein et al., 2012) that not only regulate fundamental biological processes such as growth, development, and organ function, but also appear to play a critical role in the whole spectrum of human disease, notably cancer (for recent reviews, see Adams et al., 2017; Deveson et al., 2017; Rupaimoole and Slack, 2017).

Slack, F.J. and Chinnaiyan, A.M. (2019) The Role of Non-coding RNAs in Oncology. Cell 179:1033-1055 [doi: 10.1016/j.cell.2019.10.017]
Cell is a high-impact, refereed journal so we can safely assume that this paper was reviewed by reputable scientists. This means that the view expressed in the paragraph above did not raise any alarm bells when the paper was reviewed. The authors clearly believe that what they are saying is true and so do many other reputable scientists. This seems to be the "standard" view of junk DNA among scientists who do not understand the facts or the debate surrounding junk DNA and pervasive transcription.

Here are some of the obvious errors in the statement.
  1. The sequencing of the human genome did NOT show that only ~2% of our genome consisted of coding region. That fact was known almost 50 years ago and the human genome sequence merely confirmed it.
  2. No knowledgeable scientist ever thought that the remaining 98% of the genome was junk—not in 1970 and not in any of the past fifty years.
  3. The ENCODE project revealed that much of our genome is transcribed at some time or another but it is almost certainly true that the vast majority of these low-abundance, non-conserved, transcripts are junk RNA produced by accidental transcription.
  4. The existence of noncoding RNAs such as ribosomal RNA and tRNA was known in the 1960s, long before ENCODE. The existence of snoRNAs, snRNAs, regulatory RNAs, and various catalytic RNAS were known in the 1980s, long before ENCODE. Other RNAs such as miRNAs, piRNAS, and siRNAs were well known in the 1990s, long before ENCODE.
How did this false view of our genome become so widespread? It's partially because of the now highly discredited ENCODE publicity campaign orchestrated by Nature and Science but that doesn't explain everything. The truth is out there in peer-reviewed scientific publications but scientists aren't reading those papers. They don't even realize that their standard view has been seriously challenged. Why?


Monday, October 21, 2019

The evolution of de novo genes

De novo genes are new genes that arise spontaneously from junk DNA [De novo gene birth]. The frequency of de novo gene creation is important for an understanding of evolution. If it's a frequent event, then species with a large amount of junk DNA might have a selective advantage over species with less junk DNA, especially in a changing environment.

Last week I read a short Nature article on de novo genes [Levy, 2019] and I think the subject deserves more attention. Most new genes in a species appear to arise by gene duplication and subsequent divergence but de novo genes are genes that are unrelated to genes in any other clade so we can assume that they are created from junk DNA that accidentally becomes associated with a promoter causing the DNA to be transcribed. A new gene is formed if the RNA acquires a function. If the transcript contains an open reading frame then it may be translated to produce a polypeptide and if the polypeptide performs a new function then the resulting de novo gene is a new protein-coding gene.

The important question is whether the evolution of de novo genes is a common event or a rare event.

Tuesday, September 24, 2019

How many protein-coding genes in the human genome? (2)

It's difficult to know how many protein-coding genes there are in the human genome because there are several different ways of counting and the counts depend on what criteria are used to identify a gene. Last year I commented on a review by Abascal et al. (2018) that concluded there were somewhere between 19,000 and 20,000 protein-coding genes. Those authors discussed the problems with annotation and pointed out that the major databases don't agree on the number of gene [How many protein-coding genes in the human genome?].

Wednesday, September 11, 2019

Gerald Fink promotes a new definition of a gene

This is the 2019 Killian lecture at MIT, delivered in April 2019 by Gerald Fink. Fink is an eminent scientist who has done excellent work on the molecular biology of yeast. He was director of the prestigious Whitehead Institute at MIT from 1990-2001. With those credentials you would expect to watch a well-informed presentation of the latest discoveries in molecular genetics. Wouldn't you?



Sunday, September 08, 2019

Contingency, selection, and the long-term evolution experiment

I'm a big fan of Richard Lenski's long-term evolution experiment (LTEE) and of Zachary Blount's work in particular. [Strolling around slopes and valleys in the adaptive landscape] [On the unpredictability of evolution and potentiation in Lenski's long-term evolution experiment] [Lenski's long-term evolution experiment: the evolution of bacteria that can use citrate as a carbon source]

The results of the LTEE raise some interesting questions about evolution. The Lenski experiment began with 12 (almost) identical cultures and these have now "evolved" for 31 years and more than 65,000 generations. All of the cultures have diverged to some extent and one of them (and only one) has developed the ability to use citrate as a carbon source. Many of the cultures exhibit identical, or very similar, mutations that have reached significant frequencies, or even fixation, in the cultures.

Several other laboratory evolution experiments have been completed or are underway in various labs around the world. The overall results are relevant to a discussion about the role of contingency and accident in the history of life [see Evolution by Accident]. Is it true that if you replay the tape of life the results will be quite different? [Replaying life's tape].

Friday, August 30, 2019

Evolution by Accident

Evolution by Accident
v1.43 ©2006 Laurence A. Moran

This essay has been transferred here from an old server that has been decommissioned.Modern concepts of evolutionary change are frequently attacked by those who find the notions of randomness, chance, and accident to be highly distasteful. Some of these critics are intelligent design creationists and their objections have been refuted elsewhere. In this essay I'm more concerned about my fellow evolutionists who go to great lengths to eliminate chance and accident from all discussions about the fundamental causes of evolution. This is my attempt to convince them that evolution is not as predictable as they claim. I was originally stimulated to put my ideas down on paper when I read essays by John Wilkins [Evolution and Chance] and Loren Haarsma [Chance from a Theistic Perspective] on the TalkOrigins Archive.

The privilege of living beings is the possession of a structure and of a mechanism which ensures two things: (i) reproduction true to type of the structure itself, and (ii) reproduction equally true to type, of any accident that occurs in the structure. Once you have that, you have evolution, because you have conservation of accidents. Accidents can then be recombined and offered to natural selection to find out if they are of any meaning or not.
Jacques Monod (1974) p.394
The main conclusion of this essay is that a large part of ongoing evolution is determined by stochastic events that might as well be called "chance" or "random." Furthermore, a good deal of the past history of life on Earth was the product of chance events, or accidents, that could not have been predicted. When I say "evolution by accident" I'm referring to all these events. This phrase is intended solely to distinguish "accidental" evolution from that which is determined by non-random natural selection. I will argue that evolution is fundamentally a random process, although this should not be interpreted to mean that all of evolution is entirely due to chance or accident. The end result of evolution by accident is modern species that do not look designed.

Tuesday, August 27, 2019

First complete sequence of a human chromosome

A paper announcing the first complete sequence of a human chromosome has recently been posted on the bioRxiv server.

Miga, K. H., Koren, S., Rhie, A., Vollger, M. R., Gershman, A., Bzikadze, A., Brooks, S., Howe, E., Porubsky, D., Logsdon, G. A., et al. (2019) Telomere-to-telomere assembly of a complete human X chromosome. bioRxiv, 735928. doi: [doi: 10.1101/735928]

Abstract: After nearly two decades of improvements, the current human reference genome (GRCh38) is the most accurate and complete vertebrate genome ever produced. However, no one chromosome has been finished end to end, and hundreds of unresolved gaps persist. The remaining gaps include ribosomal rDNA arrays, large near-identical segmental duplications, and satellite DNA arrays. These regions harbor largely unexplored variation of unknown consequence, and their absence from the current reference genome can lead to experimental artifacts and hide true variants when re-sequencing additional human genomes. Here we present a de novo human genome assembly that surpasses the continuity of GRCh38, along with the first gapless, telomere-to-telomere assembly of a human chromosome. This was enabled by high-coverage, ultra-long-read nanopore sequencing of the complete hydatidiform mole CHM13 genome, combined with complementary technologies for quality improvement and validation. Focusing our efforts on the human X chromosome, we reconstructed the ∼2.8 megabase centromeric satellite DNA array and closed all 29 remaining gaps in the current reference, including new sequence from the human pseudoautosomal regions and cancer-testis ampliconic gene families (CT-X and GAGE). This complete chromosome X, combined with the ultra-long nanopore data, also allowed us to map methylation patterns across complex tandem repeats and satellite arrays for the first time. These results demonstrate that finishing the human genome is now within reach and will enable ongoing efforts to complete the remaining human chromosomes.

Sunday, August 25, 2019

How much of the human genome has been sequenced?

It's been more than seven years since I posted information on how much of the human genome has been sequenced [How Much of Our Genome Is Sequenced?]. At that time, the latest version of the human reference genome was GRCh37.p7 (Feb. 3, 2012) and 89.6% of the genome had been sequenced. It's time to update that information.

We have a pretty good idea of the size of the human genome based on quantitative Feulgen staining (1940-1980) and reassociation kinetic experiments from the 1970s (Morton, 1991). We can safely assume that the correct size of the human genome is close to 3,200,000,000 bp (3,200,000 kb, 3,200 Mb, 3.2 Gb) [How Big Is the Human Genome?]. That's the value cited most often in the literature. However, the actual values calculated by Morton (1991) were 3.227 Gb for the haploid female genome and less than that for the haploid male genome. The human reference genome contains all 22 autosomes plus one copy of the X chromosome and one copy of the Y chromosome. This gives a total of 3.286 Gb.

Thursday, August 22, 2019

Reactionary fringe meets mutation-biased adaptation.
7. Going forward

This the last of a series of posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution. Arlin has established that the role of mutation in evolution is much more important than most people realize. He has also built a strong case for the influence of mutation bias. How should we incorporate these concepts into modern evolutionary theory?

Click on the links in the box (below) to see the other posts in the series.



Reactionary fringe meets mutation-biased adaptation.
7. Going forward

by Arlin Stoltzfus

Haldane (1922) argued that, because mutation is a weak pressure easily overcome by selection, the potential for biases in variation to influence evolution depends on neutral evolution or high mutation rates. This theory, like the Modern Synthesis of 1959, depends on the assumption that evolution begins with pre-existing variation. By contrast, when evolution depends on the introduction of new variants, mutational and developmental biases in variation may impose biases on evolution, without requiring neutral evolution or high mutation rates.

Thursday, August 15, 2019

Reactionary fringe meets mutation-biased adaptation.
5.5 Synthesis apologetics

This is part of a continuing series of posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution. In this post, Arlin explains how defenders of the Modern Synthesis react in the face of serious challenges to the theory that was formulated in the 1940s and 50s. Rather than reject the theory, they engage in various forms of "synthesis apologetics."

Click on the links in the box (below) to see the other posts in the series.




Reactionary fringe meets mutation-biased adaptation. 5.6 Synthesis apologetics
by Arlin Stoltzfus

Tuesday, August 06, 2019

Reactionary fringe meets mutation-biased adaptation.
5.4. Taking neo-Darwinism seriously

This is part of a continuing series of posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution. In this post Arlin discusses his view of neo-Darwinism and why it is inconsistent with macromutations and lateral gene transfer. He equates neo-Darwinism with the Modern Synthesis (1959 version), a comparison that might be challenged. Click on the links in the box (below) to see the other posts in the series.




Reactionary fringe meets mutation-biased adaptation. 5.4. Taking neo-Darwinism seriously
by Arlin Stoltzfus

The Modern Synthesis is often described as the result of combining Darwinism and genetics. This description, in my opinion, is concise and historically accurate: the Modern Synthesis of 1959 is a sophisticated attempt to arrange the pieces of population genetics to justify a neo-Darwinian dichotomy in which variation merely supplies raw materials, and selection is the source of initiative, creativity and direction.

Monday, August 05, 2019

Religion vs science (junk DNA): a blast from the past

I was checking out the science books in our local bookstore the other day and I came across Evolution 2.0 by Perry Marshall. It was published in 2015 but I don't recall seeing it before.

The author is an engineer (The Salem Conjecture) who's a big fan of Intelligent Design. The book is an attempt to prove that evolution is a fraud.

I checked to see if junk DNA was mentioned and came across the following passages on pages 273-275. It's interesting to read them in light of what's happened in the past four years. I think that the view represented in this book is still the standard view in the ID community in spite of the fact that it is factually incorrect and scientifically indefensible.

Friday, August 02, 2019

Reactionary fringe meets mutation-biased adaptation.
6. What "limits" adaptation?

This is part of a continuing series of posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution. In this post Arlin discusses the role of adaptation and what determines the pathway that it will take over time. Is it true that populations will always adapt quickly to any change in the environment? (Hint: no it isn't!) Click on the links in the box (below) to see the other posts in the series.




Reactionary fringe meets mutation-biased adaptation.
6. What "limits" adaptation?

by Arlin Stoltzfus
According to the hatchet piece at TREE, theoretical considerations dictate that biases in variation are unlikely to influence adaptation, because this requires small population sizes and reciprocal sign epistasis.

Yet, we have established that mutation-biased adaptation is real (see The empirical case and Some objections addressed). If theoretical population genetics tells us that mutation-biased adaptation is impossible or unlikely, what is wrong with theoretical population genetics?

Adaptation, before Equilibrium Day

Wednesday, July 31, 2019

Reactionary fringe meets mutation-biased adaptation.
5.3. How history is distorted.

This is the ninth in a series of guest posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution. Click on the links in the box (below) to see the other post in the series.


Reactionary fringe meets mutation-biased adaptation.
5.3. How history is distorted.

by Arlin Stoltzus
In his famous Materials for the Study of Variation, Bateson (1894) refers to natural selection as "obviously" a "true cause" (p. 5). Punnett (1905) explains that mutations are heritable while environmental fluctuations are not, concluding that "Evolution takes place through the action of selection on these mutations" (p. 53). De Vries begins his major 1905 English treatise by writing that ...
"Darwin discovered the great principle which rules the evolution of organisms. It is the principle of natural selection. It is the sifting out of all organisms of minor worth through the struggle for life. It is only a sieve, and not a force of nature" (p. 6)
Morgan (1916), in his closing summary, writes:
"Evolution has taken place by the incorporation into the race of those mutations that are beneficial to the life and reproduction of the organism" (p. 194)

Monday, July 22, 2019

Reactionary fringe meets mutation-biased adaptation.
5.2. The Modern Synthesis of 1959

This is the eighth in a series of guest posts by Arlin Stoltzfus on the role of mutation as a dispositional factor in evolution.


Reactionary fringe meets mutation-biased adaptation. 5.2. The Modern Synthesis of 1959
by Arlin Stoltfus

As we learned in What makes it new?, the newness of the effect of biases in the introduction process results from a classical assumption that evolution can be understood as a process of shifting the frequencies of existing alleles. How did this position emerge? Was it a technical, mathematical issue?