This is the fourth paper that's critical of the ENCODE hype. The first was Sean Eddy's paper in Current Biology (Eddy, 2012). The second was a paper by Niu and Jiang (2012), and the third was a paper by Graur et al. (2013). In my experience this is unusual since the critiques are all directed at how the ENCODE Consortium interpreted their data and how they misled the scientific community (and the general public) by exaggerating their results. Those kind of criticisms are common in journal clubs and, certainly, in the blogosphere, but scientific journals generally don't publish them. It's okay to refute the data (as in the arsenic affair) but ideas usually get a free pass no matter how stupid they are.
In this case, the ENCODE Consortium did such a bad job of describing their data that journals had to pay attention. (It helps that much of the criticism is directed at Nature and Science because the other journals want to take down the leaders!)
Ford Doolittle makes some of the same points made in the other papers. For example, he points out that the ENCODE definition of "function" is not helpful. However, Ford also does a good job of explaining why the arguments in favor of junk DNA are still valid. He says ...
My aim here is to remind readers of the structure of some earlier arguments in defense of the junk concept (10) that remain compelling, despite the obvious success of ENCODE in mapping the subtle and complex human genomic landscape.The emphasis is on the "C-Value Paradox" by which he means the tons of work on variation of genome size. The conclusion from all that effort—dating back to the 1960s—was that large genomes contain a huge amount of non-functional DNA, or junk DNA. This DNA is still junk despite the fact that it may bind transcription factors and contain regions of modified chromatin. These sites are called "functional elements" (FE) in the ENCODE papers even though they probably don't have a "function" by any meaningful sense of the word.
Ford proposes a thought experiment based on our understanding of genome sizes. He notes that lungfish have a huge genome (130,000 Mb) while pufferfish (Takifugu) have a much smaller genome (400 Mb) than we do. (Our genome is 3,200 Mb.) He also notes that there are many closely related species of amphibians, plants, and protists, that differ significantly in genome size.
Here's the thought experiment ...
Suppose that there had been (and probably, some day, there will be) ENCODE projects aimed at enumerating, by transcriptional and chromatin mapping, factor footprinting, and so forth, all of the FEs in the genomes of Takifugu and a lungfish, some small and large genomed amphibians (including several species of Plethodon), plants, and various protists. There are, I think, two possible general outcomes of this thought experiment, neither of which would give us clear license to abandon junk.The first outcome is that all genomes, regardless of size, will have approximately the same number of FEs as the human genome. Since all these species have about the same number of genes, this means that each gene requires a constant number of FEs and it's just lucky that the human genome is almost entirely "functional." There would still have to be lots of junk in species with larger genomes. This result would make it difficult to explain why pufferfish survive with only one eighth as much DNA as humans.
The second outcome of the thought experiment would be that FEs correlate with C-value regardless of complexity. In other words, very similar species will have very different numbers of FEs in spite of the fact that they have the same numbers of genes and the same level of complexity. This is the expected result if FEs are mostly spurious binding sites that have no function but it would be difficult to explain if FEs are really doing something important in regulating gene expression.
The Doolittle thought experiment is similar to Ryan Gregory's Onion Test [see Genome Size, Complexity, and the C-Value Paradox]. In both cases, those who proclaim the death of junk DNA are challenged to explain how their hypothesis is consistent with what we know about variation in genome size. I think it's pretty obvious that the ENCODE leaders haven't thought about the evidence for junk DNA.
There are several other interesting points in Ford Doolittle's paper—most of which I agree with. I want to quote some of them because he says it so much better than I. Here's his critique of the definition of function used by the scientists in the ENCODE Consortium.
A third, and the least reliable, method to infer function is mere existence. The presence of a structure or the occurrence of a process or detectable interaction, especially if complex, is taken as adequate evidence for its being under selection, even when ablation is infeasible and the possibly selectable effect of presence remains unknown. Because our genomes have introns, Alu elements, and endogenous retroviruses, these things must be doing us some good. Because a region is transcribed, its transcript must have some fitness benefit, however remote. Because residue N of protein P is leucine in species A and isoleucine in species B, there must be some selection-based explanation. This approach enshrines "panadaptationism," which was forcefully and effectively debunked by Gould and Lewontin (34) in 1979 but still informs much of molecular and evolutionary genetics, including genomics. As Lynch (39) argues in his essay "The Frailty of Adaptive Hypotheses for the Origins of Adaptive Complexity,"Ford is correct. The ENCODE leaders don't seem to be particularly knowledgeable about modern evolutionary theory.
This narrow view of evolution has become untenable in light of recent observations from genomic sequencing and population genetic theory. Numerous aspects of genomic architecture, gene structure, and developmental pathways are difficult to explain without invoking the nonadaptive forces of genetic drift and mutation. In addition, emergent biological features such as complexity, modularity, and evolvability, allof which are current targets of considerable speculation, may be nothing more than indirect by-products of processes operating at lower levels of organization.Functional attribution under ENCODE is of this third sort (mere existence) in the main.
Ford Doolittle makes another point that also came up during my critique of Jonathan Wells' book The Myth of Junk DNA. Wells knows that SOME transposons have secondarily acquired a function so he assumes that ALL transposons must have a function. There are many scientists who fall into the same trap—they find a function for one or two particular features of the genome and then leap to the conclusion that all such features are functional.
Doolittle uses the example of lncRNAs (long non-coding RNAs). A very small number of them have been shown to have a function but that does not mean that all of them do. Ford's point is that the way science operates it is guaranteed that someone will find a function for at least one lncRNA because biology is messy.
Regulation, defined in this loose way, is, for instance, the assumed function of many or most lncRNAs, at least for some authors (6,7,44,45). However, the transcriptional machinery will inevitably make errors: accuracy is expensive, and the selective cost of discriminating against all false promoters will be too great to bear. There will be lncRNAs with promoters that have arisen through drift and exist only as noise (46). Similarly, binding to proteins and other RNAs is something that RNAs do. It is inevitable that some such interactions, initially fortuitous, will come to be genuinely regulatory, either through positive selection or the neutral process described below as constructive neutral evolution (CNE). However, there is no evolutionary force requiring that all or even most do. At another (sociology of science) level, it is inevitable that molecular biologists will search for and discover some of those possibly quite few instances in which function has evolved and argue that the function of lncRNAs as a class of elements has, at last, been discovered. The positivist, verificationist bias of contemporary science and the politics of its funding ensure this outcome.What he didn't say—probably because he's too kind—is that these same pressures (pressure to publish and pressure to get funding) probably lead to incorrect claims of function.
It should be obvious to all of you that Ford Doolittle expects that the outcome of the thought experiment will be that "functional elements" (i.e. binding sites etc.) will correlate with genome size. This means that FEs aren't really functional at all—they are part of the junk. Does it make sense that our genome is full of nonfunctional bits of DNA? Yes, it does, especially if the spurious binding sites are neutral. It even makes sense if they are slightly delterious because Doolittle understands Michael Lynch's arguments for the evolution of nonadaptive complexity.
Assuming these predictions are borne out, what might we make of it? Lynch (39) suggests that much of the genomic- and systems-level complexity of eukaryotes vis à vis prokaryotes is maladaptive, reflecting the inability of selection to block fixation of incrementally but mildly deleterious mutations in the smaller populations of the former.It's clear that the ENCODE leaders don't think like this. So, what motivates them to propose that our genome is full of regulatory sites and molecules when there seems to be a more obvious explanation of their data? Doolittle has a suggestion ...
[A fourth misconception] may be a seldom-articulated or questioned notion that cellular complexity is adaptive, the product of positive selection at the organismal level. Our disappointment that humans do not have many more genes than fruit flies or nematodes has been assuaged by evidence that regulatory mechanisms that mediate those genes’ phenotypic expressions are more various, subtle, and sophisticated (57), evidence of the sort that ENCODE seems to vastly augment. Yet there are nonselective mechanisms, such as [constructive neutral evolution], that could result in the scaling of FEs as ENCODE defines them to C-value nonadaptively or might be seen as selective at some level higher or lower than the level of individual organisms. Splits within the discipline between panadaptationists/neutralists and those researchers accepting or doubting the importance of multilevel selection fuel this controversy and others in biology.I agree. Part of the problem is adaptationism and the fact that many biochemists and molecular biologists don't understand modern concepts in evolution. And part of the problem is The Deflated Ego Problem.
It's a mistake to think that this debate is simply about how you define function. That seems to be the excuse that the ENCODE leaders are making in light of these attacks. Here's how Ford Doolittle explains it ...
In the end, of course, there is no experimentally ascertainable truth of these definitional matters other than the truth that many of the most heated arguments in biology are not about facts at all but rather about the words that we use to describe what we think the facts might be. However, that the debate is in the end about the meaning of words does not mean that there are not crucial differences in our understanding of the evolutionary process hidden beneath the rhetoric.This reminds me of something that Stephen J. Gould said in Darwinism and the Expansion of Evolutionary Theory (Gould, 1982).
The world is not inhabited exclusively by fools and when a subject arouses intense interest and debate, as this one has, something other than semantics is usually at stake.
Doolittle, W.F. (2013) Is junk DNA bunk? A critique of ENCODE. Proc. Natl. Acad. Sci. (USA) published online March 11, 2013. [PubMed] [doi: 10.1073/pnas.1221376110]
Graur, D., Zheng, Y., Price, N., Azevedo, R. B., Zufall, R. A., and Elhaik, E. (2013) On the immortality of television sets: "function" in the human genome according to the evolution-free gospel of ENCODE. Genome Biology and Evolution published online: February 20, 2013 [doi: 10.1093/gbe/evt028]
Gould, S. J. (1982) "Darwinism and the expansion of evolutionary theory." Science 216: 380-387.
Eddy, S. R. (2012) The C-value paradox, junk DNA and ENCODE. Current Biology, 22(21), R898. [preprint PDF]
Niu, D. K., and Jiang, L. (2012) Can ENCODE tell us how much junk DNA we carry in our genome?. Biochemical and biophysical research communications 430:1340-1343. [doi: 10.1016/j.bbrc.2012.12.074]
I always liked Ford Doolittle's work. He's anything but a BSer.
ReplyDeleteLynch is widely quoted these days, but I find myself sceptical of the primacy of population size as a significant driver of the C-value differential between eukaryotes and prokaryotes. What does 'population size' even mean in a mixed collection of clonal organisms? The individual bacterium has severe local constraints, and they arise from its means of making a living, not least of which are bounding within a size-limiting energy-generating outer membrane and being surrounded by cousins.
ReplyDeleteEndosymbionts, relaxation of nutritional limits, cytoskeletons, multiple origins and sex are, IMO, much more significant than Ne with respect to the adaptive landscape traversed by the respective organisms wrt junk.
Lynch's point is that the small population sizes of SOME eukaryotes makes it impossible to eliminate slightly deleterious alleles and this is an adequate explanation of junk DNA (and other things).
DeleteThe explanation of junk DNA has nothing to do with adaptive lanscapes.That's the whole point of nonadaptive evolution of complexity.
I agree. Population size is probably not a good explanation for the prokaryote/eukaryote differences in C-value. I think it's mostly energetics; eukaryotes have a substantial surplus of energy due to mithocondria that prokaryotes don't and that enables the former to "ignore" to a certain extent the energy cost in replicating "junk". Energetic surplus is very probably the reason why eukaryotic cells achieved the level of complexity and multicellular cell specialization they have.
DeleteI do think, however, that population size is certainly important in fixation of neutral or quasi-neutral mutations, regardless of the organisms being clonal or not. But that probably has nothing to do with C-values anyway.
I don't find it convincing. All one would need to do to eliminate the junk in a genome is to make the population bigger?
DeleteIt is true that scaling up a population's number will render elimination of deleterious alleles more likely, but there are a couple of unproven assumptions in the most naive model. Do we know, for instance, what the selective coefficient of N bits of junk 'typically' is? It would have to be in a particular band to ensure that it could not be eliminated by a population of N individuals, but could be by a population a couple of orders of magnitude higher. Do we know that it is? There is also the issue of scaling. If you argue with N as the only variable, you have to do something to ensure that the new N is stirred with the same efficiency as the old, and typically it isn't (and in prokaryotes, the absence of mate search means that a significant vector is completely absent).
The explanation of junk DNA has nothing to do with adaptive lanscapes.That's the whole point of nonadaptive evolution of complexity.
There is nothing to say an adaptive landscape cannot be flat! As far as 'surplus' DNA is concerned, prokaryotes are on peaks. For eukaryotes, the landscape is much flatter - then population size may start to exert an effect. But there are many more potent mechanistic biases than that.
@Pedro,
DeleteYes, I'd agree energetics is significant, but also basic 'nutrition' - one has to get the building blocks for all this surplus, and consumption is far more productive than absorption. As far as multicellulars go, I think their main cost is that multicellular body, and unless nutrition is severely limiting in general, the cost of being-multicellular-with-junk imposes a negligible increase.
Allen Miller asks,
DeleteAll one would need to do to eliminate the junk in a genome is to make the population bigger?
That doesn't necessarily follow but it does explain why most single-cell eukaryotes have genomes that are much smaller than those of multicellular species. Yeast (a fungus), for example, has a genome that's the same size as those in some bacteria.
(Note to nitpickers: I'm aware of amoeba genomes.)
Allan,
Deleteyes, I agree.
Allan Miller says,
DeleteIf you argue with N as the only variable, you have to do something to ensure that the new N is stirred with the same efficiency as the old, and typically it isn't (and in prokaryotes, the absence of mate search means that a significant vector is completely absent).
I suspect that you haven't read Lynch's book. He discusses all sorts of other variables that can affect genome size. The main ones are mutation rates—especially the difference between deletions and insertions, generation times, recombination, and body size. He discusses whether these are correlated with population size and, if so, how. He also spends a lot of time on effective populations sizes and evolution in species with many subpopulations.
In most cases, Lynch presents data to back up his ideas.
Ameobas have one special feature, which has to be taken into account when Ne is discussed and it is asexuality.
DeleteOf course, nobody has sequenced those genomes and nobody will until super long-read sequencing becomes available but I would bet they're full of out-of-control transposons, probably in combination with some serious polyploidy.
And that would be consistent with the theory. Which, BTW, does not say that Ne is the only thing that matters - all sorts of details about the biology of the species, the mutation rates and patterns, etc. do matter and it is not an absolute relationship as a result, only a general pattern: small Ne => big genomes, full of transposons, large Ne => small genomes, few transposons, fewer introns, etc.
Larry/Georgi,
DeleteCorrect, I should perhaps read more before pontificating. Nonetheless, the difficulty I see is in establishing that it is Ne itself that is the factor at work, rather than something of which change in Ne is an inevitable corollary.
Multicellularity, for example. The principal cost of extra DNA is in building the soma. Such somas inevitably reduce the number of germ line cells that a given niche can support. But such somas generally pay their way in germ cell survival, with a bit to spare. Unless nutrition is severely and consistently limiting, junk-generating mutations (up to a point) may simply not be deleterious, for any Ne. For a population to become large, nutritional limitation - a significant determinant of 'detriment' - is less likely to be operational.
I realise that nutrition is not the sole 'cost' of junk, and that other factors, such as meiotic misalignment, may come into play. Which may help explain the higher rate of transposons in asexual eukaryotes.
Allan Miller,
DeleteI am not sure if I understand you correctly, but if I do, you are in effect arguing for the "Ne determines the strength of selection" position while claiming to oppose it.
Yes, the mere existence of junk DNA has in terms of nutrition a very slight negative effect. That means the absolute value of the selective coefficient is very low. Accordingly, 1/4Ne >> s and it becomes an effectively neutral mutation and selection cannot get rid of it. That's the point.
Georgi,
DeleteYes, I think you may be misunderstanding me. I am recognising the argument, but questioning its force.
For any imaginary small value of s, there is a threshold value of Ne below which selection is ineffective. The question would be: how robust is the assumption that s for junk increments is generally in the range that enables this relationship to have causal power? It may indeed be a factor (allowing certain assumptions about the efficiency with which real populations are stirred and 'sampled') but a dominant factor, I'm less convinced.
The null hypothesis would be that s=0, and Ne doesn't matter. Adaptationists are criticised for assuming s is large, but here we have a similar assumption - s is nonzero and within a particular range.
Obviously, the situation here is rather complex, because the detriment of a given increment of junk is highly contingent. s depends how much more or less efficient than the rest of the population it makes its bearers, and this will vary depending on how big it is and what other increments are around in the population at the time.
I accept the approximate correlation, and the possibility that Ne may be one factor, but there are significant mechanistic factors relating to the different types of organism that themselves affect both the dynamics of junk, and Ne. It may be these, rather than probabilistic effects, that provide the causation.
So if I now understand you correctly, are you arguing that the selection coefficient of new TE insertions is always 0 so Ne does not play a role. That's not true. It is clearly not 0 because it was, there would not be such elaborate defense mechanisms trying to prevent new insertions and TEs would not be so rare in organisms with very large population sizes.
DeleteSo if I now understand you correctly, are you arguing that the selection coefficient of new TE insertions is always 0 so Ne does not play a role.
DeleteNope - that was my 'null hypothesis' :) I'm certainly not saying "s always = 0". I was arguing particularly on the nutritional cost assumption, and was looking for some justification of the assumption that that cost is sufficient for natural Ne differentials, and their effect on selective response, to provide 'the' explanation for patterns of junk.
I granted that other mechanisms act against junk. In particular, a virulent transposon clearly does damage to genes, causes meiotic misalignments, and ups the genetic load at a rate potentially far in excess of that which can be absorbed by the below-threshold differentials of less active insertions. Then, of course, real and potentially large selection coefficients leap into action. No argument here.
I'm interested in "TEs would not be so rare in organisms with very large population sizes.", however. What organisms are we talking about here? I am generally arguing for comparing like with like, so I'd hope this was not across a major divide such as pro/eukaryote or uni/multicellular.
Uni/multicellular is not a large divide - there is nothing fundamentally different about unicellular eukaryotes, it's just our historic multicellular bias that creates the division. Multicellularity has developed multiple times in multiple lineages, independently.
DeleteMost unicellular eukaryotes have large Ne, small genomes, with few TEs, and few introns. Similarly, smaller multicellular eukaryotes have more compressed genomes than mammals (but bigger than those of protists) and fewer TEs - flies are a perfect example and in their case TEs have the added bonus of being younger and more active, i.e. the old ones have been purged already.
As I said, it's not a perfect relationship, it's not expected to be, but it exists.
My point about the unicellular/multicellular divide relates to a couple of mechanistic factors that may have a bearing - one is that the 'nutrition' cost is principally borne by the extra DNA in somatic cells, not by a combined somatic/germ cell, another is that generation times tend to be extended by the existence of that phase, increasing the time available for replication.
DeleteBut another is that the germ line DNA is encapsulated. There is less constraint to gather the materials for life in close contact with an unforgiving medium - the unit cost of germline DNA goes down; the soma pays for itself and then some.
Prokaryotes are most severely constrained, as they are tiny, energetically limited, in direct molecular competition with relatives, must separate sister chromosomes by cell wall growth, etc.
Single-celled eukaryotes are about 10,000 times bigger, with cytokinetic manipulation, multiple origins of replication, food engulfment, storage and/or a large energetic surface. This reduces the restraint on genome expansion.
Multicellular eukaryotes can sit in their somatic cloak, indulging a life of leisure, with cellular specialisation the payoff for elaboration, and further freeloading DNA the cost. If you are indulging the cost of a soma, you can better afford a bit of surplus DNA. Unless you fly, of course.
In tandem with this series goes an inevitable reduction in Ne, so naturally the correlation holds.
I dunno - just being pedantic, perhaps - but I likes a mechanistic explanation meself!
Nature is backpaddling. Well, kind of:
ReplyDeleteMany biologists have called the 80% figure more a publicity stunt than a statement of scientific fact. Nevertheless, ENCODE leaders say, the data resources that they have provided have been immensely popular. So far, papers that use the data have outnumbered those that take aim at the definition of function.
Currently, it is 400 authors and 30+ papers vs. four authors and four papers and 200,000,000.00 vs. 0.00$.
The debate sounds like a matter of definitional differences. But to dismiss it as semantics minimizes the importance of words and definitions, and of how they are used to engage in research and to communicate findings. ENCODE continues to collect data and to characterize what the 3.2 billion base pairs might be doing in our genome and whether that activity is important. If a better word than ‘function’ is needed to describe those activities, so be it. Suggestions on a postcard please.
There is exactly one English word for most of these activities. It is real and you can measure it: noise.
Not that it would make much of a difference: it is >440 authors and 30+ papers vs. 10 authors and four papers and 200,000,000.00 vs. 0.00$
ReplyDeleteHas anyone on either side of this debate removed the useless 80% of the human genome and recorded what the results where?
ReplyDelete"Ford is correct. The ENCODE leaders don't seem to be particularly knowledgeable about modern evolutionary theory."
Why would they need to have a preconceived idea of the evolution theory to examine the function of something? Are they trying to confirm the paradigm, or find out what the genome does? This seems to me to be the problem with interpreting what is being looked at. Kind of like saying "I wouldn't have seen it if I didn't believe it."