Here's a link to the junk DNA debate between Dan Stern Cardinale and Casey Luskin. The debate took place on May 2, 2024.
I mentioned in a previous post that Luskin should have been called out on his repeated attempts to equate junk DNA with non-coding DNA. This allowed him to portray all non-coding functions as evidence against junk DNA. [Casey Luskin posts misleading quotes about junk DNA].
There are several other things that I would have done differently. I would have made it clear that 10% of the genome is functional and we don't know the function of some of that fraction. Thus, all newly discovered functional regions could still fit into the 10% and 90% of the genome is still junk. Every time Casey mentions a new function he should have been challenged to specify exactly what percentage of the genome he is referring to. (Dan tried to do this but he was too nice, and let Casey off the hook.)
The idea here is to make it clear to viewers that recent discoveries of functional regions do not affect the idea that most of our genome is junk.
I would also attempt to get Casey to admit that there's a scientific controversy over junk DNA so there are many papers defending junk DNA and criticizing the arguments of junk DNA opponents. For every quotation from a scientist who opposes junk, there's an equally significant quotation from one who supports junk. Why does Casey only quote scientists who agree with him? Is this cherry-picking? Is selectively rattling off quotations and references from people who agree with you a reasonable way to have a serious scientific debate?
I think the arguments over transcripts should begin with presenting all the scientific evidence that spurious transcripts exist - for example, random DNA sequences inserted into a cell nucleus are transcribed and spurious transcription is easily documented in well-studied organisms such as bacteria and yeast. The characteristics of spurious transcription are that the transcripts are present in very small amounts, that they are rapidly degraded, that they come from regions of the genome that are not under purifying selection, and they are cell/tissue specific. So what is the most reasonable explanation when you look at such transcripts?
Casey Luskin's attempt to avoid the best explanation (spurius transcription) is a classic example ad hoc rescue and it might have been useful to point this out to viewers.
Regulation is not new. There was serious discussion and debate over the amount of the genome devoted to regulation back in the late 1960s when the concept of junk DNA was first proposed. Casey should have been challenged to state what percentage of the genome is devoted to regulation and if he comes up with an unreasonable number he should have to give examples of many well-studied genes that have been shown to have that level of regulation. (Hint: There aren't any.) All of the detailed work on the regulation of dozens of specific human genes has shown that you don't need more than a few transcription factor binding sites to control expression. Is there any reason to suppose that the other genes require ten or a hundred times more regulatory sequences to control expression?
What is the trend line? Ever since the ENCODE publicity disaster of 2012 there has been a flood of papers defending junk DNA and the data supporting junk DNA is now stronger that it has ever been because we now know from hundreds of thousands of human genome sequences that only about 10% is under purifying selection. There have also been a lot of papers fleshing out the 10% of the genome that's functional. There have only been a handful of papers published in the past ten years that seriously attempt to present evidence that most of our genome is functional. I would have challenged Casey to come up with a single scientific publication in the past ten years claiming, with supporting data, that most of the genome is functional.
Larry, I was wondering if there are any studies that have reversed the functional to non-function. I enjoyed the debate and of course Dan could have done a lot since there is a lot to cover but he did fairly well and of course I think Luskin's points are clearly obvious for anyone mildly familiar (me) with these issues. Thanks for your book - loved it!
ReplyDeleteMeant to say clearly obviously problematic - all four of his introductory points were WRONG!
ReplyDelete@Jathro: I'm not sure what you mean by "reversed the functional to non-function." Could you clarify?
ReplyDeleteI just meant where scientists attributed function to something and then upon later tests/review reversed that to the no-function category.
ReplyDeleteWell, there's the walking back by ENCODE of the claim that 80-100% of the genome is functional.
ReplyDelete@Jathro: There are plenty of examples of scientists attributing function where the attributions are not believed by other scientists. There aren't many examples of scientists withdrawing their claims of function in the face of further evidence or logical arguments. The best example is the one John Harshman mentions above.
ReplyDeleteIn the debate, Casey claims they didn’t walk back their claim. I read the Kellis paper and they seem to be less certain about the 80% claim but they double down on the assertion for lots of function
ReplyDeleteDefine "lots".
ReplyDeleteIsn't the issue that there can be hundreds if not thousands of non-coding functional sequences, some of them quite long. So maybe be in total millions of base pairs involved, which most people would say is a lot. But it makes up only a percent or so of the genome I.E. not a lot!
ReplyDelete"lots" : they enumerate all the ways nonfunctional sequences could acquire function
ReplyDeleteThere's a big difference between "lots of ways nonfunctional sequences could acquire function" and "lots of function".
ReplyDelete@Rod Wilson: Kellis et al. admit that biochemical activity is not a good way to define function. They do not "double down" on their previous claim that most of the genome is functional. In fact, they avoid making any claim at all about the amount of junk DNA except to note that there is evidence for junk DNA that they completely ignored in 2012.
ReplyDeleteIn spite of what they claimed in 2012, the ENCODE researchers say in 2014 that the main purpose of the ENCODE experiments is not to come up with an interim estimate of the fraction of the human genome that is functional but to provide maps of DNA segments with biochemical signatures that can be used to test for function.
Seen in a comment on Slashdot.org here, a long quote from an article by Philip Ball in Scientific American including
ReplyDelete"Some biologists greeted this announcement with skepticism bordering on outrage. The ENCODE team was accused of hyping its findings; some critics argued that most of this RNA was made accidentally because the RNA-making enzyme that travels along the genome is rather indiscriminate about which bits of DNA it reads
....
Now it looks like ENCODE was basically right ...."
... and ending up with a quote from a 2014 article by Kevin Morris and John Mattick.
@Joe Felsenstein
ReplyDeleteThat thing you quote is one remarkably poor description of the phenomenon of spurious transcription. It is the transcription initiators that even despite remarkable specificity, will still occasionally bind some locus by accident and initiate transcription at a very low level. The level of expression is one of the lines of evidence that it is spurious and therefore probably nonfunctional. This effect is reproducible with random sequences of DNA explicitly designed not to be functional.
But of course the article is basically taking Mattick and others in that camp's word for it.
Maybe of interest:
ReplyDeletehttps://www.nytimes.com/2024/05/31/science/largest-genome-fern-plant.html?unlocked_article_code=1.wk0.Si5u.tGb4xzNUJvHH&smid=url-share
Philip Ball has posted a long thread on Twitter. Interested to hear people's thoughts:
ReplyDeletehttps://x.com/philipcball/status/1797173070391644661