More Recent Comments

Tuesday, September 18, 2012

Athena Andreadis Writes for Scientific American: Junk DNA, Junky PR

Quite a few science journalists have clued in to the fact that they were massively conned by the ENCODE publicity machine. Turns out that the death of junk DNA was greatly exaggerated.

Here's what Athena Andreadis has to say on the Scientific American website: Junk DNA, Junky PR. Athena is a professor in the Department of Cell and Developmental Biology at the University of Massachusetts Medical School.
A week ago, a huge, painstakingly orchestrated PR campaign was timed to coincide with multiple publications of a long-term study by the ENCODE consortium in top-ranking journals. The ENCODE project (EP) is essentially the next stage after the Human Genome Project (HGP). The HGP sequenced all our DNA (actually a mixture of individual genomes); the EP is an attempt to define what all our DNA does by several circumstantial-evidence gathering and analysis techniques.

The EP results purportedly revolutionize our understanding of the genome by “proving” that DNA hitherto labeled junk is in fact functional and this knowledge will enable us to “maintain individual wellbeing” but also miraculously cure intractable diseases like cancer and diabetes.

Unlike the “arsenic bacteria” fiasco, the EP experiments were done carefully and thoroughly. The information unearthed and collated with this research is very useful, if only a foundation; as with the HGP, this cataloguing quest also contributed to development of techniques. What is way off are the claims, both proximal and distal.

A similar kind of “theory of everything” hype surrounded the HGP but in the case of the EP the hype has been ratcheted several fold, partly due to the increased capacity for rapid, saturating online dissemination. And science journalists who should know better (in Science, BBC, NY Times, The Guardian, Discover Magazine) made things worse by conflating junk, non-protein-coding and regulatory DNA.
That's exactly right, Athena! The PR campaign was deliberate and misleading and it's true that most science journalists were ill-prepared to write about junk DNA.
Let’s tackle “junk” DNA first, a term I find as ugly and misleading as the word “slush” for responses to open submission calls. Semantic baggage aside, the label “junk” was traditionally given to DNA segments with no apparent function. Back in the depths of time (well, circa 1970), all DNA that did not code for proteins or proximal regulatory elements (promoters and terminators) was tossed on the “junk” pile.
Oops! That's not quite true. Back then we knew about tRNA genes, ribosomal RNA genes, origins of replication, and centromeres. It wasn't long before we learned about introns, small functional and regulatory RNAs, and telomeres.
However, in the eighties the definition of functional DNA started shifting rapidly, though I suspect it will never reach the 80% used by the EP PR juggernaut. To show you how the definition has drifted, expanded, and had its meaning muddied as a term of art that is useful for everyone besides the workaday splicers et al who are abreast of trendy interpretations that may elude the laity, let’s meander down the genome buffet table.
Hmmm ... this could get interesting. It still looks to me like 90% of the genome is junk [What's in Your Genome?]. I didn't see any rapid shifting the the 1980s. Let's see where this is headed.
Protein-coding segments in the genome (called exons, which are interrupted by non-protein-coding segments called introns) account for about 2% of the total. That percentage increases a bit if non-protein-coding but clearly functional RNAs are factored in (structural RNAs: the U family, r- and tRNAs; regulatory miRNAs and their cousins).

About 25 percent of our DNA is regulatory and includes signals for: un/packing DNA into in/active configurations; replication, recombination and meiosis, including telomeres and centromeres; transcription (production of heteronuclear RNAs, which contain both exons and introns); splicing (excision of the introns to turn hnRNAs into mature RNAs, mRNA among them); polyadenylation (adding a homopolymeric tail that can dictate RNA location), export of mature RNA into the cytoplasm; and translation (turning mRNA into protein).
Okay, we may have known about some of those things in the 1970s but I'll acknowledge that we learned lot's more in the 1980s. What I won't acknowledge is that we can assign function to 25% of our genome. Where does that number come from?
All these processes are regulated in cis (by regulatory motifs in the DNA) and in trans (by RNAs and proteins), which gives you a sense of how complex and layered our peri-genomic functions are. DNA is like a single book that can be read in Russian, Mandarin, Quechua, Maori and Swahili. Some biologists (fortunately, fewer and fewer) still place introns and regions beyond a few thousand nucleotides up/downstream of a gene in the “junk” category, but a good portion is anything but: such regions contain key elements (enhancers and silencers for transcription and splicing) that allow the cell to regulate when and where to express each protein and RNA; they’re also important for local folding that’s crucial for bringing relevant distant elements in correct proximity as well as for timing, since DNA-linked processes are locally processive.
Well, I'm one of those biologists who think that most intron sequences are junk [Junk in Your Genome: Intron Size and Distribution]. And I'm one of those biologists who think that regulatory sequences are usually near the genes they regulate and don't take up a lot of DNA sequence [Junk in Your Genome: Protein-Encoding Genes].

The expression of a typical gene may be controlled by the binding of a dozen or so activators and repressors but the minimal amount of DNA required for their proper function isn't much more than 1000 bp. or so. The fact that they may be spread out over several thousand base pairs interspersed with dead transposons and other flotsam doesn't mean that all the DNA is functional.
But what of the 70% of the genome that’s left? Well, that’s a bit like an attic that hasn’t been cleaned out since the mansion was built. It contains things that once were useful – and may be useful again in old or new ways – plus gewgaws, broken and rusted items that can still influence the household’s finances and health… as well as mice, squirrels, bats and raccoons. In bio-jargon, the genome is rife with duplicated genes that have mutated into temporary inactivity, pseudogenes, and the related tribe of transposons, repeat elements and integrated viruses. Most are transcribed and then rapidly degraded, processes that do commandeer cellular resources. Some are or may be doing something specific; others act as non-specific factor sinks and probably also buffer the genome against mutational hits. In humans, such elements collectively make up about half of the genome.

So even bona fide junk DNA is not neutral and is still subject to evolutionary scrutiny – but neither does every single element map to a specific function. We know this partly because genome size varies very widely across species whereas the coding capacity is much less variable (the “C-value paradox”), partly because removal of some of these regions does not affect viability in several animal models, including mice. It’s this point that EP almost deliberately obfuscated by trumpeting (or letting be trumpeted) that “junk DNA has been debunked”, ushering in “a view at odds with what biologists have thought for the past three decades.”
I agree with the gist of this description but I would never say that junk DNA is "subject to evolutionary scrutiny." By my definition, junk DNA has no function and it should be evolving at the rate expected of neutral DNA (i.e. the mutation rate). In fact, that's what the data shows.

All in all, this is a good article because it focuses on the fact that the ENCODE results were misinterpreted by the consortium and by the press.
The EP results are important and will be very useful – but they’re not paradigm shifters or miracle tablets and should not pretend to be.


54 comments :

Moo Moo said...

For pity's sake, introns and intergenic regions are replete with regulatory elements - some of them are even transposon insertions. Mutations in introns are likely what explains phenotypic variation in the human species as studies have shown.

Moreover, just because much of the genome is not under stringent sequence conservation, does not mean that it isn't functionally conserved.

Unknown said...

Following the "junk DNA" discussion, I see a fascinating parallel in progamming. Maybe the term "junk" is part of the problem. Writing a program in python for example - white space (indents and indent level) is critical to the compilier - and once a functional block of code is created the white space is "meaningless" except of course to delineate code blocks. Too, I often will re-write code blocks, commenting out the old code for the new. So it can serve a purpose on one level that is difficult if not impossible to ascertain. Evolution certainly must have created a lot of deprecated code blocks - but the "compiler" seems to handle that well in most cases! On the other hand, if the deprecated code blocks mutate so as to create a compiler error... well then the question of "function" is a bit compromised.

Joe Felsenstein said...

You are right, Andreadis starts out fine but then advocates the view that a large fraction, as much as 50%, of the DNA is "functional" and changes in it are not selectively neutral. I agree with you that this is probably wrong.

She does not like the term "junk DNA". I like it and think it is useful. When anyone says that, the people who are unhappy with the term claim that we are then saying that all noncoding DNA is junk. Of course not, no one is saying that. A good fraction of noncoding DNA is junk, but that does not mean that one can look at some noncoding DNA and immediately conclude that it must be junk DNA.

Diogenes said...

Where'd the 25% come from?

I'm still trying to figure out how John Stomatoyonnopoulos got to 40%, from (either) 45 million DNAse fingerprints, or from 8.5 million distinct binding sites, or both.

Diogenes said...

just because much of the genome is not under stringent sequence conservation, does not mean that it isn't functionally conserved.

What does that mean?

Anonymous said...

It means he probably doesn't understand that sequence-neutral DNA, functional or not, practically by definition isn't going to carry any information apart from its actual length, if even that.

Let's be generous and say 33 bits, enough to specify the length of any stretch of DNA shorter than the genome itself. 33 bits is less than the amount of information carried by a sequence of 17 base pairs. Whee.

Anonymous said...

But suppose you had so much white space your source code that it accounted for 80% of the length of the source file, and running the code through a pretty-printer reduced the size of the source file by a factor of four without affecting the object code generated by the compiler. Would you say that all, or even a majority, of the whitespace was "functional" by your definition?

(I'm ignoring the presence of commented-out lines in your analogy as whatever "functionality" they possess presumes the existence of computer programmers to inspect them.)

Devin said...

That's a dumb analogy, Unknown.

Claudiu Bandea said...

Larry Moran: I would never say that junk DNA is "subject to evolutionary scrutiny." By my definition, junk DNA has no function and it should be evolving at the rate expected of neutral DNA (i.e. the mutation rate)

By my definition too, junk DNA (jDNA) does not have function.

The problem is that very few people realize that functional (fDNA) can have functions that are based on their sequence (‘informational DNA’), or it can have functions that not based on specific sequence, but on its bare or bulk presence (‘structural DNA’).

Also very few people realize that both kinds of functional DNA (‘informational DNA’ and ‘structural DNA’) can be under selective forces, such as natural selection.

In my model, the so called ‘jDNA’ functions as a sink for the integration of proviruses, transposons and other inserting elements, thereby protecting informational DNA from inactivation or alteration of its expression.

Does it make sense?

SPARC said...

DH Kaye published a second post on ENCODE and law enforcement worth reading:
http://for-sci-law-now.blogspot.de/

Larry Moran said...

I recognize that there are "bulk hypotheses" to explain junk DNA and that such schemes posit that sequence isn't important but the presence f a certain amount of DNA is selected.

None of those hypotheses make sense, including yours.

gert korthof said...

Hi Larry,
if 80% of 3.2 billion bases or 2.56 billion bases are functional, how could that ever be maintained if mutation frequency is 1 in 100 million per base per generation? Would that not be an independent argument against 80% of the human genome being functional?

Joe Felsenstein said...

I agree. If we imagine a mutant at some actual locus which reduces the amount of junk DNA that is able to replicate, that would affect the "bulk". One would need to have some selection coefficient for that mutant. I believe it would be quite small.

But then one would have to face another problem. The region having the smaller amount of junk DNA would be elsewhere in the genome. So after just a few generations it would be segregating independently of the mutant. Natural selection on the mutant would thus be weak and brief. Claudiu needs to consider the issue quantitatively. Only in clonal organisms would the mutant stay associated with its effect on the genome. Or in cases where the effect on the replication of junk DNA was quite local in the genome,

I raised this in another thread here, but there was no response or consideration of this issue.

Joe Felsenstein said...

That issue, the mutational load, was raised back in the 1960s, I think, and kept many population geneticists from believing that all that extra DNA could be functional. Yes, it is an independent argument from the "onion test" and from the visible presence of genomically parasitic transposon families.

I mentioned this argument in the thread here entitled "THIS is what Michael Eisen is thinking". The argument is quite old though I am not sure who first raised it.

Joe Felsenstein said...

A little Googling shows me that Susumu Ohno raised the mutational load issue in his 1972 paper that first used the term "junk DNA". But I believe it is even older than that as the issue was raised as a response to the "c paradox" that there seemed to be too much DNA and that genome sizes varied too much.

gert korthof said...

Thanks Joe. I found the idea in: Manfred Eigen (1996) 'Steps Towards Life', in which he refers to his 1971 publication 'Self-organization of matter and the evolution of biological macromolecules', Naturwissenschaften 58, 465.

Joe Felsenstein said...

Yes, Eigen's 1971 paper raised the issue of an "error threshold"" (see the Wikipedia page on that). If a limited amount of natural selection is spread out among sites in a large genome, it can become ineffective in the face of mutation. I do not recall whether Eigen talked about genetic load, but it is a related subject.

Diogenes said...

I confess, Joe, I'm not clear on the meaning.

The region having the smaller amount of junk DNA would be elsewhere in the genome. So after just a few generations it would be segregating independently of the mutant. Natural selection on the mutant would thus be weak and brief.

Me too dim to understand.

Claudiu Bandea said...

Lary Moran: None of those hypotheses make sense, including yours

In response to a previous similar comment stating “That doesn't make any sense”, without any other explanations, I said:

I think it makes so much sense that (similar to other common sense issues that are highly inconvenient, such as the Onion Test) the only way to deal with it is to pretend that it doesn’t exist, or to say: “That doesn't make any sense,” or that “it is silly” (see Birney thinks the Onion Test is silly).

Allan Miller said...

I think there are two possible models to consider, one where a 'dispensible' segment of DNA has just been lost, and one where an active gene product excises bits of chromosome.

In the first case, the deletion spreads around the population, by drift, by positive selection on saved metabolic cost, or possibly by reduction of misalignments caused by excessive repeats (all probably weak at best).

In the second case, each deletion spreads like the first, but there is another bit of DNA which is also capable of spread. It can cause multiple deletions, so may multiply up any benefit accruing to single deletions. But due to sexual recombination, it will find itself detached from the deletions it has caused over the longer term, so failing to reap any long-term benefit. It may also suffer a more immediate cost due to a tendency to delete sections of useful genome.

Larry Moran said...

Claudiu, if I understand your hypothesis correctly, you envisage a time when genomes were quite small and most transposon insertions were lethal or detrimental. This is the case today in bacteria.

You imagine that there was selection for inserting junk DNA in the genome in order to provide safe landing spots for transposons. This greatly increased the number of active transposons in the genome and hence the mutation rate of transposon insertion. Eventually 50% of the genome became littered with dead transpsosons, bits and pieces of transposons, and active transposons.

All this confers a selective advantage on each individual and that's why large genomes full of junk are adaptations.

If that's an accurate description of your hypothesis then I maintain that it makes no sense.

I note that you published this hypothesis in 1990 and now after 22 years nobody has stepped forward to claim that it solves the problem. Why do you think that is?

Joe Felsenstein said...

Diogenes had battery trouble with his lamp and expressed his concern

Me: The region having the smaller amount of junk DNA would be elsewhere in the genome. So after just a few generations it would be segregating independently of the mutant. Natural selection on the mutant would thus be weak and brief.


Diogenes: Me too dim to understand.

Suppose a gene had a mutant occur that made the rest of the genome a bit more likely to dispense with some junk DNA each generation. And suppose that we calculate that the smaller amount of junk DNA gives those individuals a 0.001 fitness advantage (one-tenth of one percent). You might think that the mutant allele would then slowly increase in the population because it has a fitness higher by 0.001.

But you'd be wrong. After just a few generations, the region in the genome that lost the junk DNA would be occurring in both those genomes that had the new mutant, and those that didn't. That is because the region that has lower junk DNA is not located near the gene that mutated to the new allele. Crossing-over and chromosome segregation will then rapidly randomize which genomes have and do not have the junk DNA, randomizing them with respect to whether those genomes have or have not got the new allele. At which point the fitnesses of the two alleles at the gene are equal on average.

Thus the selection favoring that new allele is likely to not only be weak (not calculated here) but also of short duration.

Has this shed any light on the matter?

Claudiu Bandea said...

@Lary,

Indeed, in organisms with small genomes, such as Bacteria, in the absence of protective mechanisms, most insertions by transposons or proviruses would be detrimental or lethal.

In Bacteria, one of the well-studied protective mechanisms is the evolution of specific sites for integration (note: to my knowledge, however, this phenomenon is not usually presented from this evolutionary perspective in other papers or textbooks). The wide distribution of this protective mechanism testifies for the high selective pressure imposed by inserting elements in Bacteria, which have high evolutionary constrains against increasing their genome size.

I think that this part of the hypothesis makes sense to most people. The other part is more subtle, and more difficult to explain, so it might take a few postings.

I’ll start with your statement:

“You imagine that there was selection for inserting junk DNA in the genome in order to provide safe landing spots for transposons. This greatly increased the number of active transposons in the genome and hence the mutation rate of transposon insertion. Eventually 50% of the genome became littered with dead transpsosons, bits and pieces of transposons, and active transposons”

Your comment prompts two critical issues for understanding the evolution of genome size as well as the rationale behind my hypothesis: evolutionary origin of so called ‘junk DNA' (jDNA), and the selective forces of deleting it or maintaining some of it in the genome.

As I said in the introduction of my paper:

It is not known if secondary DNA (i.e. junk DNA) has accumulated simply because its rate of deletion has been lower than that of origin, or because individuals possessing secondary DNA have a selective advantage (parenthesis added).

We know that in humans for example, about 50% of the genome is composed of recognizable transposable elements (TEs), including thousands of human endogenous viruses which constitute about 8% of the genome, as well as thousands of other retroviral-like elements. And, as recently reaffirmed by many researchers in their discussions on the ENCODE’s new paradigm, much of the rest of jDNA are sequences that have lost their TE sequence signatures. Therefore, we know that most of the jDNA originated from amplification of these viral sequences, a very strong activity, and that without mechanisms for deleting some of the jDNA (i.e. keeping it in check), it will overcome the genome.

Therefore, from the host’s perspective, there was no selection for increasing the size of jDNA. To specifically address your statement, there was no “selection for inserting junk DNA in the genome”. On the contrary, there was on overall selection to slow down its accumulation in the genome, for multiple reasons, such as metabolic burden, but especially insertion mutagenesis.

Unlike Bacteria, which had strong constrains for maintaining small genomes, other organisms, which for one reason or another could relax this constrain, used a different strategy for preventing insertion mutagenesis: fighting fire with fire!

This means that these organisms have allowed enough jDNA in the genome to protect it from insertion mutagenesis, as proposed in my hypothesis.

And, to support my notion that complex organisms, even vertebrates, have mechanisms to reduce the size of their jDNAs if needed, and that the amount of jDNA can be under evolutionary constrains, I formally introduce here the Hummingbird Case; hummingbirds, which have the smallest genome among birds and also all tetrapods, are a clear example of these paradigms at work (likely, in hummingbirds, the selective force for small genome has been their high metabolic demands associated with powered flight)

Jud said...

The problem is that very few people realize that functional (fDNA) can have functions that are based on their sequence (‘informational DNA’), or it can have functions that not based on specific sequence, but on its bare or bulk presence (‘structural DNA’).

The only example I'm aware of is the role of junk DNA as antifreeze in certain species of fish.

Diogenes said...

@Joe,

Yeah I get it now. Not sure I believe it, but I get it.

Anonymous said...

In the ENCODE-discussions wery few fix attention to numerous physical attributes of the genome and its many important interactions with its environment in the cell. That is understandable for many reasons. In high school or university the genome physics will hardly ever be taught. Another reason is the sciences`for the present minor knowledge about the genomes physics. Something would be found e.g. with search "DNA water" or "DNA physics"

Claudiu Bandea said...

Larry: I note that you published this hypothesis in 1990 and now after 22 years nobody has stepped forward to claim that it solves the problem. Why do you think that is?

This is a relevant, but not an easy question to answer. Maybe, I should wait until I convince you about the merit of this hypothesis, but let me jump ahead and say: it was primarily my fault.

First of all, I thought that publishing this model would be enough; however, if you don't have name recognition that doesn't work, and even then, you have to pursue the issue if it doesn’t catch on fire immediately. Obviously, publishing it in a small journal did not help; also, I wasn't in a position to pursue it experimentally, so I put it on the back burner.

Unfortunately, I did the same with other models and theories, including the one on the etiology of scrapie, kuru and other Transmissible Spongiform Encephalopathies (TSEs). In a paper that I published in 1986, right after the gene coding for the ‘prion protein’ was cloned and sequenced, I suggested that prion hypothesis was flawed and proposed an endogenous virus model for the etiology of TSEs ( http://www.ncbi.nlm.nih.gov/pubmed/3090406). According to this model the prion protein was the product of an endogenous virus.

A few years ago, I wrote a new series of papers presenting additional supporting evidence and arguments supporting the hypothesis that PrP is indeed encoded by a symbiotic endogenous gene that provides protection against viruses and other pathogens (see http://precedings.nature.com/documents/3887/version/1 and all the associated comments, also a more recent paper http://www.alzforum.org/res/adh/cur/bandea/default.asp.

Based on this evolutionary model and on very large amounts of experimental data and observations, I developed a unifying model for the etiology all neurodegenerative disorders that have been classified for decades as protein misfolding diseases, such as Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, Amyotrophic Lateral Sclerosis, Frontotemporal Lobar Degeneration, and Creutzfeldt-Jakob disease.

According to this model, the primary proteins implicated in these disorders, including Aβ, tau, α-synuclein, huntingtin, TDP-43 and PrP are members of the innate immune system, and their activity and assembly into oligomers and amyloids are not protein misfolding events or prion activities, as proposed by the protein misfolding concept and the prion hypothesis, but part of their repertoire of immune functions.

I stop here, but I would like to believe that as a skeptical biochemist and champion of truth you will be willing to tackle these issues, not only because they are of high academic and scientific interest, but also because they are of very high medical and public health significance (see my Open Letter at: http://sandwalk.blogspot.com/2012/09/how-do-intelligent-design-creationists.html).

Claudiu Bandea said...

Jud: The only example I'm aware of is the role of junk DNA as antifreeze in certain species of fish

To my knowledge, there are ‘antifreeze proteins,’ not ‘antifreeze DNA’.

When discussing potential functions for ‘junk DNA,’ it is important that the function applies to most of it, not only a few percentages, and most importantly, it has to pass the Onion Test.

Anonymous said...

Shapiro weighs in on ENCODE/Junk DNA > http://www.huffingtonpost.com/james-a-shapiro/further-thoughts-on-the-e_b_1893984.html

Anonymous said...

Shapiro weighs in on ENCODE/Junk DNA. Diogenes has made the news. http://www.huffingtonpost.com/james-a-shapiro/further-thoughts-on-the-e_b_1893984.html

Anonymous said...

It's not dumb, misguided perhaps, but not dumb! How can you expect an intelligent conversation using language like that?

In fact, the analogy could be used to give some insight into part of the reason we have so much DNA. Namely, that evolution, like computer programming, is a step-wise process and as such is not always particularly efficient in it use of "code". The analogy obviously falls down on the mechanism by which evolution occurs. Computer programming requires a programmer (we're getting into god territory here...), evolution simply requires the existence of a self-replicating molecule and selective pressure.

It would be interesting to know if computer programmers were using evolutionary processes to develop new codes. I.e setting up parameters that allow the code to "mutate" and then subjecting them to some sort of automated selection process? It wouldn't necessarily be efficient but could it lead to novel solutions?

gert korthof said...

Hi Joe,
I could not find in your references how much (human) dna could be maintained based on a mutation frequency of 1 in 100 million per base per generation. I would suggest 100 million bases or about 3.1% of human dna. Right?

Diogenes said...

Anon: "Diogenes has made the news."

Oh great! Just yesterday at Uncommon Descent the UDites were telling me I'm nobody in the world of evolution.

OK, I will go over and kick Shapiro's tail. Things are going to get messy today.

Diogenes said...

I am going to do battle with Shapiro today-- IF the Huffpost filters let my comments through.

Today will be brutal. Any help anyone can offer would be appreciated. - D

Joe Felsenstein said...

Gert: Basically, yes. That was Manfred Eigen's conclusion. That is the number of bases that could be maintained in a desired base sequence. (Or you could have twice as much half-maintained). And of course it need not be consecutive bases.

Diogenes said...

I'm the target of Shapiro's article in the HuffPost, and yet I find that HuffPost will not permit me to post any comments at all-- all my comments are blocked.

But Anthony McCarthy, aka our own Thought Criminal, IS permitted to comment of Shapiro's post, and of course has joined in the clusterfuck attack on me.

Yeah, you better not let me comment on there-- frightened of all that scary evidence.

Diogenes said...

Alas, I find I cannot post at Shapiro's blog-- HuffPost is blocking all my comments.

Anonymous said...

"HuffPost is blocking *all* my comments"

Then why were you able to post there earlier this morning? It appears Joe's analogy is correct, you are FoS.

Diogenes said...

Anonymous: "Then why were you able to post there earlier this morning?"

I can post at Michael White's blog, but I'm not allowed to post at Shapiro's blog, which is the one attacking me.

Are you going to address the evidence I posted at Michael White's blog, the quotes showing that the ENCODE scientists admit they didn't disprove Junk DNA?

Do you care about the evidence at all? No? Then what do you care about?

Anonymous said...

Dio:

Where is your evidence you have been banned only from Shapiro's Blog?

Devin said...

It's dumb and misguided ^

Diogenes said...

I posted many comments this morning at Shapiro's and Michael White's blogs.

Some of my comments at Shapiro's blog were long lists of quotes of ENCODE scientists admitting that they know Junk DNA is probably real. Similar to what I posted at White's blog.

My comments were published at White's blog, but on Shapiro's they just vanished, and are not even listed as pending. At first I though this was due to spam filters, but everything I write there vanishes. 7 hours is a long time for a comment to be pending, but it's not listed as pending.

The Discovery Institute also banned me from Biologic's Facebook page-- too much science-y evidence; that frightens them.

The Shapiro hold might be reversed in the future, so I might be wrong. We'll see. If it's not reversed, I get to blog about how Prof. Shapiro is scared to death of me.

Anonymous said...

"I get to blog about how Prof. Shapiro is scared to death of me"

Noun 1. delusions of grandeur - a delusion (common in paranoia) that you are much greater and more powerful and influential than you really are

http://www.thefreedictionary.com/delusions+of+grandeur

Anonymous said...

Censorship of dissenting views at major (even minor) blogs has become pretty commonplace, the idea that one "proves" one's rightness by forbidding objections is becoming alarmingly respectable. (HuffPo itself got bad enough in that respect in the last days of Huffington's proprietorship, and since the sale it's become practically a Stepford site.)

Diogenes said...

You gonna bark all day little doggie, or you gonna bite?

Claudiu Bandea said...

Larry: ”If that's an accurate description of your hypothesis then I maintain that it makes no sense”

In my comment above, I made the point that your description of my evolutionary model on the evolution of genome size and the protective function of junk DNA is not accurate, and I wonder if you understand it now.

Larry Moran said...

@Claudiu

Your model still makes no sense to me. Maybe that's because I understand it and it's silly or maybe I don't understand it because you're not explaining it very well.

Claudiu Bandea said...

Larry Moran says:
Your model still makes no sense to me.
Maybe that's because I understand it and it's silly

In response to a similar comment stating “That doesn't make any sense”, without explaining why, I said:

I think it makes so much sense that (similar to other common sense issues that are highly inconvenient, such as the Onion Test) the only way to deal with it is to pretend that it doesn’t exist, or to say: “That doesn't make any sense,” or that “it is silly” (see Birney thinks the Onion Test is silly).

That’s something that scientists like Ewan Birney say in order to avoid addressing inconvenient questions.

Larry: Maybe that's because I understand it…

As I already said, your statement: You imagine that there was selection for inserting junk DNA in the genome is incorrect; so, I guess that you don’t understand my model on the evolution of genome size and on the protective function of ‘junk DNA’ (jDNA).

Is there anything specific that you don’t understand, or doesn’t make sense, or is silly?

Also, is it possible that you might have same reservations about accepting my model on functional ‘junk DNA,’ because you think it might give ammunition to those who think (from a creationism perspective) that ‘junk DNA’ must have a function? I don’t think it would do that, would it?

Claudiu Bandea said...

@Joe Felsenstein, Allan Miller, Diogenes

If you question the idea that natural selection can act on ‘junk DNA’ (jDNA), from the perspective that I proposed, then how do you explain the Hummingbird Case, which I brought up in one of my comment above?

Jud said...

Jud: The only example I'm aware of is the role of junk DNA as antifreeze in certain species of fish

Claudiu: To my knowledge, there are ‘antifreeze proteins,’ not ‘antifreeze DNA’.


Right, careless error on my part. But I thought it was interesting that what ordinarily would make the DNA junk (duplication errors) is what gives the resulting proteins their physical properties.

Claudiu Bandea said...

Jud: Right, careless error on my part

It is easy to make this type of error, but sometimes accidents do bring forward good points and, sometimes, even big discoveries; remember the accidental discovery of Penicillin by Fleming, in 1929!

I don’t know how the viscosity generated by high quantities of DNA would affect the freezing point, but it likely affects other physical properties of the cell that might have evolutionary significance. I’m thinking about some species of Amoeba which have extraordinary quantities of DNA (e.g. 300 times more than humans)

Anonymous said...

Claudiu,

I don't think that depositing ideas in a "prepublication" web site counts as a paper. I also think that we might not have enough information for discussing the merits of all your ideas. As for the thing about junk DNA being protective. How much protection would it confer if most mutations happened by polymerase mistakes? How much if they happened by demethylation? How much if they happened by x-ray or ultraviolet exposure? For all of the above: no protection at all.

For insertion, well, we could argue that yes, if you have more junk than useful DNA, the probability that new insertions will not harm something important are higher. Yet, it looks that this is actually a way for creating their own niche, than for them to be protective to the host. Right? The "protection" would be a self-perpetuating mechanism. Even if accidentally arising.

Did you notice the change in perspective?

Claudiu Bandea said...

Negative Entropy says: I don't think that depositing ideas in a "prepublication" web site counts as a paper

I think that depositing ideas anywhere is a good idea! And, by definition, anything published is a 'publication’, including your comments, or mine, right here at Sandwalk.

You might not be aware, but with the introduction of hundreds, if not thousands of scientific journals every year, you can basically publish any of your preferred ideas in many journals, even in some that are defined as 'peer-reviewed journals'; well, for some them you have to make a payment of a couple of hundred bucks but, in addition being a reasonable idea or study, that’s all what it takes!

And, to further make my point, I like to publish my ideas on the Blogosphere, such as Larry Moran’s Sandwalk, because of a very good (from my perspective) reason: here, I can’t hide my ideas or arguments from an immediate and direct evaluation and critique, as I could in some of the conventional journals. Viva the Blogosphere!

Negative Entropy says: As for the thing about junk DNA being protective. How much protection would it confer if most mutations happened by polymerase mistakes? How much if they happened by demethylation? How much if they happened by x-ray or ultraviolet exposure? For all of the above: no protection at all

Basically, I agree with you on this. However, I remember vaguely that the words ‘protection’ and ‘junk DNA’(jDNA) were used in the same sentence sometimes in the past, in in order to invoke that jDNA might reduce the rate of certain types of mutations, such as nucleotide substitutions, for example, by ‘titrating out’ potential mutagens. Is that true, or maybe, I’m making it up?

Negative Entropy says: For insertion, well, we could argue that yes, if you have more junk than useful DNA, the probability that new insertions will not harm something important are higher. Yet, it looks that this is actually a way for creating their own niche, than for them to be protective to the host. Right? The "protection" would be a self-perpetuating mechanism. Even if accidentally arising. Did you notice the change in perspective?

Again, I hear you! Very well!

Just as I explained to Larry in my comment above, in my model, there was no “selection for inserting junk DNA in the genome”; that came from the inserting elements themselves. On the contrary, the organisms evolve mechanism to keep the amount of jDNA in check, to optimize for its protective function against insertion mutagenesis.

Indeed, the host only use some of the ‘fire’ to fight the ‘big fire’, the rest they discard: as demonstrated by the Hummingbird Case, the natural selection working on the benefit of the host organisms can ultimately control how much jDNA it can afford.

One last point: much of the foundation of my model on the evolution of genome size and function of jDNA as a protective mechanism against insertion mutagenesis in multicellular species, such as humans, has to do with its protection against neoplastic transformation, or cancer. What’s great about this paradigm is that it can be easily addressed experimentally: e.g. transgenic mice carrying DNA sequences homologous to infectious retro-viruses, such as murine leukemia viruses (MuLV), might be more resistant to cancer induced by experimental MuLV infections as compared to controls.

Anonymous said...

You might not be aware, but with the introduction of hundreds, if not thousands of scientific journals every year, you can basically publish any of your preferred ideas in many journals, even in some that are defined as 'peer-reviewed journals'

Yeah, sometimes entropy is not such a good thing.