More Recent Comments

Friday, July 24, 2015

John Parrington discusses genome sequence conservation

John Parrington has written a book called, The Deeper Genome: Why there is more to the human genome than meets the eye. He claims that most of our genome is functional, not junk. I'm looking at how his arguments compare with Five Things You Should Know if You Want to Participate in the Junk DNA Debate

There's one post for each of the five issues that informed scientists need to address if they are going to write about the amount of junk in you genome. This is the last one.

1. Genetic load
John Parrington and the genetic load argument
2. C-Value paradox
John Parrington and the c-value paradox
3. Modern evolutionary theory
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved (this post)
John Parrington discusses genome sequence conservation

5. Most of the genome is not conserved

There are several places in the book where Parrington address the issue of sequence conservation. The most detailed discussion is on pages 92-95 where he discusses the criticisms leveled by Dan Graur against ENCODE workers. Parrington notes that 9% of the human genome is conserved and recognizes that this is a strong argument for function. It implies that >90% of our genome is junk.

Here's how Parrington dismisses this argument ...
John Mattick and Marcel Dinger ... wrote an article for the HUGO Jounral, official journal of the Human Genome Organisation, entitled "The extent of functionality in the human genome." ... In response to the accusation that the apparent lack of sequence conservation of 90 per cent of the genome means that it has no function, Mattick and Dinger argued that regulatory elements and noncoding RNAs are much more relaxed in their link between structure and function, and therefore much harder to detect by standard measures of function. This could mean that 'conservation is relative', depending on the type of genomic structure being analyzed.
In other words, a large part of our genome (~70%?) could be producing functional regulatory RNAs whose sequence is irrelevant to their biological function. Parrington then writes a full page on Mattick's idea that the genome is full of genes for regulatory RNAs.

The idea that 90% of our genome is not conserved deserves far more serious treatment. In the next chapter (Chapter 7), Parrington discusses the role of RNA in forming a "scaffold" to organize DNA in three dimensions. He notes that ...
That such RNAs, by virtue of their sequence but also their 3D shape, can bind DNA, RNA, and proteins, makes them ideal candidates for such a role.
But if the genes for these RNAs make up a significant part of the genome then that means that some of their sequences are important for function. That has genetic load implications and also implications about conservation.

If it's not a "significant" fraction of the genome then Parrington should make that clear to his readers. He knows that 90% of our genome is not conserved, even between individuals (page 142), and he should know that this is consistent with genetic load arguments. However, almost all of his main arguments against junk DNA require that the extra DNA have a sequence-specific function. Those facts are not compatible. Here's how he justifies his position ...
Those proposing a higher figure [for functional DNA] believe that conservation is an imperfect measure of function for a number of reasons. One is that since many non-coding RNAs act as 3D structures, and because regulatory DNA elements are quite flexible in their sequence constraints, their easy detection by sequence conservation methods will be much more difficult than for protein-coding regions. Using such criteria, John Mattick and colleagues have come up with much higher figures for the amount of functionality in the genome. In addition, many epigenetic mechanisms that may be central for genome function will not be detectable through a DNA sequence comparison since they are mediated by chemical modifications of the DNA and its associated proteins that do not involve changes in DNA sequence. Finally, if genomes operate as 3D entities, then this may not be easily detectable in terms of sequence conservation.
This book would have been much better if Parrington had put some numbers behind his speculations. How much of the genome is responsible for making functional non-coding RNAs and how much of that should be conserved in one way of another? How much of the genome is devoted to regulatory sequences and what kind of sequence conservation is required for functionality? How much of the genome is required for "epigenetic mechanisms" and how do they work if the DNA sequence is irrelevant?

You can't argue this way. More than 90% of our genomes is not conserved—not even between individuals. If a good bit of that DNA is, nevertheless, functional, then those functions must not have anything to do with the sequence of the genome at those specific sites. Thus, regions that specify non-coding RNAs, for example, must perform their function even though all the base pairs can be mutated. Same for regulatory sequences—the actual sequence of these regulatory sequences isn't conserved according to John Parrington. This requires a bit more explanation since it flies on the face of what we know about function and regulation.

Finally, if you are going to use bulk DNA arguments to get around the conflict then tell us how much of the genome you are attributing to formation of "3D entities." Is it 90%? 70%? 50%?


John Parrington discusses pseudogenes and broken genes

We are discussing Five Things You Should Know if You Want to Participate in the Junk DNA Debate and how they are described in John Parrington's book The Deeper Genome: Why there is more to the human genome than meets the eye. This is the fourth of five posts.

1. Genetic load
John Parrington and the genetic load argument
2. C-Value paradox
John Parrington and the c-value paradox
3. Modern evolutionary theory
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk (this post)
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved
John Parrington discusses genome sequence conservation

4. Pseudogenes and broken genes are junk

Parrington discusses pseudogenes at several places in the book. For example, he mentions on page 72 that both Richard Dawkins and Ken Miller have used the existence of pseudogenes as an argument against intelligent design. But, as usual, he immediately alerts his readers to other possible explanations ...
However, using the uselessness of so much of the genome for such a purpose is also risky, for what if the so-called junk DNA turns out to have an important function, but one that hasn't yet been identified.
This is a really silly argument. We know what genes look like and we know what broken genes look like. There are about 20,000 former protein-coding pseudogenes in the human genome. Some of them arose recently following a gene duplication or insertion of a cDNA copy. Some of them are ancient and similar pseudogenes are found at the same locations in other species. They accumulate mutations at a rate consistent with neutral theory and random genetic drift. (This is a demonstrated fact.)

It's ridiculous to suggest that a significant proportion of those pseudogenes might have an unknown important function. That doesn't rule out a few exceptions but, as a general rule, if it looks like a broken gene and acts like a broken gene, then chances are pretty high that it's a broken gene.

As usual, Parrington doesn't address the big picture. Instead he resorts to the standard ploy of junk DNA proponents by emphasizing the exceptions. He devotes more that two full pages (pages 143-144) to evidence that some pseudogenes have acquired a secondary function.
The potential pitfalls of writing off elements in the genome as useless or parasitical has been demonstrated by a recent consideration of the role of pseudgogenes. ... recent studies are forcing a reappraisal of the functional role of these 'duds."
Do you think his readers understand that even if every single broken gene acquired a new function that would still only account for less than 2% of the genome?

There's a whole chapter dedicated to "The Jumping Genes" (Chapter 8). Parrington notes that 45% of our genome is composed of transposons (page 119). What are they doing in our genome? They could just be parasites (selfish DNA), which he equates with junk. However, Parrrington prefers the idea that they serve as sources of new regulatory elements and they are important in controlling responses to environmental pressures. They are also important in evolution.

As usual, there's no discussion about what fraction of the genome is functional in this way but the reader is left with the impression that most of that 45% may not be junk or parasites.

Most Sandwalk readers know that almost all of the transposon-related sequences are bits and pieces of transposons that haven't bee active for millions of years. They are pseudogenes. They look like broken transposon genes, they act like broken genes, and they evolve like broken transposons. It's safe to assume that's what they are. This is junk DNA and it makes up almost half of our genome.

John Parrington never mentions this nasty little fact. He leaves his readers with the impression that 45% of our genome consists of active transposons jumping around in our genome. I assume that this is what he believes to be true. He has not read the literature.

Chapter 9 is about epigenetics. (You knew it was coming, didn't you?) Apparently, epigentic changes can make the genome more amenable to transposition. This opens up possible functional roles for transposons.
As we've seen, stress may enhance transposition and, intriguingly, this seems to be linked to changes in the chromatin state of the genome, which permits repressed transposons to become active. It would be very interesting if such a mechanism constituted a way for the environment to make a lasting, genetic mark. This would be in line with recent suggestions that an important mechanism of evolution is 'genome resetting'—the periodic reorganization of the genome by newly mobile DNA elements, which establishes new genetic programs in embryo development. New evidence suggests that such a mechanism may be a key route whereby new species arise, and may have played an important role in the evolution of humans from apes. This is very different from the traditional view of evolution being driven by the gradual accumulation of mutations.
It was at this point, on page 139, that I realized I was dealing with a scientist who was in way over his head.

Parrington returns to this argument several times in his book. For example, in Chapter 10 ("Code, Non-code, Garbage, and Junk") he says ....
These sequences [transpsons] are assumed to be useless, and therefore their rate of mutation is taken to taken to represent a 'neutral' reference; however, as John Mattick and his colleague Marcel Dinger of the Garvan Institute have pointed out, a flaw in such reasoning is 'the questionable proposition that transposable elements, which provide the major source of evolutionary plasticity and novelty, are largely non-functional. In fact, as we saw in Chapter 8, there is increasing evidence that while transposons may start off as molecular parasites, they can also play a role in the creation of new regulatory elements, non-coding RNAs, and other such important functional components of the genome. It is this that has led John Stamatoyannopoulos to conclude that 'far from being an evolutionary dustbin, transposable elements appear to be active and lively members of the genomic regulatory community, deserving of the same level of scrutiny applied to other genic or regulatory features. In fact, the emerging role for transposition in creating new regulatory mechanisms in the genome challenges the very idea that we can divide the genome into 'useful' and 'junk' coomponents.
Keep in mind that active transposons represent only a tiny percentage of the human genome. About 50% of the genome consists of transposon flotsam and jetsam—bits and pieces of broken transposons. It looks like junk to me.

Why do all opponents of junk DNA argue this way without putting their cards on the table? Why don't they give us numbers? How much of the genome consists of transposon sequences that have a biological function? Is it 50%, 20%, 5%?


John Parrington and modern evolutionary theory

We are continuing our discussion of John Parrington's book The Deeper Genome: Why there is more to the human genome than meets the eye. This is the third of five posts on: Five Things You Should Know if You Want to Participate in the Junk DNA Debate

1. Genetic load
John Parrington and the genetic load argument
2. C-Value paradox
John Parrington and the c-value paradox
3. Modern evolutionary theory (this post)
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved
John Parrington discusses genome sequence conservation

3. Modern evolutionary theory

You can't understand the junk DNA debate unless you've read Michael Lynch's book The Origins of Genome Architecture. That means you have to understand modern population genetics and the role of random genetic drift in the evolution of genomes. There's no evidence in Parrington's book that he has read The Origins of Genome Architecture and no evidence that he understands modern evolutionary theory. The only evolution he talks about is natural selection (Chapter 1).

Here's an example where he demonstrates adaptationist thinking and the fact that he hasn't read Lynch's book ...
At first glance, the existence of junk DNA seems to pose another problem for Crick's central dogma. If information flows in a one-way direction from DNA to RNA to protein, then there would appear to be no function for such noncoding DNA. But if 'junk DNA' really is useless, then isn't it incredibly wasteful to carry it around in our genome? After all, the reproduction of the genome that takes place during each cell division uses valuable cellular energy. And there is also the issue of packaging the approximately 3 billion base pairs of the human genome into the tiny cell nucleus. So surely natural selection would favor a situation where both genomic energy requirements and packaging needs are reduced fiftyfold?1
Nobody who understands modern evolutionary theory would ask such a question. They would have read all the published work on the issue and they would know about the limits of natural selection and why species can't necessarily get rid of junk DNA even if it seems harmful.

People like that would also understand the central dogma of molecular biology.


1. He goes on to propose a solution to this adaptationist paradox. Apparently, most of our genome consists of parasites (transposons), an idea he mistakenly attributes to Richard Dawkins' concept of The Selfish Gene. Parrington seems to have forgotten that most of the sequence of active transposons consists of protein-coding genes so it doesn't work very well as an explanation for excess noncoding DNA.

John Parrington and the C-value paradox

We are discussing John Parrington's book The Deeper Genome: Why there is more to the human genome than meets the eye. This is the second of five posts on: Five Things You Should Know if You Want to Participate in the Junk DNA Debate

1. Genetic load
John Parrington and the genetic load argument
2. C-Value paradox (this post)
John Parrington and the c-value paradox
3. Modern evolutionary theory
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved
John Parrington discusses genome sequence conservation


2. C-Value paradox

Parrington addresses this issue on page 63 by describing experiments from the late 1960s showing that there was a great deal of noncoding DNA in our genome and that only a few percent of the genome was devoted to encoding proteins. He also notes that the differences in genome sizes of similar species gave rise to the possibility that most of our genome was junk. Five pages later (page 69) he reports that scientists were surprised to find only 30,000 protein-coding genes when the sequence of the human genome was published—"... the other big surprise was how little of our genomes are devoted to protein-coding sequence."

Contradictory stuff like that makes it every hard to follow his argument. On the one hand, he recognizes that scientists have known for 50 years that only 2% of our genome encodes proteins but, on the other hand, they were "surprised" to find this confirmed when the human genome sequence was published.

He spends a great deal of Chapter 4 explaining the existence of introns and claims that "over 90 per cent of our genes are alternatively spliced" (page 66). This seems to be offered as an explanation for all the excess noncoding DNA but he isn't explicit.

In spite of the fact that genome comparisons are a very important part of this debate, Parrington doesn't return to this point until Chapter 10 ("Code, Non-code, Garbage, and Junk").

We know that the C-Value Paradox isn't really a paradox because most of the excess DNA in various genomes is junk. There isn't any other explanation that makes sense of the data. I don't think Parrington appreciates the significance of this explanation.

The examples quoted in Chapter 10 are the lungfish, with a huge genome, and the pufferfish (Fugu), with a genome much smaller than ours. This requires an explanation if you are going to argue that most of the human genome is functional. Here's Parrington's explanation ...
Yet, despite having a genome only one eighth the size of ours, Fugu possesses a similar number of genes. This disparity raises questions about the wisdom of assigning functionality to the vast majority of the human genome, since, by the same token, this could imply that lungfish are far more complex than us from a genomic perspective, while the smaller amount of non-protein-coding DNA in the Fugu genome suggests the loss of such DNA is perfectly compatible with life in a multicellular organism.

Not everyone is convinced about the value of these examples though, John Mattick, for instance, believes that organisms with a much greater amount of DNA than humans can be dismissed as exceptions because they are 'polyploid', that is, their cells have far more than the normal two copies of each gene, or their genomes contain an unusually high proportion of inactive transposons.
In other words, organisms with larger genomes seem to be perfectly happy carrying around a lot of junk DNA! What kind of an argument is that?
Mattick is also not convinced that Fugu provides a good example of a complex organism with no non-coding DNA. Instead, he points out that 89% of this pufferfish's DNA is still non-protein-coding, so the often-made claim that this is an example of a multicellular organism without such DNA is misleading.
[Mattick has been] a true visionary in his field; he has demonstrated an extraordinary degree of perseverance and ingenuity in gradually proving his hypothesis over the course of 18 years.

Hugo Award Committee
Seriously? That's the best argument he has? He and Mattick misrepresent what scientists say about the pufferfish genome—nobody claims that the entire genome encodes proteins—then they ignore the main point; namely, why do humans need so much more DNA? Is it because we are polyploid?

It's safe to say that John Parrington doesn't understand the C-value argument. We already know that Mattick doesn't understand it and neither does Jonathan Wells, who also wrote a book on junk DNA [John Mattick vs. Jonathan Wells]. I suppose John Parrington prefers to quote Mattick instead of Jonathan Wells—even though they use the same arguments—because Mattick has received an award from the Human Genome Organization (HUGO) for his ideas and Wells hasn't [John Mattick Wins Chen Award for Distinguished Academic Achievement in Human Genetic and Genomic Research].

For further proof that Parrington has not done his homework, I note that the Onion Test [The Case for Junk DNA: The onion test ] isn't mentioned anywhere in his book. When people dismiss or ignore the Onion Test, it usually means they don't understand it. (For a spectacular example of such misunderstanding, see: Why the "Onion Test" Fails as an Argument for "Junk DNA").


Five things John Parrington should discuss if he wants to participate in the junk DNA debate

It's frustrating to see active scientists who think that most of our genome could have a biological function but who seem to be completely unaware of the evidence for junk. Most of the positive evidence for junk is decades old so there's no excuse for such ignorance.

I wrote a post in 2013 to help these scientists understand the issues: Five Things You Should Know if You Want to Participate in the Junk DNA Debate. It was based in a talk I gave at the Evolutionary Biology meeting in Chicago that year.1 Let's look at John Parrington's new book to see if he got the message [Hint: he didn't].

There's one post for each of the five issues that informed scientists need to address if they are going to write about the amount of junk in your genome.

1. Genetic load
John Parrington and the genetic load argument
2. C-Value paradox
John Parrington and the c-value paradox
3. Modern evolutionary theory
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved
John Parrington discusses genome sequence conservation


1. It hasn't seemed to help very much.

John Parrington and the genetic load argument

We are discussing John Parrington's book The Deeper Genome: Why there is more to the human genome than meets the eye. This is the first of five posts on: Five Things You Should Know if You Want to Participate in the Junk DNA Debate

1. Genetic load (this post)
John Parrington and the genetic load argument
2. C-Value paradox
John Parrington and the c-value paradox
3. Modern evolutionary theory
John Parrington and modern evolutionary theory
4. Pseudogenes and broken genes are junk
John Parrington discusses pseudogenes and broken genes
5. Most of the genome is not conserved
John Parrington discusses genome sequence conservation


1. Genetic load

The genetic load argument has been around for 50 years. It's why experts did not expect a huge number of genes when the genome sequence was published. It's why the sequence of most of our genome must be irrelevant from an evolutionary perspective.

This argument does not rule out bulk DNA hypotheses but it does rule out all those functions that require specific sequences in order to confer biological function. This includes the speculation that most transcripts have a function and it includes the speculation that there's a vast amount of regulatory sequence in our genome. Chapter 5 of The Deeper Genome is all about the importance of regulatory RNAs.
So, starting from a failed attempt top turn a petunia purple, the discovery of RNA interference has revealed a whole new network of gene regulation mediated by RNAs and operating in parallel to the more established one of protein regulatory factors. ... Studies have revealed that a surprising 60 per cent of miRNAs turn out to be recycled introns, with the remainder being generated from the regions between genes. Yet these were parts of the genome formerly viewed as junk. Does this mean we need a reconsideration of this question? This is an issue we will discuss in Chapter 6, in particular with regard to the ENCODE project ...
The implication here is that a substantial part of the genome is devoted to the production of regulatory RNAs. Presumably, the sequences of those RNAs are important. But this conflicts with the genetic load argument unless we're only talking about an insignificant fraction of the genome.

But that's only one part of Parrington's argument against junk DNA. Here's the summary from the last Chapter ("Conclusion") ...
As we've discussed in this book, a major part of the debate about the ENCODE findings has focused on the question of what proportion of the genome is functional. Given that the two sides of this debate use quite different criteria to assess functionality it is likely that it will be some time before we have a clearer idea about who is the most correct in this debate. Yet, in framing the debate in this quantitative way, there is a danger that we might lose sight of an exciting qualitative shift that has been taking place in biology over the past decade or so. So a previous emphasis on a linear flow of information, from DNA to RNA to protein through a genetic code, is now giving way to a much more complex picture in which multiple codes are superimposed on one another. Such a viewpoint sees the gene as more than just a protein-coding unit; instead it can equally be seen as an accumulation of chemical modifications in the DNA or its associated histones, a site for non-coding RNA synthesis, or a nexus in a 3D network. Moreover, since we now know that multiple sites in the genome outside the protein-coding regions can produce RNAs, and that even many pseudo-genes are turning out to be functional, the very question of what constitutes a gene is now being challenged. Or, as Ed Weiss at the University of Pennsylvania recently put it, 'the concept of a gene is shredding.' Such is the nature of the shift that now we face the challenge of not just recognizing the true scale of this complexity, but explaining how it all comes together to make a living, functioning, human being.
I've already addressed some of the fuzzy thinking in this paragraph [The fuzzy thinking of John Parrington: The Central Dogma and The fuzzy thinking of John Parrington: pervasive transcription]. The point I want to make here is that Parrington's arguments for function in the genome require a great deal of sequence information. They all conflict with the genetic load argument.

Parrington doesn't cover the genetic load argument at all in his book. I don't know why since it seems very relevant. We could not survive as a species if the sequence of most of our genome was important for biological function.


Thursday, July 23, 2015

The essence of modern science education

The July 16th (2015) issue of Nature has a few articles devoted to science education [An Education]. The introduction to these articles in the editorial section is worth quoting. It emphasizes two important points that I've been advocating.
  1. Evidence shows us that active learning (student centered learning) is superior to the old memorize-and-regurgitate system with professors giving powerpoint presentations to passive students.
  2. You must deal with student misconceptions or your efforts won't pay off.
So many people have been preaching this new way of teaching that it's truly astonishing that it's not being adopted. It's time to change. It's time to stop rewarding and praising professors who teach the old way and time to start encouraging professors to move to the new system. Nobody says it's going to be easy.

We have professors whose main job is teaching. They should be leading the way.
One of the subjects that people love to argue about, following closely behind the ‘correct’ way to raise children, is the best way to teach them. For many, personal experience and centuries of tradition make the answer self-evident: teachers and textbooks should lay out the content to be learned, students should study and drill until they have mastered that content, and tests should be given at strategic intervals to discover how well the students have done.

And yet, decades of research into the science of learning has shown that none of these techniques is particularly effective. In university-level science courses, for example, students can indeed get good marks by passively listening to their professor’s lectures and then cramming for the exams. But the resulting knowledge tends to fade very quickly, and may do nothing to displace misconceptions that students brought with them.

Consider the common (and wrong) idea that Earth is cold in the winter because it is further from the Sun. The standard, lecture-based approach amounts to hoping that this idea can be displaced simply by getting students to memorize the correct answer, which is that seasons result from the tilt of Earth’s axis of rotation. Yet hundreds of empirical studies have shown that students will understand and retain such facts much better when they actively grapple with challenges to their ideas — say, by asking them to explain why the northern and southern hemispheres experience opposing seasons at the same time. Even if they initially come up with a wrong answer, to get there they will have had to think through what factors are important. So when they finally do hear the correct explanation, they have already built a mental scaffold that will give the answer meaning.

In this issue, prepared in collaboration with Scientific American, Nature is taking a close look at the many ways in which educators around the world are trying to implement such ‘active learning’ methods (see nature.com/stem). The potential pay-off is large — whether it is measured by the increased number of promising students who finish their degrees in science, technology, engineering and mathematics (STEM) disciplines instead of being driven out by the sheer boredom of rote memorization, or by the non-STEM students who get first-hand experience in enquiry, experimentation and reasoning on the basis of evidence.

Implementing such changes will not be easy — and many academics may question whether they are even necessary. Lecture-based education has been successful for hundreds of years, after all, and — almost by definition — today’s university instructors are the people who thrived on it.

But change is essential. The standard system also threw away far too many students who did not thrive. In an era when more of us now work with our heads, rather than our hands, the world can no longer afford to support poor learning systems that allow too few people to achieve their goals.
The old system is also wasteful because it graduates students who can't think critically and don't understand basic concepts.


Wednesday, July 22, 2015

University of Toronto Professor, teaching stream

After years of negotiation between the administration and the Faculty Association, the university has finally allowed full time lecturers to calls themselves "professors" [U of T introduces new teaching stream professorial ranks]. This brings my university into line with some other progressive universities that recognize the value of teaching.

Unfortunately, the news isn't all good. These new professors will have a qualifier attached to their titles. The new positions are: assistant professor (conditional), teaching stream; assistant professor, teaching stream; associate professor, teaching stream; and professor, teaching stream. Research and scholarly activity is an important component of these positions. The fact that the activity is in the field of pedagogy or the discipline in which they teach should not make a difference.

Meanwhile, current professors will not have qualifiers such as "professor: research," or "professor: administration," or "professor: physician," or "professor: mostly teaching."

The next step is to increase the status of these new professors by making searches more rigorous and more competitive, by keeping the salaries competitive with other professors in the university, and by insisting on high quality research and scholarly activity in the field of pedagogy. The new professors will have to establish an national and international reputation in their field just like other professors. They will have to publish in the pedagogical literature. They are not just lecturers. Almost all of them can do this if they are given the chance.

Some departments have to change the way they treat the new professors. The University of Toronto Faculty Association (UTFA) has published a guideline: Teaching Stream Workload. Here's the part on research and scholarly activity ....
  • In section 7.2, the WLPP offers the following definition of scholarship: “Scholarship refers to any combination of discipline-based scholarship in relation to or relevant to the field in which the faculty member teaches, the scholarship of teaching and learning, and creative/professional activities. Teaching stream faculty are entitled to reasonable time for pedagogical/professional development in determining workload.”
  • It is imperative that teaching stream faculty have enough time in their schedules, that is, enough “space” in their appointments, to allow for the “continued pedagogical/professional development” that the appointments policy (PPAA) calls for. Faculty teaching excessive numbers of courses or with excessive administrative loads will not have the time to engage in scholarly activity. Remember that UTFA fought an Association grievance to win the right for teaching stream faculty to “count” their discipline-based scholarship. That scholarship “counts” in both PTR review and review for promotion to senior lecturer.
And here's a rule that many departments disobey ...
Under 4.1, the WLPP reminds us of a Memorandum of Agreement workload protection: “faculty will not be required to teach in all three terms, nor shall they be pressured to volunteer to do so.” Any faculty member who must teach in all three terms should come to see UTFA.


Tuesday, July 21, 2015

The two mistakes of Kirk Durston

Kirk Durston think he's discovered a couple of mistakes made by people who debate evolution vs creationism [Microevolution versus Macroevolution: Two Mistakes].
I often observe that in discussions of evolution, both evolution skeptics and those who embrace neo-Darwinian evolution are prone to make one of two significant mistakes. Both stem from a failure to distinguish between microevolution and macroevolution.
Let's see how Durston defines these terms.

Debating Darwin's Doubt

Today is the day that John Scopes was found guilty in Dayton, Tennessee (USA) 90 years ago. The Intelligent Design Creationists have marked the day with publication of a new book called Debating Darwin's Doubt [A Scientific Controversy That Can No Longer Be Denied: Here Is Debating Darwin's Doubt].

The book was necessary because there has been so much criticism of the original Stephen Meyer's book Darwin's Doubt. David Klinghoffer has an interesting way of turning this defeat into a victory because he declares,
... the new book is important because it puts to rest a Darwinian myth, an icon of the evolution debate, namely...that there is no debate, about evolution or intelligent design!

The creationism continuum

Intelligent Design Creationists often get upset when I refer to them as creationists. They think that the word "creationist" has only one meaning; namely, a person who believes in the literal truth of Genesis in the Judeo-Christian Bible. The fact that this definition applies to many (most?) intelligent design advocates is irrelevant to them since they like to point out that many ID proponents are not biblical literalists.

There's another definition of "creationist" that's quite different and just as common throughout the world. We've been describing this other definition to ID proponents for over two decades but they refuse to listen. We've been explaining why it's quite legitimate to refer to them as Intelligent Design Creationists but there's hardly any evidence that they are paying attention. This isn't really a surprise.

Sunday, July 19, 2015

God Only Knows

God Only Knows is one of my favorite pop songs.1 It's from the Pet Sounds album by the Beach Boys (1966).

Experts have admired Brian Wilson and the Beach Boys for decades but most people have forgotten (or never knew) about their best songs. (Good Vibrations was released as a single at the same time as Pet Sounds.)

I haven't yet seen the movie about Brian Wilson (Love & Mercy).

The first video is a BBC production from 2014 paying tribute to (and featuring) Brian Wilson. The second video is from 1966.





1. I will delete any snarky comments about God and atheism.

The fuzzy thinking of John Parrington: pervasive transcription

Opponents of junk DNA usually emphasize the point that they were surprised when the draft human genome sequence was published in 2001. They expected about 100,000 genes but the initial results suggested less than 30,000 (the final number is about 25,0001. The reason they were surprised was because they had not kept up with the literature on the subject and they had not been paying attention when the sequence of chromosome 22 was published in 1999 [see Facts and Myths Concerning the Historical Estimates of the Number of Genes in the Human Genome].

The experts were expecting about 30,000 genes and that's what the genome sequence showed. Normally this wouldn't be such a big deal. Those who were expecting a large number of genes would just admit that they were wrong and they hadn't kept up with the literature over the past 30 years. They should have realized that discoveries in other species and advances in developmental biology had reinforced the idea that mammals only needed about the same number of genes as other multicellular organisms. Most of the differences are due to regulation. There was no good reason to expect that humans would need a huge number of extra genes.

That's not what happened. Instead, opponents of junk DNA insist that the complexity of the human genome cannot be explained by such a low number of genes. There must be some other explanation to account for the the missing genes. This sets the stage for at least seven different hypotheses that might resolve The Deflated Ego Problem. One of them is the idea that the human genome contains thousands and thousands of nonconserved genes for various regulatory RNAs. These are the missing genes and they account for a lot of the "dark matter" of the genome—sequences that were thought to be junk.

Here's how John Parrington describes it on page 91 of his book.
The study [ENCODE] also found that 80 per cent of the genome was generating RNA transcripts having importance, many were found only in specific cellular compartments, indicating that they have fixed addresses where they operate. Surely there could hardly be a greater divergence from Crick's central dogma than this demonstration that RNAs were produced in far greater numbers across the genome than could be expected if they were simply intermediates between DNA and protein. Indeed, some ENCODE researchers argued that the basic unit of transcription should now be considered as the transcript. So Stamatoyannopoulos claimed that 'the project has played an important role in changing our concept of the gene.'
This passage illustrates my difficulty in coming to grips with Parrington's logic in The Deeper genome. Just about every page contains statements that are either wrong or misleading and when he strings them together they lead to a fundamentally flawed conclusion. In order to critique the main point, you have to correct each of the so-called "facts" that he gets wrong. This is very tedious.

I've already explained why Parrington is wrong about the Central Dogma of Molecular Biology [John Avise doesn't understand the Central Dogma of Molecular Biology]. His readers don't know that he's wrong so they think that the discovery of noncoding RNAs is a revolution in our understanding of biochemisty—a revolution led by the likes of John A. Stamatoyannopoulos in 2012.

The reference in the book to the statement by Stamatoyannopoulos is from the infamous Elizabeth Pennisi article on ENCODE Project Writes Eulogy for Junk DNA (Pennisi, 2012). Here's what she said in that article ...
As a result of ENCODE, Gingeras and others argue that the fundamental unit of the genome and the basic unit of heredity should be the transcript—the piece of RNA decoded from DNA—and not the gene. “The project has played an important role in changing our concept of the gene,” Stamatoyannopoulos says.
I'm not sure what concept of a gene these people had before 2012. It appears that John Parrington is under the impression that genes are units that encode proteins and maybe that's what Pennisi and Stamatoyannopoulos thought as well.

If so, then perhaps the publicity surrounding ENCODE really did change their concept of a gene but all that proves is that they were remarkably uniformed before 2012. Intelligent biochemists have known for decades that the best definition of a gene is "a DNA sequence that is transcribed to produce a functional product."2 In other words, we have been defining a gene in terms of transcripts for 45 years [What Is a Gene?].

This is just another example of wrong and misleading statements that will confuse readers. If I were writing a book I would say, "The human genome sequence confirmed the predictions of the experts that there would be no more than 30,000 genes. There's nothing in the genome sequence or the ENCODE results that has any bearing on the correct understanding of the Central Dogma and there's nothing that changes the correct definition of a gene."

You can see where John Parrington's thinking is headed. Apparently, Parrington is one of those scientists who were completely unaware of the fact that genes could specify functional RNAs and completely unaware of the fact that Crick knew this back in 1970 when he tried to correct people like Parrington. Thus, Parrington and his colleagues were shocked to learn that the human genome only had only 25,000 genes and many of them didn't encode proteins. Instead of realizing that his view was wrong, he thinks that the ENCODE results overthrew those old definitions and changed the way we think about genes. He tries to convince his readers that there was a revolution in 2012.

Parrington seems to be vaguely aware of the idea that most pervasive transcription is due to noise or junk RNA. However, he gives his readers no explanation of the reasoning behind such a claim. Spurious transcription is predicted because we understand the basic concept of transcription initiation. We know that promoter sequences and transcription binding sites are short sequences and we know that they HAVE to occur a high frequency in large genomes just by chance. This is not just speculation. [see The "duon" delusion and why transcription factors MUST bind non-functionally to exon sequences and How RNA Polymerase Binds to DNA]

If our understanding of transcription initiation is correct then all you need is a activator transcription factor binding site near something that's compatible with a promoter sequence. Any given cell type will contain a number of such factors and they must bind to a large number of nonfunctional sites in a large genome. Many of these will cause occasional transcription giving rise to low abundance junk RNA. (Most of the ENCODE transcripts are present at less than one copy per cell.)

Different tissues will have different transcription factors. Thus, the low abundance junk RNAs must exhibit tissue specificity if our prediction is correct. Parrington and the ENCODE workers seem to think that the cell specificity of these low abundance transcripts is evidence of function. It isn't—it's exactly what you expect of spurious transcription. Parrington and the ENCODE leaders don't understand the scientific literature on transription initiation and transcription factors binding sites.

It takes me an entire blog post to explain the flaws in just one paragraph of Parrington's book. The whole book is like this. The only thing it has going for it is that it's better than Nessa Carey's book [Nessa Carey doesn't understand junk DNA].


1. There are about 20,000 protein-encoding genes and an unknown number of genes specifying functional RNAs. I'm estimating that there are about 5,000 but some people think there are many more.

2. No definition is perfect. My point is that defining a gene as a DNA sequence that encodes a protein is something that should have been purged from textbooks decades ago. Any biochemist who ever thought seriously enough about the definition to bring it up in a scientific paper should be embarrassed to admit that they ever believed such a ridiculous definition.

Pennisi, E. (2012) "ENCODE Project Writes Eulogy for Junk DNA." Science 337: 1159-1161. [doi:10.1126/science.337.6099.1159"]

Monday, July 13, 2015

Casey Luskin doubles down on junk DNA

Intelligent Design Creationists are committed to the idea that most of our genome is functional. They oppose the idea that a significant proportion is junk. It's easy to see why a junky genome is incompatible with intelligent design but there's something more to their opposition than that.

I think they've painted themselves into a corner. They are so opposed to evolution and modern science that they will take any opportunity to discredit it. They saw a chance to do so about twenty years ago when they became aware of the controversy surrounding junk DNA. This was their chance to (pretend to) rely on real science to back their position. By taking a stance against junk DNA they could seen to be supporting the latest evidence ... or so they thought.

Intelligent Design Creationists claim that they "predicted" that most our genome would be functional. They claim that "Darwinists" predicted junk DNA. The second part isn't true since evolutionary theory is silent on whether some genomes could become bloated with junk DNA or not. However, the ID proponents are sticking to their guns in spite of the growing consensus that most of our genome is junk.

Sunday, July 12, 2015

The moral argument for the existence of gods

It's Sunday. I assume that most of you are at church but here's something for the rest of you. It's a sermon by William Lane Craig on The Moral Argument for the Existence of Gods. I found the link on Evolution News & Views (sic) a blog that's devoted to the "science" of Intelligent Design Creationism [Watch: Three from William Lane Craig]. I'm not sure what this has to do with evolution except that the word is mentioned a few times in the videos.

I'm also not sure what this has to do with intelligent design since, as we all know, the "science" of intelligent design has nothing to do with gods.

Saturday, July 11, 2015

Science and skepticism

The National Academies of Sciences (USA) formed a committee to look into scientific integrity. A summary of the report was published in the June 26th issue of Science (Alberts et al., 2015)

I'd like to highlight two paragraphs of that report.
Like all human endeavors, science is imperfect. However, as Robert Merton noted more than half a century ago "the activities of scientists are subject to rigorous policing, to a degree perhaps unparalleled in any other field of activity." As a result, as Popper argued, "science is one of the very few human activities—perhaps the only one—in which errors are systematically criticized and fairly often, in time, corrected." Instances in which scientists detect and address flaws in work constitute evidence of success, not failure, because they demonstrate the underlying protective mechanisms of science at work.
All scientists know this, but some of us still get upset when other scientists correct our mistakes. We have learned to deal with such criticism—and dish it out ourselves—because we know that's how knowledge advances. Our standards are high.

Friday, July 10, 2015

John Avise doesn't understand the Central Dogma of Molecular Biology

I've just read Conceptual Breakthroughs in Evolutionary Genetics by John Avise. Avise is a Distinguished Professor of Ecology & Evolutionary Biology in the School of Biological Sciences at the University of Califonia at Davis (Davis, California, USA). He has written a number of excellent books including, Inside the Human Genome: A Case for Non-Intelligent Design.

His latest book consists of 70 idiosyncratic "breakthroughs" that have changed the way we think about biology. Each one is introduced with a short paragraph outlining "The Standard Paradigm" followed by another paragraph on "The Conceptual Revolution." There are 70 chapters, one for each "breakthrough," and all of them are two pages in length.

Chapter 42 is entitled: "1970 The Flow of Information."

Here's the "standard paradigm" according to John Avise.
In biochemical genetics, the molecular direction of information flow is invariably from DNA RNA protein. In other words, DNA is first transcribed into RNA, which then may be translated into polypeptides that make up proteins. This view was so ensconced in the field that it had become known as the "central dogma" (Crick, 1970) of molecular biology.
It's true that the Watson version of the Central Dogma was "ensconced" by 1970 and it's true that the incorrect Watson version is still "ensconced" in the textbooks.

It is NOT TRUE that this is the version that Crick described in 1970 or in his 1958 paper [see Basic Concepts: The Central Dogma of Molecular Biology]. Here's how Crick actually described the Central Dogma.
... once (sequential) information has passed into protein it cannot get out again (F.H.C. Crick, 1958)

The central dogma of molecular biology deals with the detailed residue-by-residue transfer of sequential information. It states that such information cannot be transferred from protein to either protein or nucleic acid. (F.H.C. Crick, 1970)
The version that John Avise refers to is the incorrect version promoted by Jim Watson.

I understand that many biologists have been taught an incorrect version of the Central Dogma but if you are going to write about it you are wise to read the original papers. In this case, Avise quotes the correct paper but he clearly has not read it.

Now let's look at the "conceptual revolution" according to John Avise.
Researchers showed that biochemical information could also flow from RNA DNA. The key discovery came when Howard Temin and David Baltimore, working independently and on different viral systems, identified an enzyme (reverse transcriptase) that catalyzes the conversion of RNA into DNA, thus enabling the passage of genetic information in a direction contrary to the central dogma.
How do I know that John Avise has not read Crick's 1970 paper? Because here's what Crick says in that paper ...
"The central dogma, enunciated by Crick in 1958 and the keystone of molecular biology ever since, is likely to prove a considerable over-simplification."
This quotation is taken from the beginning of an unsigned article headed "Central dogma reversed", recounting the very important work of Dr Howard Temin and others showing that an RNA tumor virus can use viral RNA as a template for DNA synthesis. This is not the first time that the idea of the central dogma has been misunderstood, in one way or another. In this article I explain why the term was originally introduced, its true meaning, and state why I think that, properly understood, it is still an idea of fundamental importance.
Crick tells us that the discovery of reverse transcriptase did NOT conflict with the central dogma. Thus, John Avise's conceptual revolution never happened. What happened instead, at least for some biologists, is that the discovery of reverse transcriptase taught them that their view of the central dogma was wrong. Most biologists still haven't experienced that particular conceptual revolution.


Crick, F.H.C. (1958) On protein synthesis. Symp. Soc. Exp. Biol. XII:138-163.

Crick, F. (1970) Central Dogma of Molecular Biology. Nature 227, 561-563. [PDF file]

Kirk Durston appears on Evolution News & Views to announce that "Darwinian Theory" has been falsified

Kirk Durston is a Canadian biophysicist with a Ph.D. from the University of Guelph (Guelph, Ontario, Canada). He's been attacking evolution for more than a decade using all the old tricks and sophistry that we've come to expect from creationists.

I thought you might be interested in his latest attempt to discredit evolution. His post is at: An Essential Prediction of Darwinian Theory Is Falsified by Information Degradation.

He begins by claiming that "Darwinian Theory" (what ever that is) makes an essential prediction. It predicts that information must increase over time.
In the neo-Darwinian scenario for the origin and diversity of life, the digital functional information for life would have had to begin at zero, increase over time to eventually encode the first simple life form, and continue to increase via natural processes to encode the digital information for the full diversity of life.

An essential, falsifiable prediction of Darwinian theory, therefore, is that functional information must, on average, increase over time.

Friday, July 03, 2015

The fuzzy thinking of John Parrington: The Central Dogma

My copy of The Deeper Genome: Why there's more to the human genome than meets the eye has arrived and I've finished reading it. It's a huge disappointment. Parrington makes no attempt to describe what's in your genome in more than general hand-waving terms. His main theme is that the genome is really complicated and so are we. Gosh, golly, gee whiz! Re-write the textbooks!

You will look in vain for any hard numbers such as the total number of genes or the amount of the genome devoted to centromeres, regulatory sequences etc. etc. [see What's in your genome?]. Instead, you will find a wishy-washy defense of ENCODE results and tributes to the views of John Mattick.

John Parrington is an Associate Professor of Cellular & Molecular Pharmacology at the University of Oxford (Oxford, UK). He works on the physiology of calcium signalling in mammals. This should make him well-qualified to write a book about biochemistry, molecular biology, and genomes. Unfortunately, his writing leaves a great deal to be desired. He seems to be part of a younger generation of scientists who were poorly trained as graduate students (he got his Ph.D. in 1992). He exhibits the same kind of fuzzy thinking as many of the ENCODE leaders.

Let me give you just one example.

Friday, June 26, 2015

Junk DNA is so last century!

My copy of John Parrington's new book, The Deeper Genome: Why there is more to the human genome than meets the eye, is due to arrive in about three weeks. However, we already have a number of clues about what's in the book [see How the genome lost its junk according to John Parrington]. The excerpt on Amazon [How the genome lost its junk] tells us that Parrington is aware of the controversy surrounding the ENCODE project but comes down on the side of ENCODE.

That view is shared by science writer Claire Ainsworth who wrote a review in New Scientist: Its' so last century.1 Ainsworth is a freelance science writer with a Ph.D. in developmental genetics from Oxford (Oxford, UK). She is co-founder of SciConnect, a company that teaches science communication skills to scientists.

Here's what she says in her review ....
John Parrington is an associate professor in molecular and cellular pharmacology at the University of Oxford. In The Deeper Genome, he provides an elegant, accessible account of the profound and unexpected complexities of the human genome, and shows how many ideas developed in the 20th century are being overturned.

Take DNA. It's no simple linear code, but an intricately wound, 3D structure that coils and uncoils as its genes are read and spliced in myriad ways. Forget genes as discrete, protein-coding "beads on a string": only a tiny fraction of the genome codes for proteins, and anyway, no one knows exactly what a gene is any more.

A key driver of this new view is ENCODE, the Encyclopedia of DNA Elements, which is an ambitious international project to identify the functional parts of the human genome. In 2012, it revealed not only that the protein-coding elements of DNA can overlap, but that the 98 per cent of the genome that used to be labelled inactive "junk" is nothing of the sort. Some of it regulates gene activity, some churns out an array of different kinds of RNA molecules (RNAs for short), some tiny, some large, many of whose functions are hotly debated. Parrington quotes ENCODE scientist Ewan Birney as saying at the time, "It's a jungle in there. It's full of things doing stuff." And that is one of the most apt genome metaphors I've ever read.
People, including science writers, can have different opinions about the validity of the ENCODE results and whether most of our genome is junk. They can also have different opinions about whether many of the ideas developed in the 20th century are still valid. However, I think it's only fair to at least acknowledge that others may have different opinions.

Ainsworth must be aware of the controversy over ENCODE's claim that most of our genome has a function. She could have pointed out that Parrington supports the function side but many prominent scientists support the junk DNA side. She could have noted that there have been several scientific papers published since 2012 that defend the concept of junk DNA—and defend it very well.

A good science journalist can express an opinion on a scientific controversy but good science journalists are obliged to point out to their readers that this is just an opinion and there are many expert scientists who disagree.

The readers of this New Scientist book review will think that ENCODE was the last word on the debate and that's not good science reporting.


1. The title of the online version is "DNA is life's blueprint? No, there's far more to it than that."

Thursday, June 25, 2015

UK bans teaching of creationism

The British Humanist Association is gloating over a recent decision by the government of the United Kingdom to ban the teaching of creationism in "all Academies and Free Schools, both those that already exist and those that will open in the future" [Government bans all existing and future Academies and Free Schools from teaching creationism as science].

This is ridiculous. I'm opposed to American politicians who meddle in science teaching and I'm opposed to British politicians who do the same even though I think creationism is bunk. Politicians should not be deciding what kind of science should, and should not, be taught in schools.

It's a matter of principle. It's as wrong as when American state governments banned the teaching of evolution.1

In addition, there are other reasons why this is a bad idea.
  1. Where do you stop? Do there also need to be laws banning the teaching of astrology, climate change denial, homeopathy, and Thatcherism? Do they need laws defining the correct history of how the traitors in the Thirteen Colonies formed an alliance with the French in order to overthrow well-meaning British governments?
  2. Why give creationists the ammunition to claim that they are being persecuted—especially when it's true?
  3. What's wrong with showing that creationism is bad science and refuting it in the classroom? Is that forbidden? Evolution is true, it doesn't need legal protection.
  4. Are the Brits so afraid of creationism that such a law is necessary in order to prevent creationist teachers from sneaking it into the classroom? If so, fix that problem by educating teachers.
  5. Was this a serious enough problem to warrant giving creationism a huge publicity boost?
  6. The government funding agreement notes that creationism "... should not be presented to pupils at the Academy as a scientific theory ..." Why not? I think that some parts of Intelligent Design Creationism really do count as valid scientific hypotheses, albeit bad ones. Why is the government taking a stand on the demarcation problem—especially an incorrect one?


Image Credit: Atheism and Me.

1. I'm not exactly sure who made the decision in the UK. It could be the case that "government" is just a catch phrase for decisions made by a body of science teachers and science experts. Those decisions are just implemented by the "government."

WHY IS IN OUR DNA

I posted this photo on Facebook yesterday. It shows a large sign on the side of Princess Margaret Hospital (Toronto, Ontario, Canada) saying "WHY IS IN OUR DNA." (Click to embiggen.)

Similar signs are appearing all over Toronto, especially at bus stops and in the subway.

What the heck does it mean? The best response on Facebook was that they meant to say "Y is in our DNA" but I don't think that's what they meant. Maybe they are talking about genetic diseases? Maybe they're asking if 90% of our genome is junk? Are they questioning whether "it" is in our DNA (Why is it in our DNA?)? Are they talking about cancer? (The research wing of Princess Margaret Hospital studies cancer.)

As it turns out, none of the above. Here's the real story: Why is in Our DNA. It's straight from the The Princess Margaret Cancer Foundation press release.
As The Princess Margaret Cancer Foundation continues to raise funds for one of the top 5 cancer research centres in the world, we’re asked often: Why is The Princess Margaret a world leader? There are many reasons the centre has achieved a world-class reputation, but this spring, in our communications with donors and the public, we are focusing on one reason—our scientists and researchers never stop asking why…

Why is the body’s immune system not able to fight off all cancers?
Why does cancer return in some patients?
Why did a particular cancer drug work for one patient but not any of the others?
Why did one patient’s tumour shrink dramatically with radiation, but another’s barely at all?

Starting in May, you’ll see on banners outside the cancer centre, the Princess Margaret Cancer Research Tower and in various media: WHY is in our DNA. This ‘Why gene’ that our team possesses in abundance has already helped to build The Princess Margaret’s rich history of discovery and innovation, ...
Now I get it. It's because scientists at PMH have a "why" gene.

I wonder what PR firm they hired and whether they thought this through? At the very least, this is a case where punctuation would help: "Why?" is in our DNA. Maybe they should have also put a disclaimer at the bottom: "But not in your DNA."



Wednesday, June 24, 2015

The ugly face of Intelligent Design Creationism

Intelligent Design Creationists like to pretend that their particular version of creationism is scientific. They claim they have evidence of an intelligent designer (gods), but 99% of their literature is an attack on evolution and not a defense of a new scientific theory. Those attacks take many forms but the worst ones are the attempts to smear Darwin and associate evolutionary biology with racism and the holocaust.

David Klinghoffer continues this despicable tradition in his latest post on Evolution News & Views (sic): In Explaining Dylann Roof's Inspiration, the Media Ignore Ties to Evolutionary Racism. Klinghoffer discusses the views of Dylann Roof, the man who shot and killed nine people in a church in Charleston, South Caroline (USA). Dylann Roof is reported to be a devout Christian [Dylann Roof was a devout Christian] but that's not relevant.
Of course, no one I've referred to endorses Dylann Roof's murderous rampage. I don't doubt that they are all sincerely mortified by the association, however unintended, with such unapologetic, undisguised evil.

I mention this at all not to blame them for Roof's crime, in any way, but simply to note -- because the mainstream media covers it up -- how certain ideas tend to hang together.

The racial elements in Charles Darwin's writing, the eugenicist implications, are often brushed aside as ugly but incidental, a mere byproduct of his time and place. Yet the myth of European superiority over inferior dark peoples continues to percolate in some evolutionary thinking, a century and more after the close of the Victorian era. It seems to have found an eager student in a disturbed young man named Dylann Roof.


Monday, June 22, 2015

Jerry Coyne on Lewontin and methodological naturalism

I'm working my way through Jerry Coyne's new book. There's lots of good stuff in there but I was particularly interested in his comment about his former Ph.D. supervisor, Richard Lewontin. The issue is whether science is confined to methodological naturalism leaving religion as the only way to investigate supernatural claims.

We've been over this many times in the past few decades but it's still worth reminding people of the only rational response to such a claim. This is from pages 91 and 92 of Faith vs. Fact: Why Science and Religion Are Incompatible.
... some scientists persist in claiming, wrongly, that naturalism is a set-in-stone rule of science. One of these is my Ph.D. advisor, Richard Lewontin. In a review of Carl Sagan's wonderful book The Demon Haunted World, Lewontin tried to explain the methods of science:
It is not that the methods and institutions of science somehow compel us to accept material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door.
That quotation has been promulgated with delight by both creationists and theologians, for it seems to show the narrow-mindedness of scientists who refuse to even admit the possibility of the supernatural and immaterial. But Lewontin was mistaken. We can in principle allow a Divine Foot in the door; it's just that we've never seen the Foot. If, for example, supernatural phenomena like healing through prayer, accurate religious prophecies, and recollection of past lives surfaced with regularity and credibility, we might be forced to abandon our adherence to purely natural explanations. And in fact we've sometimes put naturalism aside by taking some of these claims seriously and trying to study them. Examples include ESP at other "paranormal phenomena" that lack any naturalistic explanation.

Sadly, arguments similar to Lewontin's—that naturalism is a unbreakable rule of science—are echoed by scientific organizations that want to avoid alienating religious people. Liberal believers can be useful allies fighting creationism, but accommodationists fear that those believers will be driven away by any claim that science can tackle the supernatural. Better to keep comity and pretend that science by definition can say nothing about the divine. This coddling of religious sentiments was demonstrated by Eugenie Scott, the former director of an otherwise admirable anti-creationist organization, the National Center for Science Education:
First, science is a limited way of knowing, in which practitioners attempt to explain the natural world using natural explanations. By definition, science cannot consider supernatural explanations: if there is an omnipotent deity, there is no way that a scientist can exclude or include it in a research design. This is especially clear in experimental research: an omnipotent deity cannot be "controlled" (as one wag commented, "you can't put God in a test tube, or keep them out of one"). So by definition, if an individual is attempting to explain some aspect of the natural world using science, he or she must act as if there were no supernatural forces operating on it. I think this methodological naturalism is well understood by evolutionists.
Note that Scott claims naturalism as part of the definition of science. But that's incorrect, for nothing in science prohibits us from considering supernatural explanations. Of course, if you define "supernatural" as "that which cannot be investigated by science," then Scott's claims become tautologically true. Otherwise, it's both glib and misleading to say that God is off-limits because he can't be "controlled" or "put in a test tube." Every study of spiritual healing or the efficacy of prayer (which, if done properly, includes controls) puts God into a test tube. It's the same for tests of non-divine supernatural phenomena like ESP, ghosts, and out-of-body experiences. If something is supposed to exist in a way that has tangible effects of the universe, it falls within the ambit of science. And supernatural beings and phenomena can have real-world effects.