More Recent Comments

Thursday, June 03, 2010

Smart Crocodile Eaters?

National Geographic Daily News has just published an embarrassing article about early human evolution [Eating Crocodile Helped Boost Early Human Brains?].

An ancient kitchen dating back 1.95 million years was discovered in Kenya. Among the bones recovered at this were those of fish, turtles, and crocodiles. The paper has just been published in PNAS (Braun et al., 2010).

Here's what the National Geographic science writer (Christine Dell'Amore) reports ...
According to the study authors, the addition of water-based prey into early-human diets may have been what boosted brain size in certain hominins—humans plus human ancestral species and their close evolutionary relatives.

That's because reptiles and fish are particularly rich in long-chain polyunsaturated fatty acids. Some experts think this so-called good fat was "part of the package" of human brain evolution, said study leader David Braun, an archaeologist at the University of Cape Town in South Africa.

Discovering evidence for "brain food" in the late Pliocene (about 3 to 1.8 million years ago) may explain how bigger brains—for instance in our likely direct ancestor Homo erectus—arose in humans and their relatives about 1.8 million years ago, Braun said.
This is one of those cases where the press report accurately describes what's in the paper. Here's the conclusion of the PNAS paper,
The evidence from FwJj 20 indicates that hominins were very effective at securing access to a wider variety of high-quality animal tissues than has been previously documented. Some of these resources would have provided necessary dietary resources without the added predation risks associated with interactions with large mammalian carnivores that are sometimes involved with the acquisition of elements of large mammal carcasses (28, 33). In addition, although animal tissues provide nutrient-rich fuel for a growing brain, aquatic resources (e.g., fish, crocodiles, turtles) are especially rich sources of the long-chain polyunsaturated fatty acids and docosahexaenoic acid that are so critical to human brain growth (2). Therefore, the incorporation of diverse animals, especially those in the lacustrine food chain, provided critical nutritional components to the diets of hominins before the appearance of H. ergaster/erectus that could have fueled the evolution of larger brains in late Pliocene hominins.
There are so many problems here, I hardly know where to begin.

First, there's the implicit assumption that eating food rich in long chain polyunsaturated fatty acids contributes to brain growth. As far as I know, the total scientific evidence does not strongly support this assumption—although there are plenty of studies that make the claim. (Their reference 2 is to another anthropological study.) This is not an assumption that one should build a theory on, but, if you do, you'd better back it up with references to the primary biochemistry and physiology literature.

Second, it's easy to be confused about the importance of dietary lipids. For the record, you need to eat foods containing linolenate and linoleate because these are essential fatty acids. You can't synthesize them but you need them in order to make other important fatty acids. There's plenty of these essential fatty acids in plants, which is why our fellow primates (chimps and gorillas) survive quite nicely without eating crocodiles. The "magical brain food" kinds of fatty acids are the other omega-3 fatty acids that we can synthesize as long as we have an adequate supply of the essential fatty acids.

Finally, let's think about the hypothesis being put forward. The idea is that our ancient ancestors weren't very smart. They had small brains. Some of them started to eat fish and crocodiles and that made their brain get bigger. Presumably this mostly affected the children since there's no evidence that diet can make an adult brain grow bigger.

As the years passed, the entire population acquired bigger brains as a result of their diet. Maybe this group out-competed their neighbors who didn't like fish so that eventually the entire hominid population of the region had big brains and ate fish.

What has this got to do with evolution? How do you get from a cultural preference for eating fish to changes in the genes controlling brain development? Are the authors implying some kind of Lamarckian inheritance? How, exactly, does eating fish translate into genetic changes (i.e. evolution)?

Am I missing something?


[Photo Credit: ProGolferDigest]

Braun, D.R., Harris, J.W.K., Levin, N.R., McCoy, J.T., Herries, A.I.R., Bamford, M.K., Bishop, L.C., Richmond, B.G. and Kibunjia, M. (2010) Early hominin diet included diverse terrestrial and aquatic animals 1.95 Ma in East Turkana, Kenya. Proc. Natl. Acad. Sci. (USA) 107: 10002-10007. [doi: 10.1073/pnas.1002181107]

Sunday, May 30, 2010

Calibrating the Molecular Clock

John Hawks is discussing the evolution of hominids on his blog and, in particular, whether Ardipithecus (Ardi) is a hominid [Ardipithecus challenge explication: the molecular clock].

This is a complex issue. One of the problems is that Ardi is supposed to have lived 5.5 million years ago, according to John Hawks, but all estimates of the human-chimp divergence say it occurred between 3 and 5 million years ago. If that's true then Ardi is not in either the chimp or human lineages.

The human-chimp divergence is based on calibrating the molecule clock and that's what John addresses in his post. He seems to think that this calibration is accurate [Reviewing the clock, and phylogenomics] but I'm not so sure. Many of these studies (but not all) require calibrating the rate of change by using fixed time points inferred from the fossil record. For example, if you assume that primates and rodents last shared a common ancestor 100 million years ago then you can get a rate of change by adding up the number of changes in each lineage and dividing by 100 (substitutions per million years). Then you look at the number of substitutions in the human and chimp lineages and calculate the years since they diverged.

This is an over-simplification, as John explains on his blog, because the calibrations are also based on known mutation rates and population genetics. The theoretical models agree on a human-chimp divergence time of 3-5 million years.

I've been skeptical of the fossil record calibrations for many years because they give some very unreasonable divergence times and because the so-called "fixed" standards also seem unreasonable. The molecular clock ticks at an approximately constant rate but we just don't know what that rate is. I would have no problem accepting that humans and chimps diverged 6-7 million years ago.


[Reconstructions: Copyright 2009, J.H. Matternes.]

A Young Student of Physics

 
I know you all hate it when bloggers inundate you with photos of their kids and grandkids but here's one I can't resist. It's my granddaughter Zoë (5 months old) learning vector calculus. It's never too soon to start.



She'll probably have to wait until she gets older to move on to more difficult subjects like biology. I think her mom is doing the right thing by starting out with the easiest science.


Friday, May 28, 2010

Junk DNA and Genetics Textbooks

 
The latest issue of The GSA Reporter, published by the Genetics Society of America, is just out and they have an article on The Ins and Outs of Textbook Authorship. Here's something I agree with.
Anthony Griffiths (University of British Columbia, Vancouver) places more emphasis on core principles than on specific applications: “The goal is to show how genetic inference is made ... hence overall the emphasis is more on process than the discoveries.”
That's right. A textbook should emphasize concepts and principles and not facts. The goal is to show students how all of the knowledge we have fits into a coherent picture of the subject. Genetics, like all sciences, is based on models that represent the consensus view of the scientists in the field. It's important for students to see that these models are internally consistent and compatible with biology and biochemistry and all other sciences. It's important that students realize that there may be some controversy in the discipline but that they need to learn how to sort it out.

The next paragraph says, ...
[Scott] Hawley also places a heavy emphasis on core topics, because he “factor[s] in heavily the concept that so-called facts can be pretty ephemeral in science.” As Hawley explains, “Many of the ‘facts’ I was taught in college are either irrelevant now or wrong. For example, I heard many lectures as an undergrad asserting that a huge part of the genome was useless ‘junk’. We no longer look at things that way.”
I'm looking forward to seeing the next edition of The Human Genome by Julia Richards and Scott Hawley. It will be interesting to see what principles and concepts they advance to explain the absence of junk in our genome.

One of the things textbook authors have to careful of is discarding solid, well-established, models (like junk DNA) based on the results of a few modern experiments. Yes, it's true that new discoveries often overthrow old concepts, but it also true that when new "facts" disagree with established models it's usually the new facts that turn out to be wrong. The idea that theories are frequently overthrown by "nasty little facts" is a myth.

Rejecting the concept of junk DNA has consequences that will be difficult to handle in the next edition. It means re-writing the sections on the C-value Paradox, transposons (especially defective transposons), selfish DNA, pseudogenes, and genetic load. Also, the explanation for why this DNA is functional is going to have serious ramifications for other topics. I can't imagine how they'll put together a coherent picture of modern genetics if they reject junk DNA.

If you're looking for a good genetics textbook then here's my advice. Buy the one that supports the idea of copious amounts of junk in our genome and explains why it has to be junk. Ignore any textbook that rejects the notion of junk DNA—it will probably have other things wrong as well.


Thanks to a friend who alerted me to the article in The GSA Reporter.

Sunday, May 23, 2010

Junk DNA on BIOpinionated

 
Nils Reinton and I are discussing junk DNA on his blog [More crap from the junkies]. It might surprise you to learn that this "junkie" still isn't convinced that junk DNA is dead. Nils isn't convinced that junk DNA exists.

This is what a real scientific controversy looks like.


Saturday, May 22, 2010

Bill Dembski, Isaac Asimov, and The Second Law of Thermodynamics

 
According to Bill Dembski, "The 2nd Law of Thermodynamics has never been a friend of materialistic evolution." [Granville Sewell on the 2nd Law]. His authority for such a ridiculous statement is none other than Granville Sewell, a Professor of Mathematics at the University of Texas, El Paso. Biochemists have never had a problem with the 2nd law or evolution.

This is a good time to remind people of a famous quotation by Isaac Asimov—a biochemist— from his 1981 essay, The “Threat” of Creationism.
Creationists have learned enough scientific terminology to use it in their attempts to disprove evolution. They do this in numerous ways, but the most common example, at least in the mail I receive is the repeated assertion that the second law of thermodynamics demonstrates the evolutionary process to be impossible.

In kindergarten terms, the second law of thermodynamics says that all spontaneous change is in the direction of increasing disorder—that is, in a "downhill" direction. There can be no spontaneous buildup of the complex from the simple, therefore, because that would be moving "uphill." According to the creationists argument, since, by the evolutionary process, complex forms of life evolve from simple forms, that process defies the second law, so creationism must be true.

Such an argument implies that this clearly visible fallacy is somehow invisible to scientists, who must therefore be flying in the face of the second law through sheer perversity. Scientists, however, do know about the second law and they are not blind. It's just that an argument based on kindergarten terms is suitable only for kindergartens. [my emphasis - LAM]


Friday, May 21, 2010

"American" History

 
PZ Myers posted this video of Cynthia Dunbar reciting a prayer to open a meeting of the Texas IDiots state board of education [Another reason to ban official prayer at public meetings]. He makes an important point: why the hell is anyone saying prayers to open a meeting of publicly elected government officials? We do this in Canada as well. It makes no sense in the 21st century.



But that's not the only thing weird about this prayer. PZ draws your attention to the following statements in the "prayer."
I believe no one can read the history of our country without realizing that the Good Book and the spirit of the savior have from the beginning been our guiding geniuses.

Whether we look to the first charter of Virginia, or the charter of New England...the same objective is present — a Christian land governed by Christian principles.

I like to believe we are living today in the spirit of the Christian religion. I like also to believe that as long as we do so, no great harm can come to our country.
Keep in mind that this is the same board of education that is rewriting American history. They don't have a lot of credibility. Having said that, there's one thing I'd like to point out. Cynthia Dunbar makes reference to the First Charter of Virginia as evidence that the United States of America is a Christian nation.

Here's a bit from the beginning of that charter from: The First Charter of Virginia.
JAMES, by the Grace of God, King of England, Scotland, France and Ireland, Defender of the Faith, etc. WHEREAS our loving and well-disposed Subjects, Sir Thomas Gates, and Sir George Somers, Knights, Richard Hackluit, Clerk, Prebendary of Westminster, and Edward-Maria Wingfield, Thomas Hanham, and Raleigh Gilbert, Esquires William Parker, and George Popham, Gentlemen, and divers others of our loving Subjects, have been humble Suitors unto us, that We would vouchsafe unto them our License, to make Habitation, Plantation, and to deduce a colony of sundry of our people into that part of America commonly called VIRGINIA, and other parts and Territories in America, either appertaining unto us, or which are not now actually possessed by any Christian Prince or People, situate, lies, and being all along the Sea Coasts, between four and thirty Degrees of Northerly Latitude from the Equinoctial Line, and five and forty Degrees of the same Latitude, and in the main Land between the same four and thirty and five and forty Degrees, and the Islands thereunto adjacent, or within one hundred Miles of the Coast thereof;

....

We, greatly commending, and graciously accepting of, their Desires for the Furtherance of so noble a Work, which may, by the Providence of Almighty God, hereafter tend to the Glory of his Divine Majesty, in propagating of Christian Religion to such People, as yet live in Darkness and miserable Ignorance of the true Knowledge and Worship of God, and may in time bring the Infidels and Savages, living in those parts, to human Civility, and to a settled and quiet Government: DO, by these our Letters Patents, graciously accept of, and agree to, their humble and well-intended Desires;
This doesn't sound much like the United States of America, does it? The United States didn't come into existence until almost 180 years after this charter was written. Furthermore, when the revolution began the goal was to separate from Great Britain and its monarch and start a new country that did not have a state religion.

At least I thought that was the goal. Does Ms. Dunbar want to turn back the clock and revert to being a colony of Great Britain? Does she want Queen Elizabeth II to become the American head of state and the Church of England to become the state religion as in 1606? I'm not sure that Britain would agree to such a change. But I bet if you ask them nicely they'd consider giving you Prince Charles as an American king.


Dear Royal Ontario Museum ...

 
Indicate in the comments whether you'd like to sign this letter as a supporter of the Committee for the Advancement of Scientific Skepticismat the Center for Inquiry (Canada). Include your name, title, and affiliation. Email me if you'd rather not post a comment. (My name is "l.moran" and my domain is "utoronto.ca")

See Shame on the Royal Ontario Museum for more information about the event.
William Thorsell
Director, the Royal Ontario Museum
100 Queen's Park
Toronto, ON
M5S 2C6

Mr. Thorsell,

We at the Committee for the Advancement of Scientific Skepticism (CASS) at the Centre for Inquiry (Canada) and its supporters were dismayed  to learn that the Royal Ontario Museum will be sponsoring a talk by Deepak Chopra at the University of Toronto in connection with the Director's Signature Series: The Warrior Emperor and China's Terracotta Army.

While we fully support the concept of academic freedom, we are baffled by this invitation and wonder how it fits into the mandate of the museum to "serve as an advocate for science in the study of nature," as stated in your message on the ROM website.  Mr. Chopra's new age psycho-babble may be attractive to the general public, but by inviting him to speak at the ROM, you lend undeserved scientific credibility to his pseudo-scientific claims about quantum physics, psychology, chemistry and medicine. These claims are rightly rejected as absurd by the scientific community and by promoting them you tarnish the otherwise excellent scientific reputation of the Royal Ontario Museum.

CASS will be publishing the standard rebuttals of Deepak Chopra's fanciful quackery in order to help the public understand where he goes off the rails. Our hope is to turn this otherwise embarrassing event into a learning opportunity.  We are also contacting the sponsors of the event and the ROM's other private donors in order to voice our concern about Mr. Chopra's presentation. We would like the ROM to clarify how Mr. Chopra's visit fits into this lecture series, as it seems this is just another opportunity for him to promote his new book.  

We look forward to hearing from you.

 
Sincerely,

The Committee for the Advancement of Scientific Skepticism (CASS) at the Centre for Inquiry Canada


Thursday, May 20, 2010

The Mutationism Myth: III Foundations of Evolutionary Genetics

 
This is the fifth in a series of postings by guest blogger, Arlin Stoltzfus. You can read the introduction to the series at: Introduction to "The Curious Disconnect". The first part is at: The "Mutationism" Myth I. The Monk's Lost Code and the Great Confusion. The second installment is: Theory vs Theory. The third part is: The Mutationism Myth, II. Revolution



The Curious Disconnect


Today in the Curious Disconnect we continue with our series on the Mutationism Myth. In this oft-told story (see part 1), the discovery of genetics in 1900 leads to rejection of Darwin's theory and the rise of "mutationism", a laughable1 theory that imagines evolution by mutation alone, without selection. "Mutationism" prevails for a generation, until Fisher, Haldane and Wright show that genetics is the missing key to Darwinism. In the conclusion to the story, the world is set right again when the "Modern Synthesis", combining selection with Mendelian genetics, shoulders aside the mutationist heresy, which ends up in the dustbin of history with the other "doomed rivals" of Darwin's great theory.2

Thats the story, at least. In reality- as we found out in part 2-, the Mendelians rejected Darwin's errant principles of heredity, not his principle of selection. What kind of view did the Mendelians develop? Addressing this question is our next challenge. Today, in part 3, we'll consider aspects of the Mendelian view that became the foundations of mainstream 20th-century thinking. In part 4, we'll delve into some "non-Darwinian" or "anti-Darwinian" aspects that were rejected, or merely ignored.

The Mutationism Myth. 3. Foundations of evolutionary genetics


Darwin's "Natural Selection" theory posited a smooth and automatic process of adaptation to altered conditions, dependent on infinitesimal hereditary fluctuations ("indefinite variability", in Darwin's terminology) induced by the effect of "altered conditions of life" on the "sexual organs". As we discovered in part 2, geneticists rejected fluctuation because it is incompatible with the assumption of exclusively Mendelian inheritance, an assumption embraced eagerly by geneticists, and held in suspicion by others for many years. As Bateson wrote:
"To Darwin the question, What is a variation? presented no difficulties. Any difference between parent and offspring was a variation. Now we have to be more precise. First we must, as de Vries has shown, distinguish real, genetic, variation from fluctuational variations, due to environmental and other accidents, which cannot be transmitted." (p. 95)
and as Morgan wrote:
"As has been explained, the kind of variability on which Darwin based his theory of natural selection can no longer be used in support of that theory, because, in the first place, in so far as fluctuating variations are due to environmental effect, these differences are now known not to be inherited, and because, in the second place, selection of the differences between individuals, due to the then existing genetic variants, while changing the number of individuals of a given kind, will not introduce anything new. The essential [feature] of the evolutionary process is the occurrence of new characteristics." p. 148-149 of Morgan (1932) 3

Because heredity and variation did not behave in the manner assumed by Darwin and his followers, it was up to a new generation of evolutionists to develop a new understanding of evolution. Thus, at a time when naturalists were dismissing genetics and clinging to 19th-century views of heredity, including Darwinism and Lamarckism, a group of Young Turks4 was laying the foundations of the genetics-based understanding of evolution that dominated the 20th century.

The concept of population genetics


To understand these foundations, I need to say a few words about the theoretical side of evolutionary genetics, often referred to as "population genetics". Please recall from Theory vs. Theory that when we talk about population genetics theory or music theory, thats a different sense of "theory" from Lamarck's theory or the prion theory of disease. Previously, we called them theory2 (body of abstract principles) and theory1 (grand conjecture).

Population genetics theory2 (roughly speaking) works out the implications of transmission genetics in populations of reproducing organisms, focusing on implications of such Mendelian phenomena as biparental inheritance, chromosome assortment, mutation, recombination, sex-linked inheritance, and so on.

As it exists today, population genetics theory2 covers a wide range of possible worlds, and thus a wide range of possible theories1. For instance, it provides classic equations to treat allele frequencies continuously and deterministically (e.g., Hardy-Weinberg), and at the same time, it provides another framework for addressing probabilistic changes with random drift. Is evolution deterministic or probabilistic? Population genetics theory2 doesn't say- it allows us to consider both possibilities. Is evolutionary change smooth or does it come in chunks? Population genetics theory2 doesn't say: it provides a quantitative genetics framework for continuous changes in quantitative characters, and a completely different framework for molecular evolutionists examining discrete characters. There are limiting cases where these different frameworks converge in some respects, but there is not any single realizable world in which all of population genetics theory2 applies, thus theoretical population genetics can't be understood as a theory1.

Crudely speaking, three frameworks of population genetics theory have been important in the 20th century: the stability analysis5 of systems of continuous allele frequencies, initially deterministic a la Hardy-Weinberg (or Lewontin-Kojima and so on) and later stochastic; the "quantitative genetics" theory of generational change in continuous-valued phenotypic characters (with implicit genetics) subject to selection; and the dynamics of the steady-state origin-fixation process, which was not an important paradigm until Kimura proposed the neutral theory.

The Bateson-Saunders equilibrium


In a landmark 1902 report to the Evolution committee of the Royal Society, Bateson and Saunders report some of their own findings and, more generally, try to explain the new science of Mendelian genetics, and the implications of Mendel's rules for evolution. In one of many fascinating comments, Bateson and Saunders suggeest that:
"It will be of great interest to study the statistics of such a population [with recognizable Mendelian characters] in nature. If the degree of dominance can be experimentally determined, or the heterozygote recognised, and we can suppose that all forms mate together with equal freedom and fertility, and that there is no natural selection in respect of the allelomorphs, it should be possible to predict the proportions of the several components of the population with some accuracy. Conversely, departures from the calculated result would then throw no little light on the influence of disturbing factors, selection, and the like.
Those of you who know your population genetics will recognize, in this passage, a paradigm that continues to play a key role in contemporary research as a "zero-force" model, describing the case of an unperturbed system, i.e., a system at rest. Deviations from this resting state indicate the perturbing effect of some factor or force.

In 1908, Hardy and Weinberg independently derived solutions for the frequencies of genotypes and alleles in the zero-force model of Bateson and Saunders. The mathematical solution to the Hardy-Weinberg equilibrium, as it came to be called, is sufficiently trivial that publishing it was nearly beneath the dignity of G.H. Hardy, the archetypal pure mathematician. In his paper, Hardy seems to sneer at biologists, saying "I should have expected the very simple point which I wish to make to have been familiar to biologists". Legend has it that Hardy learned of this problem while playing cricket with Punnett, the Mendelian, providing an early example of how interdisciplinary work is done.

The research program that eventually developed around this model was exactly as Bateson and Saunders imagined: compute the Hardy-Weinberg equilibrium, compare this to the observed frequencies, then interpret any deviations in terms of "the influence of disturbing factors". Researchers continue to use it, as one may find by searching PubMed with "hardy-weinberg AND 2009 [date]", which yields 532 publications for 2009. Contemporary philosophers discussing causation in evolutionary theory make frequent reference to Hardy-Weinberg as a zero-force law (see Stephens, 2001).

Given the crystal-clear statement of the problem by Bateson and Saunders, including the assumptions and the interpretive framework, should we not call it the Bateson-Saunders-Hardy-Weinberg equilibrium (or the Bateson-Saunders-Weinberg equilibrium, saving Hardy the embarrassment of receiving credit for something practical 6)?

Morgan's origin-fixation process


An entirely different, but similarly prescient, model is found in T.H. Morgan's 1916 book:
"If through a mutation a character appears that is neither advantageous nor disadvantageous, but indifferent, the chance that it may become established in the race is extremely small, although by good luck such a thing may occur rarely. It makes no difference whether the character in question is a dominant or a recessive one, the chance of its becoming established is exactly the same. If through a mutation a character appears that has an injurious effect, however slight this may be, it has practically no chance of becoming established.
If through a mutation a character appears that has a beneficial influence on the individual, the chance that the individual will survive is increased, not only for itself, but for all of its descendants that come to inherit this character. It is this increase in the number of individuals possessing a particular character, that might have an influence on the course of evolution." (187-189)
This is an abbreviated framework for understanding evolution under the "new mutations" or "mutation-limited" view that is now commonplace in molecular evolution. A new mutation arises and may "become established- we would say "become fixed" or "reach fixation" in population-genetics jargon- with a probability (not a certainty) that depends on its effects. If its effects are injurious, is has practically no chance of being established, and so on.

Morgan's verbal description is remarkably accurate. Later, in the 1920s, Haldane, Wright, and Fisher began to work out some approximations for the probability of fixation of a new mutant allele. For newly introduced neutral alleles,  (substitute 2N for diploids), where N is the population size, and this value is not affected by recessivity or dominance, just as Morgan says; for a newly introduced beneficial allele, , where s is the selective advantage; for a significantly deleterious allele, the probability of fixation is vanishingly small. Later, diffusion theory was used to derive a more general expression for the probability of fixation (e.g., Gillespie, 1998, p. 82)

where the starting frequency p would be 1/N for a new mutation in the haploid case (and 1/2N for the diploid case).

To the extent that there was a distinctive "mutationist" perspective on evolutionary genetics that was rejected for its non-Darwinian implications, this was it. While Haldane, Fisher and Wright worked out the theory2 for the probability of fixation of a new mutation, they didn't use this knowledge for anything important, because evolution by new mutations was not part of their theory1 of evolution. Instead, Morgan's view of evolution as a series of mutation-fixation events was rejected by the Modern Synthesis as the "lucky mutant" view, and was ignored for nearly 50 years; Kimura popularized a neutral version of this view, which remained associated with neutral evolution for another 30 years; and in the past 10 years, Morgan's perspective is emerging as a more general view that may serve as the basis for models of adaptation (e.g., Orr, 2002).

The Mendelian interpretation of continuous variation


The advocates of Darwin's view of blending inheritance and fluctuation fought hard against Mendelism early in the 20th century, leading to the infamous biometrician-Mendelian debate. Thus, a century ago, it was necessary to defend Mendelian principles from attack by those who- disparaging Mendelism's simplistic rules and suspecting its experimental foundations in "artificial" breeding- held out hopes for a fuzzier, more organic conception of heredity and variation that would fit better with Darwin's view. In Bateson's 1902 "defense" of Mendelism, he provides a Mendelian interpretation of continuous variation:
"In the case of a population presenting continuous variation in regard to say, stature, it is easy to see how purity of the gametes in respect of any intensities of that character might not in ordinary circumstances be capable of detection. There are doubtless more than two pure gametic forms of this character, but there may quite conceivably be six or eight. When it is remembered that each heterozygous combination of any two may have its own appropriate stature, and that such a character is distinctly dependent on external conditions, the mere fact that the observed curves of stature give 'chance distributions' is not surprising and may still be compatible with purity of gametes in respect of certain pure types." (p. 31)
By "chance distribution", Bateson is invoking what we now call a "normal distribution". Such a distribution "may still be compatible with the purity of the gametes", i.e., compatible with Mendelian inheritance, because it can result by the combined effects of a multiplicity of Mendelian loci (6 or 8, he imagines), each with 2 homozygotes and 1 heterozygote, with environmental variation due to "external conditions".

Thus, Bateson interpreted quantitative characters precisely as we do today, as the result of overlaying environmental fluctuation on a discrete distribution of genetic types. This interpretation is not due to little Ronny Fisher, the 12-year-old boy who would grow up to be a founder of mathematical population genetics and would declare that genetics was the key to Darwin's theory7, but to Bateson and other geneticists, including Danish botanist Wilhelm Johannsen and the Swedish geneticist Herman Nilsson-Ehle.

The Mendelian interpretation was bolstered by a series of precise quantitative experiments conducted by Johannsen with the Princess bean. Johannsen isolated 19 stable self-fertilizing lines, each of which produced seeds with a different average weight. Planting any single variety would produce a smooth distribution of seed weights. Johannsen selected larger beans to plant a new generation, but this had no significant effect on the distribution of seed weights, proving that this newly arising variation was not heritable Darwinian fluctuation, but non-heritable somatic variation. Johannsen coined the terms "genotype" and "phenotype" to help explain this distinction.

By 1909, both Johannsen and Nilsson-Ehle had contrived to generate populations that, at the level of "genotype", were known mixtures of discrete Mendelian types, but which- at the level of "phenotype"- produced a nice smooth bell-shaped distribution. Johannsen's distributions of beans are reproduced in the figure below (right) from Morgan (1916; online source). 

The evolution of quantitative characters


Finally, the Mendelians developed a causal theory for the gradual change in a quantitative character due to selection that negotiated the phenotype-genotype distinction and was appropriately probabilistic.

In the Darwinian view based on fluctuation and blending of hereditary substances, the superficial appearance that the whole population has shifted continuously and homogeneously reflects the underlying reality that hereditary substances have shifted continuously and homogeneously.

The new Mendelian view differed in two respects. First, given the genotype-phenotype distinction, selection of a particular phenotypic range implicates hereditary factors indirectly and probabilistically. For instance, Punnett (1911) constructs a simplified example in which there are just 3 genetically defined types, A, B and C, with mean weights of 10, 12 and 14 grains (a "grain" is a unit of weight equal to 0.065 gram). "A seed that weighs 12 grains may belong to any of these three strains. It may be an average seed of B, or a rather large seed of A, or a rather small seed of C" (p. 162; online source):
"On this view we can understand why selection of the largest seed[s] raises the average weight in the next generation. We are picking out more of C and less of A and B, and as this process is repeated the proportion of C gradually increases and we get the appearance of selection acting on a continuously varying homogenous material and producing a permanent effect."
Second, as the Mendelians stressed repeatedly, the end result of this process is a not a new complement of hereditary factors, but a mixture of old components in new and different proportions. The hereditary factors are not changed by this process (as Darwin and his followers wrongly believed): only their proportions in the population are changed. Without new mutations, the new population would never transcend the genetic limits inherent in the original mix.

Homework


The popular view of history reflected in the Mutationism Myth is that our contemporary understanding of evolution began with Fisher, Haldane and Wright, not with Bateson, Morgan, Johannsen, Punnett and others. I see this as a whitewashed version of history, in which the contributions of the Mendelians have been erased.

But lets consider for a moment that, just as Darwin's followers did not give up on blending inheritance without a nasty fight that created lasting suspicions about geneticists, they are not likely to give up Synthesis Historiography8 without a nasty fight that will leave a stain on critics such as myself. So, how does one convincingly establish a point about influence or credit? How do we know whose views were influential and whose views were purged? Here are some examples of types of information that might be useful:
  • A popular evolution education web site has a timeline listing important contributors to evolutionary thinking. The timeline has a gap of a whole generation between the late-19th-century neo-Darwinians (e.g., Weismann) and the early "Synthesis" architects. Kimura is not listed.
  • Morgan published several books on evolution that went through multiple printings; the Boston Public Library includes his 1916 book in its list of 100 most influential books of the 20th century.
  • The Oxford Encyclopedia of Evolution, which includes biographic entries, does not have an entry for any Mendelian except Morgan, whose evolutionary views are not discussed.
In what other ways might we establish objectively that certain scientists, and not others, receive credit for their work and get included in histories? How could you generate a large amount of data quickly? How could one show an unwarranted or extra-scientific bias for or against certain authors, i.e., excluding the alternative possibility that "the judgment of history" favoring one person over another reflects true scientific merit?

Conclusion


The Mutationism Myth suggests that our contemporary understanding of evolution did not emerge until Fisher, Haldane and Wright combined Darwin's principle of selection with Mendelian genetics; and that a generation was wasted while Mendelians developed a "doomed rival" to Darwin's great theory, in the form of a "mutationist" view that denied selection.

In fact, the Mendelians did not develop such a view. Instead, their interpretations paved the way for the Modern Synthesis and laid the foundations for our contemporary genetics-based understanding of evolution: they developed the Hardy-Weinberg model, interpreted quantitative trait evolution correctly, and even thought ahead to the "new mutations" perspective currently making inroads into evolutionary genetics.

Among the Mendelians, I also would count Nikolai Vavilov, the extraordinary Russian geneticist who started the first global seed bank (which persists today at the Vavilov Institute), leading expeditions that collected some 200,000 seeds. In 1922 he made a fascinating contribution to "mutationist" thinking, proposing parallel variations as a key component of parallel evolution. Vavilov was sent by the Soviets to a prison camp, where he died in 1943.

The Soviets purged Vavilov because of his opposition to Lysenkoism, the non-Mendelian theory of genetics with a Lamarckian theme of improvement-through-effort that fit nicely with Soviet ideology. Why were the contributions of Mendelians purged from our history, leaving the false impression of a generation-long gap in our intellectual history? Why don't we count Bateson, Morgan, Punnett, Johannsen, and others among the "founders" of modern evolutionary thinking? Possible answers to this question will emerge in part 4 of The Mutationism Myth, where we explore the non-Darwinian aspects of Mendelian thinking, and in part 5, where we consider the "Modern Synthesis" as a restoration of Darwinian orthodoxy.


References

Batson, W., and E. R. Saunders. 1902. Experimental Studies in the Physiology of Heredity. Reports to the Evolution Committee. Royal Society. (Bateson%20saunders&pg=PP1#v=onepage&q&f=false">online source)

Bateson, W. 1902. Mendel's Principles of Heredity: A Defense. Cambridge University Press, Cambridge. (online source)

Bateson, W. 1909. Heredity and Variation in Modern Light. Pp. 85-101 in A. C. Seward, ed. Darwin and Modern Science: Essays in Commemoration of the Centenary of the Birgh of Charles Darwin and of the Fiftieth Anniversary of the publication of the Origin of Species. Cambridge, London.

Gillespie, J. H. 1998. Population Genetics: A Concise Guide. Johns Hopkins University Press, Baltimore, MD.

Morgan, T. H. 1932. The Scientific Basis of Evolution. W.W. Norton & Co., New York.

Orr, H. A. 2002. The population genetics of adaptation: the adaptation of DNA sequences. Evolution Int J Org Evolution 56:1317-1330.

Punnett, R. C. 1911. Mendelism. MacMillan. http://www.archive.org/stream/mendelism00punn#page/172

Stephens, C. 2001. Selection, Drift, and the "Forces" of Evolution. Philosophy of Science 71:550-570.

Sturtevant, A. H. 1965. The Early Mendelians. Proceedings of the American Philosophical Society 109:199-208.

Vavilov, N. I. 1922. The Law of Homologous Series in Variation. J. Heredity 12:47-89.


Notes

1 As quoted in part 1, mutationism is a source of "mirth" for Dawkins.

2 The words "doomed rivals" are also from Dawkins. Back when I was a lad in school, my evolution professor- Dr. Kenneth Christiansen, who has been at Grinnell College for at least 45 years and is still there today- had a slightly gentler way of referring to alternative theories as the "also-rans".

3 Note that Morgan's choice of words leaves some wiggle room that some other kind of variability could be offered "in support of" Darwin's theory. A century ago, admiration for Darwin was nearly universal, as it is today. The meaning of Darwin's theory, and ownership of the "Darwin" brand, were contested by scientists. By 1950, the "Modern Synthesis" school had captured the Darwin brand and began to use it more aggressively than they had dared to do before. However, things might have turned out differently. De Vries labeled himself as a "Darwinian". Bateson and others sometimes cozied up to "Darwinism".

4 Sturtevant (1965) lists 22 Mendelians who published from 1900 to 1905, and notes that all but 5 were under 40 (the older exceptions are de Vries, Garrod, Johannson, Wilson and Lang).

5 "stability analysis" means finding the "attractors" or points of stability in a dynamic system.

6 Hardy reveled in the purity of mathematics and stated that he had no desire to do anything useful. He said that his most important discovery was Srinivasan Ramanujan, a largely self-taught Indian genius who had written to Hardy and others seeking a mentor- only Hardy recognized his genius.

7 Synthesis Historiography attributes the resolution of Darwinism and Mendelism to Fisher (1918). In reality, the problem that Fisher (1918) solved was how to derive Galton's law as a formal consequence of Mendelian principles. Either this is a red herring, or it suggests that as late as 1918, "Darwinism" still implied a rejection of Mendelism in favor of blending. Note that Galton himself lacked the ideological purism of his followers: he believed in discontinuous evolutionary changes and felt that this was a missing element in Darwin's theory.

8 "Synthesis Historiography" is Ron Amundson's term for the industry of writing versions of history in which the Modern Synthesis is presented as the manifest destiny of science, and Mayr, et al are the heroes, while their intellectual opponents are fools and knaves.

*The Curious Disconnect is the blog of evolutionary biologist Arlin Stoltzfus, available at www.molevol.org/cdblog. An updated version of the post below will be maintained at www.molevol.org/cdblog/mutationism_myth3 (Arlin Stoltzfus, ©2010)

Science Education and Teaching Controversy

I'm beginning to realize that there are (at least) two fundamentally different approaches to teaching science. One strategy, which I'll call the "fact-based" approach, concentrates on communicating facts about the natural world. The other approach, which I'll call the "methodological approach" concentrates on teaching students how to acquire knowledge.

In the fact-based approach to science education, the emphasis is on making sure that students have a sound knowledge of the basic principles of physics, chemistry, geology, and biology. Let's take the teaching of evolution as an example. If you follow this strategy then you will want your students to know about the main mechanisms of evolution and the known facts about the history of life. You will only teach things that are supported by scientific evidence. In order to pass the course, students must demonstrate that they have acquired, and understand, the facts.

The goal here is to send students out into the real world armed with an understanding of what science has learned. Hopefully they'll be able to use that knowledge of evolution to choose the "right" side in any controversy.

The methodological approach concentrates on teaching students how to acquire knowledge using the scientific method. This "method" is not the kindergarten version so often seen in schools but the more fundamental version that emphasizes evidence, skepticism, and rational thinking. The idea here is not only to teach facts—although that's important—but to teach why those facts should be accepted as true. Another major goal of this method is teaching critical thinking and the desired outcome is a group of graduates who will be able to apply the methodology to any problem they encounter in the future. This includes problems that don't fall into the traditional science fields of physics, chemistry, geology, and biology.

The fact-based approach tends to avoid any distractions that might confuse students about what is known and what isn't. Thus, Intelligent Design Creationism cannot be discussed in this type of curriculum because there's nothing factual about creationism. It's not part of science.

That restriction doesn't apply if you are trying to teach critical thinking because the most important part of your objective is teaching students how to argue and how to reason. In that approach, you actually want to encourage controversy and debate in the classroom because that's how you learn to distinguish between wheat and chaff, or science and pseudoscience.

I was prompted to think about these two different approaches by a recent issue of Science containing a number of articles about science education.1 One of them is "Arguing to Learn in Science: The Role of Collaborative, Critical Discourse" by Jonathan Osborne [April 23, 2010: doi: 10.1126/science.1183944]. Here's the abstract ...
Argument and debate are common in science, yet they are virtually absent from science education. Recent research shows, however, that opportunities for students to engage in collaborative discourse and argumentation offer a means of enhancing student conceptual understanding and students’ skills and capabilities with scientific reasoning. As one of the hallmarks of the scientist is critical, rational skepticism, the lack of opportunities to develop the ability to reason and argue scientifically would appear to be a significant weakness in contemporary educational practice. In short, knowing what is wrong matters as much as knowing what is right. This paper presents a summary of the main features of this body of research and discusses its implications for the teaching and learning of science.
Clearly, this approach is consistent with bringing creationist ideas into the classroom in order to teach students why they are wrong. You will also want to bring up astrology and the ancient theory of demon possession if that helps make the point. You can't discuss every single controversy, but, at the very least, you should include the "active" ones—the ones students will encounter as soon as they step outside the classroom and watch FOX News or listen to their preacher on Sunday morning.

"Teaching the controversy" is good science if you adopt the methodological approach to science education but it's anathema if you adopt the fact-based approach.

Here's A.C. Grayling, a philosopher at Birkbeck College, University of London, and also a Fellow of St Anne's College, Oxford, giving his opinion on science education. Can you guess which approach he favors? Why isn't he aware of the "controversy" in science education? I wonder if he avoids all controversial topics in his philosophy classes?



1. Thanks to Bruce Alberts who, as editor-in-chief, is trying to promote more emphasis on science education.

P.S. I don't want to discuss whether the methodological approach is possible in American schools. If you think that science teachers are too stupid to adopt this approach, or if you think that many of them are secret creationists, then that's an entirely different problem. It's a defeatist attitude to conclude that the quality of science teachers is so bad that science education can't be fixed. If you have bad science teachers then the first step is to replace them with good ones. The sooner the better.

Junk RNA or Imaginary RNA?

RNA is very popular these days. It seems as though new varieties of RNA are being discovered just about every month. There have been breathless reports claiming that almost all of our genome is transcribed and most of the this RNA has to be functional even though we don't yet know what the function is. The fervor with which some people advocate a paradigm shift in thinking about RNA approaches that of a cult follower [see Greg Laden Gets Suckered by John Mattick].

We've known for decades that there are many types of RNA besides messenger RNA (mRNA encodes proteins). Besides the standard ribosomal RNAs and transfer RNAs (tRNAs), there are a variety of small RNAs required for splicing and many other functions. There's no doubt that some of the new discoveries are important as well. This is especially true of small regulatory RNAs.

However, the idea that a huge proportion of our genome could be devoted to synthesizing functional RNAs does not fit with the data showing that most of our genome is junk [see Shoddy But Not "Junk"?]. That hasn't stopped RNA cultists from promoting experiments leading to the conclusion that almost all of our genome is transcribed.

Late to the Party

Several people have already written about this paper including Carl Zimmer and PZ Myers. There are also summaries in Nature News and PLoS Biology.
That may change. A paper just published in PLoS Biology shows that the earlier work was prone to artifacts. Some of those RNAs may not even be there and others are present in tiny amounts.

The work was done by Harm van Bakel in Tim Hughes' lab, right here in Toronto. It's only a few floors, and a bridge, from where I'm sitting right now. The title of their paper tries to put a positive spin on the results: "Most 'Dark Matter' Transcripts Are Associated With Known Genes" [van Bakel et. al. (2010)]. Nobody's buying that spin. They all recognize that the important result is not that non-coding RNAs are mostly associated with genes but the fact that they are not found in the rest of the genome. In other words, most of our genome is not transcribed in spite of what was said in earlier papers.

Van Bekal compared two different types of analysis. The first, called "tiling arrays," is a technique where bulk RNA (cDNA, actually) is hybridized to a series of probes on a microchip. The probes are short pieces of DNA corresponding to genomic sequences spaced every few thousand base pairs along each chromosome. When some RNA fragment hybridizes to one of these probes you score that as a "hit." The earlier experiments used this technique and the results indicated that almost every probe could hybridize an RNA fragment. Thus, as you scanned the chip you saw that almost every spot recorded a "hit." The conclusion is that almost all of the genome is transcribed even though only 2% corresponds to known genes.

The second type of analysis is called RNA-Seq and it relies on direct sequencing of RNA fragments. Basically, you copy the RNA into DNA, selecting for small 200 bp fragments. Using new sequencing technology, you then determine the sequence of one (single end) or both ends (paired end) of this cDNA. You may only get 30 bp of good sequence information but that's sufficient to place the transcript on the known genome sequence. By collecting millions of sequence reads, you can determine what parts of the genome are transcribed and you can also determine the frequency of transcription. The technique is much more quantitative than tiling experiments.

Van Bekel et al. show that using RNA-Seq they detect very little transcription from the regions between genes. On the other hand, using tiling arrays they detect much more transcription from these regions. They conclude that the tiling arrays are producing spurious results—possibly due to cross-hybridization or possibly due to detection of very low abundance transcripts. In other words, the conclusion that most of our genome is transcribed may be an artifact of the method.

The parts of the genome that are presumed to be transcribed but for which there is no function is called "dark matter." Here's the important finding in the author's own words.
To investigate the extent and nature of transcriptional dark matter, we have analyzed a diverse set of human and mouse tissues and cell lines using tiling microarrays and RNA-Seq. A meta-analysis of single- and paired-end read RNA-Seq data reveals that the proportion of transcripts originating from intergenic and intronic regions is much lower than identified by whole-genome tiling arrays, which appear to suffer from high false-positive rates for transcripts expressed at low levels.
Many of us dismissed the earlier results as transcriptional noise or "junk RNA." We thought that much of the genome could be transcribed at a very low level but this was mostly due to accidental transcription from spurious promoters. This low level of "accidental" transcription is perfectly consistent with what we know about RNA polymerase and DNA binding proteins [What is a gene, post-ENCODE?, How RNA Polymerase Binds to DNA]. Although we might have suspected that some of the "transcription" was a true artifact, it was difficult to see how the papers could have failed to consider such a possibility. They had been through peer review and the reviewers seemed to be satisfied with the data and the interpretation.

That's gonna change. I suspect that from now on everybody is going to ignore the tiling array experiments and pretend they don't exist. Not only that, but in light of recent results, I suspect more and more scientists will announce that they never believed the earlier results in the first place. Too bad they never said that in print.


van Bakel, H., Nislow, C., Blencowe, B. and Hughes, T. (2010) Most "Dark Matter" Transcripts Are Associated With Known Genes. PLoS Biology 8: e1000371 [doi:10.1371/journal.pbio.1000371]

Wednesday, May 19, 2010

Is God Dead?

 
I stumbled upon this while looking for something else. It's the cover from April 8, 1966. I remember it well. It didn't seem like such a big deal at the time. We all assumed the answer was "yes." Not a big deal in the '60s.

If I recall correctly, the inside article was about some dude named Friedrich Nietzsche. Weird name. Nobody cared. The cover said it all.



The Essence of Christianity

Right now there's a conference going on in Oxford, United Kingdom—that hotbed of Christian apologetics (and Richard Dawkins). John Wilkins is there. One of the topics is defining religion [Ruminations in Oxford].

John's "ruminations" remind me of the ongoing debate over the conflict between science and religion. Everyone knows that the conflict exists but everyone has their own idea about how far it penetrates into religion. As you all know, various accommodationists are trying hard to wall off a protected area of religion that science cannot enter. That allows science and religion to co-exist peacefully.

In order to do this, the accommodationists have to define the essence of a religion. They agree that belief in a six thousand year old Earth conflicts with science but, according to them, that's not an essential belief in Christianity. The people who believe that sort of nonsense don't represent the serious "sophisticated" Christians (like the ones in theology at Oxford). So, what are the essential beliefs that don't conflict with the scientific way of acquiring knowledge?

Here's how Michael Ruse describes them in his latest book, Science and Spirituality: Making Room for Faith in the Age of Science (p. 182). I wonder how many of the people at the conference will agree with Ruse about the four items that are essential for Christians? I wonder how many of them agree with Ruse that none of these four conflict with the scientific way of thinking?
With an eye to the discussion of the previous chapters, I want to pick out four items or claims that are central to Christian belief—four items that the Christian takes on faith. If you do not believe in these, then you should not call yourself a Christian. First, that there is a God who is creator, "maker of heaven and earth." Second, we humans have duties, moral tasks here on earth, in the execution of which we are going to be judged. Hence, God stands behind morality. Third, Jesus Christ came to earth and suffered because we humans are special, we are worth the effort by God. The usual way of expressing this is to say that we are "made in the image of God." We have "souls." Fourth and finally, there is the promise of "life everlasting." We can go to heaven, what ever that means.

Let me spell out carefully what I see as the task in this and the next chapter. It is not to defend Christianity as a true or compelling belief system. I take it that you can enter these chapters as an agnostic or an atheist and depart in the same frame of mind. I do not want to dissuade people from Christianity, nor do I want to convince them of it. I want to explain in a fair manner what is meant by Christianity in terms of the four points introduced in the last paragraph. I also want to show that you could hold these, if you so wish, in the light of modern science—if you prefer, in the face of modern science. In other words, the Christian's claims are not refuted by modern science—or indeed threatened or made less probable by modern science.
Here's my quick take on the four items.

1. God the creator: It's possible to imagine a Deist God who starts off the known universe then goes off somewhere to watch perpetual reruns of The Lawrence Welk Show. (Where does he go?) This sort of God does not conflict directly with science, even if you define science as a way of knowing that requires evidence, skepticism, and rationality. It's an unnecessary God but a relatively harmless one compared to some others. Nobody I know believes in such a God, including Keith Ward, Ken Miller and Francis Collins.

2. God stands behind morality and He will judge us: There's no scientific evidence to support the notion that morality has anything to do with supernatural beings and plenty of evidence against it. There's no scientific evidence that you will be judged by anyone except other humans. This belief conflicts with science.

3. Jesus Christ is/was God: The idea that a supernatural being appeared on Earth in the form of a real human and lived among a group of primitive farmers in some obscure part of the world is not consistent with anything we know by applying scientific reasoning. It conflicts with science big time. So does the idea that we have something called a "soul" that no other animal possesses.

4. When you die you go to heaven: Totally inconsistent with a scientific way of thinking. In spite of several thousand years of tying, no evidence of heaven has ever been produced. Or hell, for that matter. There is nothing about this silly belief that's even remotely consistent with science.


Monday, May 17, 2010

Visitors

 



Clarity vs Obscurity

 
Richard Dawkins says, "There are people who are so in love with obscurity—a nice warm fuzzy feeling of obscurity and obscurantism—that, if you say something clearly, they feel threatened." See the video below.

For some reason this reminds me of a book I just read by Keith Ward called "The Big Questions in Science and Religion." Perhaps it's because of the blurb on the back cover that says,
Ward effortlessly flows from one fascinating insight to another about the often contentious relationship between diverse religious views and the new scientific knowledge. Writing with both passion and clarity, he masterfully converys the depth, the difficulty, and the importance of the greatest intellectual and existential questions of the modern age.
"Clarity"? Don't make me laugh. Keith Ward has never met an example of obscurantism that he doesn't embrace.

Ward is a colleague of Dawkins at Oxford. I wonder if Dawkins was thinking of him when he made his statement? Or, he may have been thinking of another colleague, Alister McGrath.



[Hat Tip: Clarity - A very nice statement by Dawkins, at RichardDawkins.net.]