Let's review what we know. The first thing we have to do is define "mutation" [What Is a Mutation?]. A mutation is any alteration of the nucleotide sequence of a genome. It includes substitutions, insertions, and deletions.
The mutation rate can be described and defined in many ways. For most purposes, we can assume that it's equivalent to the error rate of DNA replication since that accounts for the vast majority of substitutions. Substitutions are far more numerous than most insertions and deletions. (But see, Arlin Stoltzfus on The range of rates for different genetic types of mutations).
One kind of rate is the error rate of DNA replication. This is close to 1.0 × 10-8 for replication complexes that are capable of proofreading. Many errors are repaired by repair enzymes and this process is about 99% efficient. Thus, the overall error rate is close to 1.0 × 10-10 per bp.
Given that the human genome is 3.2 × 109 bp, this means that there are on average 0.32 new substitutions every time the complete genome is replicated. In humans there are about 30 cell generations between zygote and egg cells and about 400 cell divisions between zygote and mature sperm. Thus, in males, the sperm cells have about 128 new mutations and the haploid egg genome has about 10 new mutations for a total of 138 new mutations in every new zygote. Let's round this down to 130 mutations per generation.1 I call this the "biochemical method" of calculating mutation rate [Estimating the Human Mutation Rate: Biochemical Method].
This is a pretty reliable number. The error rates of DNA replication and repair are well-studied and they are unlikely to be off by more than 20%. The number of cell divisions per generation is less reliable but still pretty accurate. The number of divisions during spermatogenesis varies with the age of the father so that that older men pass on more mutations but the number used here (400) is a good average. The overall mutation rate of 130 mutations per generation is likely to be close to the actual mutation rate.
The second method is something I call the "phylogenetic method" [Estimating the Human Mutation Rate: Phylogenetic Method]. It's based on the idea that the vast majority of mutations in primate genomes are in junk DNA. That means they are neutral. If we compare the genomes of different primate species, we can calculate a mutation rate based on population genetics, which postulates that the rate of fixation of neutral alleles is the same as the mutation rate.
This gives mutation rates of 112-160 mutations per generation. That value depends on knowing the time of divergence of different lineages (e.g. humans and chimpanzees) and the generation times. Both of these values are subject to uncertainty but they can't be off by very much. Certainly not by a factor of two or more.
The third method is the "direct method" [Estimating the Human Mutation Rate: Direct Method]. This is where you sequence the genomes of parents and their offspring or you sequence the genomes of two individuals who descend from a common ancestor. All you need to do is count the number of new mutations and you get the mutation rate directly.
These experiments yield a range of values from a low of 56 mutations per generation to a high of 103 mutations per generation. Most of the values cluster around 75 mutations per generation and that's what gives rise to the controversy. The direct method gives mutation rates the are about half the rate of other methods.
The direct method is not very reliable since the quality of the genome sequences is low and only a fraction of the genomes is actually sequenced. Typically about 60-80% of the genome sequence is reliable. The number of potential sequencing errors overwhelms the number of possible mutations so a lot of "adjusting" is necessary in order to weed out false positives and false negatives. Nevertheless, it's satisfying that the results are in the right ballpark.
The Human Mutation Rate Meeting].
Geneticists ... are having trouble deciding between one measure of how fast human DNA mutates and another that is half that rate.This is a bit of an exaggeration. The main problem comes from interpreting the direct measures of the mutation rate and those number aren't really "so much lower" than previous estimates.
The rate is key to calibrating the ‘molecular clock’ that puts DNA-based dates on events in evolutionary history. So at an intimate meeting in Leipzig, Germany, on 25–27 February, a dozen speakers puzzled over why calculations of the rate at which sequence changes pop up in human DNA have been so much lower in recent years than previously. They also pondered why the rate seems to fluctuate over time. The meeting drew not only evolutionary geneticists, but also researchers with an interest in cancer and reproductive biology — fields in which mutations have a central role.
But even a twofold difference is cause for worry if the mutation rates are used to determine the history of human evolution.
Later estimates of the mutation rate counted the differences between stretches of DNA and protein amino-acid sequences in humans and those in chimpanzees or other apes, and then divided the number of differences by the time that has elapsed since the species’ most recent common ancestor appeared in the fossil record. These estimates were clouded by the patchiness of the fossil record, but researchers eventually settled on a consensus: each DNA letter, on average, mutates once every billion years. That is a “suspiciously round number”, molecular anthropologist Linda Vigilant of the Max Planck Institute for Evolutionary Anthropology in Leipzig told Nature in 2012 (see Nature 489, 343–344; 2012).The human diploid genome consists of 6.4 × 109 base pairs (6.4 billion). That works out to 6.4 mutations per year. The average generation time for humans is 30 years so that means 30 × 6.4 = 192 mutations per generation. That's very much on the high end of estimates that I've seen.
In the past six years, more-direct measurements using ‘next-generation’ DNA sequencing have come up with quite different estimates. A number of studies have compared entire genomes of parents and their children — and calculated a mutation rate that consistently comes to about half that of the last-common-ancestor method.This is correct. The direct measurements generally turn out to be lower than the estimates made by other methods.
A slower molecular clock worked well to harmonize genetic and archaeological estimates for dates of key events in human evolution, such as migrations out of Africa and around the rest of the world1. But calculations using the slow clock gave nonsensical results when extended further back in time — positing, for example, that the most recent common ancestor of apes and monkeys could have encountered dinosaurs. Reluctant to abandon the older numbers completely, many researchers have started hedging their bets in papers, presenting multiple dates for evolutionary events depending on whether mutation is assumed to be fast, slow or somewhere in between.The problems aren't as severe as this implies. If the average mutation rate is about 130 mutations per generation then this is consistent with a human-chimpanzee split about 5-6 million years ago. If the actual rate is only 75 new mutations per generation then this pushes the last common ancestor back to about 10 million years ago assuming that the generation times are accurate.
This older value is probably not right so the actual mutation rate is likely to be more than 75 mutations per generation.
This seems like a problem but there are all kinds of potential errors in these calculations. For one thing, we don't know how accurate the Denisovan sequence is and what the real number of differences are. There are also issues with population sizes and actual times of divergence, not to mention generation times.
There's no point in getting your knickers in a knot at this time.
Last year, population geneticist David Reich of Harvard Medical School in Boston, Massachusetts, and his colleagues compared the genome of a 45,000-year-old human from Siberia with genomes of modern humans and came up with the lower mutation rate2. Yet just before the Leipzig meeting, which Reich co-organized with Kay Prüfer of the Max Planck Institute for Evolutionary Anthropology, his team published a preprint article that calculated an intermediate mutation rate by looking at differences between paired stretches of chromosomes in modern individuals (which, like two separate individuals’ DNA, must ultimately trace back to a common ancestor). Reich is at a loss to explain the discrepancy. "The fact that the clock is so uncertain is very problematic for us," he says. "It means that the dates we get out of genetics are really quite embarrassingly bad and uncertain."Of course the direct measurements are going to be uncertain because you are sequencing individuals. These are not "finished" sequences that have been edited and corrected.
My view, is that the direct ("genetic"?) mutation rates are probably systematically low and the real rates will likely be more than 100 mutations per generation.
1. I'm rounding down to be consistent with other posts on this subject.
Meyer, M., et al. (2012) A high-coverage genome sequence from an archaic Denisovan individual. Science 338: 222-226.