Motoo Kimura (1968)We know a great deal about the error rate of DNA replication. The replisome makes a mistake about once in every 100 million bases incorporated. This is an error rate of 10-8. The repair mechanism fixes 99% of these errors for an overall mutation rate of 10-10. Given the size of the human genome and the number of replications between zygote and germ cells, this translates to approximately 130 mutations per individual per generation [Mutation Rates].
Recently there have been two attempts to verify this calculation. In one, the Y chromosomes of two men separated by 13 generations in a paternal lineage from a common male ancestor were sequenced. The differences correspond to a mutation rate of 0.75 × 10-10 per generation, or almost the same as theory predicts [Human Y Chromosome Mutation Rates]. This is based on the fact that if most mutations are nearly neutral (they are) then the rate of fixation by random genetic drift should be the same as the mutation rate [Random Genetic Drift and Population Size].
The other study, by Roach et al. (2010), compared the genome sequences of two offspring and their parents. By adding up all the differences in the offspring they arrived at an estimate of 70 mutations in the offspring instead of the expected 130 [Direct Measurement of Human Mutation Rate]. This is half the expected value but the study is fraught with potential artifacts and it's best not to make a big deal of this discrepancy.
John Hawks was worried about this last March [A low human mutation rate may throw everything out of whack ] and he's still worried about it today [What is the human mutation rate?].
What John is really interested in isn't the mutation rate per se since we have pretty good handle on that number. What interests him in is Calibrating the Molecular Clock and that's not the same thing. What it boils down to is the number of years per generation—or the number of fixed mutations per million years.
John thinks that if the actual mutation rate is only half of the value we though it was then the dating of many evolutionary events will need to be recalculated. For example, the human-chimp divergence would have to be re-set to eight or nine million years ago. But that's not strictly correct. We don't calibrate the molecular clock by taking the known mutation rate and multiplying by the number of generations then throw in a known value for the number of years per generation.1
None of those values are known for even the most recent events in the primate lineages. What we usually do is work from a fixed point in the fossil record, count the number of differences between species, and estimate a mutation rate per million years. That value is then used to calibrate other divergences.
Sometimes these rates of change can be related to the mutation rate by estimating the generation times and they often seem reasonable when we come up with generation times of,say, 25 years. Even if the known mutation rate were half of the current consensus value, the most reasonable adjustment would not to be recalibrate the time of divergence but to reconsider our assumption abut generation time. Maybe there were twice as many generations per million years.
But this is actually a non-problem right now since the Roach et al. (2010) estimate is not very reliable. I don't think John Hawks should be worried.
1. Plus estimates of the effective population size, Ne.
Roach, J.C., Gustavo Glusman, G., Smit, A.F.A., Huff, C.D., Hubley, R., Shannon, P.T., Rowen, L., Pant, K.P., Goodman, N., Bamshad, M., Shendure, J., Drmanac, R., Jorde, L.B., Hood, L., and Galas, D.J. (2010) Analysis of Genetic Inheritance in a Family Quartet by Whole-Genome Sequencing. Science (Published Online March 10, 2010) [doi: 10.1126/science.1186802]