More Recent Comments

Friday, June 15, 2007

Penicillin Resistance in Bacteria: After 1960

 
The widespread appearance of penicillin-resistant bacteria by 1960 prompted the introduction of new drugs that could not be degraded by newly evolved β-lactamases [see Penicillin Resistance in Bacteria: Before 1960].

The most important of these new drugs are the cephalosporins, modified β-lactams with bulky side chains at two different positions. These drugs still inhibit the transpeptidases and prevent cell wall formation but because of the bulky side chains they cannot be hydrolyzed by β-lactamases. Thus, they are effective against most of the penicillin-resistant strains that arose before 1960.

Other drugs, such as methicillin, were modified penicillins. They also had modified side chains that prevented degradation by the β-lactamases.

It wasn't long before cephalosporin- and methicillin-resistant strains began to appear in hospitals. As a general rule, these strains were not completely resistant to high doses of the new class of drugs but as time went on the resistant strains became more and more immune to the drugs.

The new version of drug resistance also involves the transpeptidase target but instead of developing into β-lactamases they evolve into enzymes that can no longer bind the cephalosporins. Usually the development of resistance takes place in several stages.

There are many different transpeptidases in most species of bacteria. The are usually referred to as penicillin-binding proteins or PBP's. Often the first sign of non-lactamase drug resistance is a mutant version of one PDP (e.g., PDP1a) and subsequent development of greater resistance requires the evolution of other PDB's that don't bind the drug. In the most resistant strains there will be one particular PDB (e.g., PDB2a) that is still active at high drug concentrations while the other transpeptidases will be inhibited.

Resistant enzymes have multiple mutations, which explains the slow, stepwise acquisition of drug resistance. An example is shown in the figure. This is PDP1a from Streptococcus pneumoniae (Contreras, et al. 2006) and the mutant amino acids are displayed as gold spheres. Most of the mutations do not affect the binding of the drug but those surrounding the entry to the active site are crucial. The necessary amino acid substitutions are numbered in the figure. You can see that they line the groove where the cephalosporin drug (purple) is bound. The effect of the mutations is to prevent the bulky β-lactam from inhibiting the enzyme. This is a very different form of drug resistance than the evolution of degradation enzymes that characterized the first stage of penicillin resistant bacteria.


Chambers, H.F. (2003) Solving staphylococcal resistance to beta-lactams. Trends Microbiol. 11:145-148.

Contreras-Martel, C., Job, V., Di Guilmi, A.M., Vernet, T., Dideberg, O. and Dessen, A. (2006) Crystal structure of penicillin-binding protein 1a (PBP1a) reveals a mutational hotspot implicated in beta-lactam resistance in Streptococcus pneumoniae. J. Mol. Biol. 355:684-696.

Livermore, D.M. (2000) Antibiotic resistance in staphylococci. Int. J. Antimicrob. Agents 16:s3-s10.

Penicillin Resistance in Bacteria: Before 1960

 
The Nobel Prize for the discovery and analysis of penicillin was awarded in 1945 [Nobel Laureates: Sir Alexander Fleming, Ernst Boris Chain, Sir Howard Walter Florey]. It was about this time that penicillin became widely available in Europe and North America.

By 1946 6% of Staphylococcus aureus strains were resistant to penicillin. Resistance in other species of bacteria was also detected in the 1940's. By 1960 up to 60% of Staphylococcus aureus strains were resistant with similar levels of resistance reported in other clinically relevant strains causing a wide variety of diseases (Livermore, 2000).

Penicillins are a class of antibiotics with a core structure called a β-lactam. The different types of penicillin have different R groups on one end of the core structure. A typical examples of a penicillin is penicillin G [Monday's Molecule #30]. Others common derivatives are ampicillin and amoxicillin.

The original resistance to this entire class of drugs was caused mostly by the evolution of bacterial enzymes that could degrade them before they could block cell wall synthesis. (Recall that bacteria have cell walls and penicillin blocks cell wall synthesis [How Penicillin Works to Kill Bacteria].)
It seems strange that the evolution of penicillin resistance would require a totally new enzyme for degrading the drug. Where did this enzyme come from? And how did it arise so quickly in so many different species?

The degrading enzyme is called penicillinase, β-lactamase, or oxacillinase. They all refer to the same class of enzyme that binds penicillins and then cleaves the β-lactam unit releasing fragments that are inactive. The enzymes are related to the cell wall transpeptidase that is the target of the drug. The inhibition of the transpeptidase is effective because penicillin resembles the natural substrate of the reaction: the dipeptide, D-alanine-D-alanine.

In the normal reaction, D-Ala-D-Ala binds to the enzyme and the peptide bond is cleaved causing release one of the D-Ala residues. The other one, which is part of the cell wall peptidoglycan, remains bound to the enzyme. In the second part of the reaction, the peptidoglycan product is transferred from the enzyme to a cell wall crosslinking molecule. This frees the enzyme for further reactions (see How Penicillin Works to Kill Bacteria for more information).

Penicillin binds to the peptidase as well and the β-lactam bond is cleaved resulting in the covalent attachment of the drug to the enzyme. However, unlike the normal substrate, the drug moiety cannot be released from the transpeptidase so the enzyme is permanently inactivated. This leads to disruption of cell wall synthesis and death.

Resistant strains have acquired mutations in the transpeptidase gene that allow the release of the cleaved drug. Thus, the mutant enzyme acts like a β-lactamase by binding penicillins, cleaving them, and releasing the products. Although the β-lactamases evolved from the transpeptidase target enzymes, the sequence similarity between them is often quite low in any given species. This is one of the cases where structural similarity reveals the common ancestry [see the SCOP Family beta-Lactamase/D-ala carboxypeptidase]. It's clear that several different β-lactamases have evolved independently but, in many cases, a particular species of bacteria seems to have licked picked up a β-lactamase gene by horizontal transfer from another species. The transfer can be mediated by bacteriophage or plasmids.


Livermore, D.M. (2000) Antibiotic resistance in staphylococci. Int. J. Antimicrob. Agents 16:s3-s10.

Thursday, June 14, 2007

Catherine Shaffer Responds to My Comments About Her WIRED Article

 
Over on the WIRED website there's a discussion about the article on junk DNA [One Scientist's Junk Is a Creationist's Treasure]. In the comments section, the author Catherine Shaffer responds to my recent posting about her qualifications [see WIRED on Junk DNA]. She says,
You might be interested to learn that I contacted Larry Moran while working on this article and after reading the archives of his blog. I wanted to ask him to expand upon his assertion that junk DNA disproves intelligent design. His response was fairly brief, did not provide any references, and did not invite further discussion. It's interesting that he's now willing to write a thousand words or so about how wrong I am publicly, but was not able to engage this subject privately with me.
Catherine Shaffer sent me a brief email message where she mentioned that she had read my article on Junk DNA Disproves Intelligent Design Creationism. She wanted to know more about this argument and she wanted references to those scientists who were making this argument. Ms. Shaffer mentioned that she was working on an article about intelligent design creationism and junk DNA.

I responded by saying that the presence of junk DNA was expected according to evolution and that it was not consistent with intelligent design. I also said that, "The presence of large amounts of junk DNA in our genome is a well established fact in spite of anything you might have heard in the popular press, which includes press releases." She did not follow up on my response.
His blog post is inaccurate in a couple of ways. First, I did not make the claim, and was very careful to avoid doing so, that “most” DNA is not junk. No one knows how much is functional and how much is not, and none of my sources would even venture to speculate upon this, not even to the extent of “some” or “most.”
Her article says, "Since the early '70s, many scientists have believed that a large amount of many organisms' DNA is useless junk. But recently, genome researchers are finding that these "noncoding" genome regions are responsible for important biological functions." Technically she did not say that most DNA is not junk. She just strongly implied it.

I find it difficult to believe that Ryan Gregory would not venture to speculate on the amount of junk DNA but I'll let him address the validity of Ms. Shaffer's statement.
Moran also mistakenly attributed a statement to Steven Meyer that Meyer did not make.
I can see why someone might have "misunderstood" my reference to what Myer said so I've edited my posting to make it clear.
Judmarc and RickRadditz—Here is a link to the full text of the genome biology article on the opossum genome: Regulatory conservation of protein coding and microRNA genes in vertebrates: lessons from the opossum genome. We didn't have space to cover this in detail, but in essence what the researchers found was that upstream intergenic regions were more highly conserved in the possum compared to coding regions, but also represented a greater area of difference between possums and humans.
This appears to be a reference to the paper she was discussing in her article. It wasn't at all clear to me that this was the article she was thinking about in the first few paragraphs of her WIRED article.

Interested readers might want to read the comment by "Andrea" over on the WIRED site. She He doesn't pull any punches in demonstrating that Catherine Shaffer failed to understand what the scientific paper was saying. Why am I not surprised? (Recall that this is a science writer who prides herself on being accurate.)
So, yes, this does run counter to the received wisdom, which makes it fascinating. You are right that the discussion of junk vs. nonjunk and conserved vs. nonconserved is much more nuanced, and we really couldn't do it justice in this space. Here is another reference you might enjoy that begins to deconstruct even our idea of what conservation means: “Conservation of RET regulatory function from human to zebrafish without sequence similarity.” Science. 2006 Apr 14;312(5771):276-9. Epub 2006 Mar 23. Revjim—If you have found typographical errors in the copy, please do point them out to us. The advantage of online publication is that we do get a chance to correct these after publication.
Sounds to me like Catherine Shaffer is grasping at straws (or strawmen).
For Katharos and others—I interviewed five scientists for this article. Dr. Francis Collins, Dr. Michael Behe, Dr. Steve Meyers, Dr. T. Ryan Gregory, and Dr. Gill Bejerano. Each one is a gentleman and a credentialed expert either in biology or genetics. I am grateful to all of them for their time and kindness.
I think we all know just how "credentialed" Stephen Meyer is. He has a Ph.D. in the history and philosophy of science. Most of us are familiar with the main areas of expertise of Michael Behe and none of them appear to be science.

Wednesday, June 13, 2007

WIRED on Junk DNA

Junk DNA is the DNA in your genome that has no function. Much of it accumulates mutations in a pattern that's consistent with random genetic drift implying strongly that the sequences in junk DNA are unimportant. In fact, the high frequency of sequence change (mutation plus fixation) is one of the most powerful bits of evidence for lack of function.

Catherine Shaffer is a science writer who describes herself like this on her website,
I am a writer specializing in biotechnology, genetics, genomics, and other molecular, biological sciences. I have experience with news and features. My strengths include a meticulous attention to detail, an absolutely fanatical devotion to scientific accuracy, and enthusiasm. Readers appreciate my clean, uncluttered prose; my crisp, novelistic style; and (sometimes) my zany sense of humor. I am a writer who always meets deadlines and is organized and dependable.

I studied biochemistry at the graduate level at the University of Michigan, and worked in the pharmaceutical industry for several years. I am especially knowledgeable about genomics, proteomics, biotechnology, drug discovery, and chromatographic separations.
She has written an article for WIRED on junk DNA [One Scientist's Junk Is a Creationist's Treasure]. Here's how the article begins,

Without your "junk DNA" you might be reading this article while hanging upside down by your tail.

That's one of the key findings of the opossum genome-sequencing project, and a surprising group is embracing the results: intelligent-design advocates. Since the early '70s, many scientists have believed that a large amount of many organisms' DNA is useless junk. But recently, genome researchers are finding that these "noncoding" genome regions are responsible for important biological functions.

The opossum data revealed that more than 95 percent of the evolutionary genetic changes in humans since the split with a common human-possum ancestor occurred in the "junk" regions of the genome. Creationists say it's also evidence that God created all life, because God does not create junk. Nothing in creation, they say, was left to chance.

"It is a confirmation of a natural empirical prediction or expectation of the theory of intelligent design, and it disconfirms the neo-Darwinian hypothesis," said Stephen Meyer, director of the Center for Science and Culture at the Discovery Institute in Seattle.

Advocates like Meyer are increasingly latching onto scientific evidence to support the theory of intelligent design, a modern arm of creationism that claims life is not the result of natural selection but of an intelligent creator. Most scientists believe that intelligent design is not science. But Meyer says the opossum data supports intelligent design's prediction that junk DNA sequences aren't random, but important genetic material. It's an argument Meyer makes in his yet-to-be-published manuscript, The DNA Enigma.
Hmmmm ... This is so confused that it's difficult to know where to begin. First, the connection between my junk DNA and whether I am an opossum completely escapes me. I don't know of any credible scientist who claims that it's changes in junk DNA that makes us so different from the common ancestor of opossums. (And none who claim that we are descended from opossums.)

Second, the implication that most junk DNA is turning out to have a function is completely false and the confusion about the difference between junk DNA and noncoding DNA is inexcusable from someone who claims to be an expert on genomics [see Noncoding DNA and Junk DNA, The Deflated Ego Problem].

Third, the idea that large amounts of evolution in junk DNA supports Intelligent Design Creationism is crazy. But, in fairness, I don't think Shaffer is making the connection between the sequence variation and Intelligent Design Creationism; instead, she's making the (factually incorrect) connection between the discovery of some functions in noncoding, nonjunk, DNA and Intelligent Design Creationism (IDC). I think Steve Meyer is suggesting that IDC predicts that junk DNA will have a function and that's why he's being quoted here in the article (see above).
Scientists have made several discoveries about what some call the "dark matter of the genome" in recent years, but they say the research holds up the theory of natural selection rather than creationism.
When sequences in noncoding DNA are conserved, this is taken as evidence of negative selection. In that sense, it supports the theory of natural selection. However, most of the sequence comparisons show that junk DNA is not conserved. This does not support the theory of natural selection. It supports Neutral Theory and the mechanism of evolution by random genetic drift.

The article then describes one recent study suggesting that some noncoding DNA is not junk (Lowe et al. 2007). It appears to be the justification for writing the article since it compares short stretches of sequences in the human and opossum genomes. This is not news so I won't bother commenting.
With scientists increasingly believing that so-called junk DNA regulates other genes, among other functions, creationists like Michael Behe, a biochemistry professor at Lehigh University in Pennsylvania and author of the controversial new book on intelligent design, The Edge of Evolution, are more than happy to point out their errors.

"From the very beginning Darwinism thought whatever it didn't understand must be simple, must be nonfunctional," Behe said. "It's only in retrospect that Darwinists try to fit that into their theory."
The concept of junk DNA is not based on ignorance in spite of what the IDiots say. It's based on good scientific evidence and deduction. Of course most IDiots wouldn't recognize scientific evidence even if it bit them on the ...

Is this just a way of getting in another quote from a prominent advocate of Intelligent Design Creationism? Why is Shaffer so interested in the IDiots? This seems to be more than just seeking out controversy since the proper way to do that would be to interview real scientists who can put the work into perspective and comment on it's significance (see below).
Part of the difficulty in studying junk DNA is that it's impossible to prove a negative, i.e., that any particular DNA does not have a function.

That's why T. Ryan Gregory, an assistant professor in biology at the University of Guelph, believes that nonfunctional should be the default assumption. "Function at the organism level is something that requires evidence," he said.
That's how a real scientist speaks [see A word about "junk DNA" and Comments on "Noncoding DNA and Junk DNA"].

This is getting to be a familiar pattern among science writers. Many of them seem to be incapable of sorting out the actual science from the rhetoric. In this case the problem is exacerbated by introducing IDiots as though their opinion had a bearing on the subject. Not only that, the poor science writing stands in sharp contrast to the claim that, "My strengths include a meticulous attention to detail, an absolutely fanatical devotion to scientific accuracy, and enthusiasm."

Lowe, C.B., Bejerano, G. and Haussler, D. (2007) Thousands of human mobile element fragments undergo strong purifying selection near developmental genes. Proc. Natl. Acad. Sci. (USA) 104:8005-8010. [PubMed]

University College London Restores Professor Colquhoun's Website

 
David Colquhoun has a website at University College London where he regularly debunks the claims of "medical" quacks. Recently a herbal medicine practitioner took offense at this debunking and threatened legal action against the university. The university responded by removing the website.

Today the website has been restored [DC's Improbable Science] and University College London has published a press release explaining why [Joint statement by Professor Colquhoun and UCL].

While it's encouraging that the university decided to restore the website, the fact that it buckled to pressure in the first place is disturbing. What's the point of academic freedom if you abandon it whenever you're threatened with a lawsuit?
UCL has a long and outstanding liberal tradition and is committed to encouraging free and frank academic debate. The evidence (or lack thereof) for the claims made for health supplements is a matter of great public interest, and UCL supports all contributions to that debate. The only restriction it places on the use of its facilities is that its staff should use their academic freedom responsibly within the law.

To this end, the Provost and Professor Colquhoun have taken advice from a senior defamation Queen’s Counsel, and we are pleased to announce that Professor Colquhoun’s website – with some modifications effected by him on counsel’s advice - will shortly be restored to UCL’s servers. UCL will not allow staff to use its website for the making of personal attacks on individuals, but continues strongly to support and uphold Professor Colquhoun’s expression of uncompromising opinions as to the claims made for the effectiveness of treatments by the health supplements industry or other similar bodies.
I'm curious about the "minor modifications" and I'm troubled by the prohibition against "the making of personal attacks on individuals." It seems to me that such a prohibition could be used in a way that inhibits academic freedom. For example, would it prohibit a university Professor from criticizing Tony Blair for the war in Iraq? Would it block any negative comments about Prince Charles (pictured at left)? Does it mean that the UCL website is completely devoid of any negative comments about Richard Dawkins?

Perhaps more importantly, does this mean that university Professors cannot point out on their websites the stupidity of administration officials such as UCL President and Provost Malcolm Grant?

Nobel Laureates: Sir Alexander Fleming, Ernst Boris Chain, Sir Howard Walter Florey

 
The Nobel Prize in Physiology or Medicine 1945.

"for the discovery of penicillin and its curative effect in various infectious diseases"


Sir Alexander Fleming (1881-1955), Ernst Boris Chain (1906-1979) and Sir Howard Walter Florey (1898-1968) received the Nobel Prize in Physiology or Medicine for their work on penicillin [see Monday's Molecule #30 and How Penicillin Works to Kill Bacteria]. The Presentation Speech was delivered by by Professor G. Liljestrand of the Royal Caroline Institute (Karolinska Institutet), on December 10, 1945.
Attempts have been made to reach the goal of medical art - the prevention and cure of disease - by many different paths. New and reliable ones have become practicable as our knowledge of the nature of the different diseases has widened. Thus the successful combating of certain disturbances in the activities of the organs of internal secretion, as also of the deficiency diseases, or avitaminoses, has been a direct result of the increase in our knowledge of the nature of these afflictions. When, thanks to the research work of Louis Pasteur and Robert Koch, the nature of the infectious diseases was laid bare, and the connection between them and the invasion of the body by bacteria and other micro-organisms was elucidated, fully a generation ago, this was an enormous advance, both for the prevention and the treatment of this important group of diseases. This was so much the more important as the group included a number of the worst scourges of humanity, which had slain whole peoples, and at times had laid waste wide areas. But now possibilities were revealed which have not yet been by any means fully utilized. In rapid succession, different forms of vaccination were evolved, and subsequently also serum treatment, for the introduction of which the first Nobel Prize for Physiology or Medicine was given 44 years ago today. In these cases advantage was taken of the capacity of the human and animal bodies themselves to produce protective substances in the fight against the invaders, and to do so in great abundance. But it is by no means the higher organisms only that are able to produce such substances. In cooperation with Joubert (1877), Pasteur himself observed that anthrax bacilli cultivated outside the body were destroyed if bacteria from the air were admitted, and with prophetic acumen he realized that it was justifiable to attach great hopes to this observation in the treatment of infectious diseases. Nevertheless more than two decades passed before an attempt was made to profit by the struggle for existence which goes on between different species of micro-organisms. Experiments carried out by Emmerich and Loew (1899) did not give such favourable results, however, that any great interest was aroused, nor did success attend the later efforts of Gratia and Dath and others. It was reserved to this year's Nobel Prize winners to realize Pasteur's idea.

The observation made by Professor Alexander Fleming which led to the discovery of penicillin, is now almost classical. In 1928, in the course of experiments with pyogenic bacteria of the staphylococcus group, he noticed that, around a spot of mould which had chanced to contaminate one of his cultures, the colonies of bacteria had been killed and had dissolved away. Fleming had earlier made a study of different substances which prevent the growth of bacteria and, inter alia, had come upon one in lacrimal fluid and saliva, the so-called lysozyme. As he points out himself, he was therefore always on the look-out for fresh substances which checked bacteria, and he became sufficiently interested in his latest find to make a closer investigation of the phenomenon. The mould was therefore cultivated and subsequently transferred to broth, where it grew on the surface in the form of a felted green mass. When the latter was filtered off a week later, it was found that the broth had such a strongly checking effect on bacteria that even when diluted 500-800 times it completely prevented the growth of staphylococci; consequently an extremely active substance had passed to the broth from the mould. This proved to belong to the Penicillium group or brush moulds, and therefore first the broth, and later the substance itself, was called «penicillin». It was soon realized that most of the species of Penicillium did not form it at all, and a closer scrutiny showed that the species which polluted Fleming's culture was Penicillium notatum. It had been described for the first time by Richard Westling, in the thesis which he defended in the autumn of 1911 at the University of Stockholm for the degree of Doctor of Philosophy - an illustration of the international nature of science, but also of the suddenly increased importance which sometimes accrues to sound work as a result of further developments. Fleming also showed that penicillin was extremely effective against cultures of many different kinds of bacteria, above all against those belonging to the coccus group, among them those that usually give rise to suppuration, pneumonia and cerebral meningitis, but also against certain other types, such as diphtheria, anthrax, and gas gangrene bacteria. But as numerous other species, among them the influenza, coli, typhoid and tuberculosis bacilli, grew even if they were exposed to moderate quantities of penicillin, Fleming was able to work out a method for isolating out from a mixture of bacteria those which were insensitive to penicillin. He found, further, that the white blood corpuscles, which are usually so sensitive, were not affected by penicillin. When injected into mice, too, it was fairly harmless. In this respect penicillin differs decisively from other substances which had been produced earlier from micro-organisms, and which were certainly found to be noxious to bacteria, but at the same time at least equally noxious to the cells of the higher animals. The possibility that penicillin might be used as a remedy was therefore within reach, and Fleming tested its effect on infected wounds, in some cases with moderate success.

Three years after Fleming's discovery, the English biochemists Clutterbuck, Lovell, and Raistrick, endeavoured to obtain penicillin in the pure form, but without success. They established, inter alia, that it was a sensitive substance which easily lost its antibacterial effect during the purifying process, and this was soon confirmed in other quarters.

Penicillin would undoubtedly still have remained a fairly unknown substance, interesting to the bacteriologist but of no great practical importance, if it had not been taken up at the Pathological Institute at the venerable University of Oxford. This time a start was again made from what is usually called basic research. Professor Howard Florey, who devoted his attention to the body's own natural protective powers against infectious diseases, together with his co-workers, had studied the lysozyme referred to above, the nature of which they succeeded in elucidating. Dr. Ernst Boris Chain, a chemist, took part in the final stage of these investigations, and during 1938 the two researchers jointly decided to investigate other antibacterial substances which are formed by micro-organisms, and in that connection they fortunately thought first of penicillin. It was certainly obvious that the preparation of the substance in a pure form must involve great difficulties, but on the other hand its powerful effect against many bacteria gave some promise of success. The work was planned by Chain and Florey, who, however, owing to the vastness of the task, associated with themselves a number of enthusiastic co-workers, among whom mention should be made especially of Abraham, Fletcher, Gardner, Heatley, Jennings, Orr-Ewing, Sanders and Lady Florey. Heatley worked out a convenient method of determining the relative strength of a fluid with a penicillin content, by means of a comparison under standard conditions of its antibacterial effect with that of a penicillin solution prepared at the laboratory. The amount of penicillin found in one cc. of the latter was called an Oxford unit.

In the purifying experiments then made, the mould was cultivated in a special nutritive fluid in vessels, to which air could only gain access after it had been filtered through cotton wool. After about a week the penicillin content reached its highest value, and extraction followed. In this connection advantage was taken of the observation that the free penicillin is an acid which is more easily dissolved in certain organic solvents than in water, while its salts with alkali are more readily dissolved in water. The culture fluid was therefore shaken with acidified ether or amyl acetate. As, however, the penicillin was easily broken up in water solution, the operation was performed at a low temperature. Thus the penicillin could be returned to the water solution after the degree of acidity had been reduced to almost neutral reaction. In this way numerous impurities could be removed, and after the solution had been evaporated at a low temperature it was possible to obtain a stable dry preparation. The strength of this was up to 40-50 units per mg and it prevented the growth of staphylococci in a dilution of at least 1 per 1 million - thus the active substance had been successfully concentrated very considerably. It was therefore quite reasonable that it was thought that almost pure penicillin had been obtained - in a similar manner, in their work with strongly biologically active substances, many earlier researchers had thought that they were near to producing the pure substance. The further experiments, which were made subsequently with the help of the magnificent resources of modern biochemistry proved, however, that such was not the case. In reality the preparation just mentioned contained only a small percentage of penicillin. Now when it has become possible to produce pure penicillin in a crystalline form, it has been found that one mg contains about 1,650 Oxford units. It is also known that penicillin is met with in some different forms, which possibly have somewhat different effects. The chemical composition of penicillin has also been elucidated in recent years, and in this work Chain and Abraham have successfully taken part.

The Oxford school was able to confirm Fleming's observation that penicillin was only slightly toxic, and they found that its effect was not weakened to any extent worth mentioning in the presence of blood or pus. It is readily destroyed in the digestive apparatus, but after injection under the skin or into the muscles, it is quickly absorbed into the body, to be rapidly excreted again by way of the kidneys. If it is to have an effect on sick persons or animals, it should therefore be supplied uninterruptedly or by means of closely repeated injections - some more recent experiments indicate that gradually perhaps it will be possible to overcome the difficulties in connection with taking the preparation by mouth. Experiments on mice infected with large doses of pyogenic or gas gangrene bacteria, which are sensitive to penicillin, proved convincingly that it had a favourable effect. While over 90% of the animals treated with penicillin recovered, all the untreated control animals died.

Experiments on animals play an immense role for modern medicine; indeed it would certainly be catastrophic if we ventured to test remedies on healthy or sick persons, without having first convinced ourselves by experiments on animals that the toxic effect is not too great, and that at the same time there is reason to anticipate a beneficial result. Tests on human beings may, however, involve many disappointments, even if the results of experiments on animals appear to be clear. At first this seemed to be the case with penicillin, in that the preparation gave rise to fever. Fortunately this was only due to an impurity, and with better preparations it has subsequently been possible to avoid this unpleasant effect.

The first experiments in which penicillin was given to sick persons were published in August 1941 and appeared promising, but owing to the insufficient supplies of the drug, the treatment in some cases had to be discontinued prematurely. However, Florey succeeded in arousing the interest of the authorities in the United States in the new substance, and with the cooperation of numerous research workers it was soon possible, by means of intensive work, to obtain materially improved results there and to carry on the preparation in pure form to the crystallization stage just mentioned. Large quantities of penicillin could be made available, and numerous tests were made above all in the field, but to a certain extent also in the treatment of civilians. Many cases were reported of patients who had been considered doomed or had suffered from illness for a long period without improvement, although all the resources of modern medicine had been tried, but in which the penicillin treatment had led to recoveries which not infrequently seemed miraculous. Naturally such testimony from experienced doctors must not be underestimated, but on the other hand we must bear in mind the great difficulties in judging the course of a disease. «Experience is deceptive, judgment difficult», is one of Hippocrates' famous aphorisms. Therefore it is important that a remedy should be tested on a large material and in such a way that comparison can be made with cases which have not been given the remedy but had otherwise received exactly the same treatment. There are now many reports of such investigations. The extraordinarily good effects of penicillin have been established in a number of important infectious illnesses, such as general blood poisoning, cerebral meningitis, gas gangrene, pneumonia, syphilis, gonorrhea and many others. It is of special importance that even sick persons who are not favourably affected by the modern sulfa drugs are not infrequently cured with penicillin. The effect naturally depends on the remedy being given in a suitable manner and in sufficient doses. On the other hand, experience has confirmed what might have been surmised, namely that penicillin is not effective in cases of, e.g. tuberculosis, typhoid fever, poliomyelitis, and a number of other infectious diseases. Consequently penicillin is not a universal remedy, but it is of the highest value for certain diseases. And it appears not improbable that, with the guidance of experience with penicillin, it will be possible to produce new remedies which can compete with or perhaps surpass it in certain respects.

Four years is a short time in which to arrive at definite conclusions as to the value of a remedy. But during these last few years experiences of penicillin have been assembled which, under ordinary conditions, would have required decades. And therefore there is no doubt at the present time that the discovery of penicillin and its curative properties in the case of various infection diseases for which this year's Nobel Prize is awarded, is of the greatest importance for medical science.

Sir Alexander Fleming, Doctor Chain, and Sir Howard Florey. The story of penicillin is well-known throughout the world. It affords a splendid example of different scientific methods cooperating for a great common purpose. Once again it has shown us the fundamental importance of basic research. The starting-point was a purely academic investigation, which led to a so-called accidental observation. This gave the nucleus, around which one of the most efficient remedies ever known could be crystallized. This difficult process was made possible with the aid of modern biochemistry, bacteriology, and clinical research. To overcome the numerous obstacles, all this work demanded not only assistance from many different quarters, but also an unusual amount of scientific enthusiasm, and a firm belief in an idea. In a time when annihilation and destruction through the inventions of man have been greater than ever before in history, the introduction of penicillin is a brilliant demonstration that human genius is just as well able to save life and combat disease.

In the name of the Caroline Institute I extend to you hearty congratulations on one of the most valuable contributions to modern medicine. And now I have the honour of calling on you to accept the Nobel Prize for Physiology or Medicine for the year 1945 from the hands of His Majesty the King.

Tuesday, June 12, 2007

How Penicillin Works to Kill Bacteria

 
Bacterial cell walls are made of peptidoglycan [Bacteria Have Cell Walls]. In order to form a rigid structure, the polysaccharide chains (glycans) are linked together by peptide crosslinks. The first step in the formation of the crosslinks involves attachment of a short five residue peptide to the MurNAc sugar in the polysaccharide. This peptide ends in two D-Alanine (D-Ala) residues.

This peptide is further modified by attaching an additional peptide to the middle of the first one creating a branched structure. Finally the peptide of one of the polysaccharide molecules is attached to another to form the crosslink. The reaction is a transpeptidase reaction.

The mechanism is shown in the figure (Lee et al. 2003). The top structure (1) is a polysaccharideMurNAc-GlcNAc) with the first peptide already bound. Note that it ends in two D-Ala residues. The first step in the transpeptidase reaction involves binding of the enzyme (Enzyme-OH) to the D-Ala-D-Ala end of the chain. A reaction takes place in which one of the D-Alanine residues is released and the enzyme become attached to the end of the peptide (2).

In the second step, an adjacent peptidoglycan (purple) (3) with a branched structure is covalently linked to the first peptidoglycan forming a crosslink between the two polysaccharides.

Almost all bacteria have cell walls and they have transpeptidase enzymes that catalyze this reaction. The activity of this enzyme is inhibited by penicillins or β-lactams. Mondays Molecule #30 was Penicillin G, one of many different types of β-lactam that block cell wall formation and kill bacteria.

The mechanism of inhibition is well known. The β-lactam region of the drug resembles the D-Ala-D-Ala end of the peptide to which the transpeptidase enzyme binds. The structures are shown below.


A typical penicillin is shown at the top of the figure. The business part of the molecule is the β-lactam moiety and the "R" represents various groups that can be bound to create different penicillin drugs. The structure of D-Ala-D-Ala is shown below.

The structures of several different transpeptidases have been solved. The enzymes are usually called penicillin-binding proteins or PBP's. Most bacteria have several related versions of PDB genes but all of the enzymes are inhibited by β-lactams.

The figure shows the structure of penicillin-binding protein 1a (PBP1a) from Streptococcus pneumoniae with the bound drug in gray in the grove in the lower right corner of the enzyme (Contreras-Martel et al. 2006). This form of the enzyme is inactive because the drug binds very tightly to the active site and blocks the reaction. That's how penicillin works.


Contreras-Martel, C., Job, V., Di Guilmi, A.M., Vernet, T., Dideberg, O. and Dessen, A. (2006) Crystal structure of penicillin-binding protein 1a (PBP1a) reveals a mutational hotspot implicated in beta-lactam resistance in Streptococcus pneumoniae. J. Mol. Biol. 355:684-96.


Lee, M., Hesek, D., Suvorov, M., Lee,W., Vakulenko, S. and Mobashery, S. (2003) A mechanism-based inhibitor targeting the DD-transpeptidase activity of bacterial penicillin-binding proteins. J. Am. Chem. Soc. 125:16322-16326.

Bacteria Have Cell Walls

 
Most species of bacteria have a cell wall. The rigid cell wall prevents the bacterial cell from expanding in solutions where the salt concentrations are lower than the salt concentration inside the cell. If it wasn't for the cell wall, bacteria wouldn't be able to live in fresh water or sea water.

Gram positive bacteria have a thick cell wall on the exterior that picks up the purple Gram stain (named after Hans Christian Gram). Gram negative bacteria, such as the E. coli cell shown in the figure, do not stain with the dye because the thinner cell wall lies between the inner and outer membranes.

The cell wall is made up of peptidoglycan, which, as the name implies, is a combination of polysaccharide (glycan) and peptides. The polysaccharide consists of alternating N-acetylglucosamine (GlcNac) and N-acetylmuramic (MurNAc) resides [see Glycoproteins].


During cell wall synthesis a short peptide of five amino acid residues is attached to the polysaccharide. The sequence of this peptide varies slightly from species to species. In some gram negative bacteria the sequence is L-alanine- D-isoglutamate- L-lysine- D-alanine- D-alanine. These short chains are linked to each other by another peptide consisting of five glycine residues. When the cross-links are formed, the terminal D-alanine residue is cleaved off so the final structure has only a single D-alanine at the end. (The significance of this cleavage will become apparent in subsequent postings.)


The completed peptidoglycan cell wall is extremely rigid because of the peptide crosslinks between the polysaccharide chains. In the figure above, the original peptide chain is colored blue and the second pentaglycine peptide is colored red. The right end of the red pentaglycine is covalently attached to the blue D-alanine residue at the bottom of an adjacent polysaccharide chain as shown in the cartoon in the upper right corner of the figure.

Monday, June 11, 2007

Does Politics Influence When Scientific Papers Are Published?

 
I'm told that the American House of Representatives is considering a bill that will allow embryonic stem cell research. Matt Nisbet thinks that the recent publication of three papers on stem cell research [Reprogramming Somatic Cells] may have been timed to correspond with this debate in the US Congress [Understanding the political timing of stem cell studies]. Nisbet quotes from an article by Rick Weiss in the Washington Post [Darn Cells. Dividing Yet Again!]. Here's what Weiss says,
Thursday, June 7. After months of intense lobbying by scientists and patient advocacy groups, the House is ready to vote on legislation that would loosen President Bush's restrictions on the use of human embryos in stem cell research. But that very morning, the lead story in every major newspaper is about research just published in a British journal that shows stem cells can be made from ordinary skin cells.

The work was in mice, but the take-home message that suffuses Capitol Hill is that there is no need to experiment on embryos after all.

If that doesn't sound suspicious, consider this:

Monday, Jan. 8. After months of intense lobbying by scientists and patient advocacy groups, Congress is ready to vote on legislation that would loosen Bush's restrictions on stem cell research. But that very morning, newspapers are touting new research just published in a British journal suggesting that stem cells can be made from easily obtained placenta cells. No need for embryos after all!

Is there a plot afoot?

Lots of lobbyists, members of Congress and even a few scientists are starting to think so.

"It is ironic that every time we vote on this legislation, all of a sudden there is a major scientific discovery that basically says, 'You don't have to do stem cell research,' " Democratic Caucus Chairman Rahm Emanuel (Ill.) sputtered on the House floor on Thursday. "I find it very interesting that every time we bring this bill up there is a new scientific breakthrough," echoed Rep. Diana DeGette (D-Colo.), lead sponsor of the embryo access bill. Her emphasis on the word "interesting" clearly implies something more than mere interest.

"Convenient timing for those who oppose embryonic stem cell research, isn't it?" added University of Pennsylvania bioethicist Arthur Caplan in an online column. (The bill passed easily, but not with a margin large enough to override Bush's promised veto.)
Hmmm ... let's see if we can figure out what's going on here. Apparently there's some vast conspiracy afoot to keep the American ban on embryonic stem cell research in place. The idea is that scientists and the editors of Nature (for example) want to publish key papers about alternatives to embryonic stem cell research just when American politicians are about to vote on a bill to lift the ban.

The conspiracy makes several key assumptions. It assumes that the editors of the British journal Nature knew about the American bill back on May 22nd when they accepted the two papers. That's when the decision to publish on line for June 6th was taken (the two week delay between acceptance and online publication is typical). It assumes, therefore, that the acceptance date was juggled to meet the target date of June 6th—assuming that the editors even knew, or cared, about what was going on in Washington D.C. (USA). Presumably the acceptance date was delayed somewhat in order to fix the timing. One assumes that the group in Japan who published one of the papers had no problem with this delay and nor did the scientists in Boston. The papers were extremely important in a competitive field but, hey, anything can wait for American politics, right?

Harvard risk expert David Roepik and Temple mathematician John Allan Paulos are skeptical about the conspiracy theory with good reason. The whole idea is ludicrous but that doesn't stop Matt Nisbet from suggesting that it's true. Here's what Nisbet says,
Still, something more than just coincidence is likely to be going on here. Roepik and Paulos' arguments innocently assume that publication timing at science journals is random, without systematic bias. But journal editors, just like news organization editors and journalists, are subject to various biases, many of them stemming from the fact that they work within a profit-driven organization that has to keep up a subscriber base and play to their audience.

Peer-review is just one of the many filtering devices that scientific research goes through. Certainly many papers make it through peer-review based on technical grounds, but then editors at the elite journals, faced with limited space and the need to create drama and interest among subscribers and news organizations, apply more subjective criteria based on what they believe to be the "scientific newsworthiness" of the research. In other words, how much interest among the scientific community will these papers generate AND how much news attention?
Still, Nisbet isn't quite as paranoid and confused about the process as Rick Wiess. In the Washington Post article he says,
Then there is the question of motive. The Brits are competing against Americans in the stem cell field and are legally allowed to conduct studies on embryos. Might they be aiming to dominate the field by helping the conservative and religious forces that have so far restricted U.S. scientists' access to embryos?

Or might the journals be trying, as one stem cell expert opined on the condition of anonymity, to leverage their visibility by publishing stem cell articles just as Congress is voting on the topic?
Damn Brits. :-)

In fairness, Weiss includes a disclaimer from the editors of Nature,
"Nature has no hidden agenda in publishing these papers," said the journal's senior press officer, Ruth Francis, in an e-mail. The real goal was to get the papers out before a big stem cell conference in Australia next week, she said.
More significantly, Weiss includes a comment from someone who seems to have hit the nail on the head,
To Ropeik, the Harvard risk expert, the fact that people are imputing anything more than sheer coincidence is "just more proof that inside the Beltway the thinking is so myopic. They see the whole world through their own lens, and are blinded" to common sense.
That sounds about right to me. If you live in Washington you start to think that the whole world revolves around the White House and Congress. It's easy to believe that everything has to be spun framed in order to influence American politicians—even the timing of publication of scientific papers by a prominent British journal.

Monday's Molecule #30

 
Today's molecule looks complicated but it has a very simple, and well-known, name. We need the correct common name and the long systematic (IUPAC) name.

As usual, there's a connection between Monday's molecule and this Wednesday's Nobel Laureate(s). This one is an obvious direct connection. Once you have identified the molecule the Nobel Laureate(s) are obvious.

The reward (free lunch) goes to the person who correctly identifies the exact molecule with the correct formal name and the Nobel Laureate(s). Previous free lunch winners are ineligible for one month from the time they first collected the prize. There are no ineligible candidates for this Wednesday's reward since recent winners have declined the prize on the grounds that they live in another country and can't make it for lunch on Thursday (a feeble excuse, in my opinion).

Comments will be blocked for 24 hours. Comments are now open.

Sunday, June 10, 2007

Was Bertrand Russell an Atheist or an Agnostic?

John Wilkins has posted an article on whether Bertrand Russell was an agnostic or an atheist [What is an Agnostic? by Bertrand Russell]. John links to an essay by Russell where he defines agnostic as,
An agnostic thinks it impossible to know the truth in matters such as God and the future life with which Christianity and other religions are concerned. Or, if not impossible, at least impossible at the present time.
This is a definition we can all agree on. I am an agnostic, as is John Wilkins and Richard Dawkins.

Bertrand Russell goes on to define atheist as,
An atheist, like a Christian, holds that we can know whether or not there is a God. The Christian holds that we can know there is a God; the atheist, that we can know there is not.
This is not correct. There are many people who have decided not to believe in Gods and they live their lives as if there were no Gods. However, they do not maintain that the nonexistence of Gods is known for certain. They believe that that it's impossible to prove a negative. These people call themselves atheists and they think that this is true to the original root meaning of the word ("not a theist"). Many of us are atheists and agnostics.

Russell knows that there is a difference between the philosophical concept of being unable to prove a negative and the practical, day-to-day, behavior of believers and non-believers. In another essay (Am I An Atheist Or An Agnostic?) he says,
Here there comes a practical question which has often troubled me. Whenever I go into a foreign country or a prison or any similar place they always ask me what is my religion.

I never know whether I should say "Agnostic" or whether I should say "Atheist". It is a very difficult question and I daresay that some of you have been troubled by it. As a philosopher, if I were speaking to a purely philosophic audience I should say that I ought to describe myself as an Agnostic, because I do not think that there is a conclusive argument by which one prove that there is not a God.

On the other hand, if I am to convey the right impression to the ordinary man in the street I think I ought to say that I am an Atheist, because when I say that I cannot prove that there is not a God, I ought to add equally that I cannot prove that there are not the Homeric gods.

None of us would seriously consider the possibility that all the gods of homer really exist, and yet if you were to set to work to give a logical demonstration that Zeus, Hera, Poseidon, and the rest of them did not exist you would find it an awful job. You could not get such proof.

Therefore, in regard to the Olympic gods, speaking to a purely philosophical audience, I would say that I am an Agnostic. But speaking popularly, I think that all of us would say in regard to those gods that we were Atheists. In regard to the Christian God, I should, I think, take exactly the same line.

There is exactly the same degree of possibility and likelihood of the existence of the Christian God as there is of the existence of the Homeric God. I cannot prove that either the Christian God or the Homeric gods do not exist, but I do not think that their existence is an alternative that is sufficiently probable to be worth serious consideration. Therefore, I suppose that that on these documents that they submit to me on these occasions I ought to say "Atheist", although it has been a very difficult problem, and sometimes I have said one and sometimes the other without any clear principle by which to go.

When one admits that nothing is certain one must, I think, also admit that some things are much more nearly certain than others. It is much more nearly certain that we are assembled here tonight than it is that this or that political party is in the right. Certainly there are degrees of certainty, and one should be very careful to emphasize that fact, because otherwise one is landed in an utter skepticism, and complete skepticism would, of course, be totally barren and completely useless.
Bertrand Russell was not a Christian as the essays in his book, Why I Am Not a Christian demonstrated. Some of these essays lead to a famous court case in 1940 where Russell was declared unfit to teach at City College because his moral views were too permissive.

Above all, Russell was a rationalist who opposed superstition. His views on religion, written back in 1927, still sound familiar today.
Religion is based, I think, primarily and mainly upon fear It is partly the terror of the unknown and partly, as I have said, the wish to feel that you have a kind of elder brother who will stand by you in all your troubles and disputes. Fear is the basis of the whole thing—fear of the mysterious, fear of defeat, fear of death. Fear is the parent of cruelty, and therefore it is no wonder if cruelty and religion have gone hand in hand. It is because fear is at the basis of these two things. In this world we can now begin a little to understand things, and a little to master them by the help of science, which has forced its way step by step against the Christian religion, against the churches, and against the opposition of all the old precepts.

Science can help us get over this craven fear in which mankind has lived for so many generations. Science can teach us, and I think our own hearts can teach us, no longer to look around for imaginary supports, to longer to invent allies in the sky, but rather to look to our own efforts here below to make this world a fit place to live in, instead of the sort of place that the churches in all these centuries have made it.

Saturday, June 09, 2007

The Ethics of Stem Cell Research

 
Arthur Caplan is the director of the Center for Bioethics at the University of Pennsylvania. He was written an article on the MSNBC website that addresses the new work on reprogramming cells to produce totipotent stem cells [Does stem cell advance provide an ethical out?].

He claims that the new work will not replace conventional creation of embryonic stem cell lines using embryos because the new procedure has only been shown to work in mice and because it involves retrovirus vectors. This may be true, although Rudi Jaenisch seems to be more optimistic [see Reprogramming Somatic Cells].

I'm interested in another part of Caplan's essay where he says,
As much as critics of this field of research would like to have you believe that human embryos in dishes are people, that moral argument is not compelling.

Human embryos in dishes are not people or even potential people. They are, at best, possible potential people.

Frozen embryos in infertility clinics face a fate of certain destruction anyway. The moral case against using them, or cloned embryos, which have almost zero chance of becoming people, is no less compelling because progress has been made in another area of research.

The existence of a new way to perhaps make embryonic-like stem cells is not enough to make frozen embryos and cloned embryos off-limits for American scientists or for research relying on federal funds.

Those in favor of human embryonic stem cell research, and that is the majority of Americans according most polls, including one done by CNN just last month, do not have to change their minds about the morality of such research even when another avenue for creating embryonic-like cells is found in mice.
What Caplan fears is that those who are opposed to destroying embryos will use this new work to reinforce their opinion that a ban must be enforced. This is almost certainly correct but there's little that he can do to change their minds.

What puzzles me is that Caplan argues the case that these embryos are not humans—in fact they are not even "possible potential" humans. This seems to be typical of the sort of debate that passes for ethics these days. I don't get it.

For those of us who are pro-abortion the argument makes no sense at all. We are already committed to the concept that real potential humans can be destroyed so the destruction of earlier stages poses no problem whatsoever. We simply don't care to debate whether embryos at the pre-blastula stage are human or not since the decision has no bearing on whether they should be destroyed.

For those who oppose abortion, and the destruction of embryos, the declaration that they are not even "possible potential" humans rings hollow. For them, the issue is not going to be settled by scientists. If it were, it would not be an "ethical" issue at all but a simple scientific problem. For many of them the facts are quite obvious. God created man and woman to make babies and any direct interference in that process is an attempt to disrupt the process that God intended.

This is a group who distrust scientists from the get-go. The idea that they are going to allow men in white lab coats to poke at human embryos just in order to advance their careers is a non-starter. Remember, most of this group doesn't even believe in evolution or a 4.5 billion year old Earth. Why should they believe what scientists have to say about embryos?

So what's the point of making the argument about the "humanness" of embryos, especially from someone who is director of bioethics at a major university? Who is he trying to convince, George Bush? Where are Mooney and Nisbet when we need them? Caplan needs a lesson on spin framing.

I have recently discovered that there are many moral realists in philosophy departments. This group believes that there is an underlying "truth" behind every ethical problem and it is the goal of ethical reasoning to discover this "truth." I wonder what the "truth" is about destroying embryos in order to create stem cells? I would have thought that the definition of moral truth depends on your cultural/religious background but I'm told that this "ethical relativism" is very much a minority position among philosophers.

[Hat Tip: Shalini at Scientia Natura: Stem cell breakthrough]

Friday, June 08, 2007

Still Banned From Uncommon Descent

 
Poor old DaveScott. Like GilDodgen he's very confused about the evolution of topoisomerases [A Dynamic Fitness Landscape]. He posted the same video they've been confused about for several months and asked if someone could explain it to him. I tried to post a comment to help him out since he really needs it. I linked to my earlier posting [A State of Extreme Cognitive Dissonance].

My comment never got published on the blog. I guess like most IDiots he doesn't want my help.

Reprogramming Somatic Cells

 
There were three papers published this week that showed how to reprogram somatic cells so they could act like embryonic stem cells (Maherali et al., 2007; Okita et al., 2007; Wernig et al., 2007). The trick is to introduce genes for four different transcription factors into the somatic cells (e.g., skin cells). When the four transcription factors (Oct4, Sox2, c-Myc, and Klf4) are made they turn on genes in the somatic cell that cause it to reprogram and become competent to differentiate into any other type of cell, including germ cells.

The experiments were performed in mice and they are based on the work of Takahashi and Yamanaka (2006). The interesting thing about this experiment is that it achieves a goal that most scientists expected; namely, the ability to reprogram cells by changing the pattern of gene expression. This goal was achieved less than two years after Science magazine asked two so-called "fundamental" questions in their July 2005 issue [SCIENCE Questions: How Does a Single Somatic Cell Become a Whole Plant? and SCIENCE Questions: How Can a Skin Cell Become a Nerve Cell?]. What it reveals is that the editors of Science were out of touch with the scientific community. These were not significant questions. The solution was known it was just a matter of finding out which transcription factors were required.

Here's a short video where Rudi Jaenisch explains the significance of his work. I think it represents good science communication from a scientist. What do you think?



Maherali, N., Sridharan, R., Xie, W., Utikal, J., Eminli, S., Arnold, K., Stadtfeld, M., Yachechko, R., Tchieu, J., Jaenisch, R., Plath, K. and Hochedlinger, K. (2007) Directly Reprogrammed Fibroblasts Show Global Epigenetic Remodeling and Widespread Tissue Contribution. Cell: Stem Cell 1: 55-70.

Okita, K., Ichisaka, T. and Yamanaka, S. (2007) Generation of germline-competent induced pluripotent stem cells. Nature advance online publication 6 June 2007 | doi:10.1038/nature05934

Takahashi, K. and Yamanaka, S. (2007) Induction of pluripotent stem cells from mouse embryonic and adult fibroblast cultures by defined factors. Cell 126, 663–676.

Wernig, M., Meissner, A., Foreman, R., Brambrink, T., Ku, M., Hochedlinger, K., Bernstein, B.E. and Jaenisch, R. (2007) In vitro reprogramming of fibroblasts into a pluripotent ES-cell-like state. Nature advance online publication 6 June 2007 |doi:10.1038/nature05944
[Hat Tip: Alex Palazzo for the video [Rudy Jaenisch on Stem Cells] and for alerting us in advance to watch for exciting news]

USA TODAY/Gallup Poll on Evolution

 
Check out the USA Today website for information on the latest poll [USA TODAY/Gallup Poll results].

Let's look at the result of two questions on Creationism and Evolution.
23. Next, we'd like to ask about your views on two different explanations for the origin and development of life on earth. Do you think -- [ITEMS ROTATED] -- is -- [ROTATED: definitely true, probably true, probably false, (or) definitely false]?

A. Evolution, that is, the idea that human beings developed over millions of years from less advanced forms of life
B. Creationism, that is, the idea that God created human beings pretty much in their present form at one time within the last 10,000 years
The interesting thing about these results is that only 18% think that evolution is a fact and only 15% believe that a 6000 year old Earth is definitely false. That's pretty frightening if you think about it.

But here's something even more puzzling.
24. How familiar would you say you are with each of the following explanations about the origin and development of life on earth -- very familiar, somewhat familiar, not too familiar, or not at all familiar? How about -- [ITEMS ROTATED]?

A. Evolution
Almost everyone (82%) thinks they understand evolution well enough to have an opinion. They're almost certainly wrong.

It shows us where we need to start. We need to get out the message that evolution is a scientific fact and that evolutionary theory is complicated enough that it requires some study in order to get the basic concepts down pat. Before we can teach we have to convince people that they need to learn.

I wonder how many of the people who think that evolution is true actually understand it?