More Recent Comments

Monday, April 27, 2026

Ask Gemini: "What is the difference between junk DNA and non-coding DNA?"

This is weird. I was a bit bored so I asked Gemini the following question: "What is the difference between junk DNA and non-coding DNA?" I thought the first answer was so wrong that I decided to ask it again to see if I got the same answer.

The second answer was quite different because Gemini noticed that I had bookmarked Sandwalk, a blog written by Laurence Moran, a champion of the 'junk DNA' concept. Is it trying to give me the answer it thinks I want or the best possible scientific answer?

Note: Here is the correct answer.

Non-coding DNA refers to the part of the genome that doesn't code for proteins. It's one way to partition the genome - you could also refer to regulatory sequences and non-regulatory sequences.

By the late 1960s scientists knew of lots of functional non-coding DNA such as regulatory sequences and non-coding genes such as those for ribosomal RNA and tRNA. (There are other non-coding functional elements.) It became apparent that most of the human genome consisted of non-functional DNA or junk DNA. The original model back then was that 10% is functional and 90% is junk. The 10% that is functional consisted of 1-2% coding DNA and about 8% of functional non-coding DNA.

No knowledgeable scientist ever said that all non-coding DNA was junk; that's a lie that continues to be perpetuated in scientific publications and the popular media even though it has been repeatedly debunked.

Most of the data that has accumulated over the past 50+ years has supported the idea that 90% of the human genome is junk and only 10% is functional.

The Gemini answers relate to the debate concerning whether AI is really intelligent and, more importantly, whether the popular (free) algorithms are spreading misinformation.

Tuesday, April 14, 2026

How many pseudogenes in the human genome?

There are somewhat less than 25,000 genes in the human genome and there are probably about the same number of pseudogenes.

Pseudogenes are sequences that resemble real functional genes but they contain mutations that render them non-functional. They are very real examples of junk DNA.

There are four kinds of pseudogenes. Duplicated pseudogenes arise from a gene duplication event when one of the original copies mutates. Duplicated pseudogenes retain all of the features of the original gene, including introns and adjacent regulatory sequences. The inactivating mutation may occur in the gene itself—for example in the coding region of a protein coding gene—in which case the pseudogene may still be transcribed. Duplicated pseudogenes are usually found adjacent to their parent gene.

Processed pseudogenes arise when the normal transcript is copied by reverse transcriptase and the DNA copy is reintegrated into the genome. Processed pseudogenes don't have introns or regulatory sequences and they are not near their parent gene. Most processed pseudogenes come from transcripts that are expressed in the germ line.

Saturday, April 11, 2026

Suggestions for philosophers who want to contribute to philosophical biology

Can modern biologists get along with modern philosphers of biology? James DiFrisco and Steven Orzack think they can and they have suggestions for both biologists and philosophers (DiFrisco and Orzack, 2026).

Here are their suggestions for philosophers who want to contribute to philosophy of biology.

  1. Justify engagement with philosophical biology by its capacity to improve biology. Do not justify engagement with a topic by pointing to its interest to philosophers, or by a generic appeal to interdisciplinarity, or by apparent thematic overlap.
  2. Understand that conceptual analysis needs to make a difference to scientific reasoning and practice. The development and clarification of biological concepts is best when based upon actual biology as opposed to imaginary counterfactual scenarios and thought experiments.
  3. Attain at least the level of comprehension of biology possessed by a senior undergraduate major in biology.
  4. Publish normative claims about biology in biology journals, not just in philosophy journals.
  5. Attend and present work at biology conferences. Collaborate with biologists.
  6. Ensure that articles or books about a philosophical issue in biology are reviewed by a biologist with relevant expertise.
  7. Do not claim what author X means (without documentation), as in “what Smith really means here is ….” Accept potential ambiguity as a part of human communication.
  8. Anchor a descriptive claim about biology in the actual practice of biology. Engage with current biology and not just biological authorities from the past (e.g., Darwin).
  9. Understand that claims by biologists need to be understood in their social and historical context in addition to their epistemic context.

DiFrisco, J. and Orzack, S.H. (2026) Biology Needs Philosophy, But What Philosophy? BioScience:biag016. [doi: 10.1093/biosci/biag016]

Suggestions for biologists who want to contribute to philosophical biology

Can modern biologists get along with modern philosphers of biology? James DiFrisco and Steven Orzack think they can and they have suggestions for both biologists and philosophers (DiFrisco and Orzack, 2026).

Here are their suggestions for biologists who want to contribute to philosophy of biology.

  1. Understand that debate over definitions is often not quibbling over “mere” semantics. After all, semantics concerns meaning, and meaning connects concepts to inferential roles in reasoning, including prediction.
  2. Understand that concepts having an uncertain connection with facts may still be useful. For example, the notion of a species as an ensemble of potentially interbreeding individuals has underwritten many important empirical insights into evolution, even though it can be hard to measure the potential for interbreeding over time and space. Similarly, the notion of an organ as a well-defined ensemble of cells has underwritten many important empirical insights in anatomy, pathology, and physiology even though the criteria that define organs remain in dispute.
  3. Understand that there can be useful theory in biology even if it is not expressible in compact mathematical form.
  4. Understand that theory can be important apart from its immediate empirical usefulness. However, theory that is informed by data and that informs data is most useful.
  5. Understand that explanations of phenomena do not have to be molecular in order to be causal and mechanistic. The limits of explanations based upon molecular mechanisms do not necessitate switching to a different mode of explanation (e.g., one based on agency; see below).
  6. Take guidance from philosophy when making philosophical claims. Debates by philosophers over issues such as falsifiability as a defining criterion of science; the uses of abduction, deduction, and induction; essentialism in classification; and the nature of scientific laws can improve scientific practice.

DiFrisco, J. and Orzack, S.H. (2026) Biology Needs Philosophy, But What Philosophy? BioScience:biag016. [doi: 10.1093/biosci/biag016]

Chris Christie says baby boomers are the most selfish generation in American history

The leading edge of the baby boomer generation is turning 80 this year (that's me!). Chris Christie is a trailing edge baby boomer.

Christie is upset about the number of old people in American politics and he singles out Donald Trump (a baby boomer) and Joe Biden (a member of the previous generation). I assume he's also annoyed at old members of Congress, many of whom are older than 80 and therefore not boomers.

Here's what he said ...

I don't think they are this period of our time. I call this the last gasp of the most selfish generation in American history. The baby boomers. The most selfish generation in American history. The most self-centered generation in American history. The least sacrificing generation in American history. [see Chris Christie on baby boomers]

The fact that old politicians can consistently get re-elected in America is not the fault of American baby boomers. It's a systemic problem with American politics and I'm pretty sure that 20 years from now Chris Christie will be complaining about old gen X and millenial politicians.

But there's another problem with Chris Christie's remarks. He's mad at certain members of his generation but instead of focusing his criticism on those individuals he slanders an entire generation. Most baby boomers know that's not right. It wasn't right when talking about Blacks, Jews, women, Muslims, or people from New Jearsey and it's not right when talking about an entire generation.


Friday, April 10, 2026

How can we combat the spread of misinformation?

This is a serious question. We (Sandwalk readers) know that there's a lot of science misinformation being spread in the popular science literature.1 So far, scientists have been spectacularly unsuccessful in stopping it.

The misinformation covers all aspects of science but my particular bugaboos are evolution, genomes, and junk DNA.

I'm going to quote the first few paragraphs from an article on the Knowable Magazine website. It seems to be associated with Annual Reviews and it certainly looks like it should be a credible source of science information.

The article is The silent majority: RNAs that don’t make proteins. The author is Christina Szalinski and here's how she describes herself on her website.

I know science.

I became a science writer in 2013 after finishing my PhD in cell biology at the University of Pittsburgh. So when it comes to writing, I can shake out the molecular tangles, unravel the cellular threads, and wade through the formidable details of scientific studies.

Is it wrong to specifically identify science writers who are spreading misinformation? Is it cruel or mean to imply that they don't understand their subject?

Do other science writers and their organizations have any obligation to police their own discipline to ensure scientific accuracy?

Does anybody have any good ideas on how to clean up this mess?

Here's an excerpt from the article. I don't think I need to explain what's wrong.

When scientists first cracked the genetic code, they expected a simple story: DNA makes RNA, and that RNA, known as messenger RNA, makes proteins. Proteins would do all the important work — building tissues, fighting infections, digesting food.

But when the DNA of our genome was finally sequenced, researchers encountered a head-scratcher: The 20,000-plus genes that carry instructions for making our proteins account for less than 2 percent of our DNA. What was the rest of it good for?

For years, the remaining 98 percent was dismissed as “junk DNA” — evolutionary debris, filler. But as sequencing technology improved, a startling picture emerged. Our cells were busy making RNA copies of all those expanses, not just making messenger RNA — or mRNA — from the protein-coding genes. They were churning out vast quantities of RNA molecules with no known purpose.

The question became: Why would cells waste so much energy on copying that junk?

Today, however, the importance of this non-coding RNA — the catchall term for RNA molecules that don’t carry instructions for proteins — is undeniable. Non-coding RNAs turn out to regulate everything from embryonic development to immune responses to brain function. They help determine which genes get turned on and off, and when. They can promote cancer or suppress it.

I contacted the author last week to warn her that I was about to publish this post. I asked if she wished to comment or to provide the source of her information on the history of the field. I did not get a reply.

The problem with this kind of description is that it misrepresents the way science is done. Most scientific models are due to slow and steady, incremental advances building on previous studies. That kind of science is (usually) self-correcting—when new information becomes available, the old models are revised.

The picture that is being presented to the general public is that old scientists were pretty stupid because they thought there was only one kind of gene (protein coding) and that everything else in the genome (98%) had to be junk. According to that false history, the old fuddy-duddies were shown to be totally wrong when the human genome was sequenced and thousands of non-coding genes were discovered for the first time. That disproves junk DNA according to the false history.

Is there a way of writing the true history in a way that's accessible to the general public? I don't know but I thought I would give it a try in order to try and show modern science writers how it coould be done.

It's not easy. Read my attempt below and let me know if it works.

Scientists were actively working out the functions of DNA back in the 1950s and 1960s. By the mid-1960s they had discovered two kinds of genes. The majority encoded proteins but there were also non-coding genes that specified important RNAs such as ribosomal RNA (rRNA) and transfer RNA (tRNA) that were used in protein synthesis.

Scientists also established that DNA contained regulatory elements that controlled the expression of those two types of genes. Other functional DNA elements were also identified at this time.

Most of this work was done in bacteria and their viruses where genes took up a very large percentage of the DNA in their chromosomes. However, it soon became apparent that this was not the case in humans where the coding regions of the protein-coding genes seemed to account for only 2% of the genome. (The genome is the total amount of DNA in all chromosomes.) Other functional elements, such as non-coding genes and regulatory sequences only accounted for a bit more of the genome.

This gave rise to a model developed by the leading experts of the time, including several Nobel Laureates. They proposed that only 10% of the human genome is functional and 90% is junk DNA. Based on a lot of experimental data, they estimated that there were about 30,000 genes in the human genome.

Additional non-coding genes specifying regulatory RNAs were identified at this time (early 1970s) but the biggest advance in this area ocurred in the 1980s with the discovery of a host of genes specifying various new RNAs. Some of these new non-coding genes specified RNAs that acted like protein enzymes to catalyze biochemical reactions. Others were involved in regulating gene expression and still others were structural components of large cellular complexes.

These results, and others from the 1990s, raised the number of non-coding genes in humans to as many as several thousand but they still only accounted for a fraction of the total number of protein-coding genes.

The first draft of the human genome was published 25 years ago and it confirmed the model developed more than 50 years ago. There were about 30,000 genes, just as the experts had predicted, and most of the human genome was junk.

Subsequent work on identifying features of the human genome have, by and large, confirmed this model but there are scientists who are skeptical.

Most of the human genome is transcribed into RNAs—a fact that was known 50 years ago—but many of the leading experts concluded that most of those RNAs were probably junk RNA of various sorts. The idea here is that the human genome is very messy and it gives rise to lots of spurious, accidental RNAs that are not biologically relevant. Most of those RNAs are present in small amounts and they are rapidly degraded. They are not conserved in our closest relatives. (Sequence conservation is a good indication of function and lack of sequence conservation is a good indication of junk.)

The skeptics, on the other hand, argue that most of those RNAs have a function and there are far more non-coding genes than protein-coding genes. The debate continues to this day.


1. And, unfortutnately, in the legitimate scientific literature.

Monday, April 06, 2026

How can philosophy contribute to science?

I've written quite a bit about the perceived conflict between science and philosophy and defended my view that science is best described in broad terms as a way of knowing that requires evidence, skepticism, and rational thinking. As far as I know, there is no other way of knowing that has produced true knowledge.

In this sense, the proper practice of philosophy has to involve science—and by that I mean evidence— if the results are going to produce knowledge. There's lots to debate on this topic, including discussions about the meaning of "knowledge" [Is science the only way of knowing?].

But that's not what I want to talk about today. Today's topic is about the contribution that philosophers can make to science. I'll focus on philosophers of biology and on scientific topics that I'm knowledgeable about and I'll assume that most philosophers agree with Elisabeth Lloyd when she says, "As a philosopher of science, I have always been oriented towards addressing problems that scientists have, not so much problems that philosophers have. That is how to do good philosophy of science."1

Now, let me be clear about the issue. It is blindingly obvious that philosophers could use their deep understanding of logic and argumentation to make significant contributions to biology, especially in cases where scientists are misusing logic. The question is not whether philosophy is incapable of ever contributing to biology but whether it is actually fulfilling that potential.

Sunday, April 05, 2026

Is the American President the leader of the free world?

I was intrigued by a recent article by Carlos Lozade in The New York Times: America Has Become a Dangerous Nation. The opening paragraph sets the stage.

We had a good run — some eight decades or so — but it is clear by now that the United States has ceased to be the leader of the free world. A successor for that post has not been named, and it appears unlikely that the European Union, or NATO, or whatever constitutes “the West” these days will promote from within. The job might even be eliminated, one more reduction in force courtesy of President Trump.

Here's the problem. I'm a Canadian. I think Canada is part of the "free world" but it's not a term that Canadians use very often. I also don't think it's popular in European countries but I'd like to hear from Europeans. Do the people of France think of themselves as being part of the "free world" (using the French translation)?

I remember the 1960s when the United States was bogged down in Vietnam and I don't recall thinking of either Johnson or Nixon as any kind of a leader of Canada or of similar countries such as Sweden, Australia, or Switzerland, let alone Japan or South Korea.

Ronald Reagan may have been a good President for Americans but I never thought of him as the leader of the free world in spite of the fact that Americans give him credit for the downfall of the Soviet Union. He would not have been a very good leader in Canada.

I think Canadians have enjoyed freedom for a very long time and so have many countries in Europe and elsewhere. We don't owe that freedom to the United States and we don't look to the United States as the standard of freedom. If that's what it means to be the leader of the "free" world then it's a term that doesn't resonate outside of the USA.

The Lozade article implies that the President of the United States has been the de facto leader of the free world for eight decades and it's only in the past year or so that he has lost that title. That's an interesting claim. It suggests that Canadians, Australians, Swedes, etc. looked upon George W. Bush as some kind of leader when American invaded Iraq and Afghanistan.

To my way of thinking, leaders are those who improve the lives of their citizens by promoting universal health care, income equality, the rights of women and minorities, and safety & securtity (i.e. crime and gun control). I admire world leaders who promote those values. I don't look to American Presidents as world leaders from that perspective.

I think the view of Americans is that military might is the important criterion. Since the United States is the toughest kid in school then it is the de facto leader since nobody wants to be on their bad side. Americans assume that their country always uses that military might for the good of the free world and that's they think that for the past eight decades the people of New Zealand, Mexico, and India might have looked to the President as the leader of the free world.

I'd like to suggest that viewing the President of the United States as the leader of the free world is mostly an American myth that's not shared by people in other countries to any great extent. America is the most powerful country in NATO and and so it dominates military considerations within that alliance. I think that Americans view NATO as the "free world" when they use that expression.

I view the United States as a powerful partner in NATO but I think of NATO as an alliance where every country is important. I don't see America as the "leader" in NATO any more than I see Canada as the leader. What do you think?

I'd especially like to hear from non-Americans about whether they have always viewed the President of the United States as the leader of the free world.