Allen MacNeill has started a new blog called Evolutionary Psychology. Allen teaches introductory biology and evolution at Cornell University (New York, USA). This could be a good addition to the blogosphere since he intends to defend the field. (Good luck!)
The first posting on the new blog is The Capacity for Religious Experience is an Evolutionary Adaptation for Warfare. It starts out well with ...
Recent work on the evolutionary dynamics of religion have converged on a "standard model" in which religions and the supernatural entities which populate them are treated as epiphenomena of human cognitive processes dealing with the detection of and reaction to agents under conditions of stress, anxiety, and perceived threat.I agree with this. There has been selection in the primate lineages for intelligence and one of the reasons for the fitness advantage may well be the ability to cope with external threats. I think that religion, and many other human behaviors, are epiphenomena that are indirect consequences of intelligent brains.
Unfortunately, this does not seem to be what MacNeill really means. He goes on to describe behaviors such as religion and warfare as though they were individually selected. He seems to be implying that there are genes for religious behavior and for engaging in warfare. This isn't the same as the "standard" model—unless my interpretation is completely wrong.
Here's how religion evolved, according to MacNeill ...
Wilson (2002) has proposed that the capacity for religion has evolved among humans as the result of selection at the level of groups, rather than individuals. Specifically, he argues that benefits that accrue to groups as the result of individual sacrifices can result in increased group fitness, and this can explain what is otherwise difficult to explain: religiously motivated behaviors (such as celibacy and self-sacrifice) that apparently lower individual fitness as they benefit the group.I take issue with this description. When an evolutionary psychologist says that "all that is necessary" for the evolution of a behavior is that it confers some advantage, your adaptationist bullshit detector should hit the red line. That's not all that is necessary. Another, very important, requirement is that there be a genetic component to the behavior. In other words, we need to show that suicide bombers (for example) have different alleles in their genomes than atheists.
At first glance, Wilson's argument seems compelling. Consider the most horrific manifestation of religious warfare: the suicide bomber. A person who blows him or herself up in order to kill his or her opponents has lowered his or her individual fitness. Doesn't this mean that such behavior must be explainable only at the level of group selection? Not at all: the solution to this conundrum is implicit in the basic principles of population genetics. Recall that one of Darwin's requirements for evolution by natural selection was the existence of variation between the individuals in a population. (Darwin, 1859, pp. 7 - 59) Variation within populations is a universal characteristic of life, an inevitable outcome of the imperfect mechanism of genetic replication. Therefore, it follows that if the capacity for religious experience is an evolutionary adaptation, then there will be variation between individuals in the degree to which they express such a capacity.
Furthermore, it is not necessarily true that when an individual sacrifices his or her life in the context of a struggle, the underlying genotype that induced that sacrifice will be eliminated by that act. Hamilton's principle of kin selection (Hamilton, 1964) has already been mentioned as one mechanism, acting at the level of individuals (or, more precisely, at the level of genotypes), by which individual self-sacrifice can result in the increase in frequency of the genotype that facilitated such sacrifice. Trivers (1971) has proposed a mechanism by which apparently altruistic acts on the part of genetically unrelated individuals may evolve by means of reciprocal altruism.
Given these two mechanisms, all that is necessary for the capacity for religious behavior, including extreme forms of self-sacrifice, to evolve is that as the result of such behaviors, the tendency (and ability) to perform them would be propagated throughout a population. The removal of some individuals as the result of suicide would merely lower the frequency of such tendencies and abilities in the population, not eliminate them altogether. If by making the ultimate sacrifice, an individual who shares his or her genotype with those who benefit by that sacrifice will, at the level of his or her genes, become more common over time. (Wilson, 1975, p. 4)
I don't believe there's any evidence that such alleles exist.
The idea that there's a specific genetic propensity for religion is difficult to reconcile with our history. If religion alleles have been selected for thousand of years then how come European countries have been becoming secular in the past century? Are Europeans just learning to override the dictates of thei genes?
Or, is religion just an epiphenomenon—one of many cultural ways that a society can encourage cohesiveness? As a way of creating unity—and identifying strangers—religion is no different than patriotism or irrational devotion to a charismatic political leader (Obama as a religion? ). As a matter of fact, it's probably no different than the cliques formed by adolescent girls or the street gangs of teenage boys. They all serve the same purpose, albeit on different scales. They are all learned behaviors. In my opinion, religion is a product of nurture, not nature.
I don't think there is a "gene for religion". Instead, what I think may be the case is that there is an evolved capacity in the human mind that predisposes us to having religious experiences. This is essentially the argument made by Scott Atran, Pascal Boyer, Daniel Dennett and others. This capacity is similar to (and may even use the same kinds of neural circuitry) as the capacity for human language, which has been known for forty years to be an evolved capacity in humans.
ReplyDeleteTo be more specific, what seems to be the case is that we have a "mental module" wired into our nervous systems that makes the acquisition/formulation and transmission of religious concepts extraordinarily easy, in the same way that we can acquire/formulate and transmit linguistic concepts. Most of the specialists working in the field of the evolution of religion now agree that this is the case (see Atran's In Gods We Trust, Boyer's Religion Explained, and Dennett's Breaking the Spell for more), and it is these ideas that I referred to as the current "standard model" for the evolution of the capacity for religion.
What is currently missing in the "standard model" is a soild explanation of the evolutionary context within which the capacity for religious experience is most likely to have evolved.
That's where my "warfare" hypothesis comes in. As I suggest in my paper on the subject ("The Capacity for Religious Experience is an Evolutionary Adaptation for Warfare", MacNeill, A. (2004) Evolution & Cognition, 10:1, pp. 43-60), it seems to me that the most likely setting to result in the kind of differential reproductive success that would produce such an adaptation would be chronic (albeit intermittent) warfare. Furthermore, every increase in the tendency to acquire/formulate and transmit evolutionary concepts in the context of warfare would facilitate warfare, which would lead to a coevolutionary "arms race" between the capacity for religious experience and the capacity for engaging in warfare. As I have noted elsewhere, this results in what I have called "MacNeill's Law":
Religion facilitates warfare, which facilitates religion.
Please note the stress on facilitates. There is no gene for either religion or warfare (although Dean Hamer has argued for the former). Instead, there is an inherited tendency for religious concepts to reinforce the ability to engage in warfare, and vice versa.
How much of this tendency is genetic and how much is learned is perhaps interesting, but not particularly relevant. one of the cornerstones of ethology is that virtually all behavioral adaptations are a blend of innate and learned components.
The only relevance of the degree to which such behaviors are genetic versus learned may have to do with how long and how easily such behaviors can be modified. Based on precisely the the kinds of evidence that Larry points out in his blog post, it seems to me very likely that the degree to which such tendencies are inherited via changes in the nervous system is significantly greater than the degree to which they are directly coupled to changes in the genome.
How would you go about testing your model Allen? The evidence for warfare in the palaeolithic record is next to nothing. This may reflect preservation or it may be real either. Either way, it seems to rule out one possible concrete test of your hypothesis.
ReplyDeleteI guess we could look for indirect evidence of reasonable degrees of cognitive sophistication. So, for example, you have good evidence for Lower-Middle Palaeolithic hunting at sites like Schoningen. This would suggest the ability to plan and co-operate. Could this possibly be a scenario in which some sort of "warfare" could occur. But there is no particularly strong evidence for religious behaviour.
Ooops, that wasn't meant to be anonymous.
ReplyDeleteActually, there is a fair amount of evidence for intermittent warfare throughout the paleolithic. For more on this, see
ReplyDeleteKeely, L. (1996) War Before Civilization, Oxford University Press, ISBN 0195091124, 245 pp.
Kelly, R. (2000) Warless Societies and the Origin of War, University of Michigan Press, ISBN 0472067389, 192 pp.
LeBlanc, S. (2003) Constant Battles: The Myth of the Peaceful, Noble Savage, St. Martin's Press, ISBN 0312310897, 271 pp.
All three authors provide evidence for intermittent warfare going back tens of thousands of years.
Furthermore, Wrangham & Peterson (1996) provide copious evidence of warlike behavior among chimpanzees, and Hans Kruuk documented similar behavior among spotted hyenas. Warfare is almost certainly as old as our species, if not older.
Oh, that's interesting, thanks. I just noticed this paper:
ReplyDeleteThorpe, I.J.N. (2003) Anthropology, archaeology, and the origin of warfare. World Archaeology, 35 145-165.
Obviously it's an area of palaeolithic archaeology I need to read up on. And there I was thinking I knew everything!
Whilst I'm checking that out, do you have any particular expectations/predactions about the relationship between archaeological evidence for warfare and religion?
I might be getting your point Larry: We don’t have a ‘proprietary’ religious gene anymore than we have a gene for riding a bike, playing football, or inventing quantum theory. The heuristics and morphology of humanity have a certain combinatorial portability/generality about them which allows them to move into highfalutin and speculative areas like primary ontology, ultimate necessity (Aseity), meaning, cycling, and football. But I think you’ll find that conceptual morphology, like structural morphology, has an underlying ratchet, which tends to lock in outcomes and makes reversibility difficult. So questions about whether questions of meaning have meaning are likely to stick around. Tough luck Larry!
ReplyDeleteThis is the first time I’ve seen you seriously engaging the religious question in a theoretical way, or did I miss the first time? Much better than your Father Xmas theory.
Larry and Allen:
ReplyDeleteHi. I've studied the cog sci of language acquisition, religion and cognitive development and modularity more generally. I've read Boyer and I recommend it. The language acquisition researcher that I most highly esteem is Michael Tomasello, author of Constructing a Language.
Anyhow, just a few things on the thinking regarding the evolution of the cognitive capacities for language and religion.
As you each seem to agree, neither is viewed to have been the product of individual selection. Each has deep roots in more basic cognitive functions and the interactions among these cognitive abilities. For language, social cognition (the ability to acknowledge, understand and engage in reasoning with respect to the minds of others) is huge. Add to this our pre-existing excellent pattern-finding abilities which enables us to recognize recurring relationships at various scales (e.g., phonemes, syntax), the nature of our conceptual systems (e.g., our ability to acquire and process information regarding the physical and social world - e.g., physical and social relations), the nature and capacity of our working memory, and you've got some great foundations for language. In considering the evolution of language, you also surely cannot leave out meme theory. Just as Boyer discussed how certain religious ideas are more easily grasped, retained and transferred than others, the same is true of language. Speech sounds, words, constructions of various sizes, generalized rules (e.g., morphological and syntactic), and so on that jive better with the nature of our cognitive systems and with the exigencies of our life situations (e.g., quicker is often better) will tend to propagate more effectively.
When it comes to religion, a number of non-religion-specific cognitive traits are relevant. There is our tendency to detect and remember coincidences over and above non-coincidences (e.g., the phone call from Bill right after we were thinking about Bill is far more salient than the phone call that didn't come or the thought that wasn't had prior to a call from Bill). Our tendency to draw false correlations and to infer that certain things that have happened have happened because of something to do with oneself (e.g., the roof of the house fell in when the family was home because they were evil; it couldn't have simply been a case of bad timing), the root of much superstitious behaviour. What Pascal Boyer discussed as an over-blown tendency to attribute mindedness to external entities. The confirmation bias. Then you factor in the distress that can come from the mere possibility of having to scrap foundational aspects of your worldview. Giving up religious beliefs could mean tearing down one's framework for interpreting right and wrong, how to live, why to live, what matters, and so on. Then throw in the often very reasonable fear of social ostracism - the last thing one wants when their understanding of the world has just been torn down.
Another thing that has to be considered always is that the religious myths that develop are rooted in the social issues that are present at the time of and in the history of the writers. As such, they are important for the early communities and they tend to be focal units of communities and social divisions between communities (e.g., demonizing outsiders). In-group/Out-groupism is clearly a powerful force.
Anyhow, that's all I can really say now. Sorry for a loss of organization in the religion section. I got in a rush.
But yeah, considering the nature of extant cognitive abilities (what cognitive abilities there are and how they function) in conjunction with situational exigencies is obviously the way to go, as you both well know. I imagine I've contributed little to nothing that each of you did not know very well to begin with...
If religion alleles have been selected for thousand of years then how come European countries have been becoming secular in the past century?
ReplyDeleteDoesn't that depend on how you define "religion"? Given that Professor MacNeill is talking about a predisposition to having religious experiences, can you limit the evidence on how secular Europeans are to the numbers who adhere to formal religions? Don't you have to include the followers of various forms of woo and even the soccer fans who are willing to engage in ritual (and some not-so-ritual) combat over their differing "faiths"?
Allen MacNeil says,
ReplyDeleteI don't think there is a "gene for religion". Instead, what I think may be the case is that there is an evolved capacity in the human mind that predisposes us to having religious experiences.
Let's try and clarify what you mean by "religious experience."
Is this "evolved capacity" any different than the one that makes it easy for humans to believe in UFO's, homeopathy, psychics, and weapons of mass destruction in Iraq?
Why do you specifically refer to a special "religious experience" as an adaptation? Could you use the word "gullibility" as a synonym as in; humans are predisposed to be gullible? How about "irrational"? Can that work as a synonym for "religious experience" as in; "irrationality is an adaptation"?
The range of religious experiences in human cultures covers such a disparate group of beliefs that it seems strange to lump them all together as a single type of evolutionary adaptation. I don't see that the Christianity that provoked the inquisition has much in common with the animism of native North Americans or the philosophy of Buddhism.
Furthermore, your assumption of the adaptive value of "religious experience" doesn't jibe with the existence of non-believers throughout history, does it?
Let's do a simple thought experiment. Imagine a time 100 years from now when belief in supernatural beings has disappeared from Western societies. How would you explain that? How could people abandon the "religious experience" if it is "an evolved capacity in the human mind that predisposes us to having religious experiences"?
I can see how we could reject a meme or a cultural tradition but aren't you saying that religion is something more than that?
Just idling my curiosity ...
ReplyDeleteirrationality is an adaptation
Why not? Isn't the "fight or flight" reaction irrational by definition? And doesn't that engender a lot of "end product" behaviors not particularly closely tied to the original adaptation?
Furthermore, your assumption of the adaptive value of "religious experience" doesn't jibe with the existence of non-believers throughout history, does it?
Huh? Do even strong adaptationists expect selection to result in spread of the trait throughout the 100% of the species? Especially with such a young species as H. Sap? What about frequency dependent selection?
Larry asks some good questions. I'll try to respond to them here:
ReplyDeleteIs this "evolved capacity"; any different than the one that makes it easy for humans to believe in UFO's, homeopathy, psychics, and weapons of mass destruction in Iraq?
You lump together a group of concepts, some of which are related and some not. For example, I think a strong argument can be made for an evolved human capacity to believe in UFOs and psychics (and perhaps
homeopoathy), on the basis of the kinds of cognitive predispositions that Boyer discusses in Religion Explained. To be specific, many believers in all three of these phenomena cannot be diverted from their belief by disconfirming evidence. On the contrary, every piece of evidence against such phenomena is usually transformed into evidence in their favor.
The same is, of course, true for religion, but not for "belief" in the existence of WMDs in Iraq. The latter is either the result of paranoia or having been misled using pseudo-empirical evidence by someone in a position of authority. Furthermore, now that we know that the claims about WMDs in Iraq were false, few people continue to believe in them, unlike the cases for UFOs, psychics, and homeopathy.
Why do you specifically refer to a special "religious experience" as an adaptation?
Because, as William James and others have pointed out, "religious experiences" have several characteristics that set them apart from other emotional and cognitively generated experiences. Boyer and Atran have carefully described these in their books, and have suggested evolutionary reasons for why such experiences show such similarities across cultures and among different religions.
Could you use the word "gullibility" as a synonym as in; humans are predisposed to be gullible?
In my experience, gullibility is "curable" via the application of rigorous logical thinking, and especially attention to arguments based on evidence. However, as I have already pointed out, religious concepts (based on religious experiences) are almost immune to rational argument, and therefore do not qualify as a form of "gullibility".
How about "irrational"? Can that work as a synonym for "religious experience" as in; "irrationality is an adaptation"?
The term "irrational" is so broad and ill-defined as to be almost meaningless. It's a term defined by exclusion, like the term "invertebrate"; that is, anything that isn't a vertebrate. Yes, it's clear that religious experiences and religious thinking are a form of irrationality, but so is gambling addiction. If you want to get logically rigorous about religious experiences, then one needs to treat them like any other empirically observable phenomenon: look for patterns and try to correlate such patterns with causes. Simply labeling religion as "gullibility" or "irrational" is to avoid thinking logically and rigorously about the phenomenon.
The range of religious experiences in human cultures covers such a disparate group of beliefs that it seems strange to lump them all together as a single type of evolutionary adaptation.
Maybe it seems to to you, but to comparative anthropologists like Atran and Boyer, and to human ethologists/evolutionary psychologists like myself, "lumping them all together" doesn't come close to describing what we do.
Consider the scientific study of animal mating systems: there are lots of different species of animals, and a tremendous diversity of mating systems. Does this mean that we cannot find commonalities between them, patterns that point to underlying evolutionary dynamics? Of course not; that's what science is all about – looking for patterns amid the "blooming, buzzing confusion" of nature.
I don't see that the Christianity that provoked the inquisition has much in common with the animism of native North Americans or the philosophy of Buddhism.
Perhaps you don't, but could that be due to the fact that you have never systematically studied religion and religious experiences in the way that trained comparative anthropologists have?
Furthermore, your assumption of the adaptive value of "religious experience" doesn't jibe with the existence of non-believers throughout history, does it?
I dealt with this objection in the first section of my paper in Evolution & Cognition. It's an extremely common mistake for people (including some not-very-well-trained evolutionary biologists) to assume that for a trait to be an adaptation it must be both pan-specific and invariant.
Nothing could be further from the actual case. Consider beak size in Galapagos finches: rather than all of the beaks being exactly the same size, the whole point is that there is considerable variation in beak sizes. Such variation is one of the three prerequisites that Darwin proposed for natural selection.
Therefore, rather than invariant pan-specificity being evidence for adaptation, what one should look for is what R. A. Fisher called "continuous variation". That is, variations in the observable parameters of the trait under consideration should approximate a normal distribution (i.e. a "bell-shaped" curve). When one observes such a pattern of variation, one can be pretty confident that one is dealing with an evolutionary adaptation.
So, does the capacity for religious experience show such "continuous variation"? Of course it does; there are people who have never had such experiences, and people who cannot not have them. However, the majority of most human populations have at least some capacity for such experiences, as evidenced by the fact that there is no known human culture in which at least some people do not have them.
Let's do a simple thought experiment. Imagine a time 100 years from now when belief in supernatural beings has disappeared from Western societies. How would you explain that?
This gedankenexperiment assumes that belief in supernatural beings will disappear in as little as 100 years. Based on an historical and cross-cultural analysis of the distribution and prevalence of religious belief among human cultures, this assumption is patently absurd.
Once again, there is no known human culture in which some form of religion does not exist. At the same time, within every known human culture there is considerable variation between the degree to which individuals have such experiences. Both of these observations point to the conclusion that such a capacity, like the capacity for language, is an evolutionary adaptation.
Consider the fact that the understanding and use of the scientific method has been a dominant form of rational thinking in the majority of western cultures for at least four centuries, peaking in the 20th century. Yet, religion and religious experiences show no signs of disappearing; certainly not in the United States, where the application of science and technology has reached perhaps its highest development to date.
How could people abandon the "religious experience" if it is "an evolved capacity in the human mind that predisposes us to having religious experiences"?
As my previous comment should indicate, people haven't abandoned such experiences at all, despite several centuries of increasingly rigorous use of scientific thinking. Ergo, assuming that they will, and then basing some kind of conclusion on such an assumption is equivalent to assuming that we will all learn to fly without artificial assistance. Yes, individuals can and do (sometimes) abandon religious thinking, but (based on the evidence) human societies have not, despite ample opportunity to do so.
To state what I have come to believe is the consensus view among people who's avocation it is to study human behavior and human cultures, religious experience and religious thinking is apparently nearly universal in our species, and almost immune to rational thinking. This argues very strongly for the assumption that we are strongly predisposed toward religious experience and religious thinking by our evolutionary heritage.
I can see how we could reject a meme or a cultural tradition but aren't you saying that religion is something more than that?
This is precisely my point: if religions or religious experiences were simply "mind viruses" as Richard Dawkins refers to them, then they should be susceptible to irradication in the same way that other memes are irradicated: by countering them with evidence, logic, and rationality. The idea that the Earth is the center of the universe was eventually irradicated in precisely this way.
However, the idea that there are entities in the universe that have properties that violate our understanding of how nature generally operates in compelling and memorable ways (i.e. "supernatural" entities, like angels, demons, devils, gods, sorcerers, witches, and disembodied spirits, to name just a few) is apparently so easy for us to acquire and so compelling once acquired that even the most logically rigorous cultures cannot completely irradicate such beliefs.
As just one historical example, religion in any form was ruthlessly persecuted as official government policy in the Soviet Union, which also used the force of government to promulgate a social ethic of culturally universal appreciation of science and technology. As we approach the centennial of the founding of the Soviet Union, two things are obvious: it no longer exists, but the religions that it attempted to suppress for almost a century are flourishing.
Don't get me wrong: I am not in any way a "religious believer". However, I have a very healthy respect for the power of religious experiences and religious thinking to alter peoples lives, for good and for evil, in the same way that sex alters people's lives. In the same way that churches have reinhabited the former soviet Union, religious monasteries in which celibacy was an absolute rule have all but vanished in most countries of the west. Both of these phenomena point to the conclusion that the capacity for religious experience, like the capacity for sexual experience, are evolutionary adaptations that are deeply, perhaps unirradicably embedded in the human psyche.
allen macneil says,
ReplyDeleteLet's do a simple thought experiment. Imagine a time 100 years from now when belief in supernatural beings has disappeared from Western societies. How would you explain that?
This gedankenexperiment assumes that belief in supernatural beings will disappear in as little as 100 years. Based on an historical and cross-cultural analysis of the distribution and prevalence of religious belief among human cultures, this assumption is patently absurd.
Once again, there is no known human culture in which some form of religion does not exist. At the same time, within every known human culture there is considerable variation between the degree to which individuals have such experiences. Both of these observations point to the conclusion that such a capacity, like the capacity for language, is an evolutionary adaptation.
Most of Western Europe has gone from 100% Christian to almost 50% atheist in less than 100 years. The number of non-believers is increasing so it's not unreasonable to think that there will be very few of them (10%) by the end of the 21st century. How do you explain that? Is it just the ones with the fewest god genes that are surviving?
This comment has been removed by the author.
ReplyDelete"How much of this tendency is genetic and how much is learned is perhaps interesting, but not particularly relevant"
ReplyDeleteIt's very interesting macneill would say that, since by doing so he has immunized his "evolutionary" explicative framework from any much serious evolutionary discussion
One has to wonder, if religion were found to be genetically determined, would macneill continue to wave it away? Would'nt it fit right in with his expalantion, making it all the more biological and evolutionary? Hot stuff!!!
But this is never going to happen. Any genes common to religion and war are also likely to be common to polka-dancing. There is nothing specifically genetic to it. It's just one of many possible outcomes of the very plastic human brain. Genes make this plasticity possible but have no role in specifying the particular outcome of religion.
I think macneill's idea that religion and war can get into a positive feed-back on each other is an interesting idea. His explanation of it in "evolutionary" terms is unnecessary, and by playing it as if in a pre-cultural level, ultimately results insufficient and insatisfactory. The more conventional, area-sepcific terms of the social sciences are required for a satisfactory understanding, and may well not include many biological explanations that are too basic and unspecific to much enlighten this particular topic