Why are creationists creationist?

Where two principles really do meet which cannot be reconciled with one another, then each man declares the other a fool and heretic. [Ludwig Wittgenstein, On Certainty, 611]

A question I have wondered about for a long time is this: why do people become creationists? I mean, nobody is born a creationist (or an evolutionist, or a Mayan cosmic-cyclist, etc.). These are views that one acquires as one learns and integrates into society. But we live, notionally, in a society in which science has learned more about the world in 300 years than in the prior million or so. So, why do people become creationists when the bulk of modern thought depends so heavily on science?

We can treat this as a general question about those who reject some or all of science - global warming skeptics, HIV-deniers and so forth. "Creationist" here is a stand-in for this general phenomenon of opposing science in modern society.

I ask this question as someone who once was a creationist, although I was troubled by it at the time. I know why I took it up, and why I abandoned it, but I wonder does my experience generalise? To work through this, I'm going to make a rather shocking assumption - creationists are being rational in their choice of world view.

I know, I know, we usually attack them as being ignorant and irrational, but suppose that they aren't. Suppose that every step in their doctrinal development is the best available choice for them at the moment. In short, suppose that they are acting as rational agents. What then?

Well before you get all bent out of shape, or start preening, depending on your point of view, let me me do some definitions, and present the epistemological foundation for this. "Rational" means different things to different people, as you might expect. In traditional philosophy it means something like "making the best inference on the basis of the best evidence". So an ideally rational agent is one who has all the time in the world to gather and evaluate evidence logically, on the basis of the best available goals, using a "wide reflective equilibrium" method, which means making sure that all your epistemic commitments are maximally coherent with each other. Let me know if you meet this guy.

Of course, traditional philosophy doesn't require that we are ideally rational, only that we asymptotically approach that ideal, in order to qualify as "rational". We are expected to take into account future expectations as well as present ones, and think our views through in hard detail. Let me know if you meet many of these guys, either. Cognitive psychology indicates that far from being the, or a, norm, almost nobody is rational in this sense.

This is worrisome. What use is a notion of "rationality" as an explanation of epistemic choices if nobody actually does it, or even seriously approaches it in practice? Are we left with a view of reason that only Marvin the Paranoid Android can attain (no wonder he was depressed!)? Others have thought this equally worrisome, including Herbert Simon, and more recently the ABC Group set up by Gerd Gigerenzer and Peter Todd at the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin. Todd is now at Bloomington, Indiana.

Gigerenzer and Todd propose what they call, following Simon, "bounded rationality". This is the rationality one has when time is limited and you must make quick, or as they call them, "fast and frugal" heuristic decisions based on limited information. The basic idea is that in case where you have to make environmental inferences, such as running away from a predator to use the classic example, there is not sufficient time to consider all the information and come to a wide reflective equilibrium. So you use cues, shortcuts and general heuristics. That is, your "rationality" is bounded by resource limitations - of time, evidence, and cognitive limitations of memory and processing, all of which add up to a cost. As Quine once said, "Creatures inveterately wrong in their inductions have a pathetic, but praiseworthy, tendency to die before reproducing their kind". So what to do? Use heuristics that work.

Which ones, though? Gigerenzer and Todd identify a number of strategies that people appear to follow, when trying to predict, for example, what things are more common in the environment. They use a "take the best" strategy - when you run out of time or resources, simply take the best available solution you have so far encountered. This has a number of steps to maximise one's "return" on cognitive investment.

1. The Recognition Principle - if only one of two objects is recognised, take the one you know. If neither is, randomly pick one. If both are, go to Step 2:

2. Search for Cue Values - recall the cues that are associated with these two choices.

3. Discrimination Rule - decide whether the cue discriminates between them.

4. Cue-substitution Rule - if it does, stop searching for cues. If not go back to 3 and pick a new cue value.

5. Maximising Rule for Choice - choose the object with the positive cue value.

The original version dealt only with inferences that were ecologically valid, that is, in which the cue correlates with the "right" outcome, or correctly predicts the target. This is sometimes called ecological rationality. But there is another kind, which I will call social bounded rationality ("social rationality" has a somewhat idfferent meaning in economics and cognitive psychology generally). The recognition principle indicates how this applies. If you know of two city names, but one is more familiar to you, you may infer that it is the biggest, a case study done by Gigerenzer and Todd. And you will be right, most of the time. But why is that? The answer is that you hear more from your social contacts, including the media, about the bigger city, so there is a correlation between the cue and the information.

Using the social cues helps you to deal with a large number of previously unencountered decisions based on small sample sets. And it is a remarkably accurate way to do this. But it is less reliable about enviromental facts that are outside the usual experience of your society. Sometimes, this inference leads you to choose false alternatives.

When a child enters into the conceptual world of its society, it relies heavily on these social cues for the basic epistemic commitments on which it basis its further conceptual development. In particular it relies on a series of social authorities, ranging from parental, peer and eventually social sources like teachers and authority figures such as political leaders, public intellectuals, and religious figures.

If these authorities are authoritative because they have glommed onto the facts of the environment, then they are a reliable guide, but often they are authoritative for reasons other than factuality. Even science students at university level are taught concepts that are authoritative for distal reasons. textbooks will repeat errors of fact or out of date science because the cost of maintaining them to a high degree of accuracy are such that a little error, or a fashionable view of the authors, can be tolerated.

Nobody is born knowing much. So a fast and frugal inference allows us to quickly gather the conceptual equipment to engage in our society, but this means that we will accept many things that are "true" solely because the environment being cued here is the social environment, not the environment of the nonsocial world. A case in point is the "common sense" view that we are in control of our actions, when neurology shows us that the "ego", if it exists, tends to construct rationalisations for actions after the action is precipitated.

I mentioned conceptual development above. this was deliberate. The learning process is just as much a developmental process as the development of legs or puberty. One major difference, though, is that it may seem that we can backtrack in our conceptual development, sometimes called "unlearning", in a way that we cannot do in biological development. In biology, development involves the triggering of differentiation of cells, so that pluripotent stem cells become less able to turn into specialised cells as the lineage develops. Some evidence exists, though, that our cells can be triggered to produce more potentiated cells in special cases, but overall, once a cell has become a neuronal cell, it won't turn into a glial cell or a haemopoietic cell.

But it seems to me that our conceptual development is not so different from this. Overall, once we have embedded an epistemic commitment in our conceptual set, it is unlikely to revert in proportion to how embedded it is, as we develop. The last in will be first out, but as you go deeper and deeper, it gets less likely that we will abandon that value.

So let us consider conceptual development in terms of a trajectory of changes that occur from early childhood to maturation. In the next post, I will set up the conceptual "spaces" in which proscience views and "folk tradtion" views develop, and argue that it we are to teach people science, we need to change the way in which we teach it, and why.

Categories

More like this

For whatever reason, some people (at any point in life) identify with a group--perhaps because of one large idea held by the group. But along with that one large idea, they take on the baggage of many smaller ideas that are also part of the group's identify. I have a particular case in mind, where someone may have joined the group because it provided a major explanation (i.e., scapegoat): "My life sucks because of those atheistic liberals." The additional baggage includes anti-evolutionism, which is odd because this person used to be quite interested in ancient life and how it changed over time.
But any explanation is likely only one of many.

I have a corresponding theory. Everyone has a fish-eye view of the world, with different radii corresponding to "raw intelligence and rationality" and centered on domains that are important to them. These domains are directed by both personal and social psychology.

So we all apply our limited cognitive skills differently. You see "science" as an overarching value somehow, but that's not how it's perceived generally. (Perhaps science's concrete achievements are, but that's a different matter.) I guarantee that whole areas of life you find uninteresting are, to someone else, full of wonder, knowledge, and importance; they marvel at your limited vision and shallow unconcern.

A conclusion based on this theory has me profoundly depressed: our technologically-driven wealth has allowed us to drift away from social definitions of domain importance, into more purely solipsistic, contingent, and idiosyncratic domain rankings.

It seems to me that if you are born and raised into a particular worldview (e.g. creationism) your are going to go through a number of years where you do not have the education or experience to really think about why you believe it. You believe because that's what you were taught to believe. As you get older and begin learning more about your worldview, and the rest of the world, you are either going to maintain your beliefs or start questioning them. How people decide between the two, I don't know. I think if you move towards a scientific or historical field, perhaps you'll be exposed to more contraty beliefs, thus opening you up to questioning the worldview you were raised in.

So it basically comes down to indoctrination of the young? That would seem to square with my observations, though I would add that there are often conflicting principles, and as people become more and more aware of the evidence for evolution, the more the cue that "science is right" wins out over the "the bible is literally true" cue, which is how many former creationists come to be evolutionists.

This is why it's important to keep the message available to the public -- no controversy is ever entirely over. We have often tended to retire to our research, thinking that the public has been overwhelmingly convinced of the correctness of evolution over creationism. We forget that philosophical contention in the social realm never ends, requiring never-ending attention from advocates.

It's also important to realize that a good deal of our everyday "decision-making" is subverbal or completely subrational. Emotions in general, infantile "imprints", pheromonal cues, habits, even some instincts... a lot goes on beneath the surface of our minds. In contrast, the scientific idea of "good evidence" is learned as an abstract (though hopefully habituated as well).

By David Harmon (not verified) on 10 Sep 2006 #permalink

Ah, your philosophical training - makes you write everything better than I could ever dream to. But you may remember I tried to sift through a variety of possible causes for being/becoming a creationist here.

I've posted a comment to this effect previously, I think, but...
One of the main correlations I seem to see between science-denial (and especially - but not only - religion-based science-denial) is an inability to tolerate uncertainty.
I can't say whether strong religious beliefs cause one to lose ability to cope with uncertainty, or if inability to cope with uncertainty tends to drive one towards religiousness, but either way, that seems to be the main difference between scientific thinking and religious or other anti-science thinking.
A religious belief is absolutely certain, even if it is wrong (and at the very least, one has to recognize that absolutely certain religious beliefs from conflicting religious traditions can't all be correct, even if one of them actually were The Truthâ¢. The point is, all of these religious beliefs are equally Absolutely Certain nonetheless.). By contrast, Scientific facts always, by their very nature, have a degree, of uncertainty left in them even when they are well-tested and supported by evidence. (e.g. the "theory" of Evolution, the "theory" of Gravity, the "theory" of the Horrendous Space Kablooie ("Big Bang" sounds so Plebian), etc.)
Those of us who are comfortable dealing with the fact that the natural world is uncertain are not bothered by having to amend our beliefs as new evidence comes in. Those who are not feel a need to cling to any Certainty (religious or otherwise) they can attain, no matter what. ("No, no, I'm certain autism is caused by vaccinations!" "No, no, I'm certain that what I was taught in "Jesus Camp" about the book of Genesis is Trueâ¢!", etc.)
P.S. on an unrelated note, what the heck is this blog system's problem??? It does not appear to accept <abbr>, <strike>, <div>, or <span> tags, presumably among many others...(Is there a listing somewhere of the pathetically-limited HTML tags permitted in the system somewhere?)

From Richard Dawkins tv-documentary "The Root of All Evil" (which is about religions) I learned that children are genetically programmed to believe everything that its parents says. "Children do not question what its parents says becasue of survival reasons; ie if its parents says that it is dangerous to go near the cliff, they believe that; those childrent that did not have this genetic trait died off"

I told this to a girl, and she had the comment to this that here parents had told here that she would get green spots in her hair if she washed it with soap; for a long time in adulthood she believed this.

So if tradition says this and that to children, then that tradition lives on.

Now that read was worth my day's subscription to broadband: I'm going to have to print it off to read again. Once a creationist is formed (presumably pretty young) there must also be a peer-pressure to keep thinking that way: maybe an import question 2 is what keeps someone a creationist? As the only atheist in a very religious family (one nun, two priests) the family and peer pressure to conform and agree is immense - even to pretend to conform and agree. Most of my family have serious doubts about their Catholicism, but won't voice them or act out of fear of upsetting or letting down siblings and friends they think are firm in their faith and are a large party of their social networks. The Pope's recent flirtation with ID aside, it was possible to be a Catholic reconciled with evolution as long as you accepted that a human embry has a moment of 'ensoulment'. As a fundamentalist who believes that the Bible is the literal revelaed word of God, you HAVE to believe in Genesis, because if you reject that cornerstone, the whole bloody lot comes crashing down. Looking forward to part 2.

Coturnix, that is a hell of an essay, but much broader in scope than I am trying to deal with in this series. You have given me some food for thought, though. Don't know how I missed it the first time, but I was travelling at the time.

PS: Did you meet a guy called Stefan Linquist at Duke? He's my colleague here now.

It seems that some people are just DUTIFUL. They feel strong bonds of loyalty to institutions and authorities and scrounge for arguments to defend them. I think we're all like that a little bit, but Creationists and the Religious Right seem particularly driven by a sense of loyalty/duty. I didn't come up with this notion on my own. It comes from reading Bob Altemeyer-derived stuff on Authoritarian Personalities and reading Enlightenment critiques of religion. I don't know how much weight either carries.

I've never been a Creationist (and hope John will post on what that was like), but I did grow up in the Deep South surrounded by fundamentalists and evangelicals, including half of my extended family, so I know them pretty well from the outside. Or did.

What strikes me about Creationists is how fiercely they're trying to defend whatever it is they're defending (usually by attacking perceived enemies). They say they're defending the Bible, and I used to think that's what it was. It seemed sort of rational and democratic. All the answers in one book that anyone can read themselves. It made sense that people would defend truth in the hand against truth of the experts off at some University. But the fact is, they don't really read the bible that much--except what's assigned in bible study classes--and they don't usually know the bible that well. So I don't get what they're defending.

I don't buy the notion that they're just crazy people infected by the cancer of irrationality. I feel like they have at root some actual grievance--wrapped in nonsense, yes--but some real grievance. I think they're gone so far into Dominist-type stuff that they're root grievances may be beyond redress. But they're defending SOMETHING.

John, you were a Creationist. Does this sound right at all, or am I way off?

Thank you. I am looking forward to your future essays on this topic as this is something that really interests me a lot.

I know Stefan very well. Say Hi to him when you see him next time - we both attended the Duke Philosophy of Biology group for many years (I've been remiss at going there for the last two years, yet I dutifuly download and read the papers every two weeks anyway!)

Not entirely sure where you're going with this, John (that's a good thing, I suppose, or there wouldn't be much point on your going on). One thing you might like to think about is the literature on the epistemology of testimony. As many philosophers have pointed out, if testimony isn't a way of acquiring knowledge, then we know precious little. Even scientists are heavily reliant upon testimony: have you checked the algorithms you use? Can you, even in principle? What about the machines you use, and the alorithms they implement? What about the chemistry you rely on, and so on.

Of course, your topic is rationality and not knowledge (BTW, I disagee that cognitive psychology shows that we're not rational. I think it shows that there are systematic deviations from rationality - some of them important - but I doubt that it shows we're not rational overall). Its clear that the testimony of creationists is not a good means of gaining JTB, because it fails the T tst. But it can be rational to believe someting that's not true. Suppose you think that rationality is a normative enterprise, in the following, strong, sense: it is rational to believe something when you don't infringe epistemic duties in believing it (I model this on the deontological conception of knowledge). In that case, you might take blameworthiness as a guide to epistemic duty, and from that you could conclude that (since the young person has no reliable means of distinguishing good epistemic authorities from bad ones that could be used to distinguish between the creationist and the evolutionist), it can be rational (because not blameworthy) to accept their testimony. From that point on, Baysian considerations work to maintain the belief.

Now, I think that science is in fact a case study in the rationality of knowledge gained by testimony (and the social production of knowledge generally). But on the assumption that epistemic duties are a good guide to rationality, it might regularly be the case that a conscientious agent is not in a position to see that this is the case. For me, this is just further evidence for responsibility scepticism, which is my default view.

Well Neil, we'll see how well it goes. The next post is up.

I'm trying to address one, and only one, question here - are creationists (boundedly) rational in their epistemic commitments? My answer is yes, and that this will inform us how to respond, educationally and politically, to their challenge to science. I do not think rationality is (necessarily) normative. In fact, what I'm doing here is simply descriptive (although you may think I have lost the plot in that respect).

lockean asked what it was like to be a creationist. Well, it was a time of serious internal conflict for me. I held a number of (it turned out) contradictory views, adn the tribulation of trying, and failing, to reconcile them led me eventually to rely on the authoritative opinions of those in the Christian tradition who told me there was no conflict, and so I tried to reconcile it that way (in the end I finished with a Tillichian/neo-Thomist view of creation, which satisfied nobody but me. Shortly afterwards, and not because of that, I lost my faith and the issue became otiose).

Marieke Saher has made doctoral thesis about beliefs on food and health, in abstract:
Paranormal believers as well as believers in alternative medicine were much more liberal than skeptics in violating categorical boundaries and attributed, for example, intentionality (mental) to body growth (biological) and life (biological) to energy (physical).
http://ethesis.helsinki.fi/julkaisut/kay/psyko/vk/saher/abstract.html

Believers were not necessary ignorant about food, so pooring more science data on them doesn't help to "cure" those beliefs.. this could apply to many creationists too. We'have seen in forums many times how telling facts doesn't "cure" at all..

By MrKAT, Finland (not verified) on 14 Sep 2006 #permalink

Greg:
"It sounds like a cliche, but it really is not possible to understand what it's like to be infected by religion without ever having been thus infected yourself. It is painfully clear in some passages that Dawkins simply has no idea what it's like to be a person of faith, how different priorities and standards of "evidence" apply."

How do you explain that Dawkins had two religious periods then? ( http://en.wikipedia.org/wiki/Richard_Dawkins ) Isn't it more likely that people have different (religious) experiences, and that memories are at least as much a construct as consciousness which neuroscience seems to say?

Dawkins may or may not had different standards in a bounded rationality, and he may or may not remember this correctly. But he may also be discussing ideal rationality, ie making a coherent view.

For an interesting idea how cognitive dissonance may extinguish one of several conflicting epistomic comittments in a persons bounded rationality, see Wilkins. ( http://scienceblogs.com/evolvingthoughts/2006/09/why_are_creationists_c… ) It may be difficult to reach back to analyse one's consciousness before a conversion to make comparisons.

Kristine:
Your review so far is apt, beautiful, concise. You complement PZ on Dawkins' perspective with Dawkins' polemics, so now I want to read this book too. (TL adds another book to the growing wish list.)

By Torbjörn Larsson (not verified) on 27 Sep 2006 #permalink

Oops! Commented in the wrong tab - but at least Wilkins sees that some references his posts. :-)

By Torbjörn Larsson (not verified) on 27 Sep 2006 #permalink