Tribalism, Cultural Cognition, Ideology, we're all talking about the same thing here

From Revkin I see yet another attempt to misunderstand the problem of communicating science vs anti-science.

The author, Dan Kahan, summarizes his explanation for the science communication problem, as well as 4 other "not so good" explanations in this slide:
Kahan slide

He then describes "Identity-protective cognition" thus:

Identity-protective cognition (a species of motivated reasoning) reflects the tendency of individuals to form perceptions of fact that promote their connection to, and standing in, important groups.

There are lots of instances of this. Consider sports fans who genuinely see contentious officiating calls as correct or incorrect depending on whether those calls go for or against their favorite team.

The cultural cognition thesis posits that many contested issues of risk—from climate change to nuclear power, from gun control to the HPV vaccine—involve this same dynamic. The “teams,” in this setting, are the groups that subscribe to one or another of the cultural worldviews associated with “hierarchy-egalitarianism” and “individualism-communitarianism.”

CCP has performed many studies to test this hypothesis. In one, we examined perceptions of scientific consensus. Like fans who see the disputed calls of a referree as correct depending on whether they favor their team or its opponent, the subjects in our study perceived scientists as credible experts depending on whether the scientists’conclusions supported the position favored by members of the subjects’ cultural group or the one favored by the members of a rival one on climate change, nuclear power, and gun control.

 

Does anyone else think that maybe they're unnecessarily complicating this? First, denialism is not an explanation for the science communication problem. It is a description of tactics used by those promoting bogus theories. Denialism is the symptom, ideology is the cause, and what we consider ideology seems more or less synonymous with this "identity-protective cognition", while being less of a mouthful.

Call it what you will, when you have ideology, or religion, or politics, or other deeply held beliefs which define your existence and your concept of "truth", conflicts with this central belief are not just upsetting, they create an existential crisis. When science conflicts with your ideology, it conflicts with who you are as person, how you believe you should live your life, what you've been raised to believe. And, almost no matter what ideology you subscribe to, eventually science will come in conflict with it, because no ideology, religion, or political philosophy is perfect. Eventually, they will all jar with reality. And what do most people do when science creates such a conflict? Do they change who they are, fundamentally, as a person? Of course not. They just deny the science.

Denialism is the symptom of these conflicts, and this is where the problem with the term "anti-science" comes in. Most denialists and pseudoscientists aren't against science as the term suggests. I think of "anti-science" as being in conflict with established, verifiable science, without good cause. But most people read it as being against science as some kind of belief system or philosophy, which it usually isn't. And while some people do promote the "other ways of knowing" nonsense, for the most part, even among denialists, there is acceptance that the scientific method (which is all science is) is superior at determining what is real versus what is not real. That is why they are pseudoscientists. They try to make their arguments sound as if they are scientifically valid by cherry-picking papers from the literature, by using science jargon (even if they don't understand it), or by pointing to fake experts that they think confer additional scientific strength to their arguments. They crave the validity that science confers on facts, and everyone craves scientific validation (or at least consistency) with their ideology or religious beliefs. It sucks when science conflicts with whatever nonsense you believe in because science is just so damn good at figuring stuff out, not to mention providing you with neat things like longer life expectancy, sterile surgery, computers, cell phones, satellites, and effective and fun pharmaceuticals. This is why (most) pseudoscientists and denialists insist that the science is really on their side, not that science isn't real, or that it doesn't work. We know it works, the evidence is all around us, you are using a computer, after all, to read this. Anti-science as a term is too-frequently misunderstood, or inaccurate.

Pseudoscientists and denialists don't hate science, that's not why they're anti-science. They crave the validity that science confers, and want it to apply to their nonsense as well. Sadly, for about 99.9% of us, at some point, science will likely conflict with something we really, really want to be true. What I hope to accomplish with this blog is to communicate what it looks like when people are so tested, and fail. And I suspect the majority of people fail, because in my experience almost everyone has at least one cranky belief, or bizarre political theory. Hopefully when people learn to recognize denialist arguments as fundamentally inferior, they will then be less likely to accept them, and when it's their turn to be tested, hopefully they will do better.

More like this

Can you explain more about why you think the researchers on this topic are complicating things? I can't understand how your points relate to the actual research done by Kahan and his collaborators. Can you elaborate on what of his research studies you've read, and what you feel is missing or too complicated?

I hate to sound judgmental here, but he's the one doing actual research in the area. What weaknesses have you seen in his published research papers that lead you to these conclusions?

I am not disagreeing with the research, I'm disagreeing with the jargon! Why say in three words what you can say in one? Just call it ideology. I like his research, I think it's neat, I mostly agree with his conclusions. I think it validates what we've been saying is the likely cause of denialist argumentation.

The misunderstanding of the science/anti-science debate I was critical of was from his presentation which suggested denialism as an explanation for the "science communication problem". I don't think anyone suggests denialism as an explanation, it's a symptom. No pseudoscientist sets out to be a denialist, but those are the kinds of arguments that you are left with when one lacks facts. I think he mischaracterized the denialism thesis, which I think fits 100% with his work.

My complaint is that every time I see the word "cognition" I want to scream. Add "cultural" in front of it and the effect is amplified. Why use labels like "hierarch individualists" and "egalitarian communitarians." I'm not particularly stupid or uneducated, but these terms are just meaningless to me. Why do academics feel the need to jargonize simple concepts so that they're more turgid? It's like they're being purposefully obtuse to make the work sound more important. I'd say it far more simply. Ideological extremity causes people to behave in an irrational fashion with regard to risk and adversely affects their judgment of the validity of scientific information as they will chose "their side" over the argument backed up by facts and data. I think this is actually pretty obvious, and as I'm a big fan of Jonathan Haidt (who does a great job dropping the jargon), I think it follows from how humans actually appear to form opinions. The ideology comes first, and reason is used to dig in. It's all backwards. Training in science and scientific thinking in general is very counter-intuitive and unnatural for humans, this is likely why it's both uncommon and frequently discounted by the majority who are just cruising on their ideological short-bus.

We've talked about Kahan's research here before see here, and I admit I'm a fan even though I do have some criticisms about how far you can extend his conclusions based on the questions asked in his study. I still am not entirely convinced that liberals are more inclined to accept scientific information to inform their risk assessments compared to conservatives, because the question was regarding nuclear power. A better question would have been to ask about GMOs...oh wait he does!. Surprisingly, GMOs seem to set off both sides of the ideological divide, hrmmph. I'm sure there is some legitimate bogus risk assessment you can the liberals to flip over that's better than nuclear power. I'll have to think about it some more.

You have pegged my number one complaint about scientific papers. There seems to be something in the training of such people, or in their dealing with others in the same field, which renders them incapable of stating things in terms someone with normal, or even above average, intelligence can comprehend, instead of only people in their own field.

You want my take on the "problem" with the communication of science? The fact that, had I taken different courses, could have become one of them, but, because I chose computer science, and don't have all the math, and the complicated dictionary in my head, and so on, I need a damn translator to comprehend some of the stuff they write. Someone who barely passed high school... doesn't have any damn chance to understand even a fraction of what I do.

The number of people capable, and willing, to use common language, and understandable explanation... is a bit like finding English lyrics for Japanese pop music. Someone has probably done it right. But.. You wouldn't believe the "professional" attempts made, to produce such versions, by the actual companies that sell the originals. Its not that they use the wrong words, so much as too few, or too many, and it resembles someone looking through a dictionary, trying to find the most improbable combination of words possible, while still, more or less, if you quint hard, actually saying the same thing.

Kudos. I'm all for less needless academic jargon. It really doesn't contribute to anything except egos.

Just on something you said above. It's merely anecdote, but all the denialists I know personally - the anti-vaxers, the 911 truthers, microwaving water makes it poisonous, etc - are either post-modernists who are openly anti-science in the sense that they deny the value of the scientific method, or they are anti-science in the sense that they have mystic, supernatural beliefs and they see science as a threat unless it is drastically reinterpreted to incorporate woo. In this sense, I'm quite wary of conspiracy theorists and denialists who claim to be big fans of science. I've found what they call "science" is invariably remote from the scientific method, usually it boils down to making appeals from authority because they believe that's what science is. Most people in general aren't anti-science, but the people I know who are into pseudoscience definitely literally are "anti-science".

By Matty Smith (not verified) on 02 Nov 2012 #permalink

I study "cognition" (specifically, cognitive development), so I admit that I'm a little taken back by the fact that you hate the term so. I'm guessing you hate when it's used for evil (i.e., making concepts seem overly complicated) than for good (i.e., in limited amounts that don't mean anything more or less than the word implies).

I admit being a bit bogged down by the terms Kahan uses as well. I think he's trying to clarify the different kinds of backgrounds that people can bring to the table when they are evaluating claims from others, but the terms could be more transparent.

Part of my job as a teacher and mentor is to encourage my students to clearly and concisely state their ideas. We spend some time in my classes reviewing how science is communicated in the public, and it's both fascinating and frightening how the process goes wrong. And part of the problem is certainly on our end: we make claims that are unclear and leave room for doubt.

I disagree to the extent that "ideology", like "denialism", is a loaded term. The use of a phrase like “Identity-protective cognition” takes away the potentially pejorative sense. A religious person would likely take exception to their beliefs being dismissed as ideology.

There are ideological fads in science too. Think of eugenics, and other forms of social darwinism as well as almost the entire field of neo-classical economics, as well as Austrian economics, Marxism, Freudian psychology Gaia Theory etc. All of them elaborate castles in the air that don't seem to be too bothered by any connection to reality. Karl Popper, Austrian philosopher of science came up with the idea of basing science on refutability. Many of these pseudo-sciences are not refutable, that is, no matter what happens it never disproves this type of theory. Therefore, Popper reasoned these theories are not telling us anything.

On the other hand Gaia Theory led directly to Earth Systems Science, based on the idea that the physical earth and living systems are constantly interacting and changing each other. This has revolutionized the science of geology.

By Charles Justice (not verified) on 04 Nov 2012 #permalink

Jargon always starts out with good intentions, as a way to clarify by specifying. Invariably it leads to more confusion, as further specifying leads to jargon "splitters", while the confused reactionaries become jargon "lumpers". I'm on the side of the lumpers, myself - even relatively simple language becomes intolerably confusing since we all frequently use words for which each of us have slightly different definitions. "Inigo's Observation" should be in daily use for all of us. Better for all of us to reflexively specify what we mean when we talk - even among members of the same group, terms can unintentionally mislead.
Also, too, I believe you're on the right track with this "it's all the same thing!" approach; it's like "racism", which is merely one of the many forms of "groupism" we show. Anything we use to distinguish "Us" from "Them", whether it's language, skin color, ideological (=religious) beliefs, or any other detectable characteristic, operates under the same set of rules. Basically, "I'm normal and right; you're weird and wrong (and probably dangerous)". It's a hind-brain thing, and easy to overlook.

By saying "ideology" is the problem, you are dangerously close to saying that having principals is a problem. Likewise, you are broad-brushing all ideologies, and they are not created equal. Liberalism and democracy (in the broad global, not narrow US sense) are ideologies which promote human freedom. And freedom of thought is needed for the scientific method. Compare to fascist or stalinist states, where science was always bent to the needs of ideologues in power. We cannot say "having an ideology is dangerous to scientific progress" just so. That statement, at the very least, needs some huge qualifications.

There are ideological fads in science too. Think of eugenics, and other forms of social darwinism as well as almost the entire field of neo-classical economics, as well as Austrian economics, Marxism, Freudian psychology Gaia Theory etc. All of them elaborate castles in the air that don’t seem to be too bothered by any connection to reality.

But as biologists like Stephen J Gould have pointed out, the eugenics scientists really weren't scientists but pseudoscientists, who cherry-picked results, skewed data, and generally fudged to create the impression of differences between races. They weren't scientists who came across data demonstrating a difference between the races, but racists who wanted to solidify their superiority by fudging the science. The ideology came first, and systematically biased their research. Read the Mismeasure of Man, it's wonderful.

By saying “ideology” is the problem, you are dangerously close to saying that having principals is a problem. Likewise, you are broad-brushing all ideologies, and they are not created equal.

This is the eternal argument when I criticize ideology on this blog. Which ideology is the least full of crap? That was the target of our original discussion of Kahan's paper.

It would be an enormous mistake to suggest there is parity between ideologies and their unscientific tendencies. Nothing is further from the truth. Right wing Christian fundamentalism, for instance, will have far more conflicts with established science than a left wing hippy. However, the hippy is going to still think some weird stuff about GMOs and animal research, and have some interesting anti-corporate conspiracy theories that are fed by ideology.

Certain ideologies have science built into their philosophy which allows for a certain degree of self-correction. And right now it's true, reality has a liberal bias. But it hasn't always been the case, and I don't know that it always will be the case. After all, the anti-science postmodernist movement was the invention of a left-wing academia. The war on science in the 60s and 70s was from the left. Domestic terrorism in this country and others has come from both the left and right, from the Weather Underground to Timothy McVeigh. More recently it seems more right wing, with the likes of Eric Rudolph and Tim McVeigh, but that wasn't the case when it was the radical left angry about the state of the nation.

We cannot say “having an ideology is dangerous to scientific progress” just so. That statement, at the very least, needs some huge qualifications.

I don't know, I think you can live a moral, ideology-free existence and it would be better for all of us if we engaged in more self examination of the overvalued ideas we have banging around in their brains. I don't think of ideology as being principled or necessarily consistent with principles. If anything, I think it's the opposite. It's lazy. It's an intellectual and moral short cut. It makes people do immoral things for the sake of some higher truth. How, after all, does a man get on a school bus and shoot a little girl in the face because she wants to educate herself? Not because he's principled but because he's a believer of a radical, misogynistic ideology that believes in forced submission of others to fundamentalist religion. Now, most ideology is not so severe, but I think it's all as illogical. Sometimes socialistic stuff works, sometimes free markets work, sometimes some utilitarian solution is good, sometimes a utilitarian solution is awful or unfair, sometimes having some stupid tradition works. I can't say the British have done poorly with their bizarre constitutional monarchy. There isn't a political philosophy, and certainly no religion, that is always right for all people, at all times, everywhere. Not even close.

Principles are things like believing in fairness, equality, justice, the common good. Most ideologues would say their ideology is the most perfect expression of one or more of these principles, and are wrong 90% of the time.

Ideology is poison to reason. It shuts off your brain. And it's the opposite of being principled. Instead of trying to find the "right" ideology why don't we explore things like "what works" or "what does the evidence show" or at least "what doesn't cause harm". Then, maybe I'm just expressing a belief in a ideology of radical pragmatism.

What I take away from this is that many of the mistakes people make that lead them to reject well-supported science is a result of strongly identifying with a group. If there's a unifying explanation for a lot of the resistance, that's good news rhetorically because we can build approaches that counter it.

By Michael Caton (not verified) on 05 Nov 2012 #permalink

You can't live an ideology-free existence; you have to have SOME set of basic beliefs. Explaining all opinions that you don't like with "They have an ideology" is almost content-free. It has no predictive power and provides no insights into how others' opinions might be changed, if that is your goal. To take the case of climate change denialism, it isn't helpful just to say that an average (non-1%) denialist has been infected with some sort of inferior ideology. (Where do you go from there? You instruct him to change his no-good ideology; he instructs you to change yours. Stalemate.) Suppose you instead recognize that the real desire underlying his opinion, let's say if he is from a Western state, may be to preserve both his livelihood and his self-image as part of a group that is rural and self-sufficient, living in wide-open spaces and traveling freely. Whether that's really true or not, it's important to him that he can think of himself that way. Then you can ask, how can he be persuaded to care about specific environmental problems while remaining within and honoring that fundamental framework?
Usually, this bites both ways. Just as everybody has ideologies, everybody has ingroups. Kahan mentioned gun control. He may have meant to imply that Americans who want to keep their guns do so to protect ingroup identity, e.g. as rural people who hunt or should do so; certainly, many of his liberal readers will take it that way. But it's equally true that supporters of gun control may reject evidence that gun control doesn't really reduce violent crime because they think of themselves as belonging to a "tribe" of peaceful, civilized urban people who would never want to kill an animal for food or to hurt another human even in self-defense. To own a weapon would imply rejection of a belief that's important to that ingroup, threatening one's status. The 1990s-era gun-control debate was often actually a debate about cultures, with each side not just demanding legal concessions of the other but insulting their whole cultural identity and implying that they should give it up altogether. No wonder that it was so acrimonious and ultimately led to few gains for either side.

You can’t live an ideology-free existence; you have to have SOME set of basic beliefs. Explaining all opinions that you don’t like with “They have an ideology” is almost content-free. It has no predictive power and provides no insights into how others’ opinions might be changed, if that is your goal.

You misunderstand me, or are attacking a straw man. I realize that we are products of our culture, our upbringing, and group-identity. The problem is when the assumptions and axioms one develops as a member of such a group come in conflict with measureable, verifiable facts it's clear ideology is the source of the problem. All our thinking and opinion is necessarily shaped by environment and yes ideology, but that's no reason we can't do better. It's no reason we should just throw our hands up in the air and say we can't improve. And further, when you come across denialism and anti-science, in most people you can almost always trace the issue back to a specific ideological conflict with reality (I think this is what Kahan's research shows systematically), although I think there is also a subset of folks with ingrained and otherwise inexplicable conflicts with reality (cranks) that may have roots in mental illness or personality disorders. And it's not opinions I don't like that are the problem, it's denialism, a specific, flawed set of tactics people use to entrench around a scientifically-indefensible idea.

Via wikipedia ideology:

An ideology is a set of ideas that constitute one's goals, expectations, and actions. An ideology is a comprehensive vision, a way of looking at things (compare worldview) as in several philosophical tendencies (see political ideologies), or a set of ideas proposed by the dominant class of a society to all members of this society (a "received consciousness" or product of socialization).

Ideologies are systems of abstract thought applied to public matters and thus make this concept central to politics. Implicitly every political or economic tendency entails an ideology whether or not it is propounded as an explicit system of thought.
...
Political Ideology is a certain ethical set of ideals, principles, doctrines, myths, or symbols of a social movement, institution, class, or large group that explains how society should work, and offers some political and cultural blueprint for a certain social order. A political ideology largely concerns itself with how to allocate power and to what ends it should be used. Some parties follow a certain ideology very closely, while others may take broad inspiration from a group of related ideologies without specifically embracing any one of them.

So basically, everyone's got one. Like opinions, and assholes, they're universal. The question is, given how most ideologies are different, and all are flawed, perhaps there 's a better way to try to view and run the world? Is it possible to make decisions and political choices while purposefully using science, insight, and rationality to avoid falling in the traps your ideology will invariably set for you?

Thinking ideologically is what comes naturally to us but, depending on how flawed the ideology, it periodically makes us reach the wrong conclusions, then dig in when challenged. This is exactly what Kahan's research shows. People side with the group over reality. I say the problem is ideology, group identity, tribalism, whatever the hell you want to call it, it's all bad. It's all sloppy thinking. And even when a given ideology is wrong less than some other flawed ideology, it's still going to be wrong more than the measure which it's being applied to, which is verifiable scientific facts. Think about how to test this hypothesis. One could ask, if this is true, then we should be predictably be able to figure, based on one's stated ideological agenda, area's in which an individual has beliefs that are in conflict with the observed reality. Hence Kahan's research, he shows everybody is susceptible to this flaw. It's also consistent with what other like Haidt, and Mooney are showing us that ideology causes people to have scientific blind spots, because human reasoning is not scientific, but completely backwards, and worse when people are stuck in ideological echo chambers, the irrational pattern of thought can be further aggravated.

I, like Kahan in his post, am proposing we take the ambitious step of a post-ideological political and societal agenda. We acknowledge we bring bias, opinion, and rationalizations to a given debate, but that's no excuse to give up, or to disagree on facts. Facts need to be separated from the debate over what to do with those facts. We acknowledge that ideology is universal, but inferior, to truly rational (not rationalized) and pragmatic politics and culture. We should say, no matter what I believe, I'm willing to change when my beliefs and expectations are negated by reality, or when an idea I might not like on the face of it is shown to work. We need to all acknowledge that science should be separated from politics/ideology etc., and that it's the best system at providing an agreed upon set of facts.

It's like political skepticism, or ideological atheism. I realize ideology exists, but these ideal societies and structures they espouse are mythical, and flawed. I'd suspect the ideal social structures will likely never be found, but we can improve a given one by by observation, experimentation, and study of efficacy of specific interventions. You can call that an ideology, and that's fine, members or religions often try to impute religious behavior to atheists, but it really is not. I want to endeavor to evaluate new facts and information separate from my personal biases, emotion, and overvalued ideas (and I know I have them, ask me what I think of vegans sometime). I may not always succeed, but I will endeavor to try. I won't believe in some ideal organizing principle out of faith that if everyone just thought like I did, the world will be great. Guess what, it'll never happen, and it probably wouldn't work anyway.

This goal in action is probably best demonstrated by my views on the healthcare debate. I would personally prefer a public option, but not necessarily single-payer, and I'm willing to be happy about the ACA's mixed private insurance model. Why? I think the data from the experience of other countries around the world show single-payer systems have access flaws, that public option choices in places similar to the US in culture and beliefs like Australia have been quite successful, but ultimately anything (the ACA) is better than a non-universal system. A system like the ACA exists - I'd say what we're making looks pretty similar to the Netherlands system, and appears to work quite well. Now, while I'm personally convinced a different plan might be better, it's not really a scientific belief, it's an opinion. We don't know what kind of system will work best in the US, but we do know, every other system in the world appears to perform better than ours on cost and access and likely on quality in several. We could end up stunned and find that going universal will make care worse, more expensive per capita etc., and then we should change again either back to our flawed system, or to experiment with another model. But I'm willing to try any, and be proven wrong about any. I'm not motivated by a belief in a right to healthcare. I just think what we're doing now is so freaking stupid, costly, and backwards, we should be embarrassed to be seen in public.

Anyway, a wordy reply, but my thoughts on this are still being worked out.

If I'm misunderstanding, it's perhaps because you don't seem to be clear about what you believe, and so contradict yourself (can the superior sort of people "live [an] ideology-free existence" or are ideologies "universal"?). Certainly, everyone has one or more; it's been wisely said that "humans think with stories." Almost anyone might agree in the abstract that ideologies that conflict more with observable facts are less useful. However, then the question of which people are the arbiters of what is factual becomes an ideological one. I have seen many specific topics for which different people who would claim science as their primary ideology strongly disagree on which observations and publications qualify as "verifiable scientific facts" and which do not. There's no way to avoid bias; even if you were to have a computer make the call, it would have to be programmed according to somebody's values.
For example, neuroscience is not only confirming that nonhuman animals engage in emotion-motivated actions, but reporting that human beings make better decisions when they are using emotion rather than pure reason. In other words, we have and use emotion because it benefits us. Therefore, from one perspective, condeming "human" reasoning (as opposed to "scientific" reasoning!) as "completely backwards," "irrational" and "inferior" is basically saying that all those humans around you are inferior to you for being normal members of the very successful species they were born to. Little wonder if they don't then hop to change their treasured beliefs at your say-so. Again, to return to the climate-change case, you might get farther if you recognize that opponents have feelings and beliefs that matter, and that some of those feelings and beliefs could actually be used as levers to encourage environmental concern. You will not get very far in training a cat if you pretend that she doesn't or shouldn't have feelings and preferences; surely humans deserve the same respect.

Almost anyone might agree in the abstract that ideologies that conflict more with observable facts are less useful. However, then the question of which people are the arbiters of what is factual becomes an ideological one. I have seen many specific topics for which different people who would claim science as their primary ideology strongly disagree on which observations and publications qualify as “verifiable scientific facts” and which do not.

Science is not an ideology, it's a method. As far as disagreements about observations and publications what are we talking about? Of course there is debate in science, but not over things like "does evolution happen", "is quantum mechanics real" etc. We debate over the interpretation, or methodological issues, but not over the reality of the data. And certain theories become strong enough that the dissenters are just boring cranks. Like evolution denialists.

There’s no way to avoid bias; even if you were to have a computer make the call, it would have to be programmed according to somebody’s values.

This is starting to smell an awful lot like post-modernism. While it is true there are debates in science, and you can say that there is disagreement among scientists on certain topics, certain theories reach a point where further debate is not held by anybody but cranks. There also is the issue that even while significant changes may occur in theory like the transition from classical to quantum and relativistic mechanics, that has not made the findings of classical mechanics incorrect, merely incomplete in terms of their descriptions of motion of small objects or very fast/heavy objects respectively. The data stay true. The method is reliable. We can argue about interpretation, but at a certain point things become settled, even if they are incomplete.

For example, neuroscience is not only confirming that nonhuman animals engage in emotion-motivated actions, but reporting that human beings make better decisions when they are using emotion rather than pure reason. In other words, we have and use emotion because it benefits us.

This is one interpretation. Emotional and pattern recognition heuristics are more rapid, and for certain types of decision making this may be valuable. However, I would say, when making decisions like "should I buy a ferrari today" they may be inferior.

Therefore, from one perspective, condeming “human” reasoning (as opposed to “scientific” reasoning!) as “completely backwards,” “irrational” and “inferior” is basically saying that all those humans around you are inferior to you for being normal members of the very successful species they were born to.

Yes, but what comes naturally to us isn't always what's best. This is an appeal to nature, and total bullshit. To the extent we can observe and measure the tendency of individuals to incorrectly interpret science based on ideology/group identity whatever, that's a sign that human reasoning is often defective. Our heuristics are imperfect. No one should be surprised. The Gaga hypothesis of "born that way" doesn't make it alright. No, we can do better. The scientific method should be the way out, and part of teaching people to be truly rational and scientific is to train ourselves to understand how frequently we will be victims of this phenomenon and become irrational when ideology/group identity is threatened. Read the findings of Jon Haidt if you want to get the background on what I'm talking about. Just because it's what we do naturally doesn't make this better, it's clearly flawed and the evidence is the obvious ideological underpinnings of the various forms of scientific denialism.

Little wonder if they don’t then hop to change their treasured beliefs at your say-so.

I am not trying to create such an appeal to authority. I'm just saying people should recognize that one's ideology puts one at risk of becoming anti-science, to the extent any given ideology puts one at odds with verifiable science (and yes verifiable science exists). I then describe the pattern of argumentation that denialists engage in to defend their untenable views.

Again, to return to the climate-change case, you might get farther if you recognize that opponents have feelings and beliefs that matter, and that some of those feelings and beliefs could actually be used as levers to encourage environmental concern. You will not get very far in training a cat if you pretend that she doesn’t or shouldn’t have feelings and preferences; surely humans deserve the same respect.

Well, hopefully humans are easier to train than cats. I have two and it's totally hopeless.

This is the old "framing" argument of Nisbet. Rather than saying people are wrong, or defending the validity of the scientific method, find the emotional lever that works on them. It's true, this probably works better, but it strikes me as being, well, creepy, manipulative and wrong. Not to mention Nisbet has subsequently been shown to be a hack.

I'm not trying to train people to have no preferences , or ignore their preferences or beliefs. I'm trying to say, everyone should acknowledge, at one time or another, their deeply-held and beloved beliefs are likely to come into conflict with some finding of science. The appropriate response to such a conflict isn't just wrote denial, but investigation of the result and self-examination. That should not be such a revolutionary idea. To be proactive, along this line of reasoning, one should continuously examine their own beliefs and try to separate what is emotional, ideological, etc., from what is actually demonstrable, testable, and therefore, far more likely to be true.

It's not practical. Firstly, some of the folks on ScienceBlogs who claim to have superior abilities to judge issues using pure reason and science have been seen reflexively rejecting published scientific data that support hypotheses they don't like. If these folks, some of whom have advanced degrees in science, can't overcome their emotional attachment to their beliefs, how do you expect less-educated people to do so? And since their tribal identity includes "we are more open-minded and rational than others", they have to make a pretense of considering ideas they don't like. People whose ideology includes pride in NOT questioning their beliefs are not going to do that. Sure they should, but most of them won't. So saying they can't be dealt with until their ideology has changed is a recipe for stalemate.
You may call it creepy and wrong, let us say, to try to persuade a creationist to care about mass extinction by pointing out how much he should value species that were purposefully created by his deity. He would say it was creepy and wrong to try to badger him or his kids into giving up their religion before they could be considered smart enough to reason with on environmental issues. You yourself note above "how frequently we will ... become irrational when ideology/group identity is threatened." That's kind of conceding the importance of the point that Dan Kahan made in using the terminology that you initially rejected. If your goal is to attack a belief that is really relatively peripheral, but you do it in such a way that you appear to be defining a person's basic identity as inferior and unacceptable, you will get much more pushback. Likewise, if people feel that "experts" are being portrayed as the sole appropriate arbiters of not only factual questions but values questions, they will rightly resent this, Nobody wants to be assigned the status of a passive lump, even if in fact he's let TV turn him into one.

Some examples of scienceblogs rejecting published data just based on dislike of hypotheses? We often will reject interpretaions, question methodology, or terminology as I did with Kahan, but we rarely reject the data itself as fabricated or fundamentally false. And again, I reject the idea that I'm appealing to experts. Which experts?

I'm not saying people have to give up ideology, or judge ideologies based on their scientific merit. I'm just saying all people must acknowledge ideology will pull them into situations where they will become unable to judge science rationally. I'm agreeing with Kahan here. If we recognize that fact it will allow us to recognize when we see that behavior in others, and ideally, allow us to recognize it when we do it ourselves.

Interestingly reading over at Big Think they uncovered a Bertram Russel interview and I think he makes my point better than I do:

At the end of his 1959 TV interview “Face to Face” on BBC, Bertrand Russell was asked the following question.

“One last question. Suppose, Lord Russell, this film were to be looked at by our descendants, like a dead sea scroll in a thousand years’ time. What would you think it’s worth telling that generation about the life you’ve lived and the lessons you’ve learned from it?”

This was the first part of Russell’s answer.

“I should like to say two things, one intellectual and one moral. The intellectual thing I should want to say to them is this. When you are studying any matter or considering any philosophy, ask yourself only “What are the facts? And what is the truth that the facts bear out?” Never let yourself be diverted, either by what you wish to believe or by what you think could have beneficent social effects if it were believed. But look only and solely at what are the facts. That is the intellectual thing that I should wish to say.”

I'll agree that it is philosophically superior for people to ponder how their pre-existing beliefs might bias their judgement on any issue whatsoever. (Whether "science" should be the ultimate authority in our lives is one of those specific issues that there's no point in arguing about.) However, despite your feeling that "all people must" do this, most people usually don't. In fact, in the post-TV era, many Americans never seem to spend any time contemplating philosophical issues. And, again speaking pragmatically, we live in a system where no matter how dull-minded you are, you still have the right to a voice in your government.

So to take climate denialism, rural red-staters see it implied not only that they personally are too dumb or deluded to have opinions worth expressing, but that there is nobody in their local community (i.e., their "tribe") who is knowledgeable and right-thinking enough to determine what should be done. Instead, top-down decisions need to be made by distant technocrats (i.e., members of another "tribe", one that may not have their needs at heart). Contrary to popular wisdom, most commons in functioning societies are not "tragedies," because most people are willing to accept restrictions on behavior that are imposed by their own group for the sake of that group. But when a whole group feels that they are being defined as the problem, or that restrictions might be imposed on them by outsiders to benefit outsiders, there will be fear and anger. (This is particularly true when they observe hypocritical behavior by people from the supposedly superior group.) The more they feel disrespected, the more they will vote for demagogic politicians who claim to be defending their ingroup. If you can't engage with these people as they are, you only guarantee stalemate, because you don't have the numbers to outvote them consistently. Is it better to make modest but real political progress by identifying common ground, or to sit around accomplishing nothing until your political opponents acknowledge your intellectual superiority and offer unconditional surrender?

However, despite your feeling that “all people must” do this, most people usually don’t. In fact, in the post-TV era, many Americans never seem to spend any time contemplating philosophical issues.

I wouldn't blog if I didn't think I could have some influence on people's behavior or how they think about these problems.

Is it better to make modest but real political progress by identifying common ground, or to sit around accomplishing nothing until your political opponents acknowledge your intellectual superiority and offer unconditional surrender?

Again it seems as though you think I'm making an appeal to my own opinions or intellect and I'm not. I acknowledge I can be a victim of this type of thinking too, and I sometimes have to concentrate specifically on dividing my feelings from topics that irritate me.

Every time I've been successful in changing the mind of someone who has been duped by the denialists, it's been with someone that I've been able to calmly explain what the science shows (in my very limited expertise) and why the arguments used by the opponents of the science are based on a ludicrous conspiracy theory, combined with sloppy arguments cherry-picking etc. I often explain why the alternative explanation is appealing to those who want less government, and how that's a reasonable political position, but has no bearing on the facts of the science, nor does it forgive the irresponsible tactics of the denialists.

The common ground is science, and people's respect for it. Pseudoscientists wouldn't insist the science is on their side if they didn't believe that science is superior at determining fact versus fiction. The denialists have been enormously successful against global warming by accusing the researchers of political contamination of science as part of their conspiracy theory against climate change research as a liberal environmentalist hoax. Everyone wants to be on the side of science. Luckily, their talking points are relatively easily debunked and dismissed, and I've found conversion relatively easy for all but the true believers.

I don't think the common ground has to be ideological or political, people can believe things that are contrary to their ideal political outlook if people take the time to explain the science. Most people, even ideologues, recognize when science and ideology conflict, science should win.

Fair enough, although you're not going to get enough people to fundamentally change their philosophies of living to make a noticeable difference in a polity of 300 million people. For the remaining majority who don't want to think about most issues carefully, I fear that invoking "science" to tell them how they should vote will have increasingly little effect. Americans traditionally valued science because technology (a different but related animal) was making their lives ever better, and because they were a well-educated people who prided themselves on learning about and discussing serious issues. Neither of those things is generally true any more. IMHO, science might have a more lasting reputation if average people were taught to use science-as-process just as they are (or should be) taught to use logic - which is also sometimes treated as a tool for use by an elite.