Posts Tagged ‘rationality’

This is kinda interesting – how much are you swayed on matters of scientific fact, by your biases about what should be true according to your political ideology?

In my case, reassuringly little. In fact slightly more of my mistakes were caused by attempting to deliberately steer away from politically motivated thinking than my prejudices themselves.

Still, 39% seems like a worrying low score for total correct answers, and I don’t know whether to be reassured or concerned that 60% of people did even worse.


Read Full Post »

Will Storr wrote a book really worth reading called The Heretics. It’s about people with beliefs on the fringes of mainstream or accepted scientific thought, and it’s about the skeptical movement that challenges and calls them out. In particular, it’s about how the author has failed to find a comfortable place for himself within the latter, despite sharing so many of their ideals and principles.

I read this book last year and scribbled lots of notes about it, and am only now getting around to putting those notes together into a coherent article. Knowing me, “coherent” will probably be aiming too high and this will likely end up rather scattershot and disordered. [Update from the future: Yep.]

At times the book feels a little uncharitable in its depictions of the characters involved, and a little unfair in its conclusions. But although it felt that way for me to read it, I know a lot of that feeling comes from defensiveness about a perceived attack on my own tribe, who I’m reluctant to allow to be criticised on any point that feels like it touches something personal. That doesn’t explain all that I wasn’t comfortable with – I think there are times when he does miss the mark in his final judgments – but nailing down which of my objections are reasonable and which are more emotionally driven is really difficult.

This difficulty is, in fact, a large part of his point in writing the book.

A lot of what he’s talking about is what he sees as a kind of skeptical tribalism, especially at certain gatherings like QED or Skeptics in the Pub. Many of the folk at these events have a very firm idea of what specific club they’ve joined, and exactly who the out-group are. They know very well what sort of person someone must be if they’re found in the pigeonhole labelled “homeopaths”. Not that it should be a surprise, but many self-identifying skeptics’ own beliefs and positions rely to a large extent on tribal in-group coherence, rather than the purely rational objective evaluation of data which they at least have the good sense to value and espouse.

The refrain that “There’s no evidence for homeopathy”, for instance, is a common one, even though for any reasonable interpretation of “evidence” it’s clearly untrue. Scientific research and evidence is what we fall back on as justifying our position, but several skeptics Will talked to couldn’t name or usefully cite a single study or meta-analysis that supported their position on homeopathy, and bristled when the question was asked.

Off the top of my head, I can’t accurately cite in detail the research which supports my ideas on homeopathy either. Clearly that doesn’t stop me from thinking that there are good reasons to think the things I think, all the same. But if my justifications for my beliefs aren’t truly what I think they are, that’s something worth identifying.

There are ways that general expert opinion can be judged by the layman, tools one can aquire to assess the proponderance of evidence usefully (if not impeccably) which doesn’t require us to each pick through hundreds of complicated technical papers before reaching a conclusion. This kind of direct observation isn’t the only way to learn things, and there can be sound reasons to believe things that appear to be based more on hearsay and second-hand reporting. For instance, if the average punter were tasked with writing a medium-length blog post on why they believe that the world is round – and that anyone who believe it’s flat is drastically, bewilderingly wrong – they could probably come up with something reasonable, despite not having been to space to admire the curvature of the earth directly, or personally circumnavigated it just to check.

But we don’t always think naturally in these terms, and so we often don’t summarise our positions on skeptical issues this way either. A more natural inclincation, if you’re a fairly representative skeptical blogger, might be to say “homeopathy doesn’t work, there’s no evidence for it”, and to get twitchy with anyone who starts asking you to cite papers from memory, because you’ve met people who ask questions like that before, and you think know where this is going. Your tribal integrity is under threat from someone suspected of being from the out-group.

It’s an entirely natural human tendency, when faced with such opposition, to assume the worst, close ranks, and awkwardly throw up defenses around one’s cherished beliefs to protect our ego from the perceived threat. The question worth asking for me is: are skeptics actually any better than anyone else at recognising this tendency in ourselves and working around it?

It’s not that it’s wrong to bristle at the question. It’s that it’s really important, for skeptics especially, to recognise both why it’s not a wholly rational response to bristle, and also why it’s utterly human, and completely understandable – and something we have in common with just about every “true believer” we’ve ever had a heated/feisty/empassioned conversation with. Because if we’re not better than average at recognising that kind of faulty thinking and deploying techniques to avoid it, then being right about the things we’re right about is only going to be of partial help.

I imagine it’s deeply unoriginal and quite tiresome for all involved to draw comparisons between The Heretics and any of Jon Ronson’s books, but that’s not going to stop me. One thing I remember about Jon’s approach to visiting the depths of close-knit tribal alien gatherings and reporting on them as an outsider, is that I don’t recall ever simply disliking anyone he wrote about. Which sounds bizarre, given the amount of time he’s spent with neo-Nazis and profoundly hateful religious fanatics. But either there was something affable in their quirkiness and perhaps Jon’s own affection seeped through, or there was something humanising he’d found about them, which went some way toward hinting at an underlying explanation for what was otherwise unappealing about them, in a way that caught the interest just enough that we weren’t leaving with the idea that they’re simply the antagonist to this piece and we’re supposed to take against them.

It could be that my hazy memory is giving Jon a little too much credit. I may be unfairly searching for an unfavourable comparison by which to downplay Will’s attacks on my tribe. But it feels like he doesn’t always acknowledge that same level of individual humanisation, while recounting certain remarks by certain skeptics in a way that insinuates a disapproving tone over the whole enterprise.

Is that reasonable? Am I being unjustifiably tribalistic, to expect him to tilt the balance even further toward acquiescence to my team? Or is it fair to suggest that his own personal biases might have led his own narrative into the kind of judgmentally prejudiced thinking he’s identifying in so many others?

Either way, it’d be petty to reject or condemn the whole book based on differences like this, however strongly I might feel about them. I’ve read and enjoyed numerous well-argued atheistic and skeptical tomes and essays which would no doubt be at least as grating to anyone not already on my side of the aisle who was trying to engage with it. (Most of the history of this blog is probably included in that as well.)

Actually, that paranthetical deserves more of a digression than that, as I felt particularly strongly in the chapter on James Randi. Various defences and objections to Will’s assessment formed in my head as I read, most of which he recapped and considered fairly a few paragraphs later. And a lot of my protests about his overly harsh insinuations would apply equally well to many other out-group people I’ve been critical of in the past, and of whom I’ve read far more damning accounts. If I want critics to go easy on someone I admire, I do not have a great track record of extending the same courtesy.

But it’s hard, because the things that feel like they’re of basic fundamental importance to us, like that homeopathy is bunk, are things that skeptics are generally right about. It’s important not to let that get lost in the fair and even-handed discussion of how both sides have things to learn and both sides are often swayed by irrational tribal urges and both sides have tendencies to make assumptions that unfairly privilege their own team and both sides etc etc. There is also often a crucial matter on which one side is also completely wrong. Will’s not denying that last point, and he’s got a lot to say about the earlier ones which isn’t easily dismissed with phrases like “tone policing”.

He looks into issues such as false memories, audio hallucinations, and Morgellons syndrome, and determines that the people involved with these issues generally aren’t “crazy”, and deserve to be granted a sympathetic ear – but this isn’t the direct counterpoint to the skeptical position that he seems to think. Most of what I know about the fragility of human memory, the fallibility of perception, and the need for compassion and understanding toward anyone who’s fallen prey to some of the myriad cognitive errors that afflict every one of us, I learned from the skeptical movement.

The section on David Irving was particularly good. It really got into the man’s head, explored and humanised him and all his irrationality, found a deep understanding and compassion for this person, without ever risking letting you think that he might be onto something with any of his utterly false notions.

In the end, even if there are potential complaints with the representation of cherished movements, and if the ratio of interesting questions raised to insightful answers proposed is sometimes higher than I’d like, there’s a lot in The Heretics that’s enjoyable to read, and which provides some level of intellectual challenge to anyone with any kind of investment on either side of any sort of discussion about “belief”.

Read Full Post »

I’ve started following some people I disagree with on Twitter.

Listening to people you disagree with is really quite important. I mean, I talk to hypothetical people I disagree with on this blog all the time, and I act like I’m expecting them to listen. So it’s only polite.

And I do get probably an unhealthy amount of my reporting on what “the other side” think only when it’s been filtered through someone on “my side” reporting on it, with all the expected disdain and righteous indignation that I find it hard not to share.

So I’ve added a few contrarians to my feed. I’m planning to find more blogs to achieve the same thing, too. Feel free to make suggestions. (I’m a libertarian socialist atheist humanist, in case you need a recap on who I’m likely to find utterly antithetical to every value I hold dear.)

Anyway, there’s a particular thought process some of these oppositional commentators seem to spark in me. It goes something like this.

1. This opinion contradicts my understanding of the way things are!

2. I am more rational than to simply dismiss it out of hand, however. I shall follow the attached link and look a bit more closely into what the assertion actually is, and how well it stands up.

3. Well, it isn’t immediately obvious to me what’s wrong here.

4. But something must be, this person’s a tit and clearly on the wrong side of everything.

5. Okay, that’s definitely not a rational conclusion. Can I actually find any holes in this piece of analysis?

6. …No.

7. But it doesn’t mean this person’s right; really, I just don’t understand the subject well enough to have an opinion either way. It’s quite intricately political in an area I’m not well versed in.

8. Is that a cop-out to avoid admitting that I was wrong about something, because this person made a good point?

9. No, I think I really honestly don’t have a clue one way or the other. This seems like a good point, but so did the other stuff I’d already read from the other side. Apparently I can’t reliably tell which of these two opposing viewpoints is making the best points. I really should conclude that I just don’t know what’s going on.

10. Y’know, I probably should’ve started with that before even deciding I had an opinion worth defending.

I’ve also provided myself with a few examples of how a little intelligence and rationality can be a dangerous thing, if they’re deployed and placed strategically so as to continue reinforcing one’s own biases.

In particular, this comes up in my reactions when somebody not part of my “in-group” makes a claim about a contentious subject, as opposed to when somebody who is identified as being on “my side” makes a similar claim, when I don’t have time to fully examine either of them right now.

The contrast between “Hmm, I should study this more carefully later, and also find an informed rebuttal from someone who disagrees, to make sure I’ve got both sides of the story and can fully and rationally assess the truth of the situation” and “Yep, makes sense!” is quite stark.

So I’ve learned some things about my own rationality, and the way my brain works when confronted with ideas and individuals I tend to find unreasonable and infuriating.

On the other hand, I’ve also been reminded that, sometimes, people whose political opinions happen to differ significantly from mine are also horrible. Just unbearably, viciously, despicably horrible.

So there’s that.

Read Full Post »

Let’s talk about not believing in God.

Atheists often frame their position as a simple lack of a belief; they don’t take the active, affirmative, assertive position that theists do, don’t make any direct claim, and simply don’t hold the positive position that “God exists”.

I’ve written before about why the extent to which some atheists take this feels like an unnecessary cop-out.

Atheists should totally be making positive claims. Part of the reason why many are reluctant to do so, is because of an implicit idea that “belief” is a binary thing, something you either have or you don’t.

Christians believe the claim that “God exists”, and atheists don’t. Some atheists might conversely believe the claim “God does not exist”, but many deny holding any such position, and define their own godlessness as a kind of belieflessness. It’s not that they don’t believe in anything – we often have to remind people of our strongly held convictions in favour of love, truth, beauty, cheesecake, and the basic goodness of humanity – but when it comes to God, we simply don’t buy it, and are otherwise keeping out of the argument.

I don’t think this holds up. I think that the usual ways we describe belief are necessarily short-hand for a more complex set of ideas, and that we can afford to start being clearer in our positive declarations.

As an analogue, let’s say I’ve flipped a coin but not yet revealed the result. Do you “believe” that it’s landed on heads?

Assuming you have no improbable insider knowledge about the coin or my tossing abilities (steady), you haven’t a clue which way it’s landed. So, I guess you “lack the belief” that it’s landed heads. And you lack the equivalent belief that it’s fallen on tails. It’s not that you disbelieve either option – they’re both possible, and wouldn’t be especially surprising.

Now let’s say I’ve rolled a fair six-sided die, and am also temporarily hiding the results. What beliefs do you have about the number that’s showing? Presumably you lack each individual belief in its landing on any given number – but it seems like this is true in a different way from the coin-toss. In that first case, if you’d arbitrarily picked one option to guess at, it would’ve been no big deal whether you’d been right or wrong. With the die, if you randomly picked the right one, you’d be a little more impressed. On seeing what number it landed on, you’ve now adopted one particular belief you formerly lacked, just like with the coin – and yet this feels like a bigger deal.

Let’s step it up again. I’ve got a lottery ticket here for last night’s £7,000,000 jackpot. It’s either a winner or a loser, but I’m not telling you any of the numbers on it. Clearly you’d expect some evidence if I wanted to convince you it’s a winning ticket. But do you simply “lack the belief” that I’ve won the lottery, just like you “lacked the belief” that my coin had landed on heads (or tails)? Or are you actually pretty sure I haven’t won?

I’d argue that you’re easily justified in believing I’ve not become a millionaire overnight. The evidence in favour of the idea is so slight, and the odds against it so great, that it seems like a hypothesis worth ignoring. (Even before you consider the odds that I’m lying about having a ticket in the first place. Which I am.)

Now, you might change your mind later, when I invite you round for tea and diamonds in my new gold house, but for now, you’re safe assuming that I haven’t won the lottery. It’s not dogmatic to work with that assumption; it doesn’t imply you’re unwilling to be persuaded by evidence. But come on, clearly I haven’t won the lottery. Frankly, you should be quite content telling me “James, you have not won the lottery”. We’d all understand what you meant. If you can’t make that positive assertion now, then I don’t know when declaring anything to be true is ever going to be possible.

It may seem as if it’s incompatible with acknowledging the possibility that you might be wrong – this possibility can be calculated precisely, after all. But the fact is, we don’t append the phrase “to a reasonably high degree of probability, barring the arrival of any further evidence” to the end of every other sentence we utter. When we have conversations with each other, there’s generally a subtext of “I am not absolutely and immutably 100% certain that this is the case, it is simply the most appropriate conclusion I am able to draw and it seems strongly likely, but I will be willing to reconsider if there’s a good reason why I should do so” to most of what we’re saying.

I don’t “believe” that any given flipped coin has landed on heads or tails. But I can put a probability of 50% on either outcome, which says something more useful than just “I lack belief in any direction”.

With a six-sided die, the probability is 1/6 each way. Is it fair to say “I believe it hasn’t landed on 6”, since I believe the odds are 5/6 against that outcome? Probably not, but I don’t think it matters. If you understand the numbers I’ve placed on each possible outcome, you understand what I believe.

I don’t believe an asteroid is going to crash into the Earth tomorrow and wipe out humanity. Further, I believe an asteroid will not crash into the Earth tomorrow and wipe out humanity. I believe this more strongly then any of the other examples so far. How strongly? It’s hard to put an exact number on it, but that doesn’t mean it doesn’t belong somewhere on the scale of increasingly improbable things. In this case, just saying “it’s not going to happen” is a useful short-hand way to get my point across, without going into a lengthy spiel about percentages and Bayesian priors. It gets the gist of my position across in a manner I think most of my audience will understand.

There is no God.

Does that mean I think God’s existence is less probable than, say, flipping a coin and getting ten heads in a row? Would I be more surprised to meet Jesus after I die than to roll a string of double-sixes throughout an entire game of Monopoly? Whether or not I have an exact numerical value for my god-belief, these are the terms we should be thinking in. Not that there’s simply a thing called belief which some people possess and I lack and that’s the end of it.

So can we agree that a flat denial of God’s existence is not dogmatic and unfounded now, please? Can we accept all the implied background understanding that goes along with other conversations about the things we believe? Can we maintain useful phrases like “does not exist” without burying them under a mound of tentative qualifications each and every time, when we all know damn well that Carl Sagan’s garage is a dragon-free zone?

And could we stop acting as if being sure you’re right about certain things makes you an inflexible ideological bad guy, regardless of how reasonable it is to be strongly convinced of your position?

Read Full Post »

I’ve been thinking about Katie Hopkins more than is healthy lately.

If you don’t know who she is, I’m not going to explain. You wouldn’t thank me for tarnishing your innocence, and I wouldn’t thank me for giving me another ulcer.

She’s a media presence to whom I tend to react in a strongly negative fashion, is all you really need to know. Experiencing Katie Hopkins is something I find unpleasant. It makes me angry, it makes me cringe, it gives me a visceral wrenching in my gut of revulsion and disgust.

But note that these are all statements of fact about me, not her. I’m the one doing the reacting; these feelings of negativity are instigated in my brain, regardless of her status as the causative factor. It’s not clear, a priori, that the blame for my feelings lies primarily, or at all, with her.

Because my feelings are so personal, it’s no surprise that many people don’t react to a given stimulus – such as Katie Hopkins’s ghastly opinions – in a similarly emotive fashion. I can count on a good 80% of my Twitter feed to be even more vocal in their disdain, but there are also a lot of people out there with entirely different feelings toward her.

And there are many reasons why somebody’s physiological response might be far more placid than mine. Perhaps they agree with her on some political points. Perhaps they find it easier to laugh off extremist nonsense like hers as an inconsequential source of amusement, a la Boris Johnson. Perhaps they actually know her personally, and so have some positive associations with her. Perhaps they’ve simply attained a level of dharmic serenity in their life that is currently beyond my reach.

Whatever their reason, the fact that some people manage to avoid that surge of bile at even considering the extent of their genetic material that’s shared with Katie Hopkins, is a commendable and positive thing. I can (and do, and should) passionately disagree with her views on the world and explain at length why she’s wrong about everything – but that innate reaction of nausea contributes nothing beneficial.

All of which made me realise something important: When someone else doesn’t react to Katie Hopkins with the same aversion as I do, or even seems to condone or approve of her existence, it’s all too easy and natural to mentally categorise that person as being all the same things that are wrong with the world as Hopkins herself. But as I’ve outlined, there are numerous reasons that people might not want to join in with my spluttering fury other than that they’re of one mind with her and are just as worth getting angry at as she is.

Being disgusted at Katie Hopkins should not become a litmus test in my mind for holding an acceptable set of political opinions. Someone could “fail” that test and yet still be perfectly decent folk, not at all irrevocably in her camp, and absolutely not worth relegating to the furthest, most depraved reaches of my mental taxonomy of Internet Nutters.

I actually had a practical experience of this the other day, when I had a perfectly cordial Twitter chat about public sector strikes with someone who began matters by asserting that Katie Hopkins “rarely says anything that isn’t true”. Not so long ago, I’m not sure I could’ve let that conversation go as well as it did.

Read Full Post »

A brief rationalist take on yesterday’s glurge:

A lot of it comes down to fundamental attribution error, of course. When I make a mistake, or do something that might seem rude or insensitive or otherwise negative, I’m aware of all the extenuating circumstances. I let myself off because I was tired or stressed from dealing with so much other shit, or because the blame can be pinned on something else in the world… any excuse as to why it doesn’t really count.

But when someone else cocks up, obviously they’re an incompetent asshole.

We don’t live in other people’s heads, so we aren’t naturally inclined to make all the same excuses for them as we do for ourselves. And we don’t feel their emotions to anything like the same extent they do, either.

When somebody else is suffering, or delighted, or in pain, or giddy with adulation, I might experience a surge of the same emotion on their behalf. My mirror neurons will start flapping away (neurons totally flap, ask any scientonomer) and encouraging me to empathise and bond with my fellow species-member.

But when there’s especially intense emotion, that just can’t come close to matching the experience of actually going through it. Even if you’ve seen either people in profound emotional highs or lows, it doesn’t intuitively feel like what they’re going through is really real. Your friend’s drama only impacted on you a little, nothing like what you’re experiencing now, so yours must be more real, more deep and profound. They were just moping and wailing, they can’t have felt it as strongly as you are now.

Except there’s every reason to suppose that they do. And your intensity of experience is just as inaccessible to them, but no less real for it.

Now that I’ve written all that, I’m not sure it adds very much.

Read Full Post »

See, the thing is, religion isn’t all bad.


It’s not, though. But it’s still a long, long way from the best we can do.

The Skeptics with a K framed some ideas interestingly in a recent episode of the podcast. They were talking about the classically bullshit-ridden debate over whether religion or atheism has directly caused more historical death and suffering, and which is therefore “worse”.

The first thing to remember is that this is entirely disconnected from the question of whether God exists, or whether any religious ideas are reasonable to believe based on the available evidence.

But, even while there have certainly been religiously motivated murders numbering well into the millions, and also genocidal regimes led by atheists, I’m increasingly of the view that there’s nothing useful to be gained by trying to determine any sort of comparative body count.

As I think Mike pointed out on the show, the idea of atheism being responsible for murder seems ridiculous on its face; there’s no way to logically get from “there is no god” to “I should kill a bunch of people”, without adding a load of unrelated shit in the middle.

But then, theism doesn’t directly result in or endorse killing anyone either. There’s no more a logical way to get from “there is a god” to “I should kill a bunch of people”, without also adding a load of unrelated shit in the middle.

Unfortunately, adding a load of unrelated shit in the middle is precisely what religion tends to do. Hence “I believe in God” leads, blunderingly and meanderingly and by way of numerous distortions and corruptions, to the Crusades, the lynching of homosexuals, and all the rest.

And on the flipside, you have religious charities, and the unavoidable fact that belief in God, however mistaken, often engenders a kindness and desire to do good works in people of faith.

Atheists are always quick to point out various things when this is brought up – that historic religious institutions are in a much stronger position to provide infrastructure and funding for charitable organisation, that organised atheism hasn’t had centuries to establish a similar community that can embark on charitable projects, the name of the biggest lending community on Kiva, and so forth – all of which is quite correct. The idea we’re rushing to counter, in these cases, is the common claim that believing in God makes you a more compassionate, more generous, better person, than being an atheist. We’ve been told often enough that we all have no reason to be moral, and so that’s the bullshit we most easily react against.

But there are other things to be taken from the observed association between religion and charity. It’s not a condemnation of atheism to note that some forms of religion, as a system, are pretty good at arranging, organising, and motivating people to do good things, behave kindly and compassionately, and strive to alleviate suffering.

It’s also pretty good at helping people justify and rationalise the most grossly inhumane atrocities of which humanity is capable.

So it’s a mixed bag. Racist genocide and feeding the hungry are two things people are entirely capable of, with or without religion – but which religion often exacerbates and supports.

So, can’t we have one without the other?

It’s not that hard to conceive of a better system, which does more of the good things, and less of the bad. We could identify the parts of religion (or any other system) that are beneficial, separate out the ones that are harmful, and organise ourselves in a way that promotes and encourages charity without also helping people rationalise and justify tyranny and cruelty.

It should be possible. It doesn’t seem likely that, if you want everyone to be better at sheltering the homeless and not passing by on the other side when someone’s in need, you have no choice but to accept the corresponding tendency to lead armies against anyone else who’s basically trying to do the same thing as you but gives it a different name. We can surely have compassion without religiously inspired evil.

Atheism isn’t this system. (Though I suspect, and urge, that many people acting this way would be atheists.) Humanism might be it, or at least might be a few steps down the right path. It doesn’t need to be any more formal than that, nothing with an official hierarchy and rules and whatnot. Just a set of ideas, picked and chosen to help us do the best we can.

Skepticism and critical thinking are also positive things, and any belief systems we have in place should encourage and nurture these things. Religion often tends to be hostile to genuinely honest and open questioning of ideas – not always, but it throws up some serious roadblocks. So let’s see if we can’t do better.

The claim that religion is never any good for anything doesn’t hold up, but atheists shouldn’t feel they’re conceding anything important by abandoning it. Many people cling to their faith as a source of comfort and reassurance, in times of difficulty and pain. It does them some good, in a situation where simply removing it and replacing it with non-belief would not be better for them.

What’s important, though, is that religion is not the best we can do. Not by a long way. The comfort it provides comes only at the expense of a rational approach to the real world. It lets you feel better, but only by believing false things.

Can we improve on that? Can we come up with an approach which helps and supports and comforts people, and allows us to help and support and comfort each other, while remaining grounded in the real world, letting both compassion and rationality drive what we believe?

Christ, I hope so.

It’s unhelpful to focus too fixedly on whether “religion” or “atheism” is responsible for any of history’s great mass slaughters, because nothing’s that simple. But there are things to be learned about different approaches one can take to the world, and what kind of institutionalised behaviour these approaches tend to engender. Authoritarianism and inflexible thinking are strongly connected with cruelty and tyranny, and religion is by no means the best way we have of avoiding authoritarianism and inflexible thinking.

The demonstrable falseness of religious claims is ample reason to reject them; the regularity with which bigotry, hatred, and aggression are backed up by religious motivation should be ample to strongly compel us toward a more optimal system of organising ourselves to do good things.

Read Full Post »

Older Posts »

%d bloggers like this: