Feeds:
Posts
Comments

Posts Tagged ‘logical fallacies’

A brief rationalist take on yesterday’s glurge:

A lot of it comes down to fundamental attribution error, of course. When I make a mistake, or do something that might seem rude or insensitive or otherwise negative, I’m aware of all the extenuating circumstances. I let myself off because I was tired or stressed from dealing with so much other shit, or because the blame can be pinned on something else in the world… any excuse as to why it doesn’t really count.

But when someone else cocks up, obviously they’re an incompetent asshole.

We don’t live in other people’s heads, so we aren’t naturally inclined to make all the same excuses for them as we do for ourselves. And we don’t feel their emotions to anything like the same extent they do, either.

When somebody else is suffering, or delighted, or in pain, or giddy with adulation, I might experience a surge of the same emotion on their behalf. My mirror neurons will start flapping away (neurons totally flap, ask any scientonomer) and encouraging me to empathise and bond with my fellow species-member.

But when there’s especially intense emotion, that just can’t come close to matching the experience of actually going through it. Even if you’ve seen either people in profound emotional highs or lows, it doesn’t intuitively feel like what they’re going through is really real. Your friend’s drama only impacted on you a little, nothing like what you’re experiencing now, so yours must be more real, more deep and profound. They were just moping and wailing, they can’t have felt it as strongly as you are now.

Except there’s every reason to suppose that they do. And your intensity of experience is just as inaccessible to them, but no less real for it.

Now that I’ve written all that, I’m not sure it adds very much.

Advertisements

Read Full Post »

My head has really not been in a writing place lately. I’m trying to write my way back into one today.

A new site’s been getting a lot of attention in the skeptical community, called Your Logical Fallacy Is. It’s a compilation of common logical fallacies – ways in which an argument can fail to logically support the claim in whose favour it’s cited – which you’re encouraged to link somebody to if they make any of these errors in the course of a discussion you’re having with them.

For instance, if someone demands your evidence that the Christian God doesn’t exist, and accuses you of being a godless fundamentalist with no empirical support for your position, you can point them to yourlogicalfallacyis.com/burden-of-proof, which will point out that it’s up to them to make a case for God if they’re making a claim about his existence.

Not everyone’s keen on the principle behind the site. Is it just another way for skeptics to be smug?

I’ve learned by now that any question about skeptics that includes the word “smug” is bound to make me bristle, regardless of its potential validity, and I need to give myself a quiet talking to before I respond in a way that makes me sound an arse. Part of the problem is that, while smugness is often an annoying quality in others, decrying it is something that it’s very easy to do smugly. Of course, by pointing out how smugly some people are objecting to others’ smugness, I’m unavoidably going to make things even worse and smugger than ever before.

Smug smug smug. The word’s doing that thing now. Is that really how it’s spelt? Smug. Hmm.

Anyway, Tannice’s objection in that link up there isn’t a ridiculous one. It can be satisfying to spot a hole in an adversary’s argument which completely undermines their conclusion, and depending on the attitude you’re bringing to the discussion, it might seem tempting to treat that accomplishment as some sort of conclusion, a victory, a zenith beyond which you need not progress any further. Obviously, this approach is indicative of being more interested in scoring points than learning anything new or getting closer to the truth, which may be an integral part of that detested smugness.

But I think it’s a little unfair to assume that this will be most skeptics’ prime use of a site that handily points out logical fallacies like this. It has the potential to be a useful tool for stimulating more rational debate, not just “an easy way to be a skeptical c*** online”.

Maybe there does need to be more focus among skeptics on what to do with a logical fallacy once you’ve spotted one, and how to best use an understanding of these common pitfalls to make our discussions more productive, and educate those who haven’t encountered them before and might think they’re all fine ways to make your point. But even if that side of things is being neglected, that doesn’t mean the addition of the Your Logical Fallacy Is site is a bad thing. It’s one more instrument in the arsenal, whether or not it’s used well by everyone.

I don’t think there’s a particular problem with skeptics being too smug. People can be smug – among many other, often far more undesirable traits – for all sorts of reasons. It’s not obvious to me that skepticism exacerbates it more than any other mindset.

Perhaps it’s especially grating in our case, because it’s thought that skeptics, of all people, really ought to be better at avoiding traps like smugness, rational self-examiners that we supposedly are. It’s worth noting that one of the fallacies listed on the site is “The fallacy fallacy“: the mistaken idea that, as soon as you’ve pointed out a mistake in someone’s argument, you’ve necessarily proven them wrong.

Hat-tip to Hayley Stevens for making me think about this, and for having sensible things to say.

Read Full Post »

I used to believe in some wacky stuff.

It didn’t seem all that wacky at the time, of course. When I first started taking an interest in the stuff I was reading online, about people’s religious experiences and psychics and mind readers and dowsing and so on, it sounded fascinating, and wasn’t obviously bullshit at all. I guess I tend to think about things a bit differently now, or maybe there are just more things that I’ve learnt aren’t real in the intervening years.

Anyway, there was a lot of stuff about dowsing that caught my eye, and made it seem like an accessible skill. There was reams of advice and personal experiences people wanted to share, and it sounded like you didn’t need to be whisked away from your cupboard under the stairs to a wizards’ school by a hairy giant in order to be a part of it. It sounded like anyone could join in, and learn to access some spiritual dimension which could provide insight and knowledge from beyond this world.

So I bought a crystal pendulum from a new age shop.

It feels so weird typing that sentence now.

It was cheap, but kinda pretty, and looked a lot like this quartz one. The idea, as described on that page, is to clear your mind and mentally ask a series of yes/no questions, while letting the pendulum hang loosely from your fingers. There are various ways the pendulum might swing – circular motions, clockwise or anticlockwise, back and forth, diagonally – and you can calibrate it with some control questions.

I don’t remember exactly how it went when I tried it, but it would have been something like: “Is my name James?” – and I saw it swing forward and back, so I knew that meant yes. “Is today Wednesday?” – another yes, with the same swinging motion. “Is there a dragon in my room?” – and it swung side to side, meaning no.

This was really exciting.

So I decided to test it out properly, and see if I could find out something that I didn’t know, and prove that I was really tapping into some amazing psychic source of power.

I think this is the point where my strategy departed from that of a lot of new age fans.

I got a deck of playing cards and placed one face down in front of me. I didn’t know what card it was, but I held the pendulum over it, and asked yes/no questions to narrow it down. “Is it black?” – no. “Is it red?” – yes. “Is it a picture card?” – no. And so on.

Eventually I narrowed it down to “Is it the five of diamonds?” and got a yes. It had given me a definite answer to everything I asked. It had never contradicted itself. I’d started with absolutely no knowledge or assumptions or preconceptions about the card in front of me, and my pendulum had honed directly in on its identity as the five of diamonds.

I still remember the fluttering in my chest – half excitement and half genuine fear – in the second or two before I turned over the king of clubs.

Aw, crap.

It turns out that there’s a bunch of reasons why people believe in this kind of thing, and post articles to the internet about their powerfully moving personal experiences with it. And these reasons don’t require magic to actually be real.

When I first started looking into it, it didn’t require any particular daftness on my part to take it seriously – it just seemed to be a part of the world. A somewhat secretive, not generally known, exclusive part, but that just made it all the more fun. At the depth at which I explored it at the time, I didn’t find any good reason to suppose that it was all completely fictitious. People were taking it for granted, writing detailed accounts of their achievements, and beginners’ guides to the basic techniques.

But once you start thinking about it more critically, you realise that magic powers aren’t the only explanation. They’re not the best explanation. In fact, they’re not even a very good explanation.

Some people are very keen to find evidence that supports the idea that their dangling crystal can tell them things – so confirmation bias plays a big part in explaining why it’s so widely believed, as well as a host of other logical fallacies. But the ideomotor effect is one of the most persuasive aspects if you don’t know what it is. And it’s the one I’m supposed to be talking about here.

When I was asking myself those questions, I really was trying to hold the pendulum as still as possible. I know I wasn’t deliberately swinging it around to make myself seem like an amazing wizard (“Look, it knows my name!!”), but it’s worth asking: how good am I at holding my hand perfectly still? When I look closely at my outstretched digits as I try to remain motionless, I seem surprisingly wobbly. If I’m going to hold something on a thin and flexible cord or chain, it seems likely that my natural shakiness is going to have some effect.

And it turns out that the pendulum picks up more than just a general jiggle from my unsteady muscles. Let’s say I know a forward-swing means yes, because of my first test question. If I then ask something else which I know, or expect, has the answer yes, then on some level of consciousness I’m going to be imagining getting a forward-swing answer from the pendulum. My hand will then actually twitch, without my being aware of it, to make the pendulum swing forward.

The mental processes to do this can really happen inside your head, without the part where you’re conscious of it. It “bypasses volition”, to be a bit technical (volition being your capacity to do something by your own will).

You can try it easily yourself with any weight on some sort of dangling cord. I’m trying it now with one of the earphones from my mp3 player on its lead, and it’s still quite odd to see. I concentrate on a clockwise spinning motion, and it starts spinning clockwise, even though I’m still trying to hold it as steady as I can.

If you’re thinking that this might be evidence that I was secretly psychic all along, you’re still leaping to a more complicated explanation than is necessary. If I’m not directly touching the cord, or holding it in such a way that my hand movements won’t affect its swing, then it doesn’t respond in the same way. It only moves like this when I have the capacity to be swinging it around unconsciously. The best explanation is that I’m simply moving my hand.

There’s also a common hypnotic trick, where you’re asked to close your eyes and stick your arms out, then vividly imagine a heavy weight in one hand pulling it down, and a balloon tied to the other pulling it up. You focus on the respective feelings of pressure and lightness for a while, and if you’re anything like me, after a couple of minutes you open your eyes and find that you’ve lifted and lowered your hands accordingly by several inches, without being aware of doing it.

The point is, your mind’s good at doing stuff like this without telling you about it.

Now, this doesn’t mean that nobody can dowse anything, or that we’ve proved that Ouija boards are universally a load of crap (yes, the people are just pushing the glass around even if they don’t realise it). But it reminds us the importance of asking the question “Is there a simpler, less Harry Potter explanation?” when we see something we think might be magic.

If I was doing actual magic over my playing card that time, then my skills make Neville Longbottom look like Gandalf. I must really suck at magic. I didn’t even get close to getting the card right. Magic just isn’t a good enough explanation for what happened there. But the idea that my hand wasn’t perfectly still, and made the pendulum swing a little by entirely natural means? Yep, that fits.

But what if I had got it right? What if I had no way of knowing what card I was staring at the back of, and wasn’t being provided the information by any means except the pendulum, and I actually got it right? And it kept happening, consistently?

Well, the ideomotor effect wouldn’t cover that. And I’d be a millionaire.

But it does cover, y’know, every case that’s ever been examined of any kind of dowsing ever. Except the ones that are outright fraud, where there’s conscious deception taking place. But there really doesn’t need to be any malice or dishonesty for people to make magical claims that aren’t based in reality. If you don’t know what the ideomotor effect is, and maybe don’t test out your new idea all that rigorously, and kinda let slide the few occasions where it doesn’t work… then I can imagine this being pretty convincing.

People who do things like dowsing aren’t being stupid or evil. But they are claiming that they can do magic, and it’s a big ask that we should take that at face value without daring to question it any further, even if we don’t doubt their sincerity. It’s the kind of massive claim that we should probably, y’know, check.

And, unfortunately for any aspiring Weasleys out there, natural phenomena like the ideomotor effect provide a better explanation for every instance of “magic” that’s yet been observed. They account perfectly for what’s going on, but the magical explanation fails to explain why the effect always vanishes when studied closely. It just doesn’t work. The five of diamonds was not my card.

Sorry, Hermione. Muggles win.

A more academic and less chatty approach to this topic can be found at The Skeptic’s Dictionary, RationalWiki, SkepticWiki, and all over the place really. Barrett Dorko and Ray Hyman, among others, have written rather more scientifically rigorous documents about the ideomotor effect in action, with examples of experiments in which it’s been seen.

Read Full Post »

It’s logical fallacy hour again, here in Skeptictionary corner. These tend to be a little lighter and less research-intensive, and I don’t want to wait another month before being able to post something else, so I’m scaling down a bit from the recent mammoth on homeopathy. [Spoiler from the future: I ended up rambling on for over a thousand words on this anyway. But at least it only took me a week this time.]

Therefore, A causes B

So, what’s the deal with the latin up there in the main title? Damn Romans, you’d think they invented being wrong, the way they get to name everything about it.

Post hoc ergo propter hoc translates to “After it, therefore because of it”. It refers to the idea that, because two things were seen to happen in sequence, the thing that happened first must have (or probably) caused the thing that happened next. It’s fallacious reasoning because, as is commonly pointed out, correlation does not imply causation.

Some examples:

I had a hole in my sock when I attended an event or competition featuring my local team of sportsmen or athletes, and they won! It must be my lucky sock!

I let a spiritual homeopathic healer acupuncturate my feet with his natural quantum reflexopractic needles, and just two weeks later my minor cold had been completely cured!

I did a traditional tribal rain-dance for three days straight, and sure enough the heavens finally opened, and the gods gave us water for our crops!

A guy at work bought a car out of the paper. Ten years later, Bam! Herpes.

The Family Guy gag (the last of the list) highlights how ridiculous this kind of reasoning can be, but it’s often more subtle and pervasive than that, and the first three quotes are exactly the kind of ideas that do genuinely persuade people that they’ve discovered some secret magic which gives them power over the universe. Throw in a good dose of confirmation bias, and it’s easy to become convinced that your choice of tattered footwear can affect a soccer game several miles away, and to write off all the times it hasn’t worked as minor, irrelevant aberrations.

We like finding patterns to things, and it’s a big advantage in nature to be able to connect related concepts and predict the future. If I know that tigers tend to make rustling noises in bushes, then when I hear a rustling bush I can run away before I see the tiger, without having to have his presence confirmed by seeing his teeth where my arm used to be. This sort of low-level prognostication comes in handy for a burgeoning species.

But it’s an instinct that can lead us astray, in our enthusiasm to build up a neat, logical picture of how things in the world are ordered, because sometimes things aren’t very neat or logical. When two things appear to be occuring in tandem, there might be other things going on more complicated than simply “therefore A causes B”.

For instance:

B causes A

I know that post hoc implies a temporal sequence, so the thing that happens later can’t really go back in time and cause the thing that already happened first. So maybe I’m really talking about the more general cum hoc fallacy (“with it”, rather than “after it”), but whatever. Sometimes you might just have the causitive effect backwards.

Hospitals must be unbelievably dangerous places to go. Have you seen how many sick and dying people are in there?

You know, it’s pretty suspicious how the police always seem to turn up after a crime’s been committed. Returning to the scene to admire their handiwork, perhaps?

That kind of thing, though there are probably some less silly examples I could have thought of if I’d got more sleep last night.

Some third thing C causes both A and B

In this case too, the correlation is real. You will likely find that instances of these two things you’re looking at tend to go together. But it’s not simply that one causes the other – there’s actually something deeper going on beneath both of them.

I’ve noticed that people with grubby teeth seem to get lung cancer more often than the rest of us. Does the cancer spread to the teeth? Or is the yellow stuff they get on their teeth giving them cancer?

Everyone keeps getting presents when they put a tree inside their living room. I guess their hospitality is being rewarded by a generous wood-nymph.

One way this kind of fallacy can lead you astray is if you start trying to change one thing by manipulating the other, when actually they’re both just side-effects of some deeper principle, and don’t affect each other at all. (Say, putting up a Christmas tree in June and waiting expectantly for the gifts to accumulate underneath it.) A fascinating example of this can be seen in cargo cults, of which more at some indeterminate future date, maybe.

Blind luck and dumb animals

There’s a lot of stuff going on in the world at any particular moment. There are many variables which go up and down over time. Sometimes, some of these will line up for a bit, with no underlying significance whatsoever, purely by coincidence. Pastafarianism makes good use of this by attributing global warming to a lack of pirates (or possibly vice versa), but a lot of more common magical thinking falls into this category as well.

Get barely over a thousand people flipping coins, and it’s more likely than not that one of them will get ten heads in a row on their first try. If you look hard enough, there’s probably something unique about that one person, to which you can attribute this “luck”.

And if 20,000 people regularly turn up to sporting events and take note of exactly what they’re wearing each time, some of them are going to find pretty convincing patterns between their chosen attire and the performance of their team. (And, again, some of them will see entirely unconvincing patterns but remember the hits and forget the misses.)

I’m sure we’re all mature enough here to be beyond such utter bullshit, but the whole “lucky socks” thing, as I’m going to categorise it, pisses me off enough to be worth focusing on for at least another paragraph. I know you want to feel important, and it’s tempting to leap to what seem like justified conclusions, but can you really not get over that instinct and just grow the fuck up? If you honestly believe that any ritual you compulsively go through actually has any effect on the “luck” of something as unconnected and multivariate as a Cup final, then you are literally too retarded to be allowed to handle crayons without supervision. And by “literally” I of course mean “figuratively”, but you’re almost certainly too moronic to know the difference.

Why would your own magic talisman cancel out every other equally lucky object owned by every single other person watching this sports game, let alone the relative skills and efforts of all the actual players? By what inanely trivial divine law would the outcomes of such events revolve solely around a single banal and irrelevant action by, of all people, you? If circumstances like sports results and the weather are going after you personally, why do they completely ignore everybody else’s schedule but your own?

You are seeing patterns where none exist.

Pigeons believe this kind of thing, or rather they learn to perpetuate arbitrary behaviour patterns in their efforts to achieve an unrelated goal. They’re not animals known for a natural talent for critical analysis, but they’re just about smart enough to realise that, if they were tapping a pattern with their left foot when the food turned up last time, it might be worth tapping it out again to see if it still works. But we’re really supposed to be more intelligent than fucking pigeons, and we really ought to have grown up beyond the point where we think there are mischievous leprechauns pushing footballs around mid-flight based on whether some guy watching the match is wearing the right underwear, or whatever the fuck the logic’s supposed to be.

Stop ranting, this is supposed to be one of your serious and informative bits

Sorry. Where was I? Oh, I think I was about done.

I’ve said before that humans have a crappy natural grasp of things like probability, and maybe that deserves a whole post of its own. But we have things like science, and statistics, which mean we don’t need to rely on our appalling instincts in determining the truth. We can do some tests and look more closely at alleged relationships like this, and if there’s really something there, we can find out more about it. But if we’re not doing any of that, then we’re wandering blindly in a world of wrong ideas, with no way of knowing how misguided are the concepts we’re snatching at.

Read Full Post »

It’s logical fallacy hour again, here in Skeptictionary corner.

That rule doesn’t apply to me because I’m a beautiful and unique snowflake.

Special pleading can often look reasonably convincing, and be quite persuasive if you don’t know what to watch out for. It might also appeal if you’re already inclined to believe in the amazing specialness of the snowflake in question. It’s when someone presents what they want to use as a get-out clause, to stop you from drawing a particular conclusion. If the evidence seems to lead in a direction that they have a problem with, they might jump in with a special pleading to divert you from it. It can essentially be boiled down to, “That doesn’t count in this case, because…” followed by an irrelevant justification for why some particular example should be considered an exception to the rule.

I’m not selling medicines; all my products are natural herbal remedies, which re-align people with the spiritual energies of their mother Earth. So of course they shouldn’t be regulated for safety and efficacy the same way as conventional pharmaceuticals.

The love potion didn’t work? Ah, it’s because you used it when Mercury was in retrograde. That’ll be what went wrong.

It’s a popular one among psychics, or other supernatural nuts who’ve never managed to give any convincing demonstration of their supposed abilities. The JREF often sees it in applicants to their million-dollar challenge, who were all set to wow the world with their powers, agreed to all the conditions set, then fizzled out hopelessly when it came to the crunch. Patricia Putt was a recent example of this: after failing to do anything impressive, she started coming up with excuses as to why it didn’t work, in an effort to avoid the obvious, simple, parsimonious solution that her magical powers are all in her head. The “negative skeptical energies” which so often throw the powers out of alignment only seem to come up after the fact, when the psychic powers have failed to do what was promised and we’re ready to conclude that they don’t exist. Then a special pleading is necessary to try and get around this; suddenly there’s “a perfectly good reason” why it didn’t work.

When considering this fallacy, it’s important to remember that not all post-hoc reasoning is invalid, and not every exception to a rule means there must be an unfair double standard at work. The Fallacy Files use the example of police officers being permitted to break certain laws which normally apply to all road users, such as speed limits, under certain conditions which we can agree justify this exception being made (like if they’re in hot pursuit of some villainous mastermind). Police needing to drive fast to do their job properly is a relevant and realistic exception to the normal rules. My urgently needing to get home in time to catch the final of The Apprentice, however, is not.

Or, if I claim to be able to run a marathon in under 3 hours, it’s not special pleading to insist that the course avoid scaling any mountains. That’s a reasonable request, perfectly in line with what any reasonable person would infer from the initial statement, and doesn’t much diminish the claim. But if I also say that I have to be allowed to use rollerskates, because that’s just how I’m used to running, then that’s less logically sound.

A crucial part of what makes this argument fallacious, then, is that the excuse has to fail to explain why this example should be treated differently than it would normally be. This might be because it’s irrelevant (like the “natural” status of alternative medicinal treatments) or because there’s no reason to suppose it’s true (what the hell does Mercury have to do with anything?). This latter is the form that tends to crop up when skeptically analysing unlikely claims.

If, to leap haphazardly across to yet another example, your highly technical ghost-measuring device is failing to measure any ghosts where ghosts ought to be, but you then find out that you forgot to put the batteries in, then fair enough. It’s clear that an electronic gadget needs power to run, so it’s not like you’re only bringing this up now as a post-hoc excuse. You still haven’t proved anything, but we can agree that you’re not simply making a special pleading to explain this.

On the other hand, if everything was functioning properly, and you had to resort to complaining about “negative vibes” to explain why the ghosts didn’t seem to be there, then you’re floundering in fallacy. If your magic powers haven’t worked, then it doesn’t make your position any more tenable to blame it on someone else’s bad juju, when there’s no more evidence for that than for the ghosts. It just adds another layer of implausible claims which there’s no good reason to take seriously.

You still haven’t proved that someone doesn’t have magic powers, or that ghosts don’t exist, just by bringing the whininess of their special pleading into light, of course. But the burden of proof is on them if they want to be believed, and conjuring elaborate circumstances to excuse a failed attempt just raises the bar higher for how much evidence they need to bring.

Read Full Post »

More colourfully named than most, the “No true Scotsman” logical fallacy is attributed to Anthony Flew, and is named for the example he gave of a potentially offensive racial stereotype named Hamish.

It’s a way of sticking to your guns beyond what’s reasonable, and avoiding having to admit to making a mistake, by changing the meanings of the words you’re using, to make it look like you’re still right about something. In Hamish’s case, he begins by claiming that “No Scotsman” could act in some way incongruous with his ideas of what his countrymen are like (Flew’s example is of a sex maniac). However, when he learns that one of his compatriots really has been letting the whole nation down, he redefines his terms, and labels that scoundrel as being “No true Scotsman”.

The fallacy is in moving the boundaries of the category in question, so that what you want to say about this category becomes true by definition, and no evidence can ever prove you wrong. All Scotsmen behave impeccably, because nobody who does anything that Hamish sees as distasteful is allowed to count as a Scotsman. This no longer has any meaningful implications about the virtues of people from Scotland, because that’s no longer what the term “Scotsman” is being used to mean. After all, what’s Hamish actually saying about this one Gaelic ne’er-do-well? Probably not that his birthplace must really have been somewhere other than Glasgow, or that the ethnic backgrounds of his parents must be different than had always been assumed. Hamish is just trying to preserve some unrealistic ideal of what a Scotsman is, which simply doesn’t stand up to the facts.

A similar example, claimed by some Christians (including my old boss), is that once you’re “saved” and accept Jesus into your life, you’ll never again be without his presence in your heart (essentially, once you let him in, you’re stuck with him forever). Anyone who ever loses their faith, then, and “stops” being a Christian, was never really a Christian in the first place.

Most people I know simply understand a Christian to be, more or less, someone who believes that Jesus was the son of God, and lived and died for us a couple of millennia ago, all that usual jazz. If you want to redefine the word, then that can be okay, so long as we’re all clear on what it means from the outset, and the implications of this are consistent. In this case, one side effect is that you can never really tell who’s a Christian and who isn’t.

Hamish might be convinced that his old friend Dougal is as true a Scotsman as the next kilt-wearing, bagpipe-playing, haggis-munching highlander. But if he’s also certain that no true Scotsman could have been involved in that scandal with the sheep, then Dougal must really have been some kind of soft Southern fairy all along. Anyone is liable to have their Scottishness retroactively revoked at any moment, if they stray outside these new boundaries.

And Christians, if they choose the argument as described above, can be no more certain. A lot of people have made a pretty convincing show of faith for years, or decades, but turned away from it in the end, and showed that they were never actually Christians at all. You can rule out some people after the fact like this, but that’s about all you can do. It’s becoming a uselessly meaningless concept, significantly redefined for your own convenience, and very little to do with the realities of Christianity, or anybody’s idea but your own of what a Christian is.

Also, as was helpfully suggested by some of the fine, upstanding skeptical thinkers who were kind enough to comment on my post over at ThinkAtheist, Christians will sometimes distance themselves from those who might share their faith, but not their values (in exactly the way Hamish does). It’s not uncommon for the likes of Fred Phelps and Ted Haggard to be dismissed as not being true Christians, because of how badly they seem to reflect on the group as a whole. And to an extent, a case can probably be made for this some of the time. It might not be unreasonable to assume a general consensus by which any true Christian is expected to make at least a cursory attempt to follow Christ’s teachings about not casting the first stone or turning the other cheek.

But the Bible has a lot of distasteful, unpleasant stuff which most of its adherents gloss over and ignore. Who is or isn’t acting as a “true” Christian isn’t easily settled, when it comes to homosexuality, or murdering children, or the many other instances where contradictory advice is given. It’s easy, but not really justified, to accuse someone of doing Christianity wrong simply because their approach doesn’t line up with your own personal interpretation.

Of course, there are some facts which can genuinely and universally be asserted about all true Scotsmen. None of them have lived in Australia all their lives and been raised by German parents, for one thing. But there’s no fallacy in barring Bruce Schnitzelkopf from our “true Scotsmen” camp; it’s perfectly in keeping with the definition we started with of what constitutes a Scotsman. He’s not from Scotland. No goalposts have had to be moved, we haven’t had to bend the interpretation at all, and we’re not trying to wangle our way out of admitting we were wrong. We’re just following the use of the terms as we set them down in the first place.

But if you have to change the definition of a Scotsman, and bring in matters entirely unrelated to geography and nationality, then you’re assuming a priori the very fact that’s being argued: namely, that all Scotsmen are paragons of goodness who would never dream of treating a sheep in so ungentlemanly a fashion. And that spells logical fallacy.

Read Full Post »

It’s logical fallacy time again, here in Skeptictionary corner, just because I feel like it, and because these are usually fairly light on the research, which is useful when I’ve got a headache. These fallacies tend to be the sort of thing your squishy primate brain isn’t innately equipped to handle intelligently, but which you can comfortably deal with using just a dash of good sense.

First, I’m going to blow your mind with something hugely improbable. I’m currently shuffling a deck of cards in between keystrokes, and once it’s more or less randomised (or close enough for our purposes, anyway) I’m going to deal out the top few and see what we get. Watch closely now: I am about to defy some astounding odds. Nothing up my sleeve…

Seven of clubs. Eight of hearts. Jack of clubs. Nine of diamonds. Two of diamonds. Queen of hearts. Jack of spades. And, miracle of miracles, the six of hearts.

I could go on, but that’s impressive enough right there, isn’t it? Do you know what the odds were against my drawing that exact sequence from a shuffled deck? It’s up into trillions to one against, literally. (And I mean “literally” literally there, not figuratively, or non-literally, as it’s often used.) But I laughed in the face of improbability, and went ahead and did it anyway, first time. I barely even broke a sweat.

Now, I admit, it may have been a bit more impressive if I’d predicted or aimed for this particular sequence before making the draw, but get off my back, I’m making a rhetorical point here.

Obviously, any sequence of cards in a deck is equally unlikely to appear at random – there are about 80,000 billion billion billion billion billion billion billion different ways of arranging them, after all, so that’s pretty stiff odds against any one particular combination – but somehow arranging them in such a sequence is never a hard thing to do. The point, of course, is that they have to be arranged somehow, and you’re not flying in the face of probability if it’s only afterwards that you point out how unlikely a result it is.

By the same reasoning, some lucky sod won the Lotto jackpot again last week. And we’re supposed to believe s/he just pulled off a millions-to-one shot by chance? I call shenanigans.

This is all fairly easily understood, and the story that gives the fallacy its name – a gunman shoots a hole through a wall, then carefully draws a target around wherever the bullet happened to go – is obviously comical. But it can be a surprisingly insidious mistake.

It’s bad practice to base a hypothesis on certain information, and then point to that same information as evidence that your hypothesis is right. If you flip a coin a hundred times, the odds of the heads/tails split being exactly 50/50 is actually pretty slim; chances are that you’d get a few more of one than the other. So, some sort of uneven leaning is more likely than not, even with a completely fair coin; obviously you can’t start claiming that it’s biased just because you got a few more heads than tails.

If it only came up tails twice in 100 trials, then that may be significant – even if you were just idly flipping coins to waste a boring afternoon, rather than testing a hypothesis (For Science!), you were still probably doing so under the assumption that heads and tails are both equally likely, so this would definitely look weird. But ideally a result like this should be repeated and verified, to make sure the original data hasn’t been cherry-picked, or that this exact fallacy isn’t making you read too much into a simple coincidence.

A study on whether people with some star signs are more likely to have road accidents than others is not necessarily justified in assuming any kind of correlation; if no hypothesis was proposed before the data was collected, then the differences observed could be nothing more than expected random fluctuations. It’d be weird if accident rates were split entirely evenly twelve ways. And apparently the most dangerous sign, Aries, was responsible for “nearly 9% of all road accidents”. When you consider that one-twelfth is about 8.33% anyway, this doesn’t sound so impressive. If it was a big enough change, observed in a big enough sample size, then maybe it could be significant if there’d been a hypothesis to begin with, but going “Ooh, that number’s bigger than that other number” doesn’t qualify as a statistical analysis.

(The explanations attributed to astrologers, for why these results would be expected, seem entirely post-hoc and deeply selective; Aries being fiery and headstrong makes sense, but “prudent and over-cautious” Capricorns were the 6th most dangerous, “protective” Cancer was 4th, and “adventurous and careless” Sagittarius were the safest of the lot.)

In a more subtle variation of the origin story, our sharpshooter unleashes a tirade of multiple rounds against the poor unsuspecting wall (who was only standing their doing its job, after all), and then draws his target around a cluster of several holes, which just happen to be closer together than most of the others, and which look like a fairly convincing series of hits (particularly with the target drawn in after the fact). But although it can sometimes look weird, this kind of clustering is only to be expected when the data is flying all over the place.

People tend to have a clear idea of what randomness ought to look like. It should bounce crazily and unpredictably all over the place, with heads coming up as often as tails (or whatever the variables are), and recognisable patterns definitely aren’t allowed. A string of heads amidst an assorted mass looks “less random”, and might stand out. But there are quite a number of “non-random” patterns that could be found in a series of coin-flips – a run of heads, a run of tails, a run alternating between the two, consecutive short runs of each, and so on – and the odds of some of these turning up in a long enough series might not be that long. It’s the monkeys/typewriter/Hamlet principle on a smaller scale. Spend a few hours tossing a coin, eventually you’ll get ten heads in a row, and probably stumble across a few other interesting patterns on the way.

If you watch random noise long enough, you will see patterns appearing. If you only point them out afterwards, though, it’s a lot less likely to be impressive or interesting. The results of the Global Consciousness Project would seem to be a good example of this, but my lengthy digression about them has been cut, and I’ll try and put that in a separate article soon.

Read Full Post »

Older Posts »

%d bloggers like this: