Feeds:
Posts
Comments

Archive for June, 2012

Revisiting some of Jim Butcher’s rather fun Dresden Files books has made me realise something that’s always bothered me about urban fantasy fiction:

In most fictional worlds where vampires, werewolves, and the like really do exist, most people’s attitudes towards them are exactly the same as in the world where they don’t exist.

That is, this world.

I should perhaps define some terms before getting into this. By “urban fantasy”, I mean stories which are essentially set in the real world, but with a certain fantasy twist. So, we’re not talking Middle Earth here: The Dresden Files are set in Chicago, Sunnydale is fictional but could be any regular town in California, that sort of thing.

There are important and obvious differences between these two tropes. In a completely fictional realm, living with orcs and elves and dwarves is the norm for your characters; with urban fantasy, the protagonist might be one of very few who can give you a glimpse of the magical aspects of your own world.

This isn’t always how it works – there are plenty of stories which take place in a slight variant on our planet, where most geography and history and culture are the same, but some aspect of fantasy has become mainstream, and witches in London are now as populous as hobbits in the Shire. But, in my experience with the genre at least, this is less common.

Most regular people Harry Dresden meets in Chicago thinks he’s crazy for advertising his professional services as a wizard. Awareness of Buffy’s vampire-slaying seems limited to a very exclusive clique, outside of which nobody seems to notice or believe in any of it.

Here’s the thing, though: In the real world, the majority react this way for a very good reason. In this world, if you meet someone who claims to be an actual wizard or to slay vampires, you’d have to be reasoning extremely poorly if you took them wholly at their word.

But in the world of urban fantasy fiction, there really are vampires and wizards and magic.

So why are there still skeptics?

Or perhaps I mean: why is the skeptical position so often depicted almost identically, when you’ve completely changed a crucial aspect of the context – namely, the actual evidence of the phenomenon in question?

The role vampires play in the lore of our world is pretty much exactly what you’d expect to see if vampires weren’t real. If they were really out there, they’d have had a much more significant impact. There ought to be numerous verifiable reports. They shouldn’t be such an elusive unknown quantity if they were real. It’d be like trying to pass off wasps as just an urban legend.

Harry Dresden conjures fire from the air at a single word of command. He summons the wind to do his bidding. He deflects machine-gun bullets with a magical shield. He’s a genuine, powerful wizard. Why is he still having trouble convincing anyone?

I don’t remember if Buffy ever really dealt with the skepticism thing. She knows damn well there are vampires. Most other people don’t, but they tend to come around to the idea pretty quickly when presented with evidence, often in the form of a set of fangs plunging toward their neck.

This is something I’m really struggling with in the urban fantasy novel I’m trying to write my second draft of at the moment. At first I just ignored it, and assumed as cavalierly as many authors do that all the magic and undead creatures wandering around have just been flying under the entire world’s radar for a few centuries. But the basic implausibility of that idea is probably going to be too much for me to comfortably ignore. I’m going to have to find some way to integrate the supernatural into the world at large, or explain its general absence. It’d probably still work as a story if I didn’t, and it wouldn’t bother too many people, but it’ll bug the crap out of me.

I don’t really have an end to this post, so I’m just going to stop abruptly in the mi

Read Full Post »

Workfare doesn’t work, say the people organising it.

The Department for Work and Pensions have performed their own assessment of the MWA programme (that’s “Mandatory Work Activity” – bit of a giveaway in the name there), and concluded that there’s no reason to suspect it provides any worthwhile benefit to the people it’s being inflicted upon.

Employment minister Chris Grayling has defended the scheme, protesting that the data used in the study was out of date and so the conclusions are no longer applicable, and said:

We’ve found that a month’s full-time activity can be a real deterrent for some people who are either not trying or who are gaming the system. But we’re also fighting a battle to stop claimants slipping back into the benefits system by the back door.

First of all, I don’t know how he can say that “we’ve found” anything of the sort, unless there’s been some other study done into the same scheme which isn’t being reported on.

Secondly, let’s be clear that by “a month’s full-time activity”, what he actually means is “a month of working, full-time, without being paid, with the threat of having your benefits cut off looming over your head if you don’t comply”. Now, I daresay people who are gaming the system probably do find that something of a deterrent, but I’d also stick my neck out and hazard that it’s pretty fucking off-putting for people who are trying to support themselves and their families while they look for a fucking job, too.

This is exactly what I’m sick of hearing when politicians talk about this kind of thing. We’re always being warned about the threat of people cheating the system; there’s rarely a thought spared for people being exploited by the system, such as those forced into working full-time for no pay. Nor for the people being thrown haplessly into the system, when they lose their minimum wage jobs because their corporate employers realise they can save money by replacing them with someone from Workfare who doesn’t need a salary.

This focus by politicians and the elite on fomenting contempt for those among us worst off and least able to defend themselves is as blatant a case of actual class warfare as I can think of. Particularly when – as I keep banging on about – the expense to the country imposed by benefit fraud by the poor is dwarfed by that of tax avoidance by the rich.

Anyway, the case against the Workfare scheme is now supported by common sense, basic human decency, and the only systematic evidence available.

Is anyone listening yet?

Read Full Post »

This is how great America is at the whole criminal justice thing.

There are a number of people who’ve been put in jail because it was believed they were guilty of a federal crime, but when it turned out that this was not in fact the case, there’s nobody whose job it is to tell anyone that these people shouldn’t be locked up.

If you want to challenge a conviction that’s landed you in prison, there are laws and protocols you have to follow to do that. A minor detail like not having committed any crime doesn’t get around this fact, and the Justice Department don’t consider it their job to let these prisoners know that it’s been officially acknowledged that they’re innocent.

These cases are largely unknown outside the courthouses here, but they have raised difficult questions about what, if anything, the government owes to innocent people locked in prisons.

“It’s been tough,” said Ripley Rand, the U.S. attorney in Greensboro, N.C. “We’ve spent a lot of time talking about issues of fundamental fairness, and what is justice.” [emphasis added]

They’ve spent a lot of time talking about this. I wonder what pointlessly insubstantial waffle they must have been saying, if we’re still debating whether the government owes anything to people it’s locked up in cages for absolutely no reason.

I once worked in admin at a psychiatric hospital for about a year, and noted with interest how much red tape and form-filling had to be done in order to continue justifying the detention of individuals under the Mental Health Act.

Most of the patients had regular recourse to a hearing by a Tribunal or a panel of independent managers, and have the details of their case scrutinised by outside experts and legal advisors. The paperwork giving the hospital the right to detain them was usually only valid for a fixed period, and regularly needed to be formally extended, with a number of qualified staff signing off on it. Occasionally, a deadline would approach while something important hadn’t been signed or dated or faxed through, and we’d have to scurry into action to get it sorted, otherwise we’d be illegally detaining someone against their will and they’d sue. If things were formalised even a day late, it was a big deal.

My point is, in my limited experience with psychiatric treatment in the UK, the authorities were constantly having to work to provide evidence that they were justified in imposing restrictions on other people’s liberty. If they didn’t, their rights to detain anyone would lapse in time, and things would tend generally toward a state of freedom. This seems like quite an important idea.

It doesn’t seem to be one the US prison system places much stock in.

Perhaps they’re worried about the expense of all that extra bureaucracy. In which case, Penn Jillette has a cost-cutting measure they might want to consider.

Read Full Post »

Order, order

(I should probably put this disclaimer up about this one.)

Is following orders a good thing?

Clearly it isn’t always. The phrase “just following orders” has become strongly associated with Nazism, because of how often it’s been used to refer to those numerous soldiers of the Third Reich who carried out inhuman atrocities, but who probably weren’t unusually immoral people outside of the context of Hitler’s dictatorship. As Stanley Milgram later described, regular folks are capable of doing terrible things under the right circumstances, and these Germans found ways to rationalise and justify to themselves the abominable acts they committed.

One of the things that helped many of them sleep at night, and reconcile the slaughter of innocents with their self-image as not-evil, was the idea that they were “just following orders”. Their job was to do what the higher-ups told them to do. It wasn’t their own decision that these things should be done, and if they didn’t do what they were told, they’d be harshly disciplined and someone else would be sent to do it instead.

Any Nazi soldier who defied this authority, and determinedly did the right thing regardless of the risk to themselves if their insubordination was discovered, we would likely be inclined to think of as heroic and noble. And you don’t have to be a Nazi to earn acclaim for thinking independently, acting morally, and not following orders. If you were told to do something morally wrong, and you decided not to do it because it was morally wrong, chances are you’ll have a good chunk of public opinion on your side.

But if doing wrong things is still wrong, even when someone in authority instructs you to do them… doesn’t that rather undermine the concept of “orders” altogether? We seem to have redefined an “order” as “something you’re told to do, which you ought to do except when you oughtn’t”. It seems like we’ve decided we can pick and choose what orders to follow based on whether the thing we’re being ordered to do is morally permissible, which puts “orders” on about the same level as “suggestions” when it comes to carrying moral weight.

And yet the assumption remains ingrained into much of modern life that following orders and instructions is an important, valuable skill. It’s sometimes justified by the idea that we need defined roles of order-takers and order-givers in order to maintain structure – nothing would get done in any organisation if someone wasn’t telling people what to do, and had some sort of authority to make sure it gets done.

So, do we have to admit that, sometimes, it’s morally necessary to follow an order to do a bad thing, for the sake of maintaining group cohesion? Is it an unfortunate fact we just have to face that, now and then, we’re morally obliged to drown a puppy for the greater good of bolstering a social structure which allows us to achieve so many other things and avoid descending into chaos?

Or is it still only morally right to follow such orders, if doing the thing you’re ordered to do would be morally right anyway?

Back in the increasingly distant days of my having a job, people ordered me to do things all the time. In practice, they were much more polite than to make it seem that way, but that’s what it amounted to. And, generalising broadly, if I hadn’t followed those orders, I wouldn’t have got paid. I don’t recall ever being ordered to do anything morally problematic, but I wouldn’t have been motivated to do them if not for that financial incentive, in the form of an order from the people paying me.

Even this, though, doesn’t seem to imbue order-following with any kind of moral righteousness. My employers were limited in what they could order me to do, after all, by the agreement I’d made with them about what my job actually was. The only moral thing I was doing was adhering to a pre-arranged contract to perform a certain role in the office. The “orders” I was given day-to-day defined the details of what comprised that role at any given moment, but the parameters were fixed, and it doesn’t seem like there was ever any particular moral goodness behind obeying any order in particular.

The main thing compelling me to keep following orders was that I didn’t want to get fired. Sometimes the authority behind an order amounts to “do this thing or this negative consequence will befall you” – which, rather than relegating orders to mere suggestions, now wholly equates them with threats.

Are they capable of being anything else?

Read Full Post »

In a dishearteningly short time, a couple of stories appear to which my new catchphrase could be applied.

A man put a sign up in his home, visible from outside, reading “Religions are fairy stories for adults“.

This, obviously, is something some people will disagree with. I’ve often seen things people have put on display on their own property, expressing their own views, with which I’ve disagreed: sentiments such as “JESUS SAVES” typify the sort of thing I mean. They’re wrong, but that’s fine. It’s no bother to me, and no business of mine.

Police have told this man that he’ll have to take his sign down if anyone complains that they’re offended by it, or else he’ll be arrested.

Now, that’s importantly different from having the authorities simply swoop in and order him to remove the sign straight away or face a prison sentence. Nobody’s complained about it yet. But they could. Anyone walking past his house has the legal ability to lodge a complaint, and make it illegal for that sign to remain up in the man’s own home. That’s still pretty scary and surreal.

The Public Order Act is the law in question, which is supposed to protect us all from “distress” caused by other people’s “threatening, abusive or insulting” actions. It’s hard to imagine how complaining to the police, and having them threaten a pensioner with forced imprisonment, is a more effective or humane way of saving anyone from distress than by simply choosing not to look at that particular square foot of some other guy’s house.

Ken at Popehat asks:

What is the character of a person who sees a sign like that in a pensioner’s window, and runs to the police to complain?

He may find at least a partial answer to that question in the form of Stephen Green of Christian Voice, who both complains about members of his own religion being charged with crimes under this law and simultaneously mocks the idea that an atheist being so censored has any legitimate complaint. He throws in a dozen or so misunderstandings of evolution toward the end, too, for no reason except to hurl misguided abuse.

Anyway…

Putting up a piece of paper, with some writing expressing their disagreement with others’ opinions, in the window of their own house?

Or, for that matter, cheering at their daughter’s graduation ceremony at a point of the process during which they’ve been asked to keep quiet?

Strange reasons to think you’re entitled to lock someone in a cage.

Read Full Post »

I find things like this very confusing.

Regardless of the one-word religious label that best sums her up, Leah seems like one of the good guys. I haven’t made a habit of reading her blog consistently, but I’ve always got the impression that she’s well on the side of tolerance, intellectual honesty, and all-round being nice to each other. I’ve written before about her idea for the Ideological Turing Test, which was an excellent way of examining how well believers and non- can actually understand and empathise with each other, and accurately describe the opposing view.

She’s been a relatively high-profile atheist blogger for quite some time. A way bigger noise in the godless community than me.

And now she’s a Christian.

Which – with absolutely not a gram of malice intended towards Leah or anyone else – I find seriously weird.

There are plenty of ongoing discussions about what atheism is, but the most common proposition I keep hearing (from atheists) is that it simply describes the lack of belief in a god, and nothing more, regardless of how some religious people might want to stereotype or generalise about us. Perhaps it’s partly because of this oft-repeated truism that I tend to assume that, when it comes to the whole god thing, I’m on roughly the same page as other atheists – at least, those atheists firm enough in their convictions to, say, blog regularly about it. Whatever our other philosophical differences, and whatever the divergent nature of the paths that took us here, we’re of one mind now in rejecting god in all his/her/their forms.

But an announcement like this throws a whole new perspective on it. Apparently Leah and I have been worlds apart this whole time. I can’t begin to understand her reasoning process right now, and apparently that’s been true for longer than I’ve realised.

I just can’t really conceive of myself ever taking a step backward like that. It feels like I’m such an unfathomably long way from where she was when she made this decision. It’s not like I find atheism fundamentally unsatisfying, or miss the comfort religion used to offer. I’m not interested in “exploring my spirituality” in the inane way most people use that phrase. I’m not struggling to find some resolution to the metaphysical epistemology of moral reasoning. There’s just no God.

There are more interesting and complex and difficult questions than that, of course, just like there’s a lot we still don’t know about the history of the development of life on this planet. But the basic idea of common descent and Darwinian selection isn’t in crisis. And nor is my lack of faith.

Also, it’s not just that Leah now believes in God – she hasn’t simply been too troubled by the ultimate question of whether there isn’t something bigger than us out there, and fallen into a flimsy, tentative, unnecessary-but-kinda-understandable deism. She’s Catholic. Not only does atheism not adequately explain the visible world for her any more, but the people who had the right answer all along are the Catholic Church.

Seriously? This Catholic Church? The one right here? From not believing in any god at all, you’ve swung so far round that these despicable fuckers here now seem like the best bet for all the reliable facts about Jesus, and the meaning of life, and morality, which was the exact point of contention which led you to convert?

Apparently I’m more angry about this than I thought. I don’t know Leah personally, and don’t wish her any ill. It’s just quite bewildering.

Read Full Post »

A proposed law in North Carolina would restrict scientists and limit how much science they’re actually allowed to use when doing science.

In case that’s a big vague for you, here’s a quote from the bill being considered, describing the ways in which they’d be permitted to examine and describe the rates at which the sea level is rising:

These rates shall only be determined using historical data, and these data shall be limited to the time period following the year 1900. Rates of seas-level rise may be extrapolated linearly…

Lawmakers seem to share the concerns of Tom Thompson, a spokesman from a local economic development group, who worries about science being done by “nothing but computers and speculation”.

Science which depends on such arcane and incomprehensible techno-wizardry as “computers” is, of course, well known to be less reliable than simply declaring the world to be how you want it and assuming everything will work out for the best.

And the restrictions due to be placed on scientists make perfect sense. Just like how, sometimes, it’s better for everyone if you insist that the defendant in a criminal trial enter a plea without resorting to use of the word “not”. It’s still perfectly fair on them; it just assures that reality lines up neatly with your own desired outcome.

Perhaps canny state legislators noticed how Springfield was never threatened with destruction by a comet again after its residents burned down the observatory.

Also, if NASA had had to assume that gravity decreases linearly as you move away from the Earth, instead of making things all complicated, maybe we would have reached the Moon a lot sooner. I guess we’ll never know.

Anyway, in a spirit of true North Carolinian enquiry, I’ve done a bit of my own research into other trends that can be foreseen, using the same conditions as these oceanographers will be working under, and I’ve discovered some fascinating facts about the world of the future.

Here are just a few examples.

In 1920, the men’s 100m sprint world record was 10.6 seconds. As of 2009, it now stands at just under 9.6 seconds. Having improved by a whole second in just under 90 years, it can be linearly extrapolated that by the year 2873, men will be able to run the 100m instantaneously.

At the turn of the next millennium, they’ll be crossing the finish line a second and a half before the starting pistol is fired.

Oddly enough, running a marathon in no time flat will be achieved by the first man in 2244, even while a much shorter dash still takes several whole seconds. Meanwhile, women will be starting the marathon more than an hour after they’ve already finished it.

In 1900, the tallest building in the world was the Eiffel Tower, at 300m. This has since been surpassed by the Chrysler Building, the Empire State Building, and numerous others. The current record-holder is a ridiculous half-mile skyscraper in Dubai. Very approximately, then, we seem to be extending an extra 500m skyward each century.

By 2100 the tallest structure will be 1300m high. By around 2160, the toppermost of top floors will be a mile off the ground. That’ll double in a further 320 years. I’m not sure what’s going to motivate us to keep building up and up and up like this, how we’ll keep these things structurally sound against high winds and earthquakes, and whether low temperatures will become problematic as we start nearing the edge of the troposphere – but hey, I’m just extrapolating linearly from the available data.

Alarmingly, if we follow the same trend back in time, then we discover that the only things constructed before the year 1840 were basements and cellars. How this can be squared with the discovery of, say, the Pyramids, I’m not clear – but we’re only using historical data from the past century, so we’re a bit stuck.

But we’re barely scratching the surface of what this new form of science can tell us. For instance: the improvements in infant mortality over the past few decades can only be seen as wonderfully encouraging, but it also produces perhaps the most startling future predictions. Since 1950, the UK’s infant mortality rate has gone from 29 deaths (per 1,000 live births) to 5. That means we’re saving about one more child, out of every thousand, every two-and-a-bit years.

This leads us inexorably to the conclusion that, by the year 2030, for every 1,000 children born in this country, 1,002 of them will survive.

I’m sure I don’t need to explain to you the catastrophic effect this will have on population scientists’ spreadsheets.

Forget whether North Carolina’s going to have any coastline left in a hundred years. Clearly the world has bigger problems on its hands.

Read Full Post »

Older Posts »

%d bloggers like this: