Monday, November 02, 2009

Trolley in a box

The ultra-brief summary of the premise of the new movie The Box reads as follows:
What if someone gave you a box containing a button that, if pushed, would bring you a million dollars… -- but simultaneously take the life of someone you don't know?
Presumably the movie just uses this as a starting point to launch into all kinds of action and twists and suspense and whatnot. I have no idea if it's going to be any good. I'm intrigued.

But as to that question itself, that is the kind of moral dilemma that's never seemed that bad to me. Let's assume that the deal being offered is just as it appears (though in real life one would of course be highly suspicious, but let's assume that's been allayed, as it usually is in things like the trolley problem, which is really what this boils down to.) And let's assume that the person who'll die is randomly selected, not maliciously selected to have a disproportionately big impact, or to affect me personally, etc. (as it probably turns out to be in the movie).

My reasoning is, how many people can I save with a million dollars? If I can find three people who would definitely have died but who I can ensure definitely live because of my application of that million, I press the button. There, done.

Why three? Because I want a nice wide margin of error. What if I save a 50-year-old woman who has heart disease, and the box kills a young woman with a long life ahead of her? Obviously, we're playing the odds no matter what, and it could go either way. If it were one random life for one random life, it's not worth the risk. If it's two for one, it probably is, but I'd want to make sure, so three.

Sure, maybe the person the box kills was going to be a nurse who saved a hundred lives, and the three people I save will turn out to be greedy bastards. But it's just as likely to be the other way around. In fact, given that I have some oversight over who are the three people I save, the odds are on my side here. (Unless the box is actually going to be malicious, not truly random.)

And with a million dollars I have no doubt I could save at least three lives, probably a lot more than three, and have some left over. Maybe the money would be better spent on more infrastructure-supporting things: better to buy something that will help a hospital save a hundred lives than just definitely save five. But the indirectness of buying a new MRI machine, never being sure that it actually made the difference, means I'd rather make sure I saved three lives unequivocally and then use what's left on more indirect help. (Plus a little bit for me, to buy some new toys or something. I'm not wholly unselfish.)

That some random person is going to die because of me is not a problem. The fact is, we may remain blissfully ignorant of it, but every day, there's a chance something we do could lead to someone's death. If you read that someone got hit by a bus and died this morning, maybe that bus was in that precise spot because of traffic that you were a part of, and if you'd driven a little slower or faster, it would have been a few yards farther ahead or behind. Every effect has a million causes at various levels of indirection, and while we imagine that (in addition to intent) the amount of indirectness is what matters in dulling the ethical imperative, it's usually just the (correlated) amount of ignorance we have of our effect. Life's just like that: every choice you make (or refuse to make) might be leading to things you can't foresee, so what else can you go on but measuring the effects you do know and can evaluate?

But while that would be no comfort to the person the box killed, neither would it be any comfort to the three people that you didn't save with that million if you chose the other way.

I can perfectly well understand why people would recoil from how cut-and-dried I'm being, how I'm treating lives like lines on a ledger. I wouldn't disrespect someone else for not making the same choice, for the idea that the guilt over that one death would haunt them forever, for instance. But this is still my conclusion. Net gain of two lives or more, plus a little cash for me to buy some toys. That's almost a no-brainer.

No comments: