Tuesday, October 31, 2006

Do not over-read this post

Cooking directions are full of these helpful statements. "Do not overbake", for instance. Well, duh. Is there anything you are supposed to overbake? If there were, that wouldn't be overbaking. Cooking shows are even worse; you can hardly go an episode of any show, even the good ones, without being told something about how you shouldn't put too much, or not enough, of something into the recipe. That's what "too much" and "not enough" means already; tell us how much to put! Why don't they go the next step and say "When preparing this recipe, be sure not to do anything wrong."

Now, if you tell me how to tell how much is too much, that's another story. But even then, after you define "too much", you don't still need to add, "and too much is too much". It should go without saying.

So should I.

Sunday, October 29, 2006

Quantum physics and determinism

(If you're reading from the top down, skip down to the first of this three-part series and read them in chronological order instead. It'll work a lot better that way.)

Much has been made of the assertion, widely agreed upon by quantum physicists, that determinism as a scientific principle is dead. How's that?

Physics has been getting farther and farther from what makes intuitive sense for a century now. This is not a criticism -- the real world has no obligation to resemble, on the microscopic or macroscopic scales, the everyday world our brains are hardwired to work in and be comfortable with. Relativity gave us such mind-blowing concepts as time dilation and an absolute speed limit, flying in the face of intuition. Quantum mechanics makes relativity look positively tame.

If even Richard Feynman, one of the most gifted minds in history at relating such esoteric things as quantum mechanics to the lay mind, can't make things like the observer effect and the uncertainty principle seem comprehensible, I sure don't stand much of a chance. Any attempt to simplify invariably produces such an abbreviation that it leads people to jump to spurious conclusions based not on the actual science but rather in the gaps in the analogies. So I'm reluctant to even try to explain the relationship of quantum mechanics to determinism. But I can't really proceed without doing so; so read this knowing it's a poor summary at best.

The uncertainty principle says, in a sense, that information is itself an object in the physical world, that can be conserved, that has an effect. Nearly everyone's heard of the famous paradox of Schroedinger's Cat, in which information changes the world by collapsing a waveform, and many have heard of the problem of Wigner's Friend which furthers this question by asking what, precisely, constitutes an observer. The uncertainty principle, which could be likened to a conversation principle for information, is less well popularized, and just as infuriating.

It says basically that there are pairs of pieces of information which cannot simultaneously be known about a particle. You cannot, for instance, know both a particle's position and its momentum at the same time; the more precisely you know one, the less you know the other.

This tends to make people think it's an engineering problem, the same way people liken the light barrier to the sound barrier and assume someone's just got to come up with a better spaceship design or a more powerful engine to break it. But it's not an engineering problem. It's a fundamental limitation that is part of the very fabric of the universe, and one which has been proven in a large number of mind-numbingly-weird experiments. The universe simply does not allow you to know one of these pieces of information more precisely without making it so you know the other one less precisely, and this precision is a mathematically defined constant (called Planck's constant).

What this means is that, at the atomic scale, particles are effectively not in one place. A particle isn't a glob of stuff somewhere. A particle is better understood as an effect on the world around it, and that effect has only a statistically distributed probability of being seen in various places. This is called the "waveform" of a particle; it says, in a sense, the particle has a 5% chance of being here (or rather, its effect being felt here), a 15% chance of being there, and so on. Certain observations "collapse the waveform" -- force it to reduce itself to a single point with a 100% probability -- but these observations in turn make the particle's momentum become a more broadly-defined, and finally undefined, waveform. The broader one is, the narrower the other becomes.

That means that on an atomic scale, determinism is not strictly observed. You cannot know the positions and momentums of the particles, so you cannot predict their future states. The best you can do is provide a statistical estimate of the likelihoods of the particles being in various places doing various things. Einstein famously refused to accept this and essentially wasted the second half of his career trying to disprove it; while his efforts were very fruitful in leading other scientists to make important discoveries, and he himself was a key part in the development of some important principles of quantum mechanics (much to his ire), one can't help wonder what he might have accomplished if he hadn't railed so hard against it. (Though the famous quote attributed to him, "God does not play dice with the universe", is a paraphrase; the actual quote is, "I, at any rate, am convinced that He does not throw dice", to which Neils Bohr replied, "Einstein, don't tell God what to do".)

On an everyday scale (the scale of grains of sand, ball bearings, and boulders) this lack of determinism almost disappears. Technically speaking, there is still a statistical variability in the position of a grain of sand, or even a planet, because of the uncertainty principle. However, as the randomness in each of the billions of subatomic particles involved tends to balance with the others, the effective precision of our knowledge of the position of an entire grain of sand is so high that the uncertainty can be all but ignored. But it is still there. Technically speaking, even if you knew the precise position, momentum, energy state, etc. of every star in the universe, you could not predict their future states with perfect precision, only with a statistically enormous probability of correctness.

So technically determinism is dead. Most laypersons who hear about this feel confident that scientists will eventually discover some "hidden variables" that will allow it to be revived, though the more one studies quantum mechanics, the less likely one is to still consider that possible, so it's probably just one of those spurious conclusions I discussed earlier; but it does remain possible, as the history of science is full of surprises.

But does the death of determinism undo any of the changes to the human condition that determinism originated? I say no, and not for the usual reasons (quantum effects are vanishingly small on the macroscopic level), but for a far more fundamental reason. It's true that we can't say with precision where a particle will be, but it's also true that statistically the positions of particles will obey well-defined laws in aggregate. The key point about determinism that made it change everything is that the world behaves as it does because of its own rules, not the whims of unpredictable spirits and gods, and thus an understanding of those rules would allow mankind to foresee and change its future. The position of a particle might not be possible to measure, know, and predict to arbitrary accuracy, but that doesn't change that its behavior in aggregate is governed by knowable and usable rules; it just means those rules include a statistical element, but statistics itself is a science.

One can still know the world. One can still shape the world. There is no going back to the time of spirits. Free will cannot be so readily surrendered.

Thursday, October 26, 2006

I will choose free will

Much has been made about the philosophical implications of determinism. Generally it is seen as opposed to free will, but this, to my mind, is a mistaken dichotomy.

Consider this situation. A man has been arrested after being caught brutally torturing and murdering people. Further analysis reveals that a serious chemical imbalance in his brain has caused him to be delusional and violent. Maybe it's even possible to treat the chemical imbalance, and thus reform him. Another man is arrested for similar crimes, but no physical cause is detected, and thus, no cure is offered other than long social retraining with a low chance of success.

Most people would be inclined to forgive the first person by saying that his actions are "not his fault". The moving about of chunks of blame is very important to people in a situation like this. But the latter person should not be forgiven, since it's his own choices, and people should take responsibility for their own choices.

Ultimately, though, anyone who is moderately well-educated about things like psychology, the body-mind connection, and the history of our understanding of neurochemistry, has to realize that odds are very good that the second person also has something physical going on which we simply don't know how to diagnose and treat yet. A hundred years ago, these two men would have seemed identical as villains; and a hundred years from now they may again seem identical as victims. But today, they seem as different as night and day.

The principle of determinism takes this conundrum much farther. The idea is that a full knowledge of and understanding of the state of all matter and energy in the universe would, given sufficient time to compute, allow the prediction of all future events, since those events are determined by physical interactions of matter and energy, not by whimsical transcendant spirits. (The fact that it would take a few thousand universes to store the information to describe our one universe is immaterial. The point is that in principle it could be predicted and therefore is determined. The only randomness left is that we don't know what will happen, but what will happen is still determined by what is.)

So if everything you're going to do depends on physical characteristics, ranging from the state of a wave of light passing through space near you, to the chemistry you got from your parents, to the intricate and unseeable clockwork of electrical forces in your brain, then does that mean you are not responsible for any of your actions? Everything is determined ahead of time. We are only playing out a script written in the first moments of the universe. So who cares if you take that last donut? You were always going to.

But this doesn't make sense to me. If you did something bad just now, it doesn't matter if you did it because of something that happened earlier. Causes are not exclusive. Every cause in turn has a cause, but that does not take away the significance of the proximate cause. (This fallacy, the assumption that one cause obviates another, is one of the bigger ones in spurious logic.)

If you accept that it makes sense to judge people and their actions at all, then it seems clear that you can do so equally well in the case of both murderers. Both murderers did the things they did, because they were, at the time, the kind of people who did those sorts of things. There are reasons why they were those kinds of people, but that doesn't change that they were.

So what's really different? In one case, we can change what kind of person he is. That change does not change the past; it does not change what he did, or the fact that he was the person who could do those things. It doesn't obviate him of responsibility for his past. But it does change who he is now; and therefore, it changes how we feel about him. We can certainly deplore who he used to be and accept who he is now, if we're emotionally strong enough to accept the change (and if the change really is as good as I'm presuming). In the other case we can't make that change happen.

But in all these cases we can still judge someone based on who they are and what they do at any point in time, and the particular causes for why they are and do those things don't really enter into the judgment. It's no better or worse to be a killer for one reason than another, as long as in both cases, you're a killer.

Furthermore, and perhaps ironically, while nowadays people are inclined to imagine determinism and free will are opposed, a more historical viewpoint will reveal that the realization of determinism was actually a liberating concept. Before determinism took sway, people tended to imagine that the world around them, from the small to the large, was controlled in large part by the whims of gods and spirits who could not be understood, and only sometimes even appeased. In a real sense one's fate was out of one's hands. Determinism tells me that I can come to understand the actual causes of events, measure them, and manipulate them, without them depending on a whim completely outside my sphere of influence or even understanding. Determinism means my actions have a large part in determining my future.

Nowadays people look at determinism and can only look backwards and fret about the implication that determinism causes your actions, but it's just as important to look forward and see how determinism allows your actions to cause your future, too. It's a sword that cuts both ways. If you are a brilliant mathematician, do you say it means nothing because your father had a particular allele, or because a teacher you had in fourth grade motivated you? No, you can still be proud to be the person who is the accumulation of all those causes, who is thereby a brilliant mathematician. Determinism does not take away anything from your pride, your responsibility, your celebrations, your remorse, your accomplishments, your mistakes. They're still yours. In fact, determinism is what makes them yours.

Wednesday, October 25, 2006

Randomness and determinism

I think this is the first part of a three-part set of posts, but we'll see if I feel like writing the other parts.

If I pick up a die and roll it, the result is random. What does that mean? Essentially, it means unpredictable. But what if I allowed you to measure every single physical constant involved in the die throw beforehand? Every attribute of the die, air, and table, down to the positions and movements of atoms, and their energy levels and perturbations. Every detail of the velocity and trajectory of my throw. Every iota of the energy levels in the air, from heat, light, etc. All to an arbitrarily high degree of precision. Couldn't you then predict the die roll?

Most educated people today essentially take for granted that the answer is "yes", but it wasn't always so. That realization, the core concept of determinism, is the defining, core concept behind the birth of science. And it changed the world more than almost any other idea. That things that happened, happened because of physical things that could be known, understood, even manipulated, rather than at the whim of unknowable spirits, gods, and ghosts. That the only limit on how much you could know and affect the world around you was in you (how much time, intelligence, and resources you could bring to bear in a lifetime), not in the universe itself. Before this, science was a tentative poking to see if we could maybe figure out a few things; after it, science could encompass the whole of creation.

Sunday, October 22, 2006

Power outage

An unseasonable, nasty story hit us Friday night and made a lot of ruckus. First, we started losing the Internet to rainouts Friday afternoon. About dusk, as the rain and wind turned to snow, a tree fell down in the front yard, which means I need to get out there with the chainsaw and cut it up. It didn't hit the house, though. By then, wet snow clinging to the satellite dish had taken the Internet connection out to stay.

About 3:30am the power went out. After a while the UPSes beeping woke me up, but I didn't bother to get up and do anything about it. It was still out when I got up around 7:30.

With some things I needed to do on the Internet later in the day, and the house cold, and the toilets unusable without power to the water pump, we decided to spend the day at my office with high-speed Internet. It was enjoyably novel to wear slippers around my office.

Back home, we had power and the Internet again, and all's back to normal. But that tree's still waiting for me to cut it.

Friday, October 20, 2006

Another nail in rhetoric's coffin

The disappointing discussion resulting from my posting about my criteria for recognizing good roleplaying on the RPGnet Open Forum got me thinking again about the sad state of rhetoric. It's not that most people make bad arguments. Most people don't even know what an argument is; they think if you say something true after something, that you've refuted it. Even sometimes if what they say isn't at all true.

One "argument" I saw a lot was this: "Your experiences aren't universal." I didn't decide to engage in a point-by-point rebuttal for the same reason you don't answer "Have you stopped beating your wife?" with either "yes" or "no", but I couldn't help wonder how one could go about it. After all, if "having universal experiences" was actually a prerequisite to make observations, generalize from them, or post criteria, it'd be an awfully thin field. No one has universal experiences, but thanks to the fact that we can generalize and perform induction, and thanks to the fact that rules like this don't have to be 100.0000% accurate to be useful, that's really not relevant.

But there have been times when a statement like that was relevant, indirectly, to something, and that's probably what confuses the people who said it. Consider if someone had posted something like this: "Your criteria have a flaw in them: you haven't accounted for ABC or DEF, perhaps because your experiences aren't broad enough to have encountered them before." Now that would be a rebuttal. But saying "your experiences aren't universal" is only a very minor part of that, and more importantly, a secondary part. The bit that says "you have this flaw" is the responsive part; the other stuff is connected, not to the original post, but to that bit, and thus relevant only indirectly. They speak about the flaw, not the criteria.

People who don't have the foggiest notion how to reason and form an argument, just mimic the sounds of other things they saw that were called argumentation, without having any idea what they're actually doing. It's like trying to fix a car by adjusting valves that are the same color as the ones someone else once adjusted to fix a dishwasher.

Monday, October 16, 2006

Superficial profundity

I was thinking about writing up something in a blog post about how so many things considered profound in popular culture are really not profound at all -- they're merely obfuscated, as if people who are used to assuming that something they cannot understand must be profound, simply write things which cannot be understood, and assume they must be profound. But then I remembered Nietzsche already said that so much better. Most quotably thus:

"Mystical explanations are considered deep. The truth is that they are not even superficial."

I am writing this from work, or else I'd grab my copy of The Gay Science in order to quote a few more passages on the subject, of a little more length and clarity. But this quote really does say it. They're not even superficial, because they're not even explanations.

A truly profound statement can withstand scrutiny and examination -- in fact, it demands such, it blossoms from it. A superficial, trite nothing-in-fancy-dress, like you are likely to hear uttered by characters on the "very special episodes" of mainstream TV shows, or see on bumper-stickers, will not withstand scrutiny; it will turn out to mean nothing, or nothing worth saying. Scrutiny and analysis is not a means to puncture the truth and let the air out; it's a means to separate the truth from the shallow obfuscation that masquerades as insight.

Thursday, October 12, 2006

Brynna status update

Today our cat Brynna is in for dental surgery, extracting a rotten tooth. She turns 16 next month so she's pretty old for this, but it's still not a high-risk operation.

We also discussed a thyroidectomy, but that's higher-risk, and also very high cost, and brings with it the likelihood of increasing her kidney dysfunction. Its primary purpose would be to avoid having to give her pills; it's hard to medicate a cat since you can't catch them regularly enough when it's several times a day, and particularly one like Brynna who doesn't eat reliably, even if you sneak something into her food.

Instead, however, we're going to be getting her a cream that we apply twice daily in her ear (should be a lot easier than a pill!) that'll give her the thyroid suppression transdermally. This will provide a more steady flow of the drug for better regulation, should last longer for less times of high blood pressure, and avoid the stomach upset effects that the orally administered pills cause.

Wednesday, October 11, 2006

Good roleplaying looks like this

To some extent it's subjective what constitutes good roleplaying, particularly as you get into the distinction between really good and supremely good. However, I think when it comes to the distinction between good roleplaying and not-so-good (or even bad), that can be defined clearly enough.

  1. Your character's behavior always corresponds to what that kind of person would do in that kind of situation in that kind of world. Always.

  2. You can play a character who is not merely a simple variant on yourself, or a simple wish-fulfillment version of yourself, without violating rule #1.

  3. What you do doesn't just conform to the world, setting, theme, and tone, it also helps build it up and establish it.

  4. Your actions are always done in a considerate way that enhances, or at least does not detract from, the enjoyment of the other participants, always. You manage to do that without conflicting with rule #1.
I think that pretty much covers it. If you do all of these things all the time you are probably a good roleplayer.

Amusingly, in my 25 years roleplaying, without exception, every time I've met someone who believed themselves to the best roleplayer they knew, one who would "show the rest of you how it's done", they were invariably bad roleplayers who regularly violated at least one of these rules. The best roleplayers have always been people who considered themselves about equal with most of their peers at roleplaying.

Tuesday, October 10, 2006

What's so bad about small dogs?

Dogs of all sizes have their charms, don't get me wrong. But it seems there's some strange cultural stigma about small ones which annoys me.

Large dogs are stereotyped as being drooly and dumb but always good-natured and friendly. Small dogs are stereotyped as being yappy and uptight and annoying, and are called "little rat dogs". Neither stereotype is particularly accurate.

Any purebred is likely to be uptight, regardless of size, compared to mutts which tend to be laid back and friendly by comparison. There are of course exceptions, but the rule of thumb works. Size, however, doesn't enter into it. One of the most uptight breeds of dog is the greyhound, because of how it's been bred for racing, and they're by no means small. Other large breeds known for being at least as uptight as any poodle include pit bulls and collies. Of course individual dogs of any breed, even purebred, might be laid back... but that applies just as well to poodles or cockers as larger breeds. Still, small dogs get all the bad rap without being any more likely to have the problem.

And mutt small dogs are just as drooly and just as friendly and good-natured as big dogs, except they can do it on your lap (without crushing you and your sofa). Plus they're a lot less likely to knock someone over in their enthusiasm to be friendly and hurt someone. But these traits are overlooked. Instead, little dogs are dismissed as "little rat dogs" summarily, and so unfairly.

People who appreciate smaller dogs are rarely prejudicial against the big dogs, but not vice versa. It's not fair. More power to cuddly, friendly little pooches everywhere!

Thursday, October 05, 2006

The Four Foods Theory

Consider the veracity of the following assertion:

Any good food goes with at least one of the following: chocolate, cheese, garlic, onions.

Note that the converse is not asserted, so beware of the excluded middle.

Logically, one can conclude that if this is true for a set of ingredients S, it will also be true for any set S' where S is a subset of S'. For instance, if it is true for "chocolate, cheese, garlic, onions" then it's also true for "chocolate, cheese, garlic, onions, spaghetti sauce". So the challenge is not to find a set which makes a true assertion; it's to find the smallest possible set that does so.

Clearly we can't eliminate chocolate or cheese since it's trivial to find examples of "good food" which do not go with any of the others. However, garlic and onions have a lot of overlap. I think at some point I had come up with one good food that goes with onions but not garlic, and one that goes with garlic but not onions, which requires the set to contain both; however, I can't think of one right now.

The trickier question is whether this set really is broad enough to work. Some people might claim that lobster proves butter must be added; however, lobster is not good food. (And even if it were, garlic goes with it.) Corn on the cob might be a better argument for butter needing to be added to the list, however.

Please don't make me say I'm just kidding, please...

Wednesday, October 04, 2006

Boogers!

I haven't gotten a cold serious enough to stay home for, in about two years, if I remember right. This one came on very abruptly. Friday afternoon I had a slight cough as we drove up to Burlington to our anniversary dinner, and my back was more achey than I expected. My throat stayed dry and scratchy all evening, and by the time I got home I was blowing my nose and coughing. By Saturday morning, it was a full-on cold, and stayed that way bad enough to stay home Monday and Tuesday.

So far it's all been that sniffly, occasionally sneezy, too-wet stage which is not really that bad; just kind of tired, and the feeling that your head is bigger on the inside than on the outside. But I'm just starting to come into the dry cough stage, which I really really hate. I'm not as tired, so I can go to work, get things done, and that's good -- I'm not good at sitting around doing nothing. But that cough just gets so tiring and painful so quickly, and then it starts robbing me of my sleep, and then it just stretches on and on and on. I would take two more weeks of sniffling to avoid that, no question.