This person is always smiling and saying superficially pleasant things, but there's a profound undercurrent of malice and spite just millimeters below the surface. Every single thing she says can be taken by a bystander as innocent, or at most, just a joke. She always retains plausible deniability against any complaint about her attitude or behavior, because each individual element of her constant, comprehensive, exhaustive campaign of malice is, taken by itself, so trivial and petty as to make the complainer look like a fool making mountains out of molehills. Yet collectively, if someone sticks around long enough to see it and makes even the slightest effort to pay attention, they accumulate into a vicious stream of self-centered belittlement and ridicule that speaks of a profound darkness of the soul.
Every word is sweet and soft, a perfumed silken caress, with a slender pointed dagger behind it. She cuts so cleanly you almost can't feel it at first. By the time you realize you're bleeding, she's flouncing away, and no one else would believe it, either.
She has a bevy of other ways to deflect criticism. One of her favorites is to put on airs of indignance at being judged. This one is especially popular if the objection is not to her, or her actions, but just their appropriateness to a particular situation. You're really trying to oppress her. Any criticism of anything she does or says is never actually what it says it is; it's simply a symptom of your personal, and irrational, dislike of her. She never actually answers any comment, criticism, or accusation. Never. She may respond, she will certainly speak afterwards, but she will never actually answer.
All of this should be of little import, because it seems like such an obvious ploy, such an obvious strategy. And such an easily recognizable pattern to anyone who looks for it for the tiniest amount of time. It oozes, after all, from everything she says. She thinks she's very very clever to have invented it; she's positively smug, smirking all the time. You want to laugh at her for being so proud of inventing something so obvious and so recognizable.
But... somehow, almost everyone doesn't recognize it. Doesn't realize it. Falls prey to it, over and over. So much so, that anyone who does recognize it and tries to say something gets not only ignored but marginalized. They get seen as the troublemaker for complaining about such petty matters, in the eyes of all the people who fall for it.
I wish there were a nice simple name you could use for this kind of person. Something along the lines of "passive-aggressive" or "anal-retentive", something that sums up the whole pattern of behavior and thus makes it more readily recognizable. But if there is one, I don't know what it is. But most of all I wish more people would learn to recognize this pattern. It's depressing that such a simple and obvious tactic works so darned well.
Friday, September 29, 2006
Thursday, September 28, 2006
A public service announcement
Apparently, there's some confusion on this point, so, as a public service, I'm here to settle the matter.
The toilet paper goes on so it goes over the top.
You may now go back about your lives knowing that this question need never confuse anyone again. I'm glad we got that settled.
The toilet paper goes on so it goes over the top.
You may now go back about your lives knowing that this question need never confuse anyone again. I'm glad we got that settled.
Wednesday, September 27, 2006
OOC reference or just a word?
I was asked about the appropriateness of using the word "hell" in a literary work within a world like Lusternia which doesn't have a place called Hell. It's an interesting question and hard to answer briefly.
It's clear that a reference to, say, Microsoft, or Beowulf, or biochemistry, does not belong in a work written and set in Lusternia (the same could be said for most fantasy worlds). In the same vein, a reference to Hell, the uppercase one, is inappropriate. (Of course, you could have a fantasy world that has a Hell in its mythology or as an actual place people visit, but Lusternia doesn't happen to have one; the nearest thing is the plane of Nil.)
However, the word "hell" has become genericized by use, and now has senses which do not allude to that particular bit of our culture. The only connection between the word hell in the phrase "I had a hell of a time" and the Christian concept of Hell is a very indirect connection, one rooted in etymology. (Most idioms started as metaphors or similes, but eventually through repeated use came to mean directly what they had formerly meant only by indirection through that metaphor or simile. Thus, the original metaphor is nothing more than an etymology.)
But in the case of "hell" the etymology is fairly evident. Perhaps it's evident enough to jolt the reader out of any immersion in the fantasy world that she had been able to achieve, and if so, that's enough of a reason to avoid it right there. The question is whether it has been genericized enough that one can read it and get the generic meaning in mind without even a flicker of the etymological original meaning.
Personally, I'd avoid it because while most people would not get the "wrong" meaning and be jolted out, a few people might. However, if it was really the best word to fit the spot and nothing else would do, I might go ahead and use it anyway.
One might say as a matter of principle to never use a word like that. But ultimately, almost every word turns out to be a "word like that" once you dig into its etymology far enough. The difference is that the original meaning is obscure enough that you either don't know it, or only think of it if you make a conscious effort to do so.
And when you get right down to it, every word of English has unshakeable origins in one or another language that never existed in Lusternia; we assume that Lusternians are actually speaking their own language, and it's being translated for us. (Because it's an impossibly ridiculous coincidence to expect that they have a language that actually happens to have the same words in it as English.) So ultimately, even "Hell" might be assumed to be a translation of something that is appropriate in Lusternia.
It's clear that a reference to, say, Microsoft, or Beowulf, or biochemistry, does not belong in a work written and set in Lusternia (the same could be said for most fantasy worlds). In the same vein, a reference to Hell, the uppercase one, is inappropriate. (Of course, you could have a fantasy world that has a Hell in its mythology or as an actual place people visit, but Lusternia doesn't happen to have one; the nearest thing is the plane of Nil.)
However, the word "hell" has become genericized by use, and now has senses which do not allude to that particular bit of our culture. The only connection between the word hell in the phrase "I had a hell of a time" and the Christian concept of Hell is a very indirect connection, one rooted in etymology. (Most idioms started as metaphors or similes, but eventually through repeated use came to mean directly what they had formerly meant only by indirection through that metaphor or simile. Thus, the original metaphor is nothing more than an etymology.)
But in the case of "hell" the etymology is fairly evident. Perhaps it's evident enough to jolt the reader out of any immersion in the fantasy world that she had been able to achieve, and if so, that's enough of a reason to avoid it right there. The question is whether it has been genericized enough that one can read it and get the generic meaning in mind without even a flicker of the etymological original meaning.
Personally, I'd avoid it because while most people would not get the "wrong" meaning and be jolted out, a few people might. However, if it was really the best word to fit the spot and nothing else would do, I might go ahead and use it anyway.
One might say as a matter of principle to never use a word like that. But ultimately, almost every word turns out to be a "word like that" once you dig into its etymology far enough. The difference is that the original meaning is obscure enough that you either don't know it, or only think of it if you make a conscious effort to do so.
And when you get right down to it, every word of English has unshakeable origins in one or another language that never existed in Lusternia; we assume that Lusternians are actually speaking their own language, and it's being translated for us. (Because it's an impossibly ridiculous coincidence to expect that they have a language that actually happens to have the same words in it as English.) So ultimately, even "Hell" might be assumed to be a translation of something that is appropriate in Lusternia.
Monday, September 25, 2006
Aggressive driving
I first encountered the blue Oldsmobile Alero about a quarter of the way along my drive in to work, in Barre. It was right in front of me going across Main Street onto Route 62 heading up the hill to the Interstate. The driver, a blonde woman I couldn't make out, seemed to be in a big hurry, soon weaving in and out of lanes and racing up to traffic lights.
Traffic was heavy in both lanes which were moving at about the same pace, but since there were more cars on the left, she darted into the right lane and raced ahead (and I don't mean a mild racing ahead, I mean a Smokey and the Bandit racing ahead) the half-dozen car-lengths until she was stopped again, then put on her left turn signal. No one let her in, and she stayed that way five minutes until she ended up two cars behind me before she could finally get over.
Farther up the hill she passed me again by weaving lanes, got caught by the light, and this kept repeating the whole trip.
When I finally turned onto the small road on which my office is located, she was still there... right back to where she started, right in front of me. Turns out she's one of my coworkers. I am so tempted to go ask her if all that reckless driving was worth it, to end up gaining precisely zero time.
Traffic was heavy in both lanes which were moving at about the same pace, but since there were more cars on the left, she darted into the right lane and raced ahead (and I don't mean a mild racing ahead, I mean a Smokey and the Bandit racing ahead) the half-dozen car-lengths until she was stopped again, then put on her left turn signal. No one let her in, and she stayed that way five minutes until she ended up two cars behind me before she could finally get over.
Farther up the hill she passed me again by weaving lanes, got caught by the light, and this kept repeating the whole trip.
When I finally turned onto the small road on which my office is located, she was still there... right back to where she started, right in front of me. Turns out she's one of my coworkers. I am so tempted to go ask her if all that reckless driving was worth it, to end up gaining precisely zero time.
Thursday, September 21, 2006
Making a gaming keyboard on the cheap
I don't spend a lot of time on computer games -- hardly any, in fact, compared to most people. Well, let me be more clear: I spend tons of time on games that involve computers, such as pencil-and-paper RPGs that happen to involve computers as game aids, or more prevalently, MUDs. And tons more time using computers in fun ways that aren't games, like chat. But actual computer games, I don't do a lot. And if you take away the idle playing of solitaire games like Sudoku on my Palm, even less.
Some of the games I enjoy on those rare times I play are things like FreeSpace2 and flight simulators, which unfortunately, require a fairly large amount of learning about to play well. You can't readily just jump in and start playing; you have to learn how the controls work, how to operate your craft, and a lot of technical stuff. Perhaps the most annoying part is learning the scores and scores of keyboard combinations necessary to effectively pilot a ship in FreeSpace2, which you really have to know by memory. Having a joystick with a zillion buttons on it helps a little, but too many buttons there and it gets just as bad.
Hardcore gamers are all drooling over the Optimus keyboard, which, if it ever actually becomes available, will provide a fantastic help both for the casual gamer like me, and the hardcore gamer. Though, maybe a kind of expensive solution for the casual gamer. In any case, since it's "coming soon" and has been for at least a year, it's just a pipe dream.
But thinking on this made me come up with an idea. A simple, basic USB keyboard can be had for a few bucks, if you wait for a sale at somewhere like Woot. Most techies probably have a few lying around somewhere not being used. So why not print labels, stick them on the keys, and go ahead with that? If you only play a few games, like I do, you only need a few keyboards. A FS2 keyboard will let me jump into that game when I have some time to kill without feeling like I have to start from scratch. Cool beans.
Some of the games I enjoy on those rare times I play are things like FreeSpace2 and flight simulators, which unfortunately, require a fairly large amount of learning about to play well. You can't readily just jump in and start playing; you have to learn how the controls work, how to operate your craft, and a lot of technical stuff. Perhaps the most annoying part is learning the scores and scores of keyboard combinations necessary to effectively pilot a ship in FreeSpace2, which you really have to know by memory. Having a joystick with a zillion buttons on it helps a little, but too many buttons there and it gets just as bad.
Hardcore gamers are all drooling over the Optimus keyboard, which, if it ever actually becomes available, will provide a fantastic help both for the casual gamer like me, and the hardcore gamer. Though, maybe a kind of expensive solution for the casual gamer. In any case, since it's "coming soon" and has been for at least a year, it's just a pipe dream.
But thinking on this made me come up with an idea. A simple, basic USB keyboard can be had for a few bucks, if you wait for a sale at somewhere like Woot. Most techies probably have a few lying around somewhere not being used. So why not print labels, stick them on the keys, and go ahead with that? If you only play a few games, like I do, you only need a few keyboards. A FS2 keyboard will let me jump into that game when I have some time to kill without feeling like I have to start from scratch. Cool beans.
Wednesday, September 20, 2006
Specious reasoning behind some truth
My brain likes finding connections, and I've noticed another connection between a lot of my blog posts, and in particular the least successful ones. It goes like this:
Specious reasoning can, and often is, used to support something that has some truth in it. This does not significantly diminish the danger inherent in the specious reasoning, nor the imperative of pointing it out.
Often I find myself pointing out how some particular argument or line of reasoning is faulty, in spite of the fact that the conclusion that argument reaches may well have some truth in it. It's tricky. People want to assume I'm arguing against the conclusion, not the argument itself, and jump to defend it.
Using bogus reasoning to support a truth is almost as bad as using bogus reasoning to support a lie. The ability to recognize bogus reasoning is key. It's more important than the truth of any one particular thing. Give a man a truth and he'll be right for a day; teach a man to think and he'll be right for a lifetime.
Besides which, the conclusions under consideration are not usually "right" or "wrong". They have some truth in them, but are also overstated, oversimplified, understated, or deceptive.
Here's an example. I recently read a forum post in which someone tells about a man he saw in a supermarket talking on a cell phone. The man said, "I need to concentrate on shopping, I'll call you back from the car." The person posting, naturally, was rolling his eyes at this.
Fair enough: people abuse cell phones, both in cars and out of them. I'm not disagreeing with that. However, this situation is not that unreasonable. Consider the same situation only instead of a cell phone you have the man's son there, asking him a string of questions about why the sky is blue. He says to his son, "I need to concentrate on shopping, I'll answer you when we're in the car." Would you roll your eyes at him for unsafe driving practices? Would you groan at the idea that he can spare more concentration to talk to his son while driving than he could while shopping? Probably not. Talking on a cell phone isn't identical to talking to a person who's there, sure, but are the differences relevant?
The fact is, cell phone use in cars has been demonized. Yes, using a cell phone in the car reduces safety for you and the people around you. It also leads to more rude driving. But someone using a hands-on phone during rush hour in a busy city habitually, and someone using a hands-free phone to make an occasional call while driving on simple, uncrowded, familiar roads, are completely different cases. And changing the cassette in your car is also distracting, but no one has proposed (so far as I know) banning cassette players. A knee-jerk reaction to something typically produces a conclusion that's just as wrong as the one you left, but simply in a more publicly acceptable, and perhaps safer, direction.
But I can't say that without someone saying, as if it were relevant, "Hey! People abusing cell phones are rude and a menace, so shut up!"
Specious reasoning can, and often is, used to support something that has some truth in it. This does not significantly diminish the danger inherent in the specious reasoning, nor the imperative of pointing it out.
Often I find myself pointing out how some particular argument or line of reasoning is faulty, in spite of the fact that the conclusion that argument reaches may well have some truth in it. It's tricky. People want to assume I'm arguing against the conclusion, not the argument itself, and jump to defend it.
Using bogus reasoning to support a truth is almost as bad as using bogus reasoning to support a lie. The ability to recognize bogus reasoning is key. It's more important than the truth of any one particular thing. Give a man a truth and he'll be right for a day; teach a man to think and he'll be right for a lifetime.
Besides which, the conclusions under consideration are not usually "right" or "wrong". They have some truth in them, but are also overstated, oversimplified, understated, or deceptive.
Here's an example. I recently read a forum post in which someone tells about a man he saw in a supermarket talking on a cell phone. The man said, "I need to concentrate on shopping, I'll call you back from the car." The person posting, naturally, was rolling his eyes at this.
Fair enough: people abuse cell phones, both in cars and out of them. I'm not disagreeing with that. However, this situation is not that unreasonable. Consider the same situation only instead of a cell phone you have the man's son there, asking him a string of questions about why the sky is blue. He says to his son, "I need to concentrate on shopping, I'll answer you when we're in the car." Would you roll your eyes at him for unsafe driving practices? Would you groan at the idea that he can spare more concentration to talk to his son while driving than he could while shopping? Probably not. Talking on a cell phone isn't identical to talking to a person who's there, sure, but are the differences relevant?
The fact is, cell phone use in cars has been demonized. Yes, using a cell phone in the car reduces safety for you and the people around you. It also leads to more rude driving. But someone using a hands-on phone during rush hour in a busy city habitually, and someone using a hands-free phone to make an occasional call while driving on simple, uncrowded, familiar roads, are completely different cases. And changing the cassette in your car is also distracting, but no one has proposed (so far as I know) banning cassette players. A knee-jerk reaction to something typically produces a conclusion that's just as wrong as the one you left, but simply in a more publicly acceptable, and perhaps safer, direction.
But I can't say that without someone saying, as if it were relevant, "Hey! People abusing cell phones are rude and a menace, so shut up!"
Tuesday, September 19, 2006
Compound words
And now it's time for another in our ever-unpopular series of Grammar Rants.
If a word is formed by joining two other words, that does not mean every time those two words appear in sequence, they get replaced by the one word. This is obvious in a few cases: "I was so frustrated at my attempt to bake cookies, I started throwing things. You should have seen that butterfly into the window." But too often it gets mixed up when the meaning isn't quite as different, but is still different.
Don't walk into that room. Above all, don't give in to those demands.
My computer's last backup was out of date, so I decided to back up all the data on it.
There are some times that don't work for meetings here, because the boss sometimes has to go to headquarters on short notice on Mondays.
Here's the handout for this week's class. Please hand out copies to everyone in your group.
Your login ID and default password have been placed into the envelope; please log in to the system and change your password as soon as possible.
Drat, I had a bunch of really good ones I saw in news articles recently, but I didn't note where they were and I can't find them now. I'll edit them into this post as I find them again.
Incidentally, am I just getting grumpier, or is proofreading in the newspapers really plummeting? I've seen some overt grammar and spelling errors in headlines lately that are so awful that even Microsoft Word's spell checker and grammar checker could have caught them. And that's in the headlines. Today I saw one use the word "cache" when it meant "cachet"; they're not even pronounced the same. And it's not just in my local papers, it's in Reuters News feeds just as often. What's going on?
If a word is formed by joining two other words, that does not mean every time those two words appear in sequence, they get replaced by the one word. This is obvious in a few cases: "I was so frustrated at my attempt to bake cookies, I started throwing things. You should have seen that butterfly into the window." But too often it gets mixed up when the meaning isn't quite as different, but is still different.
Don't walk into that room. Above all, don't give in to those demands.
My computer's last backup was out of date, so I decided to back up all the data on it.
There are some times that don't work for meetings here, because the boss sometimes has to go to headquarters on short notice on Mondays.
Here's the handout for this week's class. Please hand out copies to everyone in your group.
Your login ID and default password have been placed into the envelope; please log in to the system and change your password as soon as possible.
Drat, I had a bunch of really good ones I saw in news articles recently, but I didn't note where they were and I can't find them now. I'll edit them into this post as I find them again.
Incidentally, am I just getting grumpier, or is proofreading in the newspapers really plummeting? I've seen some overt grammar and spelling errors in headlines lately that are so awful that even Microsoft Word's spell checker and grammar checker could have caught them. And that's in the headlines. Today I saw one use the word "cache" when it meant "cachet"; they're not even pronounced the same. And it's not just in my local papers, it's in Reuters News feeds just as often. What's going on?
Saturday, September 16, 2006
Which privacy is technology taking away?
I read today in an article in Reuters news called Amid privacy backlash, Web publishers turn inward, the following quote:
So by day you're a corporate lawyer, and at night you go online and enjoy complete anonymity and privacy as you are a level 10 elf. Or, more interestingly, a participant in online chat rooms about kinky sex, or a trader in illegal copies of MP3s. Or something else you want to keep separate from your daytime identity. And now some new feature of some system, like the Facebook feature to track your friend's activities, threatens to undermine that anonymity.
Ummm... who gave you that anonymity in the first place?
I'm not saying we shouldn't be sensitive to its erosion, but to go out making technology as the villain, to blame the Internet, is so foolish and wrong-headed that it amazes me that this argument is made so regularly. Before the Internet gave you the opportunity, precisely how were you able to indulge your interests in orc-killing, kinky sex, or copyright infringement without the risk of being discovered, anyway?
All in all, technology, and the Internet, have made available to their users the biggest increases in privacy in all of history, hands down. It's true that the pendulum is swinging back slightly. And when I think of all the bad things people have done with that excess of anonymity, from the mild irritant of flamewars to the crimes of cybervandalism to the practice of pedophilia on the Internet, I don't know if it's a bad thing to have the pendulum swing back just a little bit. But don't castigate the Internet as the great anonymity-thief when it's only taking back a tiny bit of the immense, unprecedented bounty of privacy it gave you in the first place.
"If I am a corporate lawyer by day and a Level 10 Elf by night, I am not sure I want everyone to know my different identities," says David Holtzman, author of a forthcoming book "Privacy Lost: How Technology is Endangering your Privacy."I've seen this argument made a lot of times, that technology is taking away your privacy. And there are ways that it's true, to be sure. However, almost every time I see the argument, most of the focus is on things (like misuse of "cookies", or the linking of profile information between Yahoo and eBay, etc.) which are a completely wrongheaded approach to the problem.
So by day you're a corporate lawyer, and at night you go online and enjoy complete anonymity and privacy as you are a level 10 elf. Or, more interestingly, a participant in online chat rooms about kinky sex, or a trader in illegal copies of MP3s. Or something else you want to keep separate from your daytime identity. And now some new feature of some system, like the Facebook feature to track your friend's activities, threatens to undermine that anonymity.
Ummm... who gave you that anonymity in the first place?
I'm not saying we shouldn't be sensitive to its erosion, but to go out making technology as the villain, to blame the Internet, is so foolish and wrong-headed that it amazes me that this argument is made so regularly. Before the Internet gave you the opportunity, precisely how were you able to indulge your interests in orc-killing, kinky sex, or copyright infringement without the risk of being discovered, anyway?
All in all, technology, and the Internet, have made available to their users the biggest increases in privacy in all of history, hands down. It's true that the pendulum is swinging back slightly. And when I think of all the bad things people have done with that excess of anonymity, from the mild irritant of flamewars to the crimes of cybervandalism to the practice of pedophilia on the Internet, I don't know if it's a bad thing to have the pendulum swing back just a little bit. But don't castigate the Internet as the great anonymity-thief when it's only taking back a tiny bit of the immense, unprecedented bounty of privacy it gave you in the first place.
Wednesday, September 13, 2006
Stable guys in chick flicks
Many chick flicks have a guy in them, who is, at the start of the film, the current romantic interest (often fiancee) of the heroine. He's stable and boring and reliable. He is rarely romantic; perhaps his idea of a grand romantic gesture is cleaning the fridge so she doesn't have to. He pays the bills, mows the lawn, keeps the cupboards stocked, and makes sure she gets to work on time. He wears his glasses to bed. He'd never cheat on her, mostly because it just wouldn't occur to him to do anything so outré.
Then there's the flamboyant hero. He wafts in, called by whatever zephyr grabbed him most recently. He's entirely unreliable, governed by whims and passions of the moment. He probably has no idea how to mow the lawn, but even if he does, odds are he'd get distracted and run off to buy a bouquet of flowers and then drop it from a hang-glider instead.
Invariably, the film's end happens right in the narrow gap in between when she dumps the poor stable loser, and when Mr. Flamboyant's lack of focus starts to become annoying, and well before another zephyr spirits him off to some other movie.
What's amusing to me about this, though, is the fact that conventional wisdom says that it's guys who are prone to instability. Avoiding commitment, clinging to the "wild life", out with the boys. Buying unnecessary sports cars. And that it's women who don't like this, and want a guy who'll always be there, who's responsible and mature and settled down, ready to commit. Who's stable. Just like the poor shlub that gets the boot for no good reason in almost every chick flick.
Though I'll give them credit for one thing: you are, at least, allowed to feel sorry for him. He's rarely demonized; he's always "a really nice guy".
Then there's the flamboyant hero. He wafts in, called by whatever zephyr grabbed him most recently. He's entirely unreliable, governed by whims and passions of the moment. He probably has no idea how to mow the lawn, but even if he does, odds are he'd get distracted and run off to buy a bouquet of flowers and then drop it from a hang-glider instead.
Invariably, the film's end happens right in the narrow gap in between when she dumps the poor stable loser, and when Mr. Flamboyant's lack of focus starts to become annoying, and well before another zephyr spirits him off to some other movie.
What's amusing to me about this, though, is the fact that conventional wisdom says that it's guys who are prone to instability. Avoiding commitment, clinging to the "wild life", out with the boys. Buying unnecessary sports cars. And that it's women who don't like this, and want a guy who'll always be there, who's responsible and mature and settled down, ready to commit. Who's stable. Just like the poor shlub that gets the boot for no good reason in almost every chick flick.
Though I'll give them credit for one thing: you are, at least, allowed to feel sorry for him. He's rarely demonized; he's always "a really nice guy".
Tuesday, September 12, 2006
Marco Polo and spaghetti
The back-and-forth of yesterday's blog post reminded me of another thing I wanted to write about: noodles.
Everyone "knows" that noodles were invented in Italy, since that's where pasta comes from.
But then, everyone hears that, no, they weren't. Like so many things, they were actually invented in the Far East first. Marco Polo, on his historic trip to the Orient, brought them back from China. An amazing fact, and that's why everyone hears it.
Too bad it's not true. Oh, Marco might have brought some noodles back with him. China sure had noodles by the 14th century. However, Marco also brought noodles with him (dried, in baskets) when he left. The fact is, they'd been making noodles in Italy for more than a thousand years by that point. Depending on your definition, perhaps the first Mediterranean noodles were the flat bread used in a layered dish with meat and cheese, called "lagane" (and surprisingly similar to today's lasagne, sans tomatoes of course), made by Etruscans in the first century AD.
The fact is, depending on how broadly you define "noodle", they were invented in many places independently. The Chinese get full credit for inventing them, but so do the Italians.
(Incidentally, the account of Marco Polo's voyages is a fascinating subject in itself. It includes a few things that one would have to go to China to know about, that were not known in the west. However, large parts of it were also lifted verbatim from a fictional account written long before Marco's travels. And there are other reasons to doubt his story. I guess that's another reason to invent time travel or at least FTL.)
Everyone "knows" that noodles were invented in Italy, since that's where pasta comes from.
But then, everyone hears that, no, they weren't. Like so many things, they were actually invented in the Far East first. Marco Polo, on his historic trip to the Orient, brought them back from China. An amazing fact, and that's why everyone hears it.
Too bad it's not true. Oh, Marco might have brought some noodles back with him. China sure had noodles by the 14th century. However, Marco also brought noodles with him (dried, in baskets) when he left. The fact is, they'd been making noodles in Italy for more than a thousand years by that point. Depending on your definition, perhaps the first Mediterranean noodles were the flat bread used in a layered dish with meat and cheese, called "lagane" (and surprisingly similar to today's lasagne, sans tomatoes of course), made by Etruscans in the first century AD.
The fact is, depending on how broadly you define "noodle", they were invented in many places independently. The Chinese get full credit for inventing them, but so do the Italians.
(Incidentally, the account of Marco Polo's voyages is a fascinating subject in itself. It includes a few things that one would have to go to China to know about, that were not known in the west. However, large parts of it were also lifted verbatim from a fictional account written long before Marco's travels. And there are other reasons to doubt his story. I guess that's another reason to invent time travel or at least FTL.)
Monday, September 11, 2006
The long tailpipe
The first reaction people have to the idea of an electric car (setting aside questions of cost, technology, etc., and assuming a basic respect for and awareness of the environment and associated issues) is exultation: "Zero emissions? Not just low, but zero? Sign me up!"
However, this proves to be a superficial analysis. How can a car produce zero emissions? Simple: the energy conversion processes which produced the energy that drives the car, and consequent pollution, were done somewhere else. If you consider only the car, you have a 100% improvement in pollution rates, but if you consider the entire system, it's not so clear. Now a power plant somewhere is producing more electricity to make up for the gas you're not pumping into your car. How are they doing?
Most power plants will be able to convert energy far, far more efficiently than you can in a car, because they can build much more sophisticated devices, thanks to economies of scale, both to convert energy more efficiently, and to better filter and reprocess the pollutants that result. Furthermore, some of the energy is being produced using methods that generate little or no pollutants into the air, such as hydroelectric, solar, and nuclear (yes, nuclear). So it would seem like they are bound to win the comparison.
However, the transmission of energy from their plants to your car's movement involves quite a few conversion steps, and each one of these involves a loss of energy. There's a considerable loss moving the energy down high-tension lines to your house, then more as it's moved into batteries in your car, and still more as it's moved back out of those batteries. If you calculate how much energy has to be generated to move your car, and how much pollution is generated doing so, compared to how much your car would have if it was gas-powered, the result is so close that you will find an electric car is a breakeven proposition in some parts of the country, a gain in some, a loss in some.
However, even this comparison is entirely unfair. If you are going to treat the electric car as part of a system and consider the pollutant output of the whole system, you have to do the same for the gas car. That means you have to add into your comparison not just the pollution your car produces, but also all the pollution produced by the drilling process, the tankers that move the crude oil, the refineries, and the tanker trucks that move the gas to your local gas station, and so on. And that makes the calculation tip heavily in favor of the electric car.
So after a lot of flip-flopping, we end up finding out that the first assumption turns out to be right. But that's only by luck. We still have to consider the question systemically; considering only one part of the system in isolation can easily lead to incorrect answers.
However, this proves to be a superficial analysis. How can a car produce zero emissions? Simple: the energy conversion processes which produced the energy that drives the car, and consequent pollution, were done somewhere else. If you consider only the car, you have a 100% improvement in pollution rates, but if you consider the entire system, it's not so clear. Now a power plant somewhere is producing more electricity to make up for the gas you're not pumping into your car. How are they doing?
Most power plants will be able to convert energy far, far more efficiently than you can in a car, because they can build much more sophisticated devices, thanks to economies of scale, both to convert energy more efficiently, and to better filter and reprocess the pollutants that result. Furthermore, some of the energy is being produced using methods that generate little or no pollutants into the air, such as hydroelectric, solar, and nuclear (yes, nuclear). So it would seem like they are bound to win the comparison.
However, the transmission of energy from their plants to your car's movement involves quite a few conversion steps, and each one of these involves a loss of energy. There's a considerable loss moving the energy down high-tension lines to your house, then more as it's moved into batteries in your car, and still more as it's moved back out of those batteries. If you calculate how much energy has to be generated to move your car, and how much pollution is generated doing so, compared to how much your car would have if it was gas-powered, the result is so close that you will find an electric car is a breakeven proposition in some parts of the country, a gain in some, a loss in some.
However, even this comparison is entirely unfair. If you are going to treat the electric car as part of a system and consider the pollutant output of the whole system, you have to do the same for the gas car. That means you have to add into your comparison not just the pollution your car produces, but also all the pollution produced by the drilling process, the tankers that move the crude oil, the refineries, and the tanker trucks that move the gas to your local gas station, and so on. And that makes the calculation tip heavily in favor of the electric car.
So after a lot of flip-flopping, we end up finding out that the first assumption turns out to be right. But that's only by luck. We still have to consider the question systemically; considering only one part of the system in isolation can easily lead to incorrect answers.
Tuesday, September 05, 2006
Menacing reubens caught on film!
Took some pictures with the cell-phone camera of the menacing reuben at Friendly's. There was a great one I hadn't noticed before, with a monstrous cheeseburger lurking in the shadows of a covered bridge, but the picture didn't come out. (Even these aren't so hot, bad lighting).
So without further ado, here are a pair the demonstrate what I was talking about:
Doesn't that make you want to just quick run down to Friendly's and have a big meal... before it's too late?
So without further ado, here are a pair the demonstrate what I was talking about:
Doesn't that make you want to just quick run down to Friendly's and have a big meal... before it's too late?
Monday, September 04, 2006
Popular science
Science isn't complex and difficult because freakish scientists like to make it that way. It's complex and difficult because the universe is complex and difficult. More accurately, because our minds, tailor-made for dealing with the comparatively super-simple everyday world and everyday life to which we are heir, must strain to function on the vastly different level of complexity that is required to understand even the simplest parts of the universe's workings.
Popularizing science means finding a way to bridge that gap. There are many good examples of this; Carl Sagan's works are the outstanding gem amongst them, though many others have done excellent work in the field. But what rankles me is a particular kind of bad way to do it that is perennially popular, one which I think is best typified by the "Tao of Physics".
Now, the book of that name may indeed have an insight or three in it worthy of the printed page (and if you go to your local bookstore and look at what's worthy of the printed page, that might not seem like a very tough criterion). However, while it pretends to offer an insight into physics, it not only does not, it gets in the way of such insights. And it is just one of a whole class of attempts to make science accessible by taking a metaphor far too far. (If I seem to be picking too much on that particular book, it's not because it's the worst example, not by far; just the most recognizable "banner" for this approach.)
Because that's what the Tao of Physics is: a metaphor that has been mistaken by its author, or at least by most of its readers, as something more than a metaphor. (Well, probably, it's more like a simile, but you get the point.) Along the way, the actual science got lost. Instead, the reader is left making spurious and entirely inaccurate conclusions about science and scientific concepts, conclusions predicated entirely upon mistaking the metaphor for an actual equivalence.
To me this is a disservice to science, to the reader, and probably to the concept of tao (or whatever is on the other side of the metaphor). The genuine insights in the book could just as easily have been expressed without having to be mired in the morass of a metaphor gone awry. And now any attempt to clue in the reader to what's actually amazing in science, already a daunting task, is now far harder, as these mistook metaphors provide so many paths by which the reader's mind can easily slip away from the path of understanding.
Popularizing science means finding a way to bridge that gap. There are many good examples of this; Carl Sagan's works are the outstanding gem amongst them, though many others have done excellent work in the field. But what rankles me is a particular kind of bad way to do it that is perennially popular, one which I think is best typified by the "Tao of Physics".
Now, the book of that name may indeed have an insight or three in it worthy of the printed page (and if you go to your local bookstore and look at what's worthy of the printed page, that might not seem like a very tough criterion). However, while it pretends to offer an insight into physics, it not only does not, it gets in the way of such insights. And it is just one of a whole class of attempts to make science accessible by taking a metaphor far too far. (If I seem to be picking too much on that particular book, it's not because it's the worst example, not by far; just the most recognizable "banner" for this approach.)
Because that's what the Tao of Physics is: a metaphor that has been mistaken by its author, or at least by most of its readers, as something more than a metaphor. (Well, probably, it's more like a simile, but you get the point.) Along the way, the actual science got lost. Instead, the reader is left making spurious and entirely inaccurate conclusions about science and scientific concepts, conclusions predicated entirely upon mistaking the metaphor for an actual equivalence.
To me this is a disservice to science, to the reader, and probably to the concept of tao (or whatever is on the other side of the metaphor). The genuine insights in the book could just as easily have been expressed without having to be mired in the morass of a metaphor gone awry. And now any attempt to clue in the reader to what's actually amazing in science, already a daunting task, is now far harder, as these mistook metaphors provide so many paths by which the reader's mind can easily slip away from the path of understanding.
Sunday, September 03, 2006
The selfish parasite on the steering wheel
A new development adds a mind-boggling twist to the question of the biological side of the unity of the organism. Toxoplasma infections in the brain can cause rats to lose their fear of cats, which helps the parasite continue its life cycle by infecting cats. It might influence, or in part determine, the personality of humans carrying it. And rates of infection in different nations and cultures, and their change over time, can explain cultural personality aspects such as prevalence of machismo, rates of neurosis or confidence, and interest in novelty.
How much of both our collective cultural personality, and our individual personality, is the sum of influences of organisms dwelling in our bodies? The selfish gene is only the beginning of the reductionism of free will. Sometimes I think the brain's biological purpose isn't to think, but merely to rationalize. Not only aren't we single organisms but committees of billions, even our thoughts, personalities, and identities might, in a very quantifiable sense, be formed in large part by a committee of chemical causes that are determined as much by individual happenstance as by species evolution.
Where, then, is identity? We already knew, if we were willing to, that much of what we consider our free will consists of unpredictable chemical shifts whose causes can be internal and external, and rarely have any relation to the reasons we come up with. All this discovery really does, or could do once it's expanded upon, is expand yet further the gap between what we think our reasons are, and what they may actually be; and add to that gap a new purposefulness, the purpose of an organism that may be shifting our personality merely to allow us to live or die in a place convenient for it to spread. The answer lies not in our stars but in our selves; but is your self your body, or the things that live in your body?
How much of both our collective cultural personality, and our individual personality, is the sum of influences of organisms dwelling in our bodies? The selfish gene is only the beginning of the reductionism of free will. Sometimes I think the brain's biological purpose isn't to think, but merely to rationalize. Not only aren't we single organisms but committees of billions, even our thoughts, personalities, and identities might, in a very quantifiable sense, be formed in large part by a committee of chemical causes that are determined as much by individual happenstance as by species evolution.
Where, then, is identity? We already knew, if we were willing to, that much of what we consider our free will consists of unpredictable chemical shifts whose causes can be internal and external, and rarely have any relation to the reasons we come up with. All this discovery really does, or could do once it's expanded upon, is expand yet further the gap between what we think our reasons are, and what they may actually be; and add to that gap a new purposefulness, the purpose of an organism that may be shifting our personality merely to allow us to live or die in a place convenient for it to spread. The answer lies not in our stars but in our selves; but is your self your body, or the things that live in your body?
Saturday, September 02, 2006
But I have to stay in character!
In the eternal principle of "meden agan (nothing too much)", we turn now to the question of whether a roleplayer must remain true to character, world, genre, and feel at all turns or not at all, or somewhere in between. Can you guess which answer I'll end up picking?
On the extreme right we have the metagamers. They come first because they came first in roleplaying, which evolved out of wargaming, and which routinely recruits people from the worlds of card and board and other tactical games, computer games, and other media which primarily focus on tactics, and offer a means to "win" (or at least keep score) which is paramount. Roleplayers constantly bemoan the abuses of metagamers. Those who admit they're only here to play to win are looked down on, but it's the ones that claim to be roleplaying, but who always bend to fit the tactical advantage, whose characters always somehow end up after the greatest gear or magic items or piles of gold or (most commonly) highest experience level, those are the ones "true roleplayers" really despise. Twinks, minmaxers, power-gamers, they're a stain on the hobby and a perpetual obstacle to those sincerely seeking character exploration and development.
So we've got an easy villain to rail against, and what does that mean? Sure enough, it means people overreacting and going too far the other way. And when they do, criticizing their excesses is an uphill battle, because they're defending their banner of True Roleplaying from the elitist stand of someone opposing an obviously villainous opponent, the Twink. But being opposed to a villain doesn't make you a hero.
Those who elevate character motivations, genre conventions, game feel, etc. to an Ideal, make them an end, not a means, are making almost as serious a mistake as the Twinks. Metagaming has its place; what matters is how it's used, not whether. The fact is, while you're pursuing your character's internal motivations wherever they lead, it still falls upon your shoulders to make sure that somehow, they lead in a direction which:
All in all, you have to remember that Not Metagaming is not some sacred trust. It's simply a means to an end. What you're really after is Not Sacrificing Roleplaying In Favor of Metagaming. When metagaming goes against roleplay and fun, eschew it. When it facilitates roleplay and fun, embrace it.
On the extreme right we have the metagamers. They come first because they came first in roleplaying, which evolved out of wargaming, and which routinely recruits people from the worlds of card and board and other tactical games, computer games, and other media which primarily focus on tactics, and offer a means to "win" (or at least keep score) which is paramount. Roleplayers constantly bemoan the abuses of metagamers. Those who admit they're only here to play to win are looked down on, but it's the ones that claim to be roleplaying, but who always bend to fit the tactical advantage, whose characters always somehow end up after the greatest gear or magic items or piles of gold or (most commonly) highest experience level, those are the ones "true roleplayers" really despise. Twinks, minmaxers, power-gamers, they're a stain on the hobby and a perpetual obstacle to those sincerely seeking character exploration and development.
So we've got an easy villain to rail against, and what does that mean? Sure enough, it means people overreacting and going too far the other way. And when they do, criticizing their excesses is an uphill battle, because they're defending their banner of True Roleplaying from the elitist stand of someone opposing an obviously villainous opponent, the Twink. But being opposed to a villain doesn't make you a hero.
Those who elevate character motivations, genre conventions, game feel, etc. to an Ideal, make them an end, not a means, are making almost as serious a mistake as the Twinks. Metagaming has its place; what matters is how it's used, not whether. The fact is, while you're pursuing your character's internal motivations wherever they lead, it still falls upon your shoulders to make sure that somehow, they lead in a direction which:
- is fun for you
- is fun for the other people in the game
- is fun for the GM
- doesn't screw anything up for anyone else
- doesn't make anyone else waste a lot of effort and time unwillingly
- doesn't make your fun come at the expense of others involved in the game
All in all, you have to remember that Not Metagaming is not some sacred trust. It's simply a means to an end. What you're really after is Not Sacrificing Roleplaying In Favor of Metagaming. When metagaming goes against roleplay and fun, eschew it. When it facilitates roleplay and fun, embrace it.
Friday, September 01, 2006
A hole in history
Pretty much every year from first grade to my senior year, I had a Social Studies class that went through history chronologically. Each year would focus more on one thing or another. The history of the Americas, or systems of government, or wars, or kings and queens, or local history, or the role of religion in history. But each time we'd start at some point and progress forward through the years.
And every year, as summer loomed, the teacher would find that we were behind schedule, and as we hit the 20th Century, things would get covered more and more scantily. As a result, I find I learned a fair amount about the American Revolutionary War, the Age of Exploration, the Holy Roman Empire, the Industrial Revolution, the British monarchy, the Civil War, and a host of other subjects, but I know almost nothing about the latter half of the twentieth century.
World War I was always covered fairly well, and I got a moderate exposure to the temperance movement, the Roaring Twenties, and Prohibition. The Great Depression and the New Deal always got covered quite thoroughly, as did the League of Nations. After that we got very hazy. World War II, and the events leading up to it, were always covered only in broad strokes. I knew who was on which side, the leaders, the Nazi atrocities, and Pearl Harbor, but beyond that, very little. I'd heard of D-Day, but I don't think I ever had a history teacher or text tell me what it was; I only knew what little I'd picked up by osmosis from popular culture.
(In fact, what prompted this post was reading about Peter Jackson's plan to make a remake of the movie The Dam Busters. The article mentioned a little about the military action upon which the movies are based, but didn't say how it turned out. I had never heard of this operation, not at all, and I soon realized, I'd never heard of any military action in WW2 except for those I heard in passing from wargamer or historian friends and associates, or picked up in movies and books and TV shows, with the one glaring exception of Pearl Harbor. Based on my schooling, I know more about Drake's actions against the Spanish Armada than about all of WW2.)
As for after World War II, it gets even worse. Not once in any of my history classes did we talk about anything after WW2 except a brief discussion of the formation of the United Nations. Never did we cover the Korean War, the Vietnam War, the civil rights movement, the atrocities in Africa and Oceania, tensions in the Middle East, the Kennedy assassination, the Cold War and McCarthyism, the Watergate scandal, the Space Race, Cuba and other spreads of communism, or anything else going on between the end of World War II, and the "current events" things we were studying as they happened. The first major historical event after World War II that I learned in school was the Energy Crisis.
Naturally, I've filled in this gap somewhat outside of, and after, school. I wouldn't do very well on a trivia category covering the third quarter of the twentieth century, but I wouldn't be hopelessly lost. Even back in elementary school I probably knew more about the Space Race than my teacher did anyhow. But there are still unfortunate gaps. It's not like I really need to know those things particularly better than I do; but it's still odd that I know them less than I do other times that I need to know about even less.
And every year, as summer loomed, the teacher would find that we were behind schedule, and as we hit the 20th Century, things would get covered more and more scantily. As a result, I find I learned a fair amount about the American Revolutionary War, the Age of Exploration, the Holy Roman Empire, the Industrial Revolution, the British monarchy, the Civil War, and a host of other subjects, but I know almost nothing about the latter half of the twentieth century.
World War I was always covered fairly well, and I got a moderate exposure to the temperance movement, the Roaring Twenties, and Prohibition. The Great Depression and the New Deal always got covered quite thoroughly, as did the League of Nations. After that we got very hazy. World War II, and the events leading up to it, were always covered only in broad strokes. I knew who was on which side, the leaders, the Nazi atrocities, and Pearl Harbor, but beyond that, very little. I'd heard of D-Day, but I don't think I ever had a history teacher or text tell me what it was; I only knew what little I'd picked up by osmosis from popular culture.
(In fact, what prompted this post was reading about Peter Jackson's plan to make a remake of the movie The Dam Busters. The article mentioned a little about the military action upon which the movies are based, but didn't say how it turned out. I had never heard of this operation, not at all, and I soon realized, I'd never heard of any military action in WW2 except for those I heard in passing from wargamer or historian friends and associates, or picked up in movies and books and TV shows, with the one glaring exception of Pearl Harbor. Based on my schooling, I know more about Drake's actions against the Spanish Armada than about all of WW2.)
As for after World War II, it gets even worse. Not once in any of my history classes did we talk about anything after WW2 except a brief discussion of the formation of the United Nations. Never did we cover the Korean War, the Vietnam War, the civil rights movement, the atrocities in Africa and Oceania, tensions in the Middle East, the Kennedy assassination, the Cold War and McCarthyism, the Watergate scandal, the Space Race, Cuba and other spreads of communism, or anything else going on between the end of World War II, and the "current events" things we were studying as they happened. The first major historical event after World War II that I learned in school was the Energy Crisis.
Naturally, I've filled in this gap somewhat outside of, and after, school. I wouldn't do very well on a trivia category covering the third quarter of the twentieth century, but I wouldn't be hopelessly lost. Even back in elementary school I probably knew more about the Space Race than my teacher did anyhow. But there are still unfortunate gaps. It's not like I really need to know those things particularly better than I do; but it's still odd that I know them less than I do other times that I need to know about even less.
Subscribe to:
Posts (Atom)