Monday, May 31, 2010

Puzzleball

Yesterday I finished the 540-piece globe puzzleball I got for Christmas (and haven't even opened up until a couple of days ago). Here are a few pictures of it in the process of being built:




And here it is, completed:

There was one piece missing in the south Pacific ocean when I took this picture, but I found it later.

As a puzzle it's interesting and challenging, but the oceans, which cover an awful lot of Earth's surface (and when you do it in globe form, it's clear it's even more than you're expecting even if you know the statistics), are inordinately hard. It would have taken me weeks if I hadn't used the numbers on the back of the pieces for some parts of the ocean. I don't think there's any way to correct for this.

Near the end it gets crazy hard to press the pieces into place because you can't get your hand inside to press against the back. I tried to make up for it a bit with a long-handled wooden spoon, but it's too clumsy to really do the job. There's an easy solution that I'm surprised the manufacturers haven't hit on. Split the puzzle into two puzzles, one per hemisphere, and have the equatorial "edge" pieces have tabs to snap into one another. Then you have all the freedom you need to work on the two halves with full access, to make sure each piece fits securely. At the end, you slip the two halves together to make a full globe. This would work with all their spherical puzzles, not just the earth globe.

Though it's kind of fragile, I can now treat the completed puzzle as a desk globe. I just need to decide where to put it.

Sunday, May 30, 2010

Modern mind-body dualism

The concepts of soulism and Cartesian mind-body dualism are so ubiquitous that all our thoughts are shaped by them. Those with even a basic grounding in philosophy will have these ideas so entrenched that, even when opposing them, they are taking on the stance of going against establishment, rallying against cultural assumptions. And most people who have no idea what I'm talking about have the ideas even deeper in their thoughts, they're just less aware of them.

But these concepts were formed at a time (almost all of human existence) where the idea of something immaterial was so vague that it was unavoidably entangled with mysticism. The soul is explicitly described as a non-physical thing, and yet, it is treated as if it were a physical thing that just happens to be made of a different sort of physical substance. It is, above all, a thing. Less so the mind of Cartesian dualism; in this concept we get halfway to the idea of what we now consider the mind to be, but only halfway. The mind may not be a physical thing, but it's still a thing.

If, somehow, we could have gotten to this point in technological history without finalizing the ideas of soulism or mind-body dualism, and we were only inventing the idea now -- or, to put it another way, if some child could somehow grow up today familiar with modern technology yet wholly innocent of culturally established memes about dualism, then invent his own philosophy -- the resulting dualism would be a thousand times more apt, because for the first time in history, there's a convenient, handy analogy that leads us in what I think is more the right direction. Just as the camera helps us understand the eye (and, if the analogy is taken too far, can lead to misimpressions), the computer can help us understand the brain. And for the first time, everyone is familiar with software, and how it exists as an ephemeral, emergent property, partially independent of yet entirely dependent upon the underlying hardware.

The intangible thing that coexists with the physical thing no longer needs to be some mystical spark, or possess inexplicable properties because of its noncorporeality. The mind is simply software: it is information, it is a particular configuration of the hardware. It is tangible in the sense that the shape of a building you built out of Legos is tangible, and intangible in the sense that those same Legos can be taken apart and built into an airplane, and where is the house now? The mystery is not nearly so mysterious now that we all have in our pockets a six-ounce, $100 bit of metal and plastic that can do the same transcendent mystery as the mind in the brain.

What kind of philosophy would we create if we had this far clearer, much less misleading analogy to start from? To be sure, it would lead us to other spurious conclusions, but I tend to think far less of them, and not as grievous.

Saturday, May 29, 2010

Surrogates

The reviews of Surrogates were largely negative, and I can see why, but they're mostly a matter of expectations calibration. (Minor spoilers ahead, but they really won't impact your appreciation of the movie, they're all things you'll see coming, particularly if you've seen the trailer which gives away the final scene.)

The premise of the movie is that everyone lives through robotic surrogates that look just like people, and which they remotely control from a "stim-chair" back in the safety of their home. This is a fascinating premise in a lot of ways. The many ways this would be used, the benefits and disadvantages, the impact on society, the whys and wherefores of how it could come to be this way (and why the inefficency, compared to a simulated world, makes perfect sense), are ripe for exploration.

That's where the movie falls down, though. Apart from a few throwaway lines and plot points, none of that is explored. We do have one person trying on a body of the opposite gender, and a few people using "enhanced" bodies, but by and large, everyone just looks like themselves, only better-looking and in better shape, and everyone can do the same things they would normally do, only with better endurance and less worry about injury.

A few issues of the threats implicit in this possibility are glossed over. There's a casually tossed-off line about how you can't just jump into someone else's surrogate because they have to be carefully matched to your nervous system -- and then this gets ignored pretty much from then on. The communications between surrogate and stim-chair aren't even encrypted, and all route through a single central switchboard that can tap into or override them at will -- we take more precautions than that with email. Even the central threat -- a weapon that scrambles a surrogate so thoroughly it also scrambles the human behind it -- is glossed over.

Ultimately what we have as fairly serviceable action/suspense movie that only takes its "hook" as far as it needs to to get you to notice it, and then ignores it thenceforth. There's good production values, lots of scenery for Bruce Willis to chew up, a smattering of good action, and enough plot to keep you working it out in your head -- not because you won't see the twists coming like a freight train at night in a cornfield, but because you will need to at least keep a scorecard.

The one place that they do dig up the premise is to make begging-the-question "arguments" about how this technology sucks the humanity out of humanity. They do this by simply presenting it as a fait accompli: a few people have used it to escape from things they should be facing, or find it's "not for them", and while lip service is paid to the good side (the virtual elimination of disease, death due to injury, and most prejudice based on bodies; vast reduction in crime; restoring full functionality to the disabled; and more), ultimately this is all deemed unimportant compared to a few people feeling less alive instead of more alive this way, and some blurry first-year-liberal-arts-major sentimentality.

Compare this movie to Strange Days and it will make you ache, because Strange Days does better as an action movie, and as a mystery, and as a character piece, and still manages to find time to give its similarly-scoped central premise a really thorough workout and exploration, finding both the good and bad uses. (It's hard to avoid the comparison when both movies feature a dreadlocked, black cult-of-personality character who is the de facto leader of an insurrection against the current order of society, too).

So in the end Surrogates is barely a tenth of the movie it could have been, if they'd really spent any thought on their premise, or given it a fairer, less dismissive analysis. But the movie it is, is okay for filling up a few idle hours when you're not looking for much. Expect that, instead of what the ads want you to think the movie is, and you'll be fine.

Friday, May 28, 2010

How would an implant computer change my life?

I'm still very ready for an implant computer. And for just about every step between what I have now and that. But it's hard to make really complete and accurate predictions about how such a thing would really change our lives. Most predictions of the impact of technology on the future make the mistake of taking what we would use it for in today's environment and projecting that onto tomorrow, rather than considering how that act also changes the environment itself. Ask someone in 1980 what Google would do to the world and you'd be likely to get answers focused on how to use Google to solve 1980's problems, but nothing about the many ways ubiquitous access to robust search has changed the kind of things we do, not just the way we can do them. And implant computers would be like that ten times as much.

My first reaction is to think of doing the things I do now, but more ubiquitously, and more efficiently. So instead of wondering something, I could look it up, no matter what I was doing or where. Not just because I was curious ("who was that in that movie?" still comes up while driving down the highway...) but whenever I need information to do whatever I'm doing (imagine how much easier travel would be with ubiquitous access to maps, GPS, flight time updates, restaurant reviews, and my calendar, right in my brain). Instead of trying to remember to do something, I could make a note right then, as easy as thinking; or just do it, depending on the task and my circumstances. I could fill in all those vacant times, like riding in the car or waiting for people, with writing, coding, exercising, or doing any number of other things that don't use up my mind.

See how I'm just doing what I do with technology today, but in more times and places, and with some inconveniences (like having to use a keyboard) removed? The next step then is to think of new things an implant computer would let me do. One of my favorites is HUD-style "augmented reality". Walk down the street and overlaid on my view would be information about the people and places I'm seeing. This shop has 78% positive reviews and is currently having a sale on such-and-such an item I have flagged on my wish list. That person walking towards me is identified by facial recognition as someone I met at a meeting last year at work, so I can respond appropriately if she greets me. As I drive, "turn left in 100 feet" is replaced by a highlight on the actual route I need to be taking, along with emphasis on important things (like stop signs and pedestrians) and de-emphasis on unimportant things (advertising signs filtered out automatically... or replaced by better advertising for those willing to pay enough for it). Anything I'm looking at, from a menu at a restaurant to a national monument, becomes a hyperlink I can drill down to find out the calorie content or historical significance (respectively, one would hope).

But I wouldn't be the only one with an implant computer. That's where things get really hard to predict. Once implant computers are in common usage, that woman is not going to be impressed that I recognized her a year later; she'll take it for granted, since she also has an implant computer and is currently looking at my Facebook page (by then, maybe Facebook will finally have figured out privacy). Would my boss expect me to be working 18 hours a day since I am "plugged in" that long? Would it be impossible to play in trivia games (or would they invent a new form of trivia to suit it)? What would advertisers be doing to try to take advantage of implant computers to make sales? How could con artists use their implant computers for cons -- since simply adapting them to the old cons would be helpful only for a little while, until we all got used to using ours to avoid those cons?

We've barely scratched the surface of ways that implant computers would change society, and we already see that the ways an implant computer are far larger than anything you'd think of by just taking what you do with a smartphone, laptop, or iPad now, and extending it by removing the keyboard and adding ubiquitous connectivity.

Thursday, May 27, 2010

Transitory to-do lists

I've got that kind of personality that is called "pathologically organized" by people who can't find their keys and forgot to pay the electric bill. I make a lot of lists. My smartphone is always, always on my person, so I can jot down additions to my various to-do lists any time something occurs to me, and so I can review what's the next thing to do.

But that's still not enough. I regularly find myself, on getting up to do something, having to make a list of items to do on this trip, summarizing each one with a word, and then repeating the string of words to myself, because it's not worth it to write them down when it's just one little trip around the house, but I will lose items otherwise. Maybe it's a sign of my advancing age, though it seems like I've been doing this for many years.

I wonder if someone would be puzzled or amused to hear me walking around saying "bug-spray helmet jacket harness gloves" over and over -- or maybe even more so for a list that fits together less well, like "charge soda load headset slippers." But if I heard someone else doing something like that, I'd just nod knowingly at the kindred spirit.

The thing is, I never do. Maybe it's just me.

Wednesday, May 26, 2010

Attitudes about curing diabetes

A recent visit to the eye doctor bumped me up against the contradiction at the heart of the medical community's attitude towards diabetes.

It has been an uphill battle for doctors to get people to understand that diabetes is a chronic, lifelong condition. Most people have a vague notion that you just have to avoid sugar and take your medicine every few hours, and you're fine. Diabetes usually appears in movies as an excuse for a character to need their medicine now Now NOW NOW or they'll die! (when for the vast majority of diabetes sufferers, the only immediately life-threatening need they'll face is a need for food if they took their insulin a little while ago and then missed the planned meal).

At least, that's how it was ten years ago. Nowadays, while Hollywood still insists on using diabetes as a ticking time bomb plot device, huge increases in the rate of diagnosis mean most people know someone who is diabetic and probably have picked up a little bit more about it, and lots of public education has also made a dent. People have gotten the message that diabetes requires a lifetime of care. There's more awareness of being "prediabetic" (which is often a euphemism, sometimes for "you're diabetic, but we think you'll be better motivated to take care of it if we call it this instead", and sometimes for "you need to eat better and exercise more, and this is a way to strike some fear into you"). And this public education also includes educating the medical community; while gastroenterologists might be up on these things, your GP might lag a while, and your dentist probably learns about it the same way you do.

So just about when they're getting us to take diabetes seriously and understand it, a complication arises. It turns out that there has been evidence for a few decades now that gastric bypasses can resolve -- that is, "cure" -- diabetes. This has been known for a long time but not publicized even within the medical community for the simple reason that no one really understands precisely how it works. It's also very easy to misunderstand. "Fat people get diabetes. Gastric bypass makes you lose a lot of weight. Ergo, the loss of weight is how it works." Well, no. It turns out that in the majority of cases, diabetes measures like blood sugar drop precipitously in the first few weeks after the surgery, long before there's been enough weight loss to matter (losing that same amount of weight by other means would not be expected to, and does not, lead to significant changes in blood sugar levels). Something fundamental and metabolic changes, probably something to do with the extensive nervous system around the stomach (almost as complex in scale as the brain, yet barely understood), that leads to an almost immediate resolution.

We get stuck in a bit of semantics at this point. After gastric bypass, am I "cured", or is my diabetes merely in "long-term remission"? Yes, it's possible in ten years I could become diabetic again, particularly if I find ways to regain a lot of the weight. But if I had never been previously diagnosed and someone checked me out now, my fantastically normal blood sugar levels would soundly put me into the class of "not diabetic, not even at high risk to be diabetic". I don't really have a higher risk of becoming diabetic than someone else of my current weight who hadn't been through being diabetic and having a gastric bypass -- in fact, probably less, since regaining the weight, though possible, is less likely.

The insurance companies still treat diabetes as a life-long condition, and we don't argue the point too strenuously because that makes them more likely to authorize some things that the doctors ask for, that are probably no longer needed. That's where my eye doctor's office comes in. They still want us to come in for an annual checkup, because that's standard procedure for diabetics, even though our blood sugar levels are probably better than 90% of their never-been-diabetic patients. They don't quite come out and say "no you aren't" when we refer to ourselves as "cured" but it's clear that that's how they think of it. They haven't gotten the news, or aren't convinced of it; and they treat us like we're oversimplifying or being in denial. They're very polite about it, in fact, exceedingly friendly, cheerful, and upbeat, praising us for our sugar levels, our weight loss, our exercise regimens, and the condition of our eyes (particularly mine, which remain perfect at the age of 42), but it's still there.

I guess they had to fight a long time to get people to take diabetes adequately seriously, and understanding it as a permanent condition that is, at most, "in control," was a big part of that, so they're loathe to let it go. No one wants to be the first to say "cured" officially (except the people who do the gastric bypasses, and even they are cagey about it), because of the fallout (both legal and in terms of public perception) if they end up wrong.

Yet the evidence keeps mounting. Even the ADA, which is notoriously conservative (still recommending the high-carb diet that has been proven time and again to be bad for diabetics), has come close to calling it a 'cure' by now. In the end, my blood sugar is probably lower than that of the majority of people around me, including skinny, athletic, unabashedly "healthy" people; and my risk of diabetic complications is as low as theirs, or lower. What's the real point of calling me "a diabetic in remission"? To "trick" me into exercising? Maybe for some people that works, though it's a sad commentary if it does.

Tuesday, May 25, 2010

Whatever happened, happened.

I'm fairly sure this is spoiler-free, even if you haven't watched since season one.

Since the final episode of Lost started at 9pm on a Sunday night, I couldn't watch it live, though I wanted to. Wondering how it would end kept me from getting a good night's sleep on Sunday night; I kept dreaming scripts, usually involving it crossing over with other shows or other storylines, because these are dreams and they have to involve something surrealistic and absurd.

We watched it right after work yesterday. And I can say that I am entirely satisfied. No, they didn't answer every fiddly tiny detail question, but they answered all the questions that they needed to answer. Even some of the fiddly details that we despaired about ever getting answers to, I think they're there and we're still finding them. (I read a very interesting observation about the Hanso Foundation today that provided explanation for how a lot of seasons two through four fit into the larger story, for instance.)

Some things that are not explained are intentionally not explained because they are part of the premise: the story just happens to happen in a world where things like this are true. No one asks for an explanation for why Melinda can see ghosts, or why there even are ghosts, in Ghost Whisperer: it just happens to happen in a world where there are ghosts, and some people can see them. In the same way, some properties of the Island, and some other things like what Miles can do, are just how things work in this world, and while we do get explanations for what they are and how they work, we don't get explanations for why they are this way. Compare to how much better Star Wars was when the Force was the thing which binds the universe together, to when it was some nonsense about symbiont midichlorians. (And even that didn't explain anything, just pushed the question one level deeper.)

On top of having a satisfying resolution to the mysteries, the finale also had a satisfying resolution to the plot. The conflicts were drawn into sharp focus and then resolved. The challenges were built up to a climax and the tension ratcheted up, then the payoff made it all seem like it meant something. Even more, the characters were satisfyingly resolved, with each of the people we'd cared about given a chance to find completion, with some of the most tear-jerking, and entirely sincere, moments on TV in recent memory.

During the first couple of seasons, I considered Lost a really good show, well worth watching, though if it were cancelled I wouldn't've wept. Later, as they misstepped with the season about the Others, I got close to considering dropping it, and there were episodes I didn't pay strict attention to. Later seasons cranked things right back up, including the mystery and the complexity, but even more so, the characters. The show became more important to me again, but even coming into the final season, I didn't think of it as a favorite show. A few episodes this season were fantastic, but at no point did I even consider that this could be on my favorite shows of all time list at all.

Maybe it's just the emotional impact of the final episode being so fresh, but the way it all wrapped up just tied it so tightly in a bow, it feels like it retroactively made so much of what passed on the way here so much better that I am thinking it's probably on my top ten favorite shows of all time list. I haven't actually made such a list: I only know with certainty what numbers one and two are. If I took the time to make such a list it would probably be full of moments with me musing, oh, wait, that show belongs on it too, and maybe by time I was done Lost would have fallen off the top ten contenders list entirely. But even if that's so, it's remarkable that I'm even considering it now, when a week ago, it didn't even occur to me to ask the question.

Incidentally, the fact that I was at least half right, and more right than any other person I know of, in my theory about what was going on this season, probably isn't hurting either.

Monday, May 24, 2010

Revised MAME cabinet plans

Though I started with the plans for a MAME cabinet from the Ultimate MAME site, I decided to make some significant adaptations to suit my monitor and control panel plans.

The original plans were for a cathode ray tube monitor of considerable size, actually much, much larger than original arcade game monitors. The monitor I got is a 22" HDTV with a very slender profile. It wouldn't fit the space provided in any dimension; it's a bit too narrow, a lot too short, and very very much shallower, plus it's a 16:9 instead of 4:3 aspect ratio. Hence, I used a bit of trigonometry to reduce the size while keeping the same angle.

The original plans call for building a control panel, but I intend to buy one fully assembled from MAMEroom. It's a fair bit smaller than the original plans, but since it's not built into the plans, I decided to adjust the plans to make the space where the control panel goes a little deeper to better fit it. This might need adjustment after I buy the control panel next month, but for now, the plans will fit the depth and height of the panel. (The width will stick off a bit to either side, but that's typical for MAME cabinets.)

I also reduced the width of the whole cabinet by six inches in order to better fit the width of the monitor. That way, it'll fit the monitor so that there's no space showing on either side. I might put in a bit of a bezel to frame it, but I might just leave it showing as is, since it's a pretty nice front.

Finally, since I don't need clearance for the depth of a cathode ray tube, I was able to make a significant reduction in the cabinet's depth, and simplify the shape: there's no need for the slope at the top rear anymore. The resulting cabinet is about 26" deep instead of 40" deep, but still has gobs of room for the stuff that'll go into it. It's also about 5" shorter, but that's not going to make much difference.

It should fit my components better, be a little simpler to build, take up less space, and still allow plenty of room to access the equipment. I won't start building until after I buy the control panel next month. Though I might dig out one of my old retired computers and try setting up MAME on it to see if they're up to the task, or if I need to buy a cheap computer next time one comes up on Woot.

Sunday, May 23, 2010

Bridal shower versus bachelor party

Admittedly, the bachelor party I'm planning and hosting is not your ordinary bachelor party. No scantily-clad girls will be jumping out of any cakes or dancing around any poles, and the most potent potable there will be diet Pepsi. Instead of the usual fare, we're going to play lazer tag, miniature golf, and arcade games at Pizza Putt through one of their group event plans. But all in all this is probably about the same amount of work and complexity to plan as the more traditional version.

By contrast, the bridal shower which is going on at my house later today (as of this writing, Saturday, May 22), seems like planning and carrying out an invasion. The whole preceding week has been a flurry of housecleaning and moving furniture and shopping. Between decorations, party favors, cleaning supplies, and food, we've had to do a ton of shopping for this event, and it's not even finished. There's cooking to be done, too. There were games and events to plan and prepare for. Later today, we have to move most of the living room furniture out into other rooms, and rearrange most of what's left. I'll have to take the dog out for four or five hours and find somewhere to occupy her that won't involve her going nuts or digging up anyone's yard, which means I'll have to stay on the move, most likely. And when it's all done we have to move all the furniture back.

All I have to do is set a time, make a reservation, and do a little coordinating of shared rides. Maybe when we get there I'll need to offer a bit of leadership in setting up teams and times for lazer tag, or when to eat the included pizza. I guess I got off easy.

Saturday, May 22, 2010

The last people to heat spaghetti in a pot

Microwave ovens actually date from the 1950s, but they started to become ubiquitous all at once in the 1970s, when I was quite young. Within a few years we went from where most people hadn't even heard of them, to where anyone of moderate means was trying to figure out where to put one in the kitchen, about as quickly as DVD players made their way into our houses in the 1990s.

Microwaves have been standard equipment in virtually every kitchen for about thirty years now. There are people who have owned several houses and have children who never lived in a kitchen without one.

Sometimes I wonder if they look at a can of Spaghetti-Os and wonder, why do they still sell these in metal cans, when virtually everyone who eats Spaghetti-Os is going to heat it in a microwave oven, and yet you can't put the can in the microwave. Wouldn't it make more sense to put it in plastic or something? And of course they do, but only in canisters half the size but which usually cost more, so ultimately, the metal cans (of a typical can size) still remain the largest part of their sales (I assume).

And thinking that made me realize, my generation is the last one to remember when the normal way, in fact almost the only way, to heat up Spaghetti-Os, or canned corn, or virtually anything else in a can that needed heating, was in a pot over the stove. The idea must seem almost as alien as horse-drawn buggies to people just 5-10 years younger than me. In fact, the same is true of heating up most leftovers -- which means that there are tastes that are effectively extinct now, like that unique quality that leftover spaghetti (with sauce) gets when you reheat it in a pot, and it gets a little tough, but not in a bad way -- it's kind of hard to describe, you have to try it.

I suppose that if you really look, everyone can probably find something that their generation was the last to taste or try or do, and the advent of the microwave doesn't make that more true -- just more obvious. I wonder what other transitions like that aren't so evident.

Friday, May 21, 2010

How many sudoku puzzles are there?

I found myself wondering how many possible Sudoku puzzles there are, because, hey, that's the kind of thing I think about when there's nothing else for my brain to be doing. I went through a few approaches trying to figure out how to calculate it, and I'm not totally sure if the one I settled on is valid logic.

Consider a blank Sudoku puzzle:


Right now, with nothing determined, the upper left cell in the corner can be anything, so it has nine possible values. Let's write a nine there -- not representing that that's the value there, but that's the number of possible values that could be there.


Now let's assume we've figured out what the upper left cell is. That means there's eight possible values for the cell just right of it, and seven for the cell right of that, and six for the next one, and so on.


Following the same reason we can fill in more cells with how many possible values there are left for each one, based on the assumption that some valid value is in all the cells we've already filled in:

The order we fill in cells will decide what values end up in each cell, but I am fairly sure (though I probably couldn't prove it with mathematical rigor) that whatever order I choose, all the same values will end up in the grid, just in different places.

So far, I am nearly certain that I've proven something: if you multiply all the numbers written down so far, that's the total number of combinations of values that are possible in the cells that are filled in, in the set of all possible valid Sudoku solutions. (Right now, that's 407,586,816,000 possible grids.)

However, at this point it starts getting tricky. Consider the cell in row two, column four. We now know that three other cells in its row have been determined, and three other cells in its square. But we cannot conclude that that only leaves three possible values, since some of those might be the same. In fact, the same three numbers might appear in both groups, leaving six possible values here. But we can't just write in a six and proceed, because there are only six possible here in some of the 407,586,816,000. So we would need to figure out how many of those 407,586,816,000 have six, and how many have five, and how many have four, and how many have three; and then divide up the calculation. If we call those values N6, N5, N4, and N3, it would follow that N6 + N5 + N4 + N3 = 407,586,816,000. The total number of possible Sudoku puzzles with the cells filled in we have so far, and that next cell, would be equal to (6 * N6) + (5 * N5) + (4 * N4) + (3 * N3).

And once we try to do yet another cell, we will further have to bifurcate our calculation, including both those bifurcations, and the further bifurcations possible in that cell. Just a few more cells and this would go far beyond the bounds of calculability. Doing the entire puzzle would be impossible.

I've tried to come up with a different order to fill in the cells that would avoid this, and it's simply not possible. You can go back to the last version pictured above and try finding as many cells as possible you can fill in without a bifurcation, to minimize it. But more than half the puzzle will need bifurcations like this under the best of circumstances, where even three or four bifurcations makes the calculation impossible. So I must conclude my strategy is fundamentally unsound. It devolves into a brute force method too quickly.

The question I can't answer, though, is whether there is a more elegant solution. I suppose I should do a Google search to see who else has attacked the problem, but I'd rather give myself a few days to see if another idea occurs to me -- or if anyone posts some ideas on my comments.

Thursday, May 20, 2010

The next generation of GPS

Nowadays we pretty much take for granted that a GPS will include up-to-the-minute road maps, listings of restaurants and gas stations, turn-by-turn directions, on-the-fly rerouting, voice commands, and various trip planning features. Good ones also have voice recognition, the ability to say even street names, can recommend where to get a good price on gas, link to current data about road repairs and weather conditions, and can tie into Web sites about businesses. But it wasn't that long ago that a typical GPS didn't even have maps; you just set waypoints or recorded routes, or at best, prepared routes ahead of time on a PC and dumped them as a series of waypoints into the GPS.

And yet, even today's gee-whiz GPSes don't quite give us the guidance we'd need to be able to find an unfamiliar destination without a navigator, without a bit of luck, in most cities. The last hundred feet are fraught with difficulties, as are the hundred feet before each turn, particularly in cities where the roads come close together so it's not clear which one of the upcoming turns you're being warned about. And there's the issue of where you need to be in the left or right lane.

Some of this can be addressed with improvements in the database of roads, which is a perpetual challenge -- gathering the data of all the roads was a monumental task, so adding a little bit of data to it would be unthinkably duplicative of that challenge. But most of it needs something else that, if it didn't exist, I would imagine it nearly impossible to hope it would become available... but fortunately, it is being gathered right now. Google Streetview.

What a quantum leap it's going to be the first time someone makes a GPS that uses Streetview to not just tell you the turn is coming up, but show it to you. The actual road you're on, with the next turn (or the destination) highlighted. I've seen "maybe in ten years" stuff about augmented reality HUDs in cars to do this, but there's no reason Tomtom couldn't have a model out next month that did this. Putting it on a HUD overlaying reality is super, but really, a 6"-wide color screen on your dashboard with a Streetview image of where you are right now (thus echoing what you see out the windshield) with the next point highlighted, would be 90% of the benefit of HUD, but available with today's technology.

Surely Google has thought of this. Maybe they're planning to make Google Maps (or at least the mobile version) do this, once mobile bandwidth is adequate to streaming the images, or something. I can't wait.

Wednesday, May 19, 2010

All hair colors are hot

What hair color is sexiest on a woman? I have a paradoxical reaction to this question which I can't make any sense of, because they all seem to fit. It's not like I don't have a favorite, and it's not like which one is my favorite changes from day to day. Simultaneously, paradoxically, each of them is my favorite. Which makes no sense and which I can't resolve or even explain. It's like if I think of any one hair color on a woman, I get the same reaction of appreciation you'd expect if it was my favorite hair color, the same one I might get from thinking of some other trait that was particularly of interest to me.

I can't really compare when they all get that reaction. If I try, I can think of a reason for each one. For instance, redheads are exotic for being scarcer, especially more natural shades of red, and that seems like an edge... until I think of one of the other colors, which brushes aside any sense that redheads really have an edge after all. It's like my favorite really is "whichever one I'm thinking of right now" as if I had no more memory than a cat.

About the only thing that doesn't get that reaction is the more obviously artificial colorations. Even then, an obviously artificial red sheen on brown hair, or a bottle blonde, isn't too bad, just not as "favorite" as a more natural-looking color. Though go too far into unnatural and it starts to lose appeal. Bubblegum pink is tolerable; green, however, and you've lost me.

I guess I just like hair. Still, it makes no sense to me. I can certainly understand not being able to pick a favorite, but what does it even mean, beyond a meaningless banality, to say they're all favorites? I have a feeling the distinction isn't coming through in this text.

Tuesday, May 18, 2010

Apollo 11

Apollo 11 launched a week after my second birthday, and landed on the moon four days later. My mother told me that it was the first thing I ever showed interest in on the TV; before this, I'd never shown much sign of noticing the TV, though children under the age of two often love staring at it (a friend's baby has been watching Jeopardy since she was under one year old, we hope to recruit her for our trivia team when she grows up). I'm pretty sure my mother said I watched the landing; don't know if I watched the launch, too.

Most likely, it's just a coincidence that, at the age of two years plus a week, I found the moon landing the first good thing on TV, given that I later became fascinated with space and have remained so all my life. But it's certainly tempting to imagine it's not just coincidence. While people don't generally remember anything from before the age of about four to five (though many people think they do, it often turns out to be memories of hearing about events later), we really don't know much about what kind of impressions these events are making on their minds. Maybe it was being fascinated by that broadcast that shaped my mind to be fascinated by the subject later, and all my life.

Of course, it could be that there was something about me already formed by then which drew me to certain topics, and so this wasn't what caused it, but just an early sign of it. But even if you feel it's possible to have a predilection for space or futurism at the age of two, it's harder to believe that any two-year-old, even a bright one like me, could have really understood what was going on in that TV broadcast enough for that kind of predilection to kick in. At least enough to make a complete change from "totally uninterested in TV" to "staring raptly at the screen through the entire broadcast" (as my mother characterized it -- though she might have been exaggerating).

In any case, I kind of like knowing this about myself. It probably means nothing, and so probably shouldn't be a point of pride, even a minor one, but it certainly can't hurt.

Monday, May 17, 2010

The virus alert virus

Nowadays you (or at least I) rarely get emails forwarded from family members and people you barely know, with ten screenfuls of other people's email addresses at the top showing a long chain of forwardings, and a breathlessly panicked subject line about the latest terrifying computer virus. But it used to be common (at the same time as you often got other mass-forwarded emails with other things, like banal "inspirational" poems). I'm not sure why this went out of fashion. My rose-colored glasses aren't thick enough for me to imagine that people got wiser about either how viruses work, or how mass-forward emails worked; more likely, they moved this to other media (people are probably mass-forwarding twitters and Facebook updates now, though I don't see the virus scares on Facebook, so maybe not), or they just took me out of their mass forwarding.

Back when it was common, I would dutifully look up the relevant facts on Snopes, Mcafee, or other sites, and send back the inevitable "this is a hoax" message, but there was no joy in it. The person who sent it out always came back feeling stung and defensive, insisting that they were just playing "better safe than sorry," and at best, unapologetically blind to the fact that indiscriminate panic actually undermined that safety. I might also try to explain how, the moment they sent that email, they were sitting in front of the largest, easiest-to-use, and most up-to-date library of information ever compiled by mankind, so isn't it sad they couldn't spare a few Googleseconds to check on something before forwarding it? This rarely won me any converts.

Once in a while, though, I would try to explain something to them just because the idea tickled me so much, even though I could never get it across. The idea is this: that warning is itself a virus. It's not the computer virus that lives in a computer, and tricks the computer into helping it make copies of itself in other files or on other computers. Instead, it's the kind of virus that lives in a person's mind, and tricks the person into helping it make copies of itself through email to other people's minds. A meme-virus, in other words. So in the effort to help stop viruses they were actually becoming the mechanism of spreading what was effectively, at the time, the most widespread, most successful "computer" virus in existence.

I suppose it's just the self-referentiality of it that amuses me. But the fact that I don't think I ever once successfully explained this to any of the perpetrators, to the point where they really got it, kept it fresh. I always wanted to find the three-sentence explanation that was clear enough that it would lead to that a-ha! moment, but I never found it. And now, those virus alerts have gone all but extinct, so it's a challenge I never will meet.

Of course, whenever I was trying to explain it to them, wasn't I just creating a virus alert "virus alert meme" meme? Unfortunately, this meme never took root, so it couldn't propogate. Good thing, because then this very paragraph might have spawned the virus alert "virus alert 'virus alert meme' meme" meme...

Sunday, May 16, 2010

Moved to my new laptop

My new laptop came in almost two weeks ago and I am finally moved over to it where I'm doing most everything on it. Including typing this post. It just got put onto my desk and hooked to my main mouse and keyboard and second monitor this morning.

There's still a lot of stuff I haven't installed on it, but it's mostly work stuff that I can do on either computer without needing to be on the same machine that has all my main programs on it, like web site editing. The one big thing not yet moved over is everything related to synchronizing with my phone, which includes Microsoft Money.

The document I keep with brief (one or two sentences each) notes about how to move each program or item over to a new computer is now up to eight pages. Yes, I have that many things on my computer and that many of them require some notes to coordinate the order of moving things, to remember the directions for getting things to work, for transferring settings. That's why it takes two weeks. (That, and I could barely get five minutes at a time to work on it during most of the last two weeks, with everything else going on. I got more done working from home Friday than the whole previous two weeks put together.)

While everything else is working fine on the new computer, and some things are working better (I can use the fingerprint reader! I had to disable the one on my old computer, but this one's great so far), SecondLife is not working that good. Actually it works wonderfully thanks to the high-end Quadrao video card and HDTV-quality display (1920 x 1080), but there's a video driver issue that only kicks in when I close SecondLife which renders the whole screen unusable. I've found no way to restore it other than a reboot, too. Very disappointing since I'd gone out of my way to pick the best video card particularly in hopes of getting a good SecondLife experience, since a friend of mine likes to overload her avatar and land with so much bling that it bogs down my old laptop's video card to nearly unusable levels. Still, if the worst thing is that I can't run SecondLife, I could be doing a lot worse.

I also tried using my HDTV as a second monitor through the DisplayPort (with a DisplayPort-to-HDMI adapter) and brought up Google Earth, but the display didn't update in Google Earth on the TV, which is a shame. If it worked, Google Earth would probably be pretty breathtaking on the big screen at the kind of speed that I can get out of this screaming fast video card and my T1. Still, running Google Earth on my HDTV is nothing but chrome, it has no real practical purpose.

Some nice features in this include a built-in webcam (so I can Skype on it), a numpad on the built-in keyboard, and a generally nicer keyboard all around. It also comes with an interesting "QuickLook/QuickWeb" package which lets you view your Outlook calendar, email, etc. almost instantly without booting up your computer, and even get to a simple web browser in a few seconds. If my computer were shut down more often these might be pretty useful, but as it is, they're still impressive.

But the main benefit of it is that, unlike my old one, it's not already flirting with failure. Every time I try to restart my old one now, I have to let it sit five minutes or so before I boot it up, or it'll get partway through the login process and then shut down. It's no big deal to wait a few minutes, but it raises screaming alarms that it's going to get worse, so I feel better to know that most of my stuff is moved off.

Saturday, May 15, 2010

No, really, I like it medium

Siobhan watches a lot of cooking and travel shows in which chefs and cooks pontificate about the best way to cook things. These guys are professionals, experts. They've spent more time studying cooking than I've spent cooking. And amongst the thousands of people who've done that much studying, they're the ones who rose above the pack enough to become famous. (Admittedly, a lot of that is personality and luck, but for at least some of them, some level of expertise and talent has got to be part of the equation. After all, there's plenty of other cooks with personality.) So I assume these guys know what they're talking about.

But almost to a man (or woman), they all have the same idea about cooking meat. Not that "as rare as possible" is their preference, but that there is no question of preference, that there is a right way and a stupid way to do meat, period. If someone pointed out the fact that patrons in restaurants are allowed to say if they want their meat rare, medium, or well, they would groan about it the same they would if, for instance, patrons were allowed to go into their high-class restaurants and order spray cheese on Triscuits, or ask to eat whatever came out of the grease trap.

Actual real human beings I know have a variety of preferences. I know some who like their meat cooked well, some who like it still-red rare, some who want a bit of pink, and some who like me want it just barely brown through, but not a bit more.

But amongst chefs, it seems that the mere existence of the question is a sign that the world is flawed. And I really tire of that attitude. I've heard their explanations, about how it brings out the most flavor, about what "overcooking" does to the consistency of the meat, and all that, and I recognize those are good reasons for trying it. They make similarly good arguments for why you should use this ingredient instead of that one, or when heat should be applied when preparing some dish or other, and their reasons are often sound.

But they rarely go so far as to suggest that you have to like some dish because they do. Yet that's just what it seems to me they're doing. Rare steak is practically a different dish from well-done steak. Anyway, all their reasons only tell me I should try it their way, which I have, but if I conclude I like it another way better, I've got every right to do so.

So, chefs, bite me. (I'm done rare.)

Friday, May 14, 2010

TV show remakes

Heard about how NBC plans to remake The Rockford Files? I never watched it more than catching a glimpse here and there, so I have no firm opinion about whether this is a bad idea. What little I know about the show suggests that it won't jar too much with modern sensibilities, and there's room for more stories in the same vein as the original, so maybe it could work. People who know the original have mostly liked the casting. Who knows? It probably wouldn't be the kind of show I'd watch either way.

When I mentioned this recently, Suri speculated about whether they might one day remake Babylon 5. The more I think about this idea, the more compelling it seems, though I'm still not sure how much positive and how much negative.

The nightmare version is that whoever owns the rights might do it without the blessing or participation of JMS (J. Michael Stracyznski, the show's creator), and it ends up missing what was great about the original show, in its attempts to fix what wasn't so great. I can't say I know where the rights are, though, but I'm fairly sure that wherever they are, JMS has a controlling stake and can prevent it being done without his approval. While JMS's ideas of what would work have not always been ideal, I feel sure that we can count on him for one thing: no remake could possibly miss the sweeping, epic story arc, and its coherence. While people still wonder, five episodes from the end, whether the makers of Lost really had answers in mind all along, no one ever seriously doubted that Babylon 5 had a story from day one.

The dream version would be if they got a commitment to the whole run of the show with a good budget. Then they could fix the flaws that make it hard to sell people on the show now: the sometimes weak production values (spectacular for the time and the limitations of being on a "secondary" network, but even then they looked corny at times, and now, they look terrible), the dodgy acting (lots of scenery-chewing), the clunky dialogue (JMS excels at the epic scope but can be weak at wit and everyday stuff), the primitive CGI (B5 pretty much invented CGI for sci-fi on TV, but the state of the art has come a long way since), and the uneven pacing (caused by the show not having commitments to its run length or its actors). Imagine it being made with the budget and production options of Lost and you could see what was great about it preserved, divested of all those thing that, when you try to lure someone into seeing the show, you have to apologize for and hope they see past long enough to glimpse what was incredible about it.

Would I watch it? Absofragginlutely. But I would also hope it wasn't being made for me. After all, I still watch the original series, but it's just not the same since I know what's going to happen. I'd love to see those ideas explored with modern CGI, the script polished by someone who's good at humor too (the idea of Joss Whedon collaborating with JMS on something like this is a geek's wet dream!), and a more consistent run of actors (the original show had some fantastic actors, but also some clunky ones).

But the real purpose of a remake would be to get all those people who couldn't get past the weak production values to see what we were all talking about. (Which means those of us who watched it the first time would have to resist being insufferably pompous I-told-you-so braggarts.)

Even so, would the show do well now, even assuming everything was done well? The weak production values are only half of why the show doesn't make the impression today it did on the original fans. The other half is trickier: B5 changed TV sci-fi. Some things that are de rigeur now were unheard of in SF TV when B5 did them, which made them tremendously impactful then, but which would seem par for the course now.

Thursday, May 13, 2010

Traffic cops: Long Island versus Vermont

Here in Vermont, when people are speeding and they see a cop up ahead they slow down. Clearly, by the time they can do that, the cop has seen them; in fact, slowing down, if anything, draws attention to the fact that they were speeding. But I guess the assumption is, the cop doesn't care as much that you were speeding, as that you stop.

Back where I grew up on Long Island, no one slowed down. If you're speeding and you're going the same speed as the cars around you, the assumption is, he can't catch all of us. Maybe if you're the car at the very back of a group, you might slow down, but otherwise, the odds of you being the one he pulls over are too slight. In fact, slowing down might be what makes him pick you out from the crowd.

In Vermont, when people who are speeding see a cop has pulled someone over ahead, they slow down. And not just because there's a car on the side of the road so they need to be slower for safety's sake. Even if the road's two lanes and wide open, they slow down just because there's a cop there.

Back where I grew up on Long Island, people kept the same speed, or might even speed up, when they see a cop having pulled someone over. After all, if he's standing there talking to some other poor sucker, he can't be chasing you and pulling you over. So this is the best time to be speeding because they know at least one cop (and probably the only cop doing traffic duty in the area) is busy.

I make no judgments about either approach, or which one is better.

Wednesday, May 12, 2010

2012

Finally watched this movie, and on balance, I'm very, very glad we missed it in the theater. My big TV is quite big enough for it, and being at home gave us the leisure to laugh as loud as we wanted, and even pause it to talk about the things we're laughing about.

What do you mean, it wasn't a comedy? Did anyone tell Roland Emmerich that? I hope not. This is the only possible explanation I can find for John Cusack being in it, other than him needing money: he's done great work with dry comedy before, and that's what this was. Very dry (and I'm not going to make a pun on that). Very funny. He's essentially reprising his role from Con Air (though I feel sure that Simon West was aware he was making a comedy).

It must be liberating to be Roland Emmerich. There is never a time when an idea comes to you and you have to say "no" to it, or anyone else says "no" to you. No matter how over-the-top, how corny, how laughable, he's just carried along by his enthusiasm, which is apparently limitless. It's a constant stream of "can we do this?" always answered with a resounding "why not?"

One possible exception: at no point during the movie did an asteroid crash into the earth. I can't for the life of me figure out why not. It seems like the kind of thing that would have occurred to Roland Emmerich, and it would have been far less ridiculous than a lot of the thing that happened. And it's about the only disaster movie premise that we didn't get somewhere in this hodgepodge. (Roland himself said this was his last disaster movie so he made a point of getting everything into it, but somehow he missed the meteor.)

The Day After Tomorrow flirted with the line a lot, and crossed over it firmly during one lengthy sequence in the middle (involving a Russian ship in Manhattan), but it stayed within my tolerance. I sniggered at the science here and there, and there were any number of bits where the implausability tugged at me, but rarely so bad that it pulled me out. 2012 was like a parody of The Day After Tomorrow, cranking everything right over that line and then continuing to crank.

(Spoilers follow, nothing too big, and really, you don't need to worry about spoilers: you already know what's going to happen. But just to be safe, I warn you.)

The scenery-chewing expostulation about the proper way of choosing who will live through the end of the world was particularly awful. They had only so much room, so much food and energy and medical supplies, etc. The methods they'd employed to decide who would be saved were never discussed much (apart from the obligatory rant that some people got saved because they could afford to pay -- which Oliver Platt's character graciously pointed out is the only reason they could afford to save anyone). There was some vague notions that they'd chosen to preserve the people who could best repopulate the human species, but how they chose that isn't clear -- we only know that it included some scientists, some artists, some military, and some politicians. Presumably the methods were imperfect. And yet one of the scientists argued that they needed to open the gates to let in every random person who by chance happened to be there at the key moment, despite the fact that they didn't have room or supplies for them, because "that's what makes us human"; and we were clearly intended to agree. But the reasons against were far more sensible, and casually brushed off with vague emotional claptrap. It was abominable, so you had to take it -- like almost everything else -- as comedy. But it does raise interesting questions about how it should be done.

The end of The Day After Tomorrow left the fate of the world ambiguous: having trashed half the world, Roland's content to let us see a few survivors crossing the receding icefields with the assumption that they're going to rebuild somehow. In 2012 the ending is even more vague and more absurd. The people who were in the arks are now left to rebuild a world that has been totally devastated by tectonic activity. Somehow, the entire world has flooded (it's better not to even try to think of how to make it make sense that tsunamis caused the entire world to end up underwater simultaneously) except for one small part of southern Africa, and that's where the world will rebuild from (I think Roland intends us to find it amusing that this constitutes a second "out of Africa" biogenesis for humanity, but I doubt most people will even notice).

Whatever you do, don't expect to care about anyone in this movie. Roland loves to play the Noble Sacrifice card, but between making the characters such cardboard cutouts (not only are they the same people as in The Day After Tomorrow, they make those versions seem fleshed out), and undermining every moment of potential pathos with clunky timing, he can't even get a rise out of us with that one.

Chiwetel Ejiofor is really trying to show off his considerable acting chops despite the awful dialogue and self-contradictory role he's given, where John Cusack seems to have realized that all he needs is a slight sprinkling of his characteristic deadpan delivery to achieve just about as much. Which is sad. They're both excellent actors, and one of them is really trying and the other one really isn't, but the material drags them both down to almost the same level. Hurray for John Cusack for phoning it in and cashing his check. And sorry, Chiwetel, for all that effort you wasted. At least we could tell you were trying.

I will say one good thing for the movie. The references to the bunk about the Mayan calendar were a tiny detail in the movie and largely glossed over; when we did hear it, it was usually cited from people who were admittedly crazy (though in that "and they happen to be right!" way). There was one allusion to the (fictional) "planetary alignment" tossed in, but ultimately, the cause of the disaster was an unprecedented solar event (combined with the neutrinos "mutating" on the way to Earth -- I kid you not, though I hope Roland Emmerich does), and at best, the implication is that the Mayans "just happened" to get the year right.

Do I recommend the movie? Yes, with this caveat, in the words of Abraham Lincoln: "People who like this sort of thing will find this the sort of thing they like."

Tuesday, May 11, 2010

The Imaginarium of Terry Gilliam

I haven't actually seen the movie The Imaginarium of Doctor Parnassus, just the trailer, and I know virtually nothing about the movie. But just the trailer gives me one thought. While we talk about how advances in computer technology enable people like James Cameron to realize their visions that were never practical to realize before, we need to think about the other side of that equation. It also means there will soon be nothing to stop Terry Gilliam from realizing his visions.

If you've watched his progression through movies like Jabberwocky, Time Bandits, The Adventures Of Baron Munchausen, Brazil, Twelve Monkeys, and The Fisher King, think about the idea that, soon, anything he can imagine, he can render in photorealistic 3D imagery. You should be filled with an odd combination of feelings, a mixture of excitement and trepidation. It will probably be indescribably awesome. It may be alarmingly self-indulgent. It's also entirely possible it will render many of us insane, and possibly open a gateway to a parallel universe of surrealistic paradoxes, whose designs on us will be too incomprehensible to be evaluated. It seems very likely that if there is a way that a realized imagining can begin the End Of All Things, or at least its transformation into forms beyond our conception, Terry Gilliam is the man for the job.

Should we be alarmed, or should we embrace the madness and start getting fitted for our flamboyant new costumes?

Monday, May 10, 2010

Frugalista tip: bar soap

If you're the kind of person who is naturally frugal, who tries to squeeze out every last penny from the tube of toothpaste and to identify which products are identical in the off-brand so you can save cost without sacrificing quality, you've probably already tried to figure out how to handle the end of the bar of soap. This is one case where it's easy to mislead yourself.

The most obvious thing is to stick the old bar to the new bar, but often this proves impractical. The skimpy remains of the old bar are breaking into fragments, and they're hard to make stick on the new bar; they splinter and dry up, or fall off.

So some frugalistas make up other approaches. The most popular: take an old container from liquid soap, cut or break up the broken bits at the end of the bar, throw them in, and add a bit of water. Keep adding new soap and a bit more water as you end each bar, and you keep having some liquid soap. It's uneven, and unless you put a lot of time into breaking up the bits very fine and dissolving them, clumpy; but it gets it all used.

Others will cut up the bits from one bar of soap and press them into a mold to make another bar of soap out of the leavings of every ten (or however many) bars. I've even seen products sold to help you do this.

But it turns out the first approach is actually the best one, and it's your frugal nature that's making it not work. The trick is that the more frugal-minded you are, and thus, the more likely to be using this approach, the more likely you are to try to use every last bit of the bar the "normal" way before you turn to finding another use for the last bits. That's why sticking it to the next bar is hard: by this time, the previous bar is just little fragments that dry out too fast.

If, instead, you add the new bar to the old bar somewhat earlier, when it's just starting to get soft enough to curve and bend in your hand without cracking, then it'll stick to the new bar perfectly. On the very first day that the old bar is flexible, use both it and the new bar; and when they're both wet, stick them together firmly, and set them aside. Don't touch them until the next day. By that point, they'll be fused nearly as perfectly as a single bar. No flaking bits, no fragments to lose, no pieces sliding off. So you get to use every particle of soap, without spending more than a few extra seconds on the process.

Sunday, May 09, 2010

Time heals all wounds

I've been thinking about my efforts to get back on the horse (which are going pretty well, most rides I don't even think about my spill) and it occurs to me that there's some marked similarities between the process of recovering from fear, as when getting "back on the horse," and recovering from pain, as when you grieve for a loss. In both cases, there's really no cure but time. People talk about whether there's any way to hurry up the process, and generally conclude that it takes as long as it takes. I think is an oversimplification, designed not so much to reflect the truth precisely, as to make people feel okay with it taking however long it does take, rather than feeling pressured to recover faster (which is counterproductive).

So I found myself wondering, why exactly does time heal all wounds? And I came to what seemed like a minor but profound revelation: when it comes to emotional wounds, time doesn't actually heal anything at all. The wound never really heals or changes in any way. It just becomes smaller in perspective as it recedes in time, as other things get in between you and it.

Don't mistake this for an analogy that has forgotten it's an analogy: the idea of it receding can be taken literally. Our minds are programmed to prioritize the present over the past, and even the recent past over the more remote past. An injury I suffered yesterday, and the fear of how much worse it could be, is big in the mind's landscape of importance; the same exact injury and fear, a month later, is no different, but it simply is less important. Each day, it's farther from the forefront of my thoughts, and so my mind spends less time on it. Eventually, it will have shrunk so far in the distance behind me that it will not register at all. And it's the same with grief and loss; each day it's a little farther from the moment, so while the pain of missing someone is no different, it never actually changes the way a physical wound changes as you heal, it's smaller because there's things on your mind that are more important to your mind -- by virtue of being more recent.

It therefore follows that the one way to actually hasten how quickly time will heal all wounds is to fill up that time with other things. The more things that fill your life and your thoughts, the more quickly they'll push the wounds, the fear, the pain, aside. That said, I don't think it'll work to shove things into your life for no reason other than to hasten the process. Something you had no real reason to do, or think about, won't actually displace your hurt; it could, in fact, just become a reminder of it, since thinking about it will only remind you of why you chose it. However, something that genuinely grips your thoughts on its own, whether good or bad, will tend to displace the hurt. That's why a hobby or relaxation can help, as can a new crisis or concern or something new that you need to do, as can taking on a new project.

But even years later, if you find some reason to dredge up the memories of the pain, or something reminds you of it, it can all come back. If you remember it thoroughly enough, enough to drive away the distance that has accumulated, the pain or fear can return in pretty much the same quantity as when it was fresh. This rarely happens only because it's rare to really eliminate all the distance -- both because it's hard to do, and because we naturally avoid it as much as possible. But it can happen, which corroborates the idea that the wound never really heals, never really changes in itself.

In the end, this theory doesn't really add much to my understanding of grief or recovery in terms of things one can actually do; it mostly just explains what we already knew and makes it make sense. Maybe I'm just doing that Psych 101 student thing of oversimplified theories, but it seems to hold up to consideration. At worst, it could be distilled into the kind of pithy quotation that people share with each other in times of need.

Saturday, May 08, 2010

Naked Pictures Of Famous People

Since I enjoyed America: A Citizen's Guide to Democracy Inaction so much, I was hopeful for Jon Stewart's prose collection, Naked Pictures Of Famous People. The cover blurbs compared it to Woody Allen's collection Without Feathers which is high praise indeed, and while I can see a similarity of style, the comparison really doesn't hold up, sadly.

I certainly like Jon Stewart's humor, and there were bits in this book that were laugh-out-loud funny, but they were few and far between. Most of it was stylistically reminiscent of the least funny bits of the Daily Show, but separated out from everything else and exaggerated. The first story typifies this: it's a fictionalized diary of a visitor to the Kennedy household, which just builds increasingly absurdist depictions of the family's screwed-up state, trying to avoid being in poor taste by being so over the top.

In a way, it feels like when a band you like has its lead singer do a solo album, and it doesn't work. Is it because he's not carrying his share of the talent? I don't think so, in this case. You see Jon holding his own in the interviews where there's no writers backing him up, and he's sharp as a tack. It could be that the input of the rest of the team keeps him from going off into more self-indulgent directions, though. That's something you see a lot in solo albums, and it suits what you see here. Or maybe he's just rehashing the material that didn't make the cut on the show or in other venues, so it's second-rate material from the start.

A few of the bits of prose are fairly funny, at least in spots. There are some laughs in "Revenge Is A Dish Best Served Cold," "The Last Supper, or The Dead Waiter," and "The New Judaism," but none are worth the price of the book. A few others are really awfully bad, such as "Vincent And Theo On AOL," "Martha Stewart's Vagina," "A Very Hanson Christmas, 1996-1999," and "Pen Pals." The rest fall in a sort of grayish "blah" zone in between.

Save your time. Read something else. Sorry, Jon. I wanted to like it.

Friday, May 07, 2010

America: A Citizen's Guide to Democracy Inaction

I recently finished reading the first book from the cast and writers of the Daily Show, America: A Citizen's Guide to Democracy Inaction, and my biggest complaint about it is that it was hard to read without fear I was annoying Siobhan with laughing too often and too hard.

The book is in the form of a high school social studies textbook, and in some ways it actually does get the job done; you're likely to come away from it knowing a few more things than you started with, particularly if you skipped that class. But this also proves to be an ideal framework to cast the same caustic, yet light-hearted and irreverent, humor the show casts on the news, without being just a repetition of the show. They've done an exceptional job of carrying over what's great about the show's humor, while at the same time adapting to the format of the book.

This book dates back to when Stephen Colbert was another correspondent, and like several of the others, he has brief "inset" articles that reflect his particular persona. Samantha Bee's play up her Canadian heritage by telling you how Canada does it, but with a deeply self-effacing, apologetic tone. Ed Helms plays his mercenary, couldn't-care-less persona while he tells you why he'd be the ideal candidate for whatever position of power is under discussion, while Colbert focuses on being scathing, profane, and sanctimonious as he shreds one topic after another.

Chapters cover a complete overview of American government, its history and structure, and how the pieces fit together, as well as the context of foreign and domestic issues. Some of the most cuttingly incisive dialogue concerns the role of the media and lobbyists in the political process, the convenience of the two-party system, and a selection of summaries of how it's not so bad because look how much worse the rest of the world is.

I recommend the book with no reservations whatsoever to anyone who enjoys the Daily Show.

Thursday, May 06, 2010

Plot holes that don't matter

I read threads about some TV shows on the TiVo Community forums, specifically Now Playing, where some shows (notoriously especially Lost) get analyzed to death. I sometimes have trouble expressing a concept when people talk about plot holes, where something that the story depends on doesn't make sense or fit with what we learned before: namely, that some matter, and some don't.

How can a plot hole not matter? The key question is this. Could you plug up the plot hole by inserting a few lines of dialogue which would resolve the issue, and change nothing else in any significant way, and not contradict anything else already established? Even more so, could you do so in such a way that the dialogue added no entertainment value, and better yet, just bogged the show down in a bit more exposition, serving no purpose but to plug the plot hole?

If you can, then the plot hole doesn't matter. Just imagine that scene with the dialogue you need got filmed, then got cut for time. If there are multiple possible explanations, just pick one, since we've already determined that any of them would change nothing else in any significant way, so it won't matter which one you choose.

For example, perhaps a bit of a story depends on the fact that a particular character hasn't had a date in years, but there's no particular reason she couldn't get a date; she's cute enough, she's wealthy, whatever. That she hasn't had a date in years is not a big turning point or an important fact, save only that it is the impetus for her to do a particular thing that turns out to be interesting or important. That she hasn't had a date is mentioned and then the story moves on from there. Some people object: how could she not have gotten a date for years?

Let's say that, based on what we know about her, it's entirely plausible that three years ago she had a bad breakup and stayed out of the dating scene for a while, maybe had trouble trusting people. Maybe she's just shy. Maybe she has particular tastes for a particular kind of partner. Maybe she's hesitant to accept any overture that she thinks is just because of her looks.

Since nothing in the story depends on why she hasn't had a date, and we can invent any number of plausible reasons which contradict nothing past or present in the story, you could insert another throwaway line or three into the story that answered the question, and then everything would work. Removing those lines doesn't change anything, it just leaves an unimportant question unanswered. But it's possible that it makes the pacing of the story a lot better. We might feel like we'd rather have everything answered, but if we did, we might complain the show is too boring and talky. Sometimes, we never find out what something is not just because it doesn't matter but because finding out would detract or distract from the story (as in the famous case of the Maltese Falcon).

By contrast, if there's no way to explain away a contradiction or seemingly implausible fact that doesn't also require you to posit something which would affect the show, or be too important to be brushed aside, have a significant impact on the character, or change things the characters would do, then the plot hole is a real one that matters. Or if there's no way to fix the plot hole in three lines of dialogue, then it matters. Either the writers will address it later (by revealing how some previously assumed truth was wrong, or adding some new element to the story), in which case finding it might be finding a clue; or it'll just sit there as something that you have to not look at too closely.

So many people get so burned by plot holes and so used to looking for them that they can't see how some of them are, or might as well be, nothing but editing choices. It makes it harder to make real plot holes as important as they should be, when they're drowned out in ones that aren't.

Wednesday, May 05, 2010

Public Service Recognition Week

Since this is something that got publicly announced I think it's safe to talk about it here despite it being a work thing.

On Monday I attended the Public Service Recognition Week awards luncheon, along with several of my coworkers and our guests. Of the dozens of people who worked on the Warehouse Management System project (WMS) over the last few years (and particularly towards the climax last spring and summer), six of us were picked out as representatives to be nominated for, and to win, recognition as one of the fifteen Outstanding State Teams.

For this, we got to have a moderately good lunch, at the cost of having to dress up. (My first time wearing my new tie, and in fact my first time wearing a tie in 11 years -- the last time was also for the same recognition -- so I needed to find directions on how to tie them.) Then we listened to Governor Douglas give short presentations on each of the winners (we got to go last, joy!), and get handed a certificate and photographed with the Governor.

The almost-ironic thing is that this project actually went quite badly, the worst of any in my career. So why get recognized? Perhaps because the failures weren't the fault of the team, really; or in recognition of the enormous amount of effort we went to to make up for the problems; or because ultimately despite all the catastrophes we got a usable system out of the deal that has made a big improvement (and hopefully will make even more in the future).

In 1999 I and my team won the same award (it's still up on the wall here) and back then Governor Dean's appearance at the ceremony was much less pronounced and more brief. He spoke far more briefly, was in a few snapshots, and ran off to do whatever he had to do next. By contrast, Douglas is a grandstander. He got into every photograph, announced every award, chatted and made jokes, and was there almost the whole time. He never missed a publicity opportunity.

Lest you think I think better of Douglas, quite the contrary. Dean gave the impression of someone who was too busy running the state's executive branch, but managed to eke out a little time to thank us. Douglas gave the impression of someone who liked the idea of a lunch and some good press, which are more important than anything else he has to do. More than a few times he used his glib, witty remarks to poke at political issues (very mildly). Not that there's anything wrong with that -- I am a fan of the Daily Show! -- but it does erode any sense that his attention was to us, rather than to himself.

Still, I got in a picture with Douglas and shook his hand, but the best I managed with Dean was being in the same room. So I suppose it's a silver lining of sorts.

Tuesday, May 04, 2010

Think like a man

Here's a few lyrics from a song I recently heard:
Stick my foot in my mouth and just run away,
Turn off my cell, I ain't got nothing to say
Disappear and not give a damn
I should think like a man

Ignore my emotions, emotions are dumb
The channels, still my senses are numb
Shorten my attention span,
I should think like a man
Okay, so there's no prizes to be won for clever lyricism here, but that's not the point. Is there an "other side of the coin" song you could write and release and not expect to get lambasted for?

Stand-up comedians have a special license to be able to make fun of gender-based stereotyped behavior, but that special license seems to apply consistently only for them. If some male wrote a song that was just as bitter-sounding and not-trying-to-be-funny as this one, making fun of women needing to go shopping all the time, being ditzy and unable to focus on anything important, or being emotionally unstable, they'd get roasted. Even if they took a far more "I'm just joking" tone than this song does, they'd still get criticized.

All these stereotypes are not wholly true and not wholly false. I understand that the world is coming out of thousands of years of oppression and there's a need for a certain amount of backlash; we have to swing the pendulum back too far before we can center it. Malekind has earned a certain amount of slamming, even if I personally didn't. But if it's true that we have to accept some of this happening as part of the process, at the same time, me complaining about it and pointing out the double standard is also something we have to accept happening as part of the process.

Monday, May 03, 2010

Furniture changes

For the longest time we've had a dining table, with a few inches cut off from the legs, in our living room surrounded by our secitonal sofa. A conventional coffee table is way too low to put a computer on, and I spend most of my at-home time sitting on the sofa with my computer in front of me. The sofa is more comfortable than an office chair, and it lets me have my computer in the same place I can watch TV or listen to music, which suits me: for me, I want my computer to be ubiquitous (which is why I'm so ready for an implanted computer). But it does lead to an unconventional sofa and table arrangement. (Though I recently saw some interesting coffee tables designed to account for this: the surface pivots up to a higher level when needed, for writing or using a computer. Still, we'd have it up virtually all the time!)

It recently occurred to me that having one big table for both of us causes a lot of inflexibility. Anywhere I set my stuff up is always going to be a little off where it would be comfortable whenever Siobhan needs to move the table to be closer or farther away, depending on what she's doing. It also positions my computer screen so that it tends to be in the way of her view, and the contortions to avoid that are more difficult with a single table. So we went to Big Lots to look at smaller tables we could have independently and thus move independently. We ended up getting a pair of the pictured glass desks, which are working out very well. I had a brainstorm setting mine up in trying to figure out where to put my remote control holder: I drilled a hole in it, then put one of the desk's screws through it, so the holder is slung at an angle on the side of the desk, out of the way and taking up no desk space, yet completely at hand, and impossible to topple over.

Yesterday was our first roleplaying game gathering since the desks were assembled, and it led to another change. I need my computer on hand when roleplaying, and with the old table, that meant it was right in front of me, and thus in between me and at least some of the players. I never really liked the sense of being cut off from them (I never even used to use GM's screens, and this is far more obscuring). Now that my whole computer setup is on a single desk, it was easy to move that around to the end of the sofa, and put a much smaller, lower table in front of me (which I barely even used!) for my books, which gave me a clear view of the other players, and them of me. We even pulled out the table entirely and replaced it with tray tables, which left a nice clear space that made it easier for people to get to the sofa and move around (and gave Socks a center-of-attention to stand in). I was very pleased with how it worked out without the table.

Tonight on the way home we're stopping at a furniture store to order a new sectional sofa. We've been intending to do this for a while; last year we got some repairs on our current sofa, but they were only intended to buy time since we couldn't afford to replace the sofa last year. The new one will be a Flexsteel, and from all accounts, far sturdier and likely to last longer even under the weight and strain of how I use them. It'll be another sectional like the pictured arrangement, with a curved center "nest" seat and two seats on either side, though not in this color and fabric. It'll about fit the same in the living room as what we have now, and we'll keep it at an angle that'll give both of us a clear view on the TV and fit with these new desks.

It'll cost more than I originally anticipated since we're going to a high-end line in hopes it'll last a good long time. But they have a zero-interest financing plan that will let us space out the payment, and I've already worked out how to make sure it'll be paid off without a penny of interest, so there's no disadvantage. This should leave us a little money in the budget to repair (or if we can't, to replace) the lounge chairs that (for some reason) end up doubling as our guest chairs. (We've got a lot of sofa space, but most people always seem to prefer the chairs.)