Thursday, May 31, 2007

Not the Soup. The Award. Not that Award; the Other Award.

Okay, I got this via Scalzi's blog, which linked in turn to this official-looking site, so I guess it's on the level even though I've received no official notification. But it looks like Blindsight made the finals for the John W. Campbell Award.

It's in there with the usual worthy suspects from the Hugos and the Nebs, and lots more besides because — hey, thirteen finalists? Is that normal? Well, whether it is or not, I'm especially gratified to see Karl Schroeder's Sun of Suns in there, because that really is an incredibly fun book with some glorious Technicolor worldbuilding. I wish I'd written it.

Labels: ,

Wednesday, May 30, 2007

Unsung Heroes

Sorry I haven't been around here much lately. It's getting down to the wire for this damn York address and I've gotta shave half an hour off of it. (It's tough — I mean, you try argueing that pedophiles and suicide bombers will be the ones setting the ground rules for any post-singularity society. In less than 30 minutes. You gotta take it in small steps.) Also, I'm trying to find an agent, and I gotta do it quick; I've only got street cred as long as I'm a "finalist" for these awards. The moment they go to someone else, I'm just an also-ran. (And in the course of that search, might I just point out that Deanna Hoak is absurdly gracious, classy, and helpful to people she barely even knows. A few more like her would make a complete hash of the rules of Darwinian natural selection.)

Oh, and while I'm in hat-tipping mode: I usually e-mail thankyous to those who've contributed to the Niblet Memorial Kibble Fund, but sometimes I can't get through. So if John Packer from Australia, and/or Terry Doyle from the UK happen by, thank you both for your generosity. And your e-mail addresses are broken. For months.

Tuesday, May 29, 2007

Anyone with half a brain could tell it.

Via Futurismic, an accessible piece from Scientific American on radical hemispherectomies, an operation which readers of Blindsight will recognise as the defining moment in the depersonalisation of the young Siri Keeton.


Saturday, May 26, 2007

London Falling

Saw 28 Weeks Later last night. Few explicit spoilers follow, but much can be infered from what I write below. If you haven't seen the movie yet, and you intend to, you may want to skip this entry.

Released at the start of the summer blockbuster season: going up against Spiderman, Pirates, Shrek, Die Hard, Silver Surfer, and any number of other big-bang, uplifting, heartwarming, adrenalising affirmations that Good Triumphs Over Evil Because It's Nicer. And what does 28 Weeks Later put on the table against such adversaries?

Characters invested with such humanity that you know, according to the rules of Hollywood, that at the very least they'll make it to the final reel — only to see them felled like red-shirted extras before the halfway mark. Families tearing each other apart (no, that is not metaphor). Rooftop snipers stamped from neither Stormtrooper nor Top Gun molds, decent shits with real honest-to-God consciences, indiscriminately shooting civillians in the street to spare them the napalm that takes their unluckier mates ten minutes further on.

And worst of all, this movie delivers a viewing audience that knows, down in the bone, that there's nothing else to be done, that the massacre of innocents is the only strategy that holds any hope at all — and that even that, ultimately, may not be enough.

In the summer popcorn season? This movie has balls the size of fucking grapefruits.

I'm even willing to forgive the supernaturally-absurd rate at which the rage virus propagates. (Sure, you can go from swapping spit to full-blown eyeball-haemorrhageing psychosis in 35 seconds. All you need is a bug that reproduces two hundred times per second). I can forgive it because I am so sick of tepid Hollywood movies (Outbreak comes to mind) that pretend to grapple with the no-win scenario only to pull some third alternative out of their asses in the third act, movies whose sacred trust is to reassure skittish moviegoers that See, there is a way to avoid killing all those innocent infectees. All we have to do is stop the Evil General Who Won't Listen To Reason!

Back in the day, Disney kiddie movies had more hard-eyed maturity than crap like Outbreak. Bambi's mother died, irrevocably, because that's what parents do in real life. You just gotta deal with it. But not today. Today, Spielberg rewrites history and arms ET's containment forces with walkie-fucking-talkies to protect the delicate sensibilities of the world's children (yet another example of the cognitive impairment that seems to accompany parenthood, btw.)

But not 28 Weeks Later. Here's a movie that doesn't even offer us the cold solace of a no-hope scenario. There is hope, you see. There is hope, and there are decent people trying to do the best they can in dire circumstances, and there are no villains — only victims and vectors.

But most of all there is hope. There is at least the glimmer of a solution. And decent, well-intentioned Human stupidity squanders it anyway.

You can have your Spiderman and your Silver Surfer. Watch the pretty explosions. Eat your pocorn. Suck your thumbs.

As for me? Whenever I find my will to live becoming too strong, I watch 28 Weeks Later.


Tuesday, May 22, 2007

Motherhood Issues

How many times have you heard new parents, their eyes bright with happy delerium (or perhaps just lack of sleep), insisting that you don't know what love is until you first lay eyes on your baby? How many of you have reunited with old university buddies who have grown up and spawned, only to find that mouths which once argued about hyperspace and acid rain can't seem to open now without veering into the realm of child-rearing? How many commercials have you seen that sell steel-belted radials by plunking a baby onto one? How many times has rational discourse been utterly short-circuited the moment someone cries "Please, someone think of the children!"? (I've noticed the aquarium industry is particularly fond of this latter strategy, whenever anyone suggests shutting down their captive whale displays.)

You know all this, of course. You know the wiring and the rationale behind it: the genes build us to protect the datastream. The only reason we exist is to replicate that information and keep it moving into the future. It's a drive as old as life itself. But here's the thing: rutting and reproduction are not the traits we choose to exhalt ourselves for. It's not sprogs, but spirit, that casts us in God's image. What separates us from the beasts of the field is our minds, our intellects. This, we insist, is what makes us truly human.

Which logically means that parents are less human than the rest of us.

Stick with me here. All of us are driven by brainstem imperatives. We are all compromised: none of us is a paragon of intellect or rationality. Still, some are more equal than others. There is a whole set of behavioral subroutines that never run until we've actually pupped, a whole series of sleeper programs that kick in on that fateful moment when we stare into our child's eyes for the first time, hear the weird Middle-eastern Dylan riffs whining in our ears, and realise that holy shit, we're Cylons.

That is the moment when everything changes. Our children become the most important thing in the world, the center of existence. We would save our own and let ten others die, if it came to that. The rational truth of the matter— that we have squeezed out one more large mammal in a population of 6.5 billion, which will in all likelihood accomplish nothing more than play video games, watch Inuit Idol, and live beyond its means until the ceiling crashes in— is something that simply doesn't compute. We look into those bright and greedy eyes and see a world-class athlete, or a Nobel Prize-winner, or the next figurehead of global faux-democracy delivered unto us by Diebold and Halliburton.

We do not see the reality, because seeing reality would compromise genetic imperatives. We become lesser intellects. The parental subroutines kick in and we lose large chunks of the very spark that, by our own lights, makes us human.

So why not recognise that with a new political movement? Call it the "Free Agent Party", and build its guiding principles along the sliding scale of intellectual impairment. Those shackled by addictions that skew the mind — whether pharmaceutically, religiously, or parentally induced — are treated the same way we treat those who have yet to reach the age of majority, and for pretty much the same reasons. Why do we deny driver's licences and voting priveleges to the young? Why do we ban drunks from the driver's seat? Because they are not ready. They are not competent to make reasonable decisions. Nobody questions this in today's society. So tell me, how are offspring addicts any different?

I'm thinking of adding such a political movement to the noisy (and slightly satirical) background of an upcoming novel, but the more I think of it, the more it strikes me as an idea whose time has come. It's a no-lose electoral platform as far as I can see.

Now go find me a campaign manager.

Labels: , ,

Sunday, May 20, 2007

How to Build a Zombie Detector

A fair number of topics jostling for attention lately: slime moulds outfitted with skittish cyborg exoskeletons, Jim Munroe's microbudget megasavvy take on nanotech, even this recent research on free will in fruit flies (which I'm wary of, but am holding off commenting upon until I've thoroughly read the original paper). And I'm in bitten-off-more-than-I-can-chew mode at the moment, so I don't have time to put all that stuff on the crawl right now. But there is one thing that struck me like a bolt from the blue (except it was actually a bolt from an e-mail server) late last night, as I was trying to clear away the e-mail backlog:

Zombie detectors.

There's this guy, allegedly named Nick Alcock, who seems to know way more than he admits to. He first ruined my morning back in March by pointing out that if vampires really needed to eat people because they couldn't synthesise gamma-Protocadherin-Y on their own, and if they needed that protein because it was so damned critical for CNS development, then women shouldn't have working brains because the gene that codes for it is located on the Y chromosome. It was a shot across the bow I could not resist; we're still going at it two months later.

One of the things we've veered into lately is the classic philosopher-wank question: if you've got a nonconscious zombie that natural selection has nonetheless shaped to blend in — to behave as though it were conscious (we're talking the classic philosopher zombie agent here, not the fast killer-zombies under discussion a few days ago) — how could you detect it? More fundamentally, why would you bother? After all, if it behaves exactly like the rest of us, then the fact that it's nonconscious makes no real difference; and if it does behave differently, then consciousness must have some impact on the decision-making process, findings about after-the-fact volition notwithstanding. (The cast of Blindsight mumble about this dilemma near the end of the book; it's basically a variant on the whole "I know I'm conscious but how do I know anyone else is" riff.)

So this Alcock dude points out that if I'm right in my (parroted) claims that consciousness is actually expensive, metabolically, then zombie brains will be firing fewer synapses and burning through less glucose than would a comparable baseline human performing the same mental tasks. And that reminded me of a paper I read a few years back which showed that fast thinkers, hi-IQ types, actually use less of their brains than the unwashed masses; their neural circuitry is more efficient, unnecessary synapses pared away.

Zombie brains run cooler than ours. Even if they mimic our behavior exactly, the computational expense behind that behavior will be lower. You can use an MRI to detect zombies!

Of course, now Nick has turned around and pointed out all the reasons that would never work, because it is his sacred mission in life to never be satisfied. He's pointing out the huge variation in background processing, the miniscule signal one would have to read against that, the impossibility of finding a zombie and a sentry (trademark!) so metabolically identical that you could actually weed out the confounds. I say, fuck that. There are places where the conscious and subconscious minds interface: I say, look at the anterior cingulate gyrus (for example), and don't bother with crude glucose-metabolism/gas-mileage measures. There's gotta be some telltale pattern in there, some trademark spark of lightning that flickers when the pointy-haired boss sends a memo. That's what you look for. The signature of the ball and chain.

Of course, it won't be enough for this Alcock guy. He's bound to find some flaw in that response. He always does.

Maybe I just won't tell him.

Labels: ,

Thursday, May 17, 2007

Off-Key Speaker

So there's this annual thing up at York University: the Academic Conference on Canadian Science Fiction and Fantasy (although for some reason their web site seems to stop at 2005). A few months back I gave a guest lecture up at York, which was evidently a big hit on account of most academic speakers tend to not use terms like "hand job" and "shit-for-brains" during the course of their presentations. Anyway, one of the people running the course — Allan Weiss by name — asked me if I wanted to give a talk at this ACCSFF thing. Er, I said. We'll pay you, he told me. I'm in, I said.

Now I find I'm listed as Keynote Speaker. Ohhhhkay.

So far I've come up with a title. "Anachronism, Inattentional Blindness, and the Turd in the Punchbowl: or, How I learned to Stop Worrying and Love the Singularity." I have until June 9 to either figure out what that means, or failing that, to stick so many additional words onto the title that it'll take half an hour to read aloud.

Wish me luck. Better yet, wish it for my audience.


Tuesday, May 15, 2007

Torontonians: Infest Wisely

You all know the scoop on self-publishers, don't you? Those losers who, unable to interest any legitimate publisher in their verbiage, haunt Kinkos with pockets full of quarters, printing out their magnum opus on the backs of old cable bills in the hope that some streetcorner pedestrian might take pity on them. A hapless breed, their numbers kept in check by their natural predator, the Vanity Press. Oh yes. You know all about self-publishers.

Well, here's a new subspecies for you: Jim Munro, whose debut novel was published by HarperCollins, and did very well for the man. And then Munroe turned his back on Rupert Murdoch (not that ol' Rupe noticed, of course) — and walked away.

You know me. I piss and moan endlessly about the Big Bad Publishing Industry. There is no end to my fucking whining. But Jim Munroe did something I never had the guts to do: he left his Big Name Publisher because he didn't like the way it behaved, and he started publishing his books himself.

And damned if he hasn't made a go of it.

Now he's branching out into other media, writing and codirecting "Infest Wisely", an episodic "lo-fi sci-fi" feature about chewable nanotech. And if you happen to be in downtown Toronto this Friday the 18th with five bucks to spare, you can catch the premiere.

Seriously, go. It's not like you'll be missing anything here...

Labels: ,

Monday, May 14, 2007

Starfish tp ETA?

I know this is a long shot, but I don't suppose anyone out there knows when the trade paper edition of Starfish is due for release? A search on Tor's website turns up nothing.

Yes, I've asked them directly. Repeatedly. I actually brought it up twice in my last e-mail, which netted the response Yes, we're rereleasing Starfish. Not that I wasn't happy to hear this — ever since they pulled SF from its original 2006 slot I've wondered whether it had really been "rescheduled" or simply abandoned — but it doesn't actually answer the question of when. And I know it must seem weird to resort to message-in-a-bottle tactics over such a basic query. But, you know. Tor.

So, anyone?


Thursday, May 10, 2007

The Uplift Protein

Neuropsin, that is. A prefrontal-cortex protein involved in learning and memory. There's this one variant that's peculiar to us Humans, 45 amino acids longer than the standard model handed out to other primates, and a team of Chinese researchers have just nailed the gene that codes for it. And the really cool part? Utterly ignoring all those some-things-man-was-not-meant-to-know types, they spliced the causal mutation into chimpanzee DNA, which then started to synthesise the type-II variant. No word yet on how far they let that stretch of code iterate. No word on how many months away we are from building chimps with human-scale intelligence.

The actual paper isn't out yet. But I'm really hoping my U of T library access is still active when Human Mutation prints the details.

Labels: , ,

Tuesday, May 8, 2007


I've just finished reading a draft of R. Scott Bakker's soon-to-be-released Neuropath. Holy shit.

The neurology of consciousness. The advantages of nonsentience. People neurologically stripped of their behavioral constraints so that they can make the necessary Big Decisions of life and death without getting caught up in touchy-feeling shackles like conscience and morality. All the major themes of Blindsight and a bunch of those from the rifters trilogy thrown in for good measure...

And does he stick them in a hard-sf spaceships-and-aliens chassis that only hardcore skiffy geeks will read? Does he locate his story in a future so close to the Singularity's event horizon that society itself has grown strange and forbidding to the average reader? Does he present his arguments through characters so twisted and specialised that most readers have no choice but to regard them as more alien than the aliens they encounter?

No. He sets it a mere decade into the future, in the context of a serial killer police procedural. Instead of aliens and freaks he uses sexy FBI agents and divorced psychologists. This guy is basically writing about Blindsight-type issues, but is aiming them squarely at a da Vinci Code audience. He is dealing with the same existential questions, but has rendered them accessible for beach readers. He has done exactly what I would have done, if only I'd been smart enough.

At least Blindsight came out first. I can cling to that. Because trust me: when Neuropath hits the shelves, it's gonna be "Peter who?"

Labels: ,

Saturday, May 5, 2007


You may have seen this already. It's been out for a few days now. And at first glance it's nothing special: technology controlled by brainwaves through an ECG electrode interface, which is so far behind the cutting edge that you'll be finding it in games before the end of the year. But check out this quote as to why, exactly, the military would even want to develop brain-activated binoculars:

The idea is that EEG can spot "neural signatures" for target detection before the conscious mind becomes aware of a potential threat or target ... In other words, like Spiderman's "spider sense," a soldier could be alerted to danger that his or her brain had sensed, but not yet had time to process.

So. Another end run around the prefrontal cortex in the name of speed and efficiency. I'm telling you, nobody likes the pointy-haired boss these days...

Labels: ,

Thursday, May 3, 2007

The anti-Moore's Law

Anyone who's read my fiction has probably figured out my perspective on life-support/environmental issues. I tend not to talk about such stuff here, not because I don't find it relevant or important, but because it's not new or cutting edge; the non-self-aggrandizing parts of this 'crawl serve as a kind of scratch pad for things I find challenging or thought-provoking in some way, and it's been a while since the science on habitat destruction, species loss, and climate change has done anything but reinforce grim conclusions decades old.

Today, though, I make an exception because of two items in juxtaposition: first, it turns out that the most pessimistic climate-change models were in fact way too naively cheerful, and that the Arctic icecap is melting three times faster than even Cassandra foresaw. And secondly, our ability to monitor such changes is declining thanks to decreasing investment in orbital earth-monitoring programs— to the point where satellites are actually becoming "less capable" over time. The technology is devolving.

And this is a little bit on the new side. Like all the other Children of Brunner, I always knew the place was turning to shit— but I'd at least hoped that technology would let us watch it happen in hi-def.

I keep saying it, but no one believes me: I'm an optimist...

Labels: ,

Wednesday, May 2, 2007

Consciousness, Learning, and Neurochips

I'm starting this new post both to take the weight off the old one (which is growing quite the tail-- maybe I should look into setting up a discussion forum or something), and also to introduce a new piece of relevent research. Razorsmile said

Conscious trains the subconscious until it is no longer needed..

And then Brett elaborated with

that could be how concious thought is adaptive. It doesn't do anything even remotely well, but it can do anything. It is the bridge between something you've never done before and something that you do on skill. which I'll say, sure, that's certainly how it seems subjectively. But I have three flies to stick in that ointment:

1. Given the existence of consciousness to start with, what else could it feel like? Supposing it wasn't actually learning anything at all, but merely observing another part of the brain doing the heavy lifting, or just reading an executive summary of said heavy lifting? It's exactly analogous to the "illusion of conscious will" that Wegner keeps talking about in his book: we think "I'm moving my arm", and we see the arm move, and so we conclude that it was our intent that drove the action. Except it wasn't: the action started half a second before we "decided" to move. Learning a new skill is pretty much the same thing as moving your arm in this context; if there's a conscious homunculus watching the process go down, it's gonna take credit for that process -- just like razorsmile and brett just did-- even if it's only an observer.

2. Given that there's no easy way to distinguish between true "conscious learning" and mere "conscious pointy-haired-boss taking credit for everyone else's work", you have to ask, why do we assume consciousness is essential for learning? Well, because you can't learn without being con--

Oh, wait. We have neural nets and software apps that learn from experience all the time. Game-playing computers learn from their mistakes. Analytical software studys research problems, designs experiments to address them, carry out their own protocols. We are surrounded by cases of intellects much simpler than ours, capable of learning without (as far as we know) being conscious.

3. Finally, I'd like to draw your attention to this paper that came out last fall in Nature. I link to the pdf for completists and techheads, but be warned— it's techy writing at its most opaque. Here are the essential points: they stuck neurochips into the brains of monkeys that would monitor a neuron here and send a tiny charge to this other neuron over there when the first one fired. After a while, that second neuron started firing the way the first one did, without further intervention from the chip. Basically, the chip forces the brain to literally rewire its own connections to spec, resulting in chages to way the monkeys move their limbs (the wrist, in this case).

They're selling it as a first step in rehabilitating people with spinal injuries, impaired motor control, that kind of thing. But there are implications that go far further. Why stop at using impulses in one part of the brain to reshape wiring in another? Why not bring your own set of input impulses to the party, impose your new patterns from an outside source? And why stop at motor control? A neuron is a neuron, after all. Why not use this trick to tweak the wiring responsible for knowledge, skills, declarative memory? I'm looking a little further down this road, and I'm seeing implantable expertise (like the "microsofts" in William Gibson's early novels). I'm looking a little further, and seeing implantable political opinions.

But for now, I've just got a question. People whose limbs can be made to move using transcranial magnetic stimulation sometimes report a feeling of conscious volition: they chose to move their hand, they insist, even though it's incontrovertible that a machine is making them jump. Other people (victims of alien hand syndrome, for example) watch their own two hands get into girly slap-fights with each other and swear they've been possessed by some outside force-- certainly they aren't making their hands act that way. So let's say we've got this monkey, ad we're rewiring his associative cortex with new information:

Does he feel as if he's learning in realtime? Can he feel crystalline lattices of information assembling in his head (to slightly misquote Gibson)? Or is the process completely unconscious, the new knowledge just there the next time it's needed?

I be we'd know a lot more about this whole consciousmess thing, if we knew the answer to that.

Labels: ,