Wednesday, January 7, 2009

Iterating Towards Bethlehem

Most of you probably know about Turing machines: hypothetical gizmos built of paper punch-tape, read-write heads, and imagination, which can — step by laborious step — emulate the operation of any computer. And some of you may be old enough to remember the Sinclair ZX-80— a sad little personal computer so primitive that it couldn't even run its video display and its keyboard at the same time (typing would cause the screen to go dark). Peer into the darkness between these artifacts, stir in a little DNA, and what do you get?

This hairy little spider right here. A pinpoint brain with less than a million neurons, somehow capable of mammalian-level problem-solving. And just maybe, a whole new approach to cognition.

This is an old story, and a popsci one, although I've only discovered it now (with thanks to Sheila Miguez) in a 2006 issue of New Scientist. I haven't been able to find any subsequent reports of this work in the primary lit. So take it with a grain of salt; as far as I know, the peer-reviewers haven't got their talons into it yet. But holy shit, if this pans out…

Here's the thumbnail sketch: we have here a spider who eats other spiders, who changes her foraging strategy on the fly, who resorts to trial and error techniques to lure prey into range. She will brave a full frontal assault against prey carrying an egg sac, but sneak up upon an unencumbered target of the same species. Many insects and arachnids are known for fairly complex behaviors (bumblebees are the proletarian's archetype; Sphex wasps are the cool grad-school example), but those behaviors are hardwired and inflexible. Portia here is not so rote: Portia improvises.

But it's not just this flexible behavioral repertoire that's so amazing. It's not the fact that somehow, this dumb little spider with its crude compound optics has visual acuity to rival a cat's (even though a cat's got orders of magnitude more neurons in one retina than our spider has in her whole damn head). It's not even the fact that this little beast can figure out a maze which entails recognizing prey, then figuring out an approach path along which that prey is not visible (i.e., the spider can't just keep her eyes on the ball: she has to develop and remember a search image), then follow her best-laid plans by memory including recognizing when she's made a wrong turn and retracing her steps, all the while out of sight of her target. No, the really amazing thing is how she does all this with a measly 600,000 neurons— how she pulls off cognitive feats that would challenge a mammal with seventy million or more.

She does it like a Turing Machine, one laborious step at a time. She does it like a Sinclair ZX-80: running one part of the system then another, because she doesn't have the circuitry to run both at once. She does it all sequentially, by timesharing.

She'll sit there for two fucking hours, just watching. It takes that long to process the image, you see: whereas a cat or a mouse would assimilate the whole hi-res vista in an instant, Portia's poor underpowered graphics driver can only hold a fraction of the scene at any given time. So she scans, back and forth, back and forth, like some kind of hairy multilimbed Cylon centurion, scanning each little segment of the game board in turn. Then, when she synthesizes the relevant aspects of each (God knows how many variables she's juggling, how many pencil sketches get scribbled onto the scratch pad because the jpeg won't fit), she figures out a plan, and puts it into motion: climbing down the branch, falling out of sight of the target, ignoring other branches that would only seem to provide a more direct route to payoff, homing in on that one critical fork in the road that leads back up to satiation. Portia won't be deterred by the fact that she only has a few percent of a real brain: she emulates the brain she needs, a few percent at a time.

I wonder what the limits are to Portia's painstaking intellect. Suppose we protected her from predators1, and hooked her up to a teensy spider-sized glucose drip so she wouldn't starve. It takes her a couple of hours to capture a snapshot; how long will it take the fuzzy-legged little beauty to compose a sonnet?

Are we looking at a whole new kind of piecemeal, modular intellect here? And why the hell didn't I think of it first?

Update 9/1/08: Tarsitano & Jackson published these results in Animal Behaviour. Thanks to Kniffler for the heads-up


1 And isn't that a whole other interesting problem, how this little beast can sit contemplating her pedipalps for hours on end in a world filled with spider-eating predators? Do certain antipredator reflexes stay active no matter what, or does she just count on immobility and local cover to hide her ass while she's preoccupied with long-term planning? I'd love to see the cost-benefit of this tradeoff.

Portia photo: by Akio Tanikawa, scammed from Wikipedia under a CC licence.
Maze illo: scammed from New Scientist, under a nine-tenths-of-the-law licence.

Labels: , ,

Thursday, October 16, 2008

The Living Dead

Meet Candidatus Desulforudis audaxviator, the bacterium that does it all: fix carbon, fix nitrogen, synthesize all essential amino acids, locomote — an organism that can exist totally independent of other life. It doesn't even need the sun. This fucker basically lives on sulfur, rock, and electrons*.

It's an obligate anaerobe, without even the most rudimentary oxygen resistance. A bug like βehemoth would kick its ass throughout most of the terrestrial biosphere (its natural digs are a couple of kilometers down in the crust, where no O2 has poked its corrosive little head for at least three million years). But that's not likely to be any kind of drawback out in space, and various talking heads are already nattering excitedly about the prospect of something just like this hanging out on Mars, or on the Saturnian moons.

It is cool. It is, quite literally, a complete ecosystem bundled into a single species, a biosphere crammed into two-and-a-half megabytes and a crunchy shell. Astrobiologists the world over have been creaming their genes for a week now. It's such a science-fictional little beast that its very name was lifted from a Jules Verne novel— but what really sticks in my mind about this little Swiss-army knife is a feature that's actually pretty common down there.

If it's anything like other deep-rock dwellers, D. audaxviator reproduces very slowly, taking centuries or even millennia to double in numbers. It's a consequence of nutrient limitation, but might we be looking at a kind of incipient immortality here? The textbooks tell us that one of the defining characteristics of life is reproduction. But if you think of life as the propagation of organized information into the future — the persistence of signal, rather than merely its proliferation — then reproduction is really just a workaround. The chassis that carries the information wears out, and must be replaced.

It doesn't take much, here at the dawn of Synthetic Biology, to imagine an organism with unlimited self-repair capabilities; something that can keep its telomeres nice and long, which sweeps away all those nasty free radicals and picks up the broken bottles in their wake, which replaces an endless succession of disposable Swatches with a solid gold Rolex which can hang in there for a billion years or more. Hell, you could even postulate some kind of Lamarckian autoedit option on the genes, so the organism can adapt to new environments. Or you could just limit your organism to extremely stable environments that don't require ongoing adaptation. Interstellar space, for example. Or deep in a planetary lithosphere. In some ways, this could be a superior strategy to conventional breeding; at least you wouldn't have to worry about population explosions.

I wonder if, somewhere down there, D. audaxviator or something like it has given up on reproduction entirely. Maybe it keeps the machinery around as a kind of legacy app that no one uses any more and just ticks slowly onwards, buried beneath all that insulating and protective rock, unto the very end of the planet.

The textbooks would call it dead. I'd suggest our definitions may need an upgrade.


*Of course, the fact that it can live independently doesn't mean that it evolved independently. A bunch of its genes have been cadged from Archae via lateral transfer. Its genes also contain anti-viral countermeasures; whether it siphoned those off incidentally from donor species or actually uses them to guard against parasitic code, there's obviously a history of contact with other life in this bug's family tree.

Labels: , , ,

Friday, September 19, 2008

Avast! Here Be a Blindsightinator for Ye!

Aye me hearties, be ye rememberin' that time in Blindsight when Rorschach, she be putting the sun in scurvy Szpindel's eyes?

"Argh, I be seein' naught," Szpindel be sayin', his timbers a'shiver.

"It be the EM fields," James be barking. "That be how they signal. The briney deep, she be fulla words, she be—"

"I be seeing naught," Szpindel be saying. "I be blind as the skipper with his patch on the wrong eye!"

"Yar," Bates be lassooing the capstain. "That be a pretty mess— blast those scurvy rads…"


And then when they be hiding below decks, Szpindel be putting words to it…

"Ya reached for it, ya scurvy dog. You near be catchin' it. That not be blind chance."

"Argh, not blind chance. Blindsight. Amanda? Where be ye, wench?"

"Blindsight?"

"Aye. Nothing be wrong with ye receptors," he be saying. "Eye be working right enough, brain not be seein' the signal. Brain stem, he be mutineer. Arrgh."


Now those buggering cabin-boys from Denmark, they be laying claim to me booty. They be putting out "Action-blindsight in two-legged landlubbers that be having compasses on their skulls, Arggh", and they be staking their claim last winter in the Proceedings of the National Academy of Sciences.

They be asking me to be hanging their guts from the crowsnest, they e'er be blackening my horizon.

Labels: , , ,

Friday, March 14, 2008

In Praise of MPD

This month's New Scientist carries an opinion piece by Rita Carter, author of the imminent Multiplicity: The New Science of Personality. She's not the first to argue that multiple personalities may be adaptive (the whole backbone of the eighties' MPD fad was that they served to protect the primary persona from the stress of extreme abuse), nor is she the first to point out that MPD is just one end of a scale that goes all the way down to jes' plain folks adopting different faces for different social contexts (what Carter calls "normal multiplicity"). She does, however, suggest that "normal multiplicity could prove useful in helping people function in an increasingly complex world"; which raises the possibility that what we now think of as "pathological" multiplicity might prove useful in a hypercomplex world.

Cue the Gang of Four.

This is one of the themes introduced in Blindsight that I'm going to town on with Dumbspeech (okay, okay: State of Grace): that humanity is, in effect, splitting into a whole suite of specialized cognitive subspecies as a means of dealing with information overload. (You can see the rudiments of this in the high proportion of Aspies hanging out in Silicon Valley, perhaps.) But I've never encountered this Carter person before. Judging by her brief essay, I can't tell whether she's actually on to something or whether she's just putting neurogloss lipstick on the trivially obvious fact that it makes sense to behave differently in different situations (rather like making the Atkins Diet sound all high-tech and futuristic by describing it as "hacking the body").

Anyone here read her books? Are they any good?

Labels: , ,

Sunday, March 9, 2008

Mind Reading Technology...

...has been a staple of every low-budget piece of celluloid skiffy going back at least to that early-sixties Gerry-Anderson puppet show Stingray (which no one with any dignity will admit to having watched, although I clearly remember the episode with the mind-reading chair). The Prisoner also featured an episode in which No. 6's dreams could be probed, and the various incarnations of Star Trek must have had a half-dozen such episodes among them although they all seem to run together after awhile (the episode I'm thinking of had aliens with bumpy foreheads; does that help at all?).

Now here comes Kendrick Kay and his buddies in Nature with "Identifying natural images from human brain activity", and if they haven't actually vindicated all those cheesy narrative gimmicks, they've made a damn good first pass at it. They used fMRI scans to infer which one of 120 possible novel images a subject was looking at. "Novel" is important: the system trained up front on a set of nearly 2,000 images to localize the receptive fields, but none of those were used in the actual mind-reading test. So we're not talking about simply recognizing a simple replay of a previously-recorded pattern here. Also, the images were natural— landscapes and still-lifes and snuff porn, none of this simplified star/circle/wavey-lines bullshit.

The system looked into the minds of its subjects, and figured out what they were looking at with accuracies ranging from 32% to 92%. While the lower end of that range may not look especially impressive, remember that random chance would yield an accuracy of 0.8%. These guys are on to something.

Of course, they're not there yet. The machine only had 120 pictures to choose from; tagging a card from a known deck is a lot easier than identifying an image you've never seen before. But Kay et al are already at work on that; they conclude "it may soon be possible to reconstruct a picture of a person’s visual experience from measurements of brain activity alone." And in a recent interview Kay went further, suggesting that a few decades down the road, we'll have machines that can read dreams.

He was good enough to mention that we might want to look into certain privacy issues before that happens...

Labels: , ,

Thursday, March 6, 2008

Is this theory of yours accepted by any respectable authorities?

The long-awaited new Neuropsychologia's finally on the stands, and it's a theme issue on — wait for it — consciousness! Lots of articles on blindsight, interhemispheric signaling, anosognosia, all that cool stuff. And nestled in the heart of this month's episode is a paper by David Rosenthal entitled "Consciousness and its function".

Guess what. He doesn't think it has any.

From the abstract:
"...a number of suggestions are current about how the consciousness of those states may be useful ... I examine these and related proposals in the light of various empirical findings and theoretical considerations and conclude that the consciousness of cognitive and desiderative states is unlikely to be useful in these or related ways. This undermines a reliance on evolutionary selection pressures in explaining why such states so often occur consciously in humans."
Rosenthal's conclusion? Consciousness is just a side-effect, with no real adaptive value. And no, he didn't cite Blindsight. But we all know I went there first.

Somewhere else I went, back in 1991, has been making a few online waves over the past week or two: this brief Science article by Christner et al, suggesting that microbes play a major and hitherto-unsuspected role in shaping the world's weather. As Jeremy Ruhland pointed out a few days back, this is a wee bit reminiscent of a story I wrote in the early nineties — a post-environmental-apocalypse number in which vast colonies of cloud-dwelling weathermongering microbes had conspired to kick our asses. For a few years now I've been showing this slide whenever I want to make the point that sometimes you can hit the bullseye even when you have no fucking clue what you're talking about...


... because really, "Nimbus" was a spontaneous, unresearched brain fart based entirely on an old girlfriend's observation that "Ooh, look at those clouds... they almost look alive!" But CNN is not exactly the most prestigious source of scientific intel on the planet, and besides, Moffet was just starting to look back in 2002; he hadn't actually found anything. That was then; this is now. You can't get more prestigious than Science (well, unless you're Nature), and now we're looking at actual data.

Of course, this is nowhere near the cozy conjunction of Watts and Rosenthal. Christner et al. didn't even look at clouds per sé, only at the precipitation that had dropped out of them. And it's not like they discovered any new and alien microbes; mostly they came up with plant pathogens. (Also, my microbe-infested clouds had a kind of slow intelligence to them — and if we ever get any evidence supporting that conceit I'll eat my cats.) But what they did show was that microbes affect the weather— and at the very least, that leaves the door open for all sorts of evil, powerful, yet-to-be-discovered bugs lurking overhead.

I like that thought.

Labels: , , ,

Tuesday, February 26, 2008

Law & Order: Victims of Reality Unit

So Prozac and its ilk prove to be, for the most part, about as clinically effective as a sugar pill. Which kicks loose an idea for a story that's been rattling around in my head for a few years now:

A man diagnosed with terminal cancer is beating the odds with the help of a new drug recently approved by the FDA. The tumors have stabilised, perhaps even receded a little; he has already lived well past his mean life expectancy. It's a breakthrough, a miracle — until a couple of statisticians from John Hopkins publish an analysis proving that the effect is pure placebo. Our patient reads the study. Within a month, he's circling the drain. Within two, he's dead.

The next of kin charge the authors of the paper, and the journal that published them, with negligent homicide.

Placebos work, you see. The brain can do all sorts of things to the body; sometimes it just needs to be tricked into generating the right happy chemicals. Medical professionals know as much: it may not be the cure so much as the belief in the cure that does the trick, and when you shatter that belief, you are knowingly stealing hope and health from every patient who heeds your words. You are, in a very real sense, killing them.

Do we have here a legitimate argument for the perpetuation of ignorance? Medical professionals do not generally discourage the use of prayer in dire circumstance. It does no harm, after all. (Actually there's some evidence that it does do harm; let's set that aside for the moment.) But when you know that placebo effects are real, and you go out of your way to disillusion some deluded flake who shows up on the ward convinced that her crystals and magnets will keep the tumors at bay... well, maybe education of the sick should be a criminal offense.

I'm just saying.

Labels: ,

Saturday, January 19, 2008

Scramblers in the Shallows, Light in the Deeps

This is a short, stunning clip that starts with deep-sea glowsticks and segues to shallow-water cephalopods. The first part gives you a taste of Beebe Station; the second (including the Two-Faced Squid!) demonstrates some camo tricks that make scramblers look like amateurs.

No new information here, but beautiful. Try to ignore the creationist idiot in the comments.

Labels: ,

Saturday, January 5, 2008

Cancer, For the Greater Good

One of my favorite monster movies of all time has got to be John Carpenter's 1982 remake of “The Thing”. It's not a perfect film by any means – there are some gaffes with the rubber fx, and if eighties-era PCs ever came preloaded with software to test whether your buddies had been taken over by shapeshifting aliens, I never saw it. (A shame. Back then I would've bought something like that in a second. These days, it's pretty much redundant.) Still, “The Thing” is one of the few movies of its type in which the Human cast isn't as dumb as a sack of Huckabees. Nobody wanders off after the ship's cat all by themselves. The moment they figure out what they're up against they burn everything that moves, start thinking about serological detection methods, and keep really close watch on each other. It's an effective study in paranoia, starring an alien that not only squirts slime and sprouts tentacles, but actually proves to be quite a bit more intelligent than the humans it preys upon. (As one might expect from a creature with interstellar technology. Am I the only one bothered by the fact that the monster in the Howard Hawkes original never did anything smarter than just kinda flailing around and roaring?) Even at the scorched-earth end of the story, you're never really sure who won.

Then there's the biology.

It's actually not as totally whacked-out as you may think. Granted, anything able to morph between body plans in the time it takes me to snarf an Egg McMuffin would have to have stores of cellular energy verging on the nuclear. (Jeff Goldblum's gradual, donut-powered transformation in “The Fly” was a lot more believable – although why those telepods got all confused at the presence of fly DNA, when they didn't seem to bat a diode at the bacteria seething on every square micron of both fly and human, remains an open question. But I digress.) Still, if you can forgive the ridiculously fast transformation, the idea of an infectious agent retooling infected cells for its own purposes is old news. Viruses have been doing it for billions on years.

Now we are too. Synthetic Life's current rock star, J. Craig Venter, is all over the lit with his artifical chromosomes and Swiss-Army cells: build a cellular chassis that carries the basic instruction set necessary for metabolism, and then top it off with genes to produce whatever you're after this week. Before long, Venter's Vats (and only Venter's vats, if the patents go through) will be churning out great masses of everything from Nutripon to Biogasoline.

But more interesting, to me, is this recent paper out of PloS Computational Biology on “Somatic Evolution”— i.e., the classic Darwinian struggle for existence among the cells of a single tissue in a single organism. And why shouldn't the rules of natural selection apply to cells as well as their owners? The cells in your liver exist in a habitat with limited food, just like populations of multicellular creatures. They jostle up against others like themselves who have their eye on the same nutrients. Given a mutation that allowed one such cell to outcompete its siblings — faster reproductive rate, lower mortality — wouldn't its offspring kick the asses of the competition? Wouldn't the whole tissue, the whole organ, evolve into something new, something where every cell was out for itself, something like —

—Well, cancer, obviously.

Don't bother pointing out the obvious. Yes, if our cells did follow the beat of their own drummer, multicellularity would never have evolved in the first place. But that's circular; there's nothing in the rules that says multicellularity had to evolve, and logically Darwin's hand should be felt down in the blood as well as out on the savannah. Something must have suppressed those processes at the cellular level before metazoans could arise; that's what this paper is about.

But now I'm thinking on a tangent. I remember our old friends the scramblers, and how it was possible for them to evolve without genes:
"I'd swear half the immune system is actively targetting the other half. It's not just the immune system, either. Parts of the nervous system seem to be trying to, well, hack each other. I think they evolve intraorganismally, as insane as that sounds. The whole organism's at war with itself on the tissue level, it's got some kind of cellular Red Queen thing happening. Like setting up a colony of interacting tumors, and counting on fierce competition to keep any one of them from getting out of hand. Seems to serve the same role as sex and mutation does for us."

And I remember MacReady in Carpenter's film, after Norris split into several different pieces to keep from burning alive, internalising the take-home lesson:

"Every part of him was a whole. Every little piece was an individual animal with a built-in desire to protect its own life. Y'see, when a man bleeds... it's just tissue. But blood from one of you things won't obey when it's attacked. It'll try and survive. Crawl away from a hot needle, say..."

Cancer, for the greater good.

Maybe that's where people and scramblers and MacReady-battling Things went their separate ways. We tamed our inner battles using stem cells and transient cells and differentiated tissues, just like Pepper et al. hypothesise. But maybe other worlds spawned other answers. Maybe whatever alien slime mold gave rise to our Antarctic shapeshifter decided to go with the whole cell-competition thing, decided to make it a solution instead of a problem. Maybe that's how all those cells remain totipotent even in the adult animal; or maybe some tentacle-whipping alien J. Craig Venter just figured out how to go back and retrofit his species for optimal adaptability and maximum profit. Of course they could do it, even if they didn't evolve that way. They built flying saucers, for Chrissakes. They were crossing the interstellar gulf before we'd even made it out of Africa. What better failsafe for lost and stranded travellers than to be able to take your cue from the natives, slip into a new body precustomised for its environment?

I read “Who Goes There” back in the eighties, decades after John W. Campbell wrote it and about six months before Carpenter's unjustly-maligned remake hit the theatres. I thought it was tres cool from the outset. But it never occurred to me to write a sequel until I read this paper...

Labels: , , ,

Monday, December 10, 2007

A Lack of Focus

Been a while since I posted, I know. Not for lack of material. I've been meaning to post a few more I, Robot-type findings — more hardwired-aesthetics, this time centering around the "Golden Ratio"; more unsurprising evidence of a developmental basis for pedophilia, along with the (even-less surprising) preemptive disclaimers by the researchers that oh no, this shouldn't let pedophiles off the hook, no sirree. (I can't shake a certain sympathy for the kiddy-diddlers on this score. Biology seems to let everyone else off the hook: teenage brains are wired differently than adults, so we have a Young Offender's Act with different standards of culpability; jealous lovers are blinded by fight/fuck circuitry, so "crimes of passion" tend to carry lighter penalties than those that come precalculated. There's no end to the shit we're expected to put up with from victims of dementia, because hey, they "really can't help themselves". But pedophiles? Societal revulsion for those poor bastards is so strong that we don't even wait for the peasants to grab their pitchforks, we trip over ourselves insisting that no, the neurology doesn't matter for these monsters, they just need to exercise more self-control...)

Then there's this godsend from the University of Colorado — batteries, built from kidney cells! — that fits perfectly into a hole I've been trying to plug for the SquidNet novel. A seriously-overhyped item suggesting that a chatroom spam sex-bot has passed the Turing Test (I dunno— didn't Turing specify some minimal intelligence for the person the AI is supposed to be fooling?) I'm also reading this book, The Language of God: A Scientist Presents Evidence For Belief, by one of the leading lights of the Human Genome Project, and you can be damn sure that's gonna get it's very own extensive posting over the next little while. (Current opinion, at the ¼ mark: this guy is the Harriet Miers of gel jocks. How the hell can a top-flight geneticist be so abysmally ill-informed about basic biology? How can he be so utterly unfamiliar with basic logic?).

But it's fucking Christmastime, don't you know, and the obligations of this season eat at one's waking hours like a cancer. And I have four or five pitches/outlines, all in various states of (in)completion, that I gotta get done before my new agent writes me off for dead and eaten by cats. So for now, I'll just hand off with another excerpt from the imminent Szeman/Whiteman interview " Wildlife, Natural and Artificial: An Interview with Peter Watts ":


IS/MW: Dark, troubled, disturbed, heroic: Lenie Clark is one of the great characters of contemporary science fiction writing. A sympathetic protagonist despite her outward coldness—and the fact that her rage at the Grid Authority leads her to seed βehemoth across North America. Ken Lubin, too: a character about whom we know almost nothing beyond his capacity to expertly assess situations and to act on the results, but whom readers nevertheless see as on their side against the threats of the world. How did you come to create Lenie? What are the special challenges (if any) of writing about characters like these?


PW: Lenie Clarke was my attempt to imagine what was going on inside a woman I was briefly involved with back in grad school. It was one of those relationships that lasts maybe two months, tops, tosses you around like a pebble in a cement mixer full of broken glass, and then spits you out in the certain knowledge you’ll never see your partner again. You know all this going in, of course. You know the relationship has no future. And you do it anyway, because hey: what does have a future, these days? And at least you know you’re alive in the meantime.

The special challenge, of course, is that I probably got her completely wrong. But I rather suspect she’s been dead for some time, so she’s not likely to contradict me. And other people, who hail from similarly dark places, tell me that Lenie feels real to them. This honours me. I haven’t been fucked over nearly as much as these people have, I’m basically a pampered poser playing let’s-pretend-we’ve-been-sexually-abused. But if my prose can convince people who’ve actually been there, that’s something.

Unless, of course, they were just sucking up to me. That happens too. Not as much as it should, sadly.

The whole interview (which I've previously excerpted here, when I was first muddling through the questions — just scroll down to April 5) weighs in at well over 7,000 words and is slated to appear in the journal Extrapolation 48(3): 603-619. (And I mean really appear, which is not so common as I might have expected. Regular visitors may remember my mention of extensive interviews with the likes of Locus and the online editions of The Wall Street Journal, way back in spring/summer of this year. Don't know what's up with those, but I grow increasingly skeptical of either's appearance.)

Labels: , ,

Saturday, October 27, 2007

Remedial Gigerology, Part 2


I'm guessing this portrait is already familiar to a lot of you, since I got the link both from a fellow skiffhead and a boardroom mundane, but — speaking as a biologist — this is one of the creepiest, most unsettling creature pics I've ever seen. This thing has teeth where a beak should be — disquietingly human teeth, at that. (In fact, the species profile does refer to a "beak", so this has got to be a superficial resemblance, not a homologous structure. Still.)

I never really thought about it before, but this picture makes me wonder if the secret to generating creepy-verité is not to create something completely alien, but to create something mostly alien and then insert a clearly human feature onto the strangeness. Or maybe it's all about mouths; maybe I wouldn't be so creeped out if this little monster had a humanoid eye where its mouth is. I dunno.

But maybe I was missing the point when I tried so hard to make Blindsight's scramblers so utterly alien in every respect. Maybe, to make them really scary, all I had to do was add a little humanity.

Thanks for the nightmares, Mac.

Labels: ,

Thursday, September 13, 2007

The Skiffies...

Being the selection of a recent science item, hitherto unreported on this 'crawl, most near and dear to my heart.

Oddly, most of the items I've noticed recently seem reminiscent of my second book Maelstrom — from this tell-us-something-we-don't-know piece in the NY Times about the increasing fragility of complex technological systems to Naomi Klein's new book "The Shock Doctrine". Squinting at the news I can almost see the Complex Systems Instability-Response Authority gestating in the bowels of Halliburton; reading Klein's take on "disaster capitalism" I'm reminded of Marq Qammen's rant to Lenie Clarke about Adaptive Shatter: "...When damage control started accounting for more of the GGP than the production of new goods." Starfish may have been a more immersive novel; Blindsight may have had chewier ideas. But Maelstrom, I think, is way out front in terms of decent extrapolation.

Or there's this too-good-to-pass-up story out of Nature Neuroscience by way of the LA Times, in which a study combining button-pushing with the letters "M" and "W" showed that liberals are better at parsing novel input than conservatives, who have a greater tendency to fall into inflexible knee-jerk behaviors. (This would tend to explain, for example, how the inability to change one's mind in the face of new input can be regarded as a strength — "strong leadership" — while the ability to accommodate new information is regarded as "flip-flopping".) (Surprisingly, these findings have not been embraced by those who describe themselves as right-wing.)

But today's Skiffy has to go to this story in the Guardian, simply because it reflects so many facets of my own life (such as it is): marine mammals (in particular harbour porpoises, upon which I did my M.Sc.) are being infected by the mind-affecting parasite Toxoplasma gondii (whose genes were a vital part of "Guilt Trip" from the rifters novels, and which has been cited in this very crawl — May 6 2005) contacted from household cats (of which whose connection to mine own life you should all be aware by now).

Marine Mammals. Rifters. Cats.

No other contender comes close.

Labels: , ,

Friday, September 7, 2007

Remedial Gigerology


Okay, I need to tell no one here how very cool it is that moray eels have a second set of accessory jaws that leap out of their throat to handle difficult prey. You all know the obvious movie reference.

What I don't know is, there are a couple of hundred species of moray eels out there. We've known about them for centuries. So why the hell are we only discovering such an obvious anatomical feature now? Hasn't anyone dissected one of these things before?

Labels: ,

Thursday, September 6, 2007

Do-It-Yourself Zombiehood

New to me, old to the lit: a paper in Trends in Cognitive Sciences, which came out last November (just a month after Blindsight was released): "Attention and consciousness: two distinct brain processes".

Let me cherry-pick a few choice excerpts: "The close relationship between attention and consciousness has led many scholars to conflate these processes." ... "This article ... argu[es] that top-down attention and consciousness are distinct phenomena that need not occur together" ... "events or objects can be attended to without being consciously perceived."

Yes, part of me shouts in vindication, while the rest of me whispers Oh your god, please no.

It's a review article, not original research. As such it cites some of the same studies and examples I drew on while writing Blindsight. But what especially interested me was the suggestion of mechanism behind some of those results. Both Blindsight and Blog cite studies showing that being distracted from a problem actually improves your decision-making skills, or that we are paradoxically better at detecting subtle stimuli in "noisy" environments than in "clean" ones. Koch and Tsuchiya cite a paper that describes this as a form of competition between neuron clusters:
"attention acts as a winner-takes-all, enhancing one coalition of neurons (representing the attended object) at the expense of others (non-attended stimuli). Paradoxically, reducing attention can enhance awareness and certain behaviors."

I like this. It's almost ecological. Predators increase the diversity of their own prey species by keeping the most productive ones in check; remove the starfish from a multispecies intertidal assemblage and the whole neighborhood turns to mussels inside a month. This is the same sort of thing (except it happens within a single brain and therefore tastes more of Lamarck than Darwin). Different functional clusters (the different prey species) duke it out for attention, each containing legitimate data about the environment— but only the winner (i.e., the mussels) gets to tell its tale to the pointy-haired boss. All that other data just gets lost. And the static that paradoxically improves performance in such cases — white noise, or irrelevant anagrams that steal one's focus — play the role of the predator, reducing the advantage of the front-runner so that subordinate subroutines can get their voices heard.

I wonder. If we trained ourselves to live in a state of constant self-imposed distraction, could we desentientise our own brains...?

Labels: , ,

Monday, September 3, 2007

Wolbachia cronenbergium


My, the folks over at the Venter Institute have been busy lately. First they changed one microbe species into another by physically replacing its entire genome. They did this in their quest to create a synthetic organism, basically a chassis with the absolute minimum number of genes necessary for life, which could then be loaded up with other customized genes designed to act for the betterment of humanity and the environment the good of Venter stockholders. Now they've discovered that Nature herself has done them one better, by incorporating the complete genome of a parasitic bacterium called Wolbachia into the code of fruit flies: two complete genotypes for the price of one (original article here: much more accessible press release over here).

Some of you may remember ßehemoth, from the rifters books: it was basically mitochondrion's nasty cousin, and like mitochondria it brought its own genome into the host cell. This is a big step further: Wolbachia's code isn't just hanging out in the cell, it's been incorporated into the nuclear DNA of the host itself. The host is not infected with Wolbachia; there are no bacteria cruising the cytoplasm. Rather, the complete recipe for building the bug has been spliced into the host's code— and since the odds of such a big chunk of data (over a megabyte) getting thus incorporated without playing any functional role are pretty small, chances are that this embedded genotype is doing something for the host organism. This is assimilation: the dicks of Borg drones everywhere should be shriveling with collective performance anxiety.

Two major implications come immediately to mind. The first is that conventionally-derived genotypes sequenced to date might be all washed up, since bacterial DNA is routinely scrubbed from such results as "contamination"; but if this new phenomenon is widespread (and Wolbachia is one of the world's most abundant parasites of invertebrates), a lot of the bathwater we've been throwing out might actually be the baby. And the second implication, well —

Anyone remember David Cronenberg's remake of "The Fly"...?

(Illo credit, as far as I can tell, goes to the University of Rochester.)

Labels: , ,

Monday, August 27, 2007

WoW! Pandemic!

Today's post comes on the heels of a) me answering backlogged questions from XFire's gaming community, and b) grumbles from the peanut gallery about the recent lack of shiny techy science-speak on the 'crawl. It just so happens that today's subject combines elements of both, and holy shit is it cool: a paper in Lancet describing the epidemiology of an unintended plague that raged through the World of Warcraft back in 2005 (and thanks to Raymond Nielson for the heads-up). The figures presented in this paper — which, I emphasize, appears in one of the world's most prestigious medical journals — includes a screen shot of corpses in WoW's urban areas.

The plague itself was a glitch: a disease whose original range was supposed to be limited only to areas where high-level players could venture, and which was — again, to high-level players — merely a nuisance. The problem was, the plague cut down low-level players like kibble in a cat-food dish, and as Crichton once observed, Life Will Find A Way.

The bug hitchhiked out of it's original home turf in the blood of high-level characters teleporting back to their hearthstones (analogous, the authors point out, to airline travel in a real-world outbreak). Player's pets got infected, and spread the disease. NPCs, built strong for reasons of game play, acted as infectious reservoirs, not dying themselves but passing the germ on to anyone they came into contact with.

Whole villages were wiped out.

Lofgren and Fefferman point out that this completely unintentional "Corrupt Blood" outbreak was in many ways more realistic than dedicated supercomputer simulations designed to model real epidemics, simply because a real person stood behind each PC in the population. While real-world models have to use statistical functions to caricature human behavior, WoW's outbreak incorporated actual human behaviour (for example, a number of healers spontaneously acted as "first responders", rushing into infected areas to try and help the sick — and in the process spread the bug to other areas when they moved on). It's true that the ability of WoW characters to resurrect introduces a certain level of unrealism into the picture; but it's also true that players generally get so invested in their characters that they don't throw even those renewable lives away unnecessarily. More to the point, the new paradigm doesn't have to be perfect to be a vast improvement over the current state of the art.

L&F suggest that what happened once as a mistake could happen again by design — that MMORPGs could be a valuable tool for real epidemiological studies, by incorporating plausible plagues with known parameters as part of the in-game experience. Players are already used to sickness disease, and death; that's what makes the game so much fun. Do this right, and you could do population-level doomsday studies repeatedly, under controlled conditions, incorporating levels of behavioural realism far beyond what any purely statistical model could manage. Even Mengele didn't have this kind of sample size.

I can see a lot of research being done this way, and not just epidemiological. There are martial and economic possibilities, too. I can see Homeland Security getting involved. I can see national policies increasingly based on insights gleaned from fantasy simulations — and I can see such policies being played from the inside, by mages and blood elves who might have their own agendas to pursue...

Damn. The story almost writes itself.

Labels: , ,