Thursday, October 16, 2008

The Living Dead

Meet Candidatus Desulforudis audaxviator, the bacterium that does it all: fix carbon, fix nitrogen, synthesize all essential amino acids, locomote — an organism that can exist totally independent of other life. It doesn't even need the sun. This fucker basically lives on sulfur, rock, and electrons*.

It's an obligate anaerobe, without even the most rudimentary oxygen resistance. A bug like βehemoth would kick its ass throughout most of the terrestrial biosphere (its natural digs are a couple of kilometers down in the crust, where no O2 has poked its corrosive little head for at least three million years). But that's not likely to be any kind of drawback out in space, and various talking heads are already nattering excitedly about the prospect of something just like this hanging out on Mars, or on the Saturnian moons.

It is cool. It is, quite literally, a complete ecosystem bundled into a single species, a biosphere crammed into two-and-a-half megabytes and a crunchy shell. Astrobiologists the world over have been creaming their genes for a week now. It's such a science-fictional little beast that its very name was lifted from a Jules Verne novel— but what really sticks in my mind about this little Swiss-army knife is a feature that's actually pretty common down there.

If it's anything like other deep-rock dwellers, D. audaxviator reproduces very slowly, taking centuries or even millennia to double in numbers. It's a consequence of nutrient limitation, but might we be looking at a kind of incipient immortality here? The textbooks tell us that one of the defining characteristics of life is reproduction. But if you think of life as the propagation of organized information into the future — the persistence of signal, rather than merely its proliferation — then reproduction is really just a workaround. The chassis that carries the information wears out, and must be replaced.

It doesn't take much, here at the dawn of Synthetic Biology, to imagine an organism with unlimited self-repair capabilities; something that can keep its telomeres nice and long, which sweeps away all those nasty free radicals and picks up the broken bottles in their wake, which replaces an endless succession of disposable Swatches with a solid gold Rolex which can hang in there for a billion years or more. Hell, you could even postulate some kind of Lamarckian autoedit option on the genes, so the organism can adapt to new environments. Or you could just limit your organism to extremely stable environments that don't require ongoing adaptation. Interstellar space, for example. Or deep in a planetary lithosphere. In some ways, this could be a superior strategy to conventional breeding; at least you wouldn't have to worry about population explosions.

I wonder if, somewhere down there, D. audaxviator or something like it has given up on reproduction entirely. Maybe it keeps the machinery around as a kind of legacy app that no one uses any more and just ticks slowly onwards, buried beneath all that insulating and protective rock, unto the very end of the planet.

The textbooks would call it dead. I'd suggest our definitions may need an upgrade.


*Of course, the fact that it can live independently doesn't mean that it evolved independently. A bunch of its genes have been cadged from Archae via lateral transfer. Its genes also contain anti-viral countermeasures; whether it siphoned those off incidentally from donor species or actually uses them to guard against parasitic code, there's obviously a history of contact with other life in this bug's family tree.

Labels: , , ,

Wednesday, October 8, 2008

Understanding Sarah Palin: Or, God Is In The Wattles

Here's a question for you. Why hasn't natural selection driven the religious right to extinction?

You should forgive me for asking. After all, here is a group of people who base their lives on patently absurd superstitions that fly in the face of empirical evidence. It's as if I suddenly chose to believe that I could walk off the edges of cliffs with impunity; you would not expect me to live very long. You would expect me to leave few if any offspring. You would expect me to get weeded out.

And yet, this obnoxious coterie of retards — people openly and explicitly contemptuous of "intellectuals" and "evilutionists" and, you know, anyone who actually spends their time learning stuff — they not only refuse to die, they appear to rule the world. Some Alaskan airhead who can't even fake the name of a newspaper, who can't seem to say anything without getting it wrong, who bald-facedly states in a formal debate setting that she's not even going to try to answer questions she finds unpalatable (or she would state as much, if she could say "unpalatable" without tripping over her own tongue) — this person, this behavior, is regarded as successful even by her detractors. The primary reason for her popularity amongst the all-powerful "low-information voters"1? In-your-face religious fundamentalism and an eye tic that would make a Tourette's victim blush.

You might suggest that my analogy is a bit loopy: young-earth creationism may fly in the face of reason, but it hardly has as much immediate survival relevance as my own delusory immunity to gravity. I would disagree. The Christian Church has been an anvil around the neck of scientific progress for centuries. It took the Catholics four hundred years to apologize to Galileo; a hundred fifty for an Anglican middle-management type to admit that they might owe one to Darwin too (although his betters immediately slapped him down for it). Even today, we fight an endless series of skirmishes with fundamentalists who keep trying to sneak creationism in through the back door of science classes across the continent. (I'm given to understand that Islamic fundies are doing pretty much the same thing in Europe.) More people in the US believe in angels than in natural selection. And has anyone not noticed that religious fundamentalists also tend to be climate-change deniers?

Surely, any cancer that attacks the very intellect of a society would put the society itself at a competitive disadvantage. Surely, tribes founded on secular empiricism would develop better technology, better medicines, better hands-on understanding of The Way Things Work, than tribes gripped by primeval cloud-worshipping superstition2. Why, then, are there so few social systems based on empiricism, and why are god-grovellers so powerful across the globe? Why do the Olympians keep getting their asses handed to them by a bunch of intellectual paraplegics?

The great thing about science is, it can even answer ugly questions like this. And a lot of pieces have been falling into place lately. Many of them have to do with the brain's fundamental role as a pattern-matcher.

Let's start with this study here, in the latest issue of Science. It turns out that the less control people feel they have over their lives, the more likely they are to perceive images in random visual static; the more likely they are to see connections and conspiracies in unrelated events. The more powerless you feel, the more likely you'll see faces in the clouds. (Belief in astrology also goes up during times of social stress.)

Some of you may remember that I speculated along such lines back during my rant against that evangelical abortion that Francis Collins wrote while pretending to be a scientist; but thanks to Jennifer Whitson and her buddies, speculation resolves into fact. Obama was dead on the mark when he said that people cling to religion and guns during hard times. The one arises from loss of control, and the other from an attempt to get some back.

Leaving Lepidoptera (please don't touch the displays, little boy, heh heh heh— Oh, cute...) — moving to the next aisle, we have Arachnida, the spiders. And according to findings reported by Douglas Oxley and his colleagues (supplemental material here), right-wingers are significantly more scared of these furry little arthropods than left-wingers tend to be: at least, conservatives show stronger stress responses than liberals to "threatening" pictures of large spiders perched on human faces.

It's not a one-off effect, either. Measured in terms of blink amplitude and skin conductance, the strongest stress responses to a variety of threat stimuli occurred among folks who "favor defense spending, capital punishment, patriotism, and the Iraq War". In contrast, those who "support foreign aid, liberal immigration policies, pacifism, and gun control" tended to be pretty laid-back when confronted with the same stimuli. Oxley et al close off the piece by speculating that differences in political leanings may result from differences in the way the amygdala is wired— and that said wiring, in turn, has a genetic component. The implication is that right-wing/left-wing beliefs may to some extent be hardwired, making them relatively immune to the rules of evidence and reasoned debate. (Again, this is pure speculation. The experiments didn't extend into genetics. But it would explain a lot.)

One cool thing about the aforementioned studies is that they have relatively low sample sizes, both in two-digit range. Any pattern that shows statistical significance in a small sample has got to be pretty damn strong; both of these are.

Now let's go back a ways, to a Cornell Study from 1999 called "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments". It's a depressing study, with depressing findings:
  • People tend to overestimate their own smarts.
  • Stupid people tend to overestimate their smarts more than the truly smart do.
  • Smart people tend to assume that everyone else is as smart as they are; they honestly can't understand why dumber people just don't "get it", because it doesn't occur to them that those people actually are dumb.
  • Stupid people, in contrast, tend to not only regard themselves as smarter than everyone else, they tend to regard truly smart people as especially stupid. This holds true even when these people are shown empirical proof that they are less competent than those they deride.
So. The story so far:
  1. People perceive nonexistent patterns, meanings, and connections in random data when they are stressed, scared, and generally feel a loss of control in their own lives.
  2. Right-wing people are more easily scared/stressed than left-wing people. They are also more likely to cleave to authority figures and protectionist policies. There may be a genetic component to this.
  3. The dumber you are, the less likely you'll be able to recognize your own stupidity, and the lower will be your opinion of people who are smarter than you (even while those people keep treating you as though you are just as smart as they are)
Therefore (I would argue) the so-called "right wing" is especially predisposed to believe in moralizing, authoritarian Invisible Friends. And the dumber individuals (of any stripe) are, the more immune they are to reason. Note that, to paraphrase John Stuart Mill, I am not saying that conservatives are stupid (I myself know some very smart conservatives), but that stupid people tend to be conservative. Whole other thing.

So what we have, so far, is a biological mechanism for the prevalence of religious superstition in right-wing populations. What we need now is a reason why such populations tend to be so damn successful, given the obvious shortcomings of superstition as opposed to empiricism.

Which brings us to Norenzayan and Shariff's review paper in last week's Science on "The Origin and Evolution of Religious Prosociality". To get us in the mood they remind us of several previous studies, a couple of which I may have mentioned here before (at least, I mentioned them somewhere — if they're on the 'crawl, I evidently failed to attach the appropriate "ass-hamsters" tag). For example, it turns out that people are less likely to cheat on an assigned task if the lab tech lets slip that the ghost of a girl who was murdered in this very building was sighted down the hall the other day.

That's right. Plant the thought that some ghost might be watching you, and you become more trustworthy. Even sticking a picture of a pair of eyes on the wall reduces the incidence of cheating, even though no one would consciously mistake a drawing of eyes for the real thing. Merely planting the idea of surveillance seems to be enough to improve one's behavior. (I would also remind you of an earlier crawl entry reporting that so-called "altruistic" acts in our society tend to occur mainly when someone else is watching, although N&S don't cite that study in their review.)

There's also the recent nugget from which this figure was cadged:
This study found not only that religious communes last longer than secular ones, but that even among religious communes the ones that last longest are those with the most onerous, repressive, authoritarian rules.

And so on. Norenzayan and Shariff trot out study after study, addressing a variety of questions that may seem unrelated at first. If, as theorists suggest, human social groupings can only reach 150 members or so before they collapse or fragment from internal stress, why does the real world serve up so many groupings of greater size? (Turns out that the larger the size of a group, the more likely that its members believe in a moralizing, peeping-tom god.) Are religious people more likely than nonreligious ones to help out someone in distress? (Not so much.) What's the most common denominator tying together acts of charity by the religious? (Social optics. "Self-reported belief in God or self-reported religious devotion," the paper remarks wryly, "was not a reliable indicator of generous behavior in anonymous settings.") And why is it that religion seems especially prevalent in areas with chronic water and resource shortages?

It seems to come down to two things: surveillance and freeloading. The surveillance element is pretty self-evident. People engage in goodly behavior primarily to increase their own social status, to make themselves appear more valuable to observers. But by that same token, there's no point in being an upstanding citizen if there are no observers. In anonymous settings, you can cheat.

You can also cheat in nonanonymous settings, if your social group is large enough to get lost in. In small groups, everybody knows your name; if you put out your hand at dinner but couldn't be bothered hunting and gathering, if you sleep soundly at night and never stand guard at the perimeter, it soon becomes clear to everyone that you're a parasite. You'll get the shit kicked out of you, and be banished from the tribe. But as social groupings become larger you lose that everyone-knows-everyone safeguard. You can move from burb to burb, sponging and moving on before anyone gets wise—

unless the costs of joining that community in the first place are so bloody high that it just isn't worth the effort. This is where the onerous, old-testament social rituals come into play.

Norenzayan and Shariff propose that
"the cultural spread of religious prosociality may have promoted stable levels of cooperation in large groups, where reputational and reciprocity incentives are insufficient. If so, then reminders of God may not only reduce cheating, but may also increase generosity toward strangers as much as reminders of secular institutions promoting prosocial behavior."
And they cite their own data to support it. But they also admit that "professions of religious belief can be easily faked", so that
"evolutionary pressures must have favored costly religious commitment, such as ritual participation and various restrictions on behavior, diet, and life-style, that validates the sincerity of otherwise unobservable religious belief."
In other word, anyone can talk the talk. But if you're willing to give all your money to the church and your twelve-year-old daughter to the patriarch, dude, you're obviously one of us.

Truth in Advertising is actually a pretty common phenomenon in nature. Chicken wattles are a case in point; what the hell good are those things, anyway? What do they do? Turns out that they display information about a bird's health, in a relatively unfakeable way. The world is full of creatures who lie about their attributes. Bluegills spread their gill covers when facing off against a competitor; cats go all puffy and arch-backed when getting ready to tussle. Both behaviors serve to make the performer seem larger than he really is— they lie, in other words. Chicken wattles aren't like that; they more honestly reflect the internal state of the animal. It takes metabolic energy to keep them plump and colorful. A rooster loaded down with parasites is a sad thing to see, his wattles all pale and dilapidated; a female can see instantly what kind of shape he's in by looking at those telltales. You might look to the peacock's tail for another example3, or the red ass of a healthy baboon. (We humans have our own telltales— lips, breasts, ripped pecs and triceps— but you haven't been able to count on those ever since implants, steroids, and Revlon came down the pike.) "Religious signaling" appears to be another case in point. As Norenzayan and Shariff point out, "religious groups imposing more costly requirements have members who are more committed." Hence,
"Religious communes were found to outlast those motivated by secular ideologies, such as socialism. … religious communes imposed more than twice as many costly requirements (including food taboos and fasts, constraints on material possessions, marriage, sex, and communication with the outside world) than secular ones… Importantly for costly religious signaling, the number of costly requirements predicted religious commune longevity after the study controlled for population size and income and the year the commune was founded… Finally, religious ideology was no longer a predictor of commune longevity, once the number of costly requirements was statistically controlled, which suggests that the survival advantage of religious communes was due to the greater costly commitment of their members, rather than other aspects of religious ideology."
Reread that last line. It's not the ideology per sé that confers the advantage; it's the cost of the signal that matters. Once again, we strip away the curtain and God stands revealed as ecological energetics, writ in a fancy font.

These findings aren't carved in stone. A lot of the studies are correlational, the models are in their infancy, yadda yadda yadda. But the data are coming in thick and fast, and they point to a pretty plausible model:
  • Fear and stress result in loss of perceived control;
  • Loss of perceived control results in increased perception of nonexistent patterns (N&S again: "The tendency to detect agency in nature likely supplied the cognitive template that supports the pervasive belief in supernatural agents");
  • Those with right-wing political beliefs tend to scare more easily;
  • Authoritarian religious systems based on a snooping, surveillant God, with high membership costs and antipathy towards outsiders, are more cohesive, less invasible by cheaters, and longer-lived. They also tend to flourish in high-stress environments.
And there you have it. The Popular Power of Palin, explained. So the next question is

Now that we can explain the insanity, what are we going to do about it?

Coda 10/10/08: And as the tide turns, and the newsfeeds and Youtube videos pile up on my screen, the feature that distinguishes right from left seems ever-clearer: fear. See the angry mobs at Republican rallies. Listen to the shouts of terrorist and socialist and kill him! whenever Obama's name is mentioned. And just tonight, when even John McCain seemed to realise that things had gone too far, and tried to describe the hated enemy as "a decent man"— he was roundly booed by his own supporters.

How many times have the Dems had their asses handed to them by well-oiled Republican machinery? How many times have the Dems been shot down by the victorious forces of Nixons and Bushes? Were the Democrats ever this bloodthirsty in the face of defeat?

Oxley et al are really on to something. These people are fucking terrified.


Photo credit for Zombie Jesus: no clue. Someone just sent it to me.

1And isn't that a nice CNNism for "moron"? It might seem like a pretty thing veil to you lot, but then again, CNN isn't worried about alienating viewers with higher-than-room-temperature IQs.
2And to all you selfish-gene types out there, where you been? Group-selection is back in vogue this decade. Believe me, I was as surprised as you…
3Although we might be getting into "Handicap Principle" territory here, which is a related but different wattle of fish. I confess I'm not up on the latest trends in this area…

Labels: , , ,

Saturday, January 5, 2008

Cancer, For the Greater Good

One of my favorite monster movies of all time has got to be John Carpenter's 1982 remake of “The Thing”. It's not a perfect film by any means – there are some gaffes with the rubber fx, and if eighties-era PCs ever came preloaded with software to test whether your buddies had been taken over by shapeshifting aliens, I never saw it. (A shame. Back then I would've bought something like that in a second. These days, it's pretty much redundant.) Still, “The Thing” is one of the few movies of its type in which the Human cast isn't as dumb as a sack of Huckabees. Nobody wanders off after the ship's cat all by themselves. The moment they figure out what they're up against they burn everything that moves, start thinking about serological detection methods, and keep really close watch on each other. It's an effective study in paranoia, starring an alien that not only squirts slime and sprouts tentacles, but actually proves to be quite a bit more intelligent than the humans it preys upon. (As one might expect from a creature with interstellar technology. Am I the only one bothered by the fact that the monster in the Howard Hawkes original never did anything smarter than just kinda flailing around and roaring?) Even at the scorched-earth end of the story, you're never really sure who won.

Then there's the biology.

It's actually not as totally whacked-out as you may think. Granted, anything able to morph between body plans in the time it takes me to snarf an Egg McMuffin would have to have stores of cellular energy verging on the nuclear. (Jeff Goldblum's gradual, donut-powered transformation in “The Fly” was a lot more believable – although why those telepods got all confused at the presence of fly DNA, when they didn't seem to bat a diode at the bacteria seething on every square micron of both fly and human, remains an open question. But I digress.) Still, if you can forgive the ridiculously fast transformation, the idea of an infectious agent retooling infected cells for its own purposes is old news. Viruses have been doing it for billions on years.

Now we are too. Synthetic Life's current rock star, J. Craig Venter, is all over the lit with his artifical chromosomes and Swiss-Army cells: build a cellular chassis that carries the basic instruction set necessary for metabolism, and then top it off with genes to produce whatever you're after this week. Before long, Venter's Vats (and only Venter's vats, if the patents go through) will be churning out great masses of everything from Nutripon to Biogasoline.

But more interesting, to me, is this recent paper out of PloS Computational Biology on “Somatic Evolution”— i.e., the classic Darwinian struggle for existence among the cells of a single tissue in a single organism. And why shouldn't the rules of natural selection apply to cells as well as their owners? The cells in your liver exist in a habitat with limited food, just like populations of multicellular creatures. They jostle up against others like themselves who have their eye on the same nutrients. Given a mutation that allowed one such cell to outcompete its siblings — faster reproductive rate, lower mortality — wouldn't its offspring kick the asses of the competition? Wouldn't the whole tissue, the whole organ, evolve into something new, something where every cell was out for itself, something like —

—Well, cancer, obviously.

Don't bother pointing out the obvious. Yes, if our cells did follow the beat of their own drummer, multicellularity would never have evolved in the first place. But that's circular; there's nothing in the rules that says multicellularity had to evolve, and logically Darwin's hand should be felt down in the blood as well as out on the savannah. Something must have suppressed those processes at the cellular level before metazoans could arise; that's what this paper is about.

But now I'm thinking on a tangent. I remember our old friends the scramblers, and how it was possible for them to evolve without genes:
"I'd swear half the immune system is actively targetting the other half. It's not just the immune system, either. Parts of the nervous system seem to be trying to, well, hack each other. I think they evolve intraorganismally, as insane as that sounds. The whole organism's at war with itself on the tissue level, it's got some kind of cellular Red Queen thing happening. Like setting up a colony of interacting tumors, and counting on fierce competition to keep any one of them from getting out of hand. Seems to serve the same role as sex and mutation does for us."

And I remember MacReady in Carpenter's film, after Norris split into several different pieces to keep from burning alive, internalising the take-home lesson:

"Every part of him was a whole. Every little piece was an individual animal with a built-in desire to protect its own life. Y'see, when a man bleeds... it's just tissue. But blood from one of you things won't obey when it's attacked. It'll try and survive. Crawl away from a hot needle, say..."

Cancer, for the greater good.

Maybe that's where people and scramblers and MacReady-battling Things went their separate ways. We tamed our inner battles using stem cells and transient cells and differentiated tissues, just like Pepper et al. hypothesise. But maybe other worlds spawned other answers. Maybe whatever alien slime mold gave rise to our Antarctic shapeshifter decided to go with the whole cell-competition thing, decided to make it a solution instead of a problem. Maybe that's how all those cells remain totipotent even in the adult animal; or maybe some tentacle-whipping alien J. Craig Venter just figured out how to go back and retrofit his species for optimal adaptability and maximum profit. Of course they could do it, even if they didn't evolve that way. They built flying saucers, for Chrissakes. They were crossing the interstellar gulf before we'd even made it out of Africa. What better failsafe for lost and stranded travellers than to be able to take your cue from the natives, slip into a new body precustomised for its environment?

I read “Who Goes There” back in the eighties, decades after John W. Campbell wrote it and about six months before Carpenter's unjustly-maligned remake hit the theatres. I thought it was tres cool from the outset. But it never occurred to me to write a sequel until I read this paper...

Labels: , , ,

Wednesday, November 21, 2007

The End of Art

This whole stem-cell breakthrough is certainly worth keeping track of, but not here because you know about it already; it's all over other sites far more popular than mine. Ditto the hilarious perspective on WoW which serves as the subject of today's visual aid, starring characters which many of us must know (albeit in roles with more contemporary fashion sense). No, today I'm going to direct your attention to neuroeasthetics, and the following question:

Have you ever seen an ugly fractal?

I haven't. I wouldn't hang every fractal I've ever seen in my living room (even during my Roger Dean phase) — but it wasn't the essential form that turned me off those iterations, it was the color scheme. And such schemes aren't intrinsic to the math; they're arbitrary, a programmer's decision to render this isocline in red and that in blue and not the other way around.

I would argue that fractals, as mathematical entities, are, well, appealing. Aesthetically. All of them. It's something I've batted around with friends and colleagues at least since the mid-eighties, and speaking as a former biologist it has a certain hand-wavey appeal because you can see how an appreciation of fractal geometry might evolve. After all, nature is fractal; and the more fractal a natural environment might be, the greater the diversity of opportunity. An endlessly bifurcating forest; a complex jumble of rocky geometry; a salt plain. Which environments contain more niches, more places to hide, more foraging opportunities, more trophic pathways and redundant backup circuits? Doesn't it make sense that natural selection would reward us for hanging out in complex, high-opportunity environments? Couldn't that explain aesthetics, in the same way that natural selection gave us* rhythm and the orgasm**? Couldn't that explain art?

Maybe. Maybe not. Because firstly (as I'm sure some of you have already chimed in), complex environments also contain more places for predators and competitors to hide and jump out at you. There are costs as well as benefits, and the latter better outweigh the former if fractophilia is going to take hold in the population at large. Also, who says all art is fractal? Sure, landscapes and still lifes. Maybe even those weird cubist and impressionist thingies. But faces aren't fractal; what about portraiture?

The obvious answer is that the recognition and appreciation of faces has got obvious fitness value too, and aesthetics is a big tent; nothing says "art" can't appeal to the fusiform gyrus as well as whatever Mandelbrot Modules we might prove to have. But now along comes this intriguing little paper (update 22/11 — sorry, forgot to add the link yesterday) in Network, which suggests that even though faces themselves are not fractal, artistic renditions of faces are; that artists tend to increase the aesthetic appeal of their portraits by introducing into their work scale-invariant properties that don't exist in the original. Even when dealing with "representational" works, evidently, true art consists of fractalizing the nonfractal.

What we're talking about, folks, may be the end of art as we know it. Go a little further down this road and every mathematician with a graphics tablet will be able to create a visual work that is empirically, demonstrably, beautiful. Personal taste will reduce to measurable variations in aesthetic sensibilities resulting from different lifetime experiences; you will be able to commission a work tweaked to appeal to that precise sensibility. Art will have become a designer drug.

Way back in the early seventies, a story from a guy called Burt Filer appeared in Harlan Ellison's Again, Dangerous Visions. It is called "Eye of the Beholder", and it begins thusly:

THE NEW YORK TIMES, Section 2, Sunday June 3rd by Audrey Keyes. Peter Lukas' long-awaited show opened at the Guggenheim today, and may have shaken confidence in the oldest tenet of art itself: that beauty is in the eye of the beholder. Reactions to his work were uncannily uniform, as if the subjective element had been removed...


Filer wrote his story before anyone even knew what a fractal was. (His guess was that aesthetics could be quantified using derivatives, a miscall that detracts absolutely nothing from the story.) "Beholder" wasn't his first published work; in fact, as far as I can tell, it may have been his last. (That would be fitting indeed.) I don't know if the man's even still alive.

But if you're out there, Burt: dude you called it.


*Well, some of us.
** Ditto.

Labels: , ,

Monday, September 3, 2007

Wolbachia cronenbergium


My, the folks over at the Venter Institute have been busy lately. First they changed one microbe species into another by physically replacing its entire genome. They did this in their quest to create a synthetic organism, basically a chassis with the absolute minimum number of genes necessary for life, which could then be loaded up with other customized genes designed to act for the betterment of humanity and the environment the good of Venter stockholders. Now they've discovered that Nature herself has done them one better, by incorporating the complete genome of a parasitic bacterium called Wolbachia into the code of fruit flies: two complete genotypes for the price of one (original article here: much more accessible press release over here).

Some of you may remember ßehemoth, from the rifters books: it was basically mitochondrion's nasty cousin, and like mitochondria it brought its own genome into the host cell. This is a big step further: Wolbachia's code isn't just hanging out in the cell, it's been incorporated into the nuclear DNA of the host itself. The host is not infected with Wolbachia; there are no bacteria cruising the cytoplasm. Rather, the complete recipe for building the bug has been spliced into the host's code— and since the odds of such a big chunk of data (over a megabyte) getting thus incorporated without playing any functional role are pretty small, chances are that this embedded genotype is doing something for the host organism. This is assimilation: the dicks of Borg drones everywhere should be shriveling with collective performance anxiety.

Two major implications come immediately to mind. The first is that conventionally-derived genotypes sequenced to date might be all washed up, since bacterial DNA is routinely scrubbed from such results as "contamination"; but if this new phenomenon is widespread (and Wolbachia is one of the world's most abundant parasites of invertebrates), a lot of the bathwater we've been throwing out might actually be the baby. And the second implication, well —

Anyone remember David Cronenberg's remake of "The Fly"...?

(Illo credit, as far as I can tell, goes to the University of Rochester.)

Labels: , ,

Wednesday, June 20, 2007

Canadian... Smugness... Failing... Must... Read.... Darwin...

From yesterday's Globe & Mail, this flabbergasting factoid:

Only 51% of Ontario residents accept the reality of evolution.

Even Americans do better, at 53%. Nationally, Canada beats the States in the Enlightenment Sweeptstakes — at 59% — but that's not by very goddamn much. Sullen, resentful thanks to Dave Nickle for the link.

Meanwhile The EU, wary of similar threats on their own shores (these ones arising largely from Islamic fundies rather than Christian ones), is entertaining a resolution (albeit a nonbinding one) to keep such easter-bunny stupidity out of their science classes. Evidently they regard creationism as a Human Rights issue. Saddens me to say it, but maybe we need something along those lines here at home.

Oh, and okay. I'll look into this forum thingy. Looks pretty straightforward. Maybe I can embed a poll or two therein, get some sense of how widespread your irritation might be with the current on-screen format.


Update/Note o'Hope, 1750 CDT: Springer, the 800-lb gorilla of scientific publishing, is launching a new journal explicitly designed to help science teachers defend against creationism in the classroom. If you check out its Board of Directors you'll find some pretty heavy hitters, including Dan Brooks, in whose whose lab I took up space for two years.

Labels:

Thursday, May 10, 2007

The Uplift Protein

Neuropsin, that is. A prefrontal-cortex protein involved in learning and memory. There's this one variant that's peculiar to us Humans, 45 amino acids longer than the standard model handed out to other primates, and a team of Chinese researchers have just nailed the gene that codes for it. And the really cool part? Utterly ignoring all those some-things-man-was-not-meant-to-know types, they spliced the causal mutation into chimpanzee DNA, which then started to synthesise the type-II variant. No word yet on how far they let that stretch of code iterate. No word on how many months away we are from building chimps with human-scale intelligence.

The actual paper isn't out yet. But I'm really hoping my U of T library access is still active when Human Mutation prints the details.

Labels: , ,

Monday, April 23, 2007

Another Step Towards the Maelstrom

Those of you who read Maelstrom might remember what that book was named for: the frenetic chainsaw fast-forward jungle that the Internet had evolved into, infested by the virtual predators and parasites that evolved after we gave genes to spambots and let them breed at 50 generations/sec. (Those of you who didn't read Maelstrom can still give it a shot, if you're up for the challenge.) Here's another benchmark on the way to that future: net bots competing for host machines to zombify, repairing the security holes that they themselves exploited so that competitors can't get in the same way. Imagine a beast that actually installs necessary Windows patches onto your machine-- but only after it's already built anest behind your firewall. It's vaguely reminiscent of those male insects with genitals that look like pedestals of dental instruments: once they inseminate the female, they secrete a kind of crazy glue and spatula it over her genital pore to keep competitors from messing with their sperm. Or the even cooler (albeit possibly apocryphal) case of reproductive homosexual rape in hanging flies; the really successful males don't even bother to inseminate females directly, they bugger other males. Their sperm then migrate to the gonads of their victim, and when said victim finally makes it with a female, he inseminates her with the sperm of the male who raped him. (More than one clergyman has told me that you can learn a lot about the mind of God by studying His creations. I wonder what they'd make of these guys.)

Of course, this is still special creation, not evolution. The bots are intelligently designed; nobody's given them genes yet (or perhaps the coders themselves are a kind of "extended genotyope", albeit a Lamarkian one. Life always hits you upside the head with this recursive chicken/egg stuff whenever you look too closely.) (Hey-- maybe there's a story in that...)

Still, it's another step in the right direction. It's part of the arms race. Only a matter of time before someone figures out that a random number generator and a tilt bit here and there can unleash these things to evolve on their own, without always having to get respawned from the shop.

Personally, I think they're taking way too long. I can hardly wait to see what happens.

(Thanks to Raymond Neilson and Alistair Blachford for the link.)

Labels: , ,