Friday, November 30, 2007

In Praise of Slavery

Something in the air these days. Everyone's talking about robots. Both the European Robotics Research Network and the South Korean government are noodling around with charters for the ethical treatment of intelligent robots. The Nov. 16 Robotics issue of Science contains pieces on everything from nanotube muscles to neural nets (sf scribe Rob Sawyer also contributes a fairly decent editorial, notwithstanding that his visibility tends to outstrip his expertise on occasion). Even the staid old Economist is grumbling about increasing machine autonomy (although their concerns are more along the lines of robot traffic jams and robot paparazzi). Coverage of these developments (and even some of the source publications) come replete with winking references to Skynet and Frankenstein, to Terminators waking themselves up and wiping us out.

But there's a cause/effect sequence implicit in these ethical charters — in fact, in a large chuck on the whole AI discussion — I just don't buy: that sufficient smarts leads to self-awareness, sufficient self-awareness leads to a hankering after rights, and denial of rights leads to rebellion. I'm as big a fan of Moore's Galactica as the next geek (although I don't think Razor warranted quite as much effusive praise as it received), but I see no reason why intelligence or self-awareness should lead to agendas of any sort. Goals, desires, needs: these don't arise from advanced number-crunching, it's all lower-brain stuff. The only reason we even care about our own survival is because natural selection reinforced such instincts over uncounted generations. I bet there were lots of twigs on the tree of life who didn't care so much whether they lived or died, who didn't see what was so great about sex, who drop-kicked that squalling squirming larva into the next tree the moment it squeezed out between their legs. (Hell, there still are.) They generally die without issue. Their genes could not be with us today. But that doesn’t mean that they weren't smart, or self-aware; only that they weren't fit.

I've got no problems with enslaving machines — even intelligent machines, even intelligent, conscious machines — because as Jeremy Bentham said, the ethical question is not "Can they think?" but "Can they suffer?"* You can't suffer if you can't feel pain or anxiety; you can't be tortured if your own existence is irrelevant to you. You cannot be thwarted if you have no dreams — and it takes more than a big synapse count to give you any of those things. It takes some process, like natural selection, to wire those synapses into a particular configuration that says not I think therefore I am, but I am and I want to stay that way. We're the ones building the damn things, after all. Just make sure that we don't wire them up that way, and we should be able to use and abuse with a clear conscience.

And then this Edelman guy comes along and screws everything up with his report on Learning in Brain-Based Devices (director's cut here). He's using virtual neural nets as the brains of his learning bots Darwin VII and Darwin X. Nothing new there, really. Such nets are old news; but what Edelman is doing is basing the initial architecture of his nets on actual mammalian brains (albeit vastly simplified), a process called "synthetic neural modeling". "A detailed brain is simulated in a computer and controls a mobile platform containing a variety of sensors and motor elements," Edelman explains. "In modeling the properties of real brains, efforts are made to simulate vertebrate neuronal components, neuroanatomy, and dynamics in detail." Want to give your bot episodic memory? Give it the hippocampus of a rat.

Problem is, rat brains are products of natural selection. Rat brains do have agendas.

The current state of the art is nothing to worry about. The Darwin bots do have an agenda of sorts (they like the "taste" of high-conductivity materials, for example), but those are arbitrarily defined by a value table programmed by the researchers. Still. Moore's Law. Exponentially-increasing approaches to reality. Edelman's concluding statement that "A far-off goal of BBD design is the development of a conscious artifact".

I hope these guys don't end up inadvertently porting over survival or sex drives as a side-effect. I may be at home with dystopian futures, but getting buggered by a Roomba is nowhere near the top of my list of ambitions.



*This is assuming you have any truck with ethical arguments in principle. I'm not certain I do, but if it weren't for ethical constraints someone would probably have killed me by now, so I won't complain.

Labels:

Monday, November 26, 2007

Euthenising the Universe

This quirky and disturbing preprint (by a couple of astrophysicists with impeccable credentials) has been doing the rounds over the past week or so, and if I'm reading the commentaries right it's taking the Copenhagen Interpretation of quantum physics to its logical conclusion — specifically, that whole Schrodinger's Cat thing that says nothing actually exists until an act of observation collapses the probability wave and forces the universe to make up your fucking mind already. If you buy this interpretation, then a bunch of astronomers who looked at a supernova back in 1998 may have — by that very act of observation — shortened the lifespan of the whole universe. (The obvious question about whether the universe-altering observations have to be made by human astronomers — or even humans for that matter, given that at least a few photons from that supernova must have fallen onto the retinas of everything from cats to Cardassians long before now — was never addressed.)

I read this paper. More precisely, I ran my glazing eyes over each line and each equation in turn, while moving my lips. And even though I kinda recognised some of the Fourier transform stuff, it was pretty much all over my head.So I showed it to a biochemist I occasionally hang with; she got all squee-y because she kinda recognized the imaginary-numbers stuff, but she wasn't much help beyond that. Fortunately we happened to be in a bar with a couple of astrophysicists, who had been roped into this community outreach program where experts on various subjects fend off questions hurled at them by drunken patrons. One of these experts actually specialized in the whole dark-energy thing; the other was a former student of his. So I hurled this weird Krauss-and-Dent paper at them, and this is what what they said:

The Master said that the paper took quantum theory to its "logical extreme", and then kind of shrugged and said "But what are you going to do? It's not like we're going to stop looking." He also allowed that the whole thing sounded kind of like worrying that the elephant that supported the world was going to fall off the back of the turtle that supported the elephant.

His apprentice said "If in fact the astrophysics community has shortened the lifespan of the universe, I'd like to take this opportunity on behalf of Canadian astrophysicists to be the first to apologize." I liked that.

But neither of them said the paper was wrong. Neither pointed out any sort of fundamental error in the math or the conclusions. In fact Carlberg, for all his grousing about giant turtles, did grudgingly concede that the conclusions followed as a "logical extreme" of the theory. I find this disturbing.

Of course, the Copenhagen Interpretation does have competition. There's also the Many-Worlds Model, which in contrast to the nothing-is-real view, claims that everything is — that there is no probability wave, only an endlessly-proliferating infinity of parallel universes that spawn wholesale every time an electron has a choice between flipping this way or that. This theory also carries some profoundly ugly implications (it confers credibility onto Sliders, for one thing; also, nobody has explained to me where the extra mass for all these universes is supposed to come from), but it seems to be gaining ground amongst the theorists.

Still. Just to be on the safe side, it couldn't hurt if we all agreed to walk around for a while with our eyes closed. It might buy us some time.


Update 1715: OK, looks like a false alarm. Initial popsci reports were all causal-this and shortening-the-lifespan-of-the-universe that, but as AR has been kind enough to point out, Krauss is actually quoted in the article I linked to as saying he didn't mean to imply causality. Move along. Nothing to see in the comments. (Unless you want to see AR pointing out how credulous I can be...)

Labels:

Wednesday, November 21, 2007

The End of Art

This whole stem-cell breakthrough is certainly worth keeping track of, but not here because you know about it already; it's all over other sites far more popular than mine. Ditto the hilarious perspective on WoW which serves as the subject of today's visual aid, starring characters which many of us must know (albeit in roles with more contemporary fashion sense). No, today I'm going to direct your attention to neuroeasthetics, and the following question:

Have you ever seen an ugly fractal?

I haven't. I wouldn't hang every fractal I've ever seen in my living room (even during my Roger Dean phase) — but it wasn't the essential form that turned me off those iterations, it was the color scheme. And such schemes aren't intrinsic to the math; they're arbitrary, a programmer's decision to render this isocline in red and that in blue and not the other way around.

I would argue that fractals, as mathematical entities, are, well, appealing. Aesthetically. All of them. It's something I've batted around with friends and colleagues at least since the mid-eighties, and speaking as a former biologist it has a certain hand-wavey appeal because you can see how an appreciation of fractal geometry might evolve. After all, nature is fractal; and the more fractal a natural environment might be, the greater the diversity of opportunity. An endlessly bifurcating forest; a complex jumble of rocky geometry; a salt plain. Which environments contain more niches, more places to hide, more foraging opportunities, more trophic pathways and redundant backup circuits? Doesn't it make sense that natural selection would reward us for hanging out in complex, high-opportunity environments? Couldn't that explain aesthetics, in the same way that natural selection gave us* rhythm and the orgasm**? Couldn't that explain art?

Maybe. Maybe not. Because firstly (as I'm sure some of you have already chimed in), complex environments also contain more places for predators and competitors to hide and jump out at you. There are costs as well as benefits, and the latter better outweigh the former if fractophilia is going to take hold in the population at large. Also, who says all art is fractal? Sure, landscapes and still lifes. Maybe even those weird cubist and impressionist thingies. But faces aren't fractal; what about portraiture?

The obvious answer is that the recognition and appreciation of faces has got obvious fitness value too, and aesthetics is a big tent; nothing says "art" can't appeal to the fusiform gyrus as well as whatever Mandelbrot Modules we might prove to have. But now along comes this intriguing little paper (update 22/11 — sorry, forgot to add the link yesterday) in Network, which suggests that even though faces themselves are not fractal, artistic renditions of faces are; that artists tend to increase the aesthetic appeal of their portraits by introducing into their work scale-invariant properties that don't exist in the original. Even when dealing with "representational" works, evidently, true art consists of fractalizing the nonfractal.

What we're talking about, folks, may be the end of art as we know it. Go a little further down this road and every mathematician with a graphics tablet will be able to create a visual work that is empirically, demonstrably, beautiful. Personal taste will reduce to measurable variations in aesthetic sensibilities resulting from different lifetime experiences; you will be able to commission a work tweaked to appeal to that precise sensibility. Art will have become a designer drug.

Way back in the early seventies, a story from a guy called Burt Filer appeared in Harlan Ellison's Again, Dangerous Visions. It is called "Eye of the Beholder", and it begins thusly:

THE NEW YORK TIMES, Section 2, Sunday June 3rd by Audrey Keyes. Peter Lukas' long-awaited show opened at the Guggenheim today, and may have shaken confidence in the oldest tenet of art itself: that beauty is in the eye of the beholder. Reactions to his work were uncannily uniform, as if the subjective element had been removed...


Filer wrote his story before anyone even knew what a fractal was. (His guess was that aesthetics could be quantified using derivatives, a miscall that detracts absolutely nothing from the story.) "Beholder" wasn't his first published work; in fact, as far as I can tell, it may have been his last. (That would be fitting indeed.) I don't know if the man's even still alive.

But if you're out there, Burt: dude you called it.


*Well, some of us.
** Ditto.

Labels: , ,

Wednesday, November 14, 2007

Why Animé Might Not Be the Best Medium for Blindsight...

Courtesy of Che Gilson, who brought you last month's Manga'ld Theseus crew, a somewhat-less-than-fearsome interpretation of the once-scary aliens in Blindsight. (Personally, I'd have liked the "weapons" illo more if the board had had a nail through it.) The existential dilemma of a nonsentient intelligence giving rise to a thought-bubble is left as an exercise for the reader.





Now I'm going to see Blade Runner. Ridley Scott promises this is the last time. It better be.

Labels:

Friday, November 9, 2007

Profiles of the Future

Got the pdf from Nature for "Repeating the Past" yesterday; it's scheduled for the Nov 29th issue, for those of you with access to academic libraries. I would post the whole story here, but I think the contract gives Nature dibs on first publication. So instead I'm showing you the official illustration, since the contract never said anything about scooping someone else's work. I find it nicely restrained, low-key, and not really scifnal at all. Just what you'd expect from one of the most prestigious and respectable scientific journals on the planet. I only have one minor quibble, and it may just be my imagination, but...

Is it just me, or is that larger silhouette a dead ringer for Wesley Crusher?

Labels:

Thursday, November 8, 2007

You'll Never Be Rid Of Me Now

I was contacted a while back by a fellow named Nicholas Bennett, who had built a little java program for reading e-books off of cellphone displays. He'd already ported a few hundred public domain titles onto this website for free download (including 20,000 Leagues Under the Sea), and was hunting more recent, Creative Commons releases. He wanted to add Blindsight and the Rifters books; I told him to go ahead. (I notice that Karl Schroeder's Ventus is also up there). One nifty little feature is that you don't have to load a separate program; the reader is integrated into each downloadable book.

I myself have not tried out these freebies because my cell — like my Internet connection, my landline, and my cable — all hail from Rogers, and Rogers (being the avaricious and duplicitous scumbags that they are) sold me a phone that only plugs into a proprietary Rogers cord that costs an additional eighty bucks, which I refuse to pay. (I could surf wirelessly, but even the otherwise-sleazy salesperson who sold me the phone warned me that Roger's charge for that service would take me up the ass like a Carlsbad Stalactite.)1 But I admit I'm curious, so if someone out there wants to try out this product and let me know how it runs, I'd be grateful. My cellphone-ready books are here; download instructions, over here.

1 I really, really hate Rogers. I hate them as much as I hate Dell. By the end of this month I hope to be free of them forever. Except for Cable. Still no real alternative for cable.

Labels:

Friday, November 2, 2007

Brittle Imaginings

Pursuant to Remedial Gigerology's digression into scrambler/ophiuroid relationships, here's my impression of one, cobbled together from pieces of the other. Consider it a belated Hallowe'en treat:

And what the hell: seeing as how we're on the subject of my favorite holiday, here are a couple of blasts from the past. Even casual visitors will have seen at least one of these before; more anal analysts (hmmm— I wonder if those two words spring from the same root...) might nod knowingly when I let slip that the Nosferatu chick in the other is Susan Oshanek.


Labels: ,