The End of War Crimes

We’ve talked about free will on these screens before. We’ve referred to consciousness as the pointy-haired boss who takes credit for decisions made endless milliseconds before it was even aware of them; tumors that turn people into pedophiles, and do violence to the very concept of “culpability’; military hardware that bypasses conscious thought entirely and takes its cues from the far-faster processes of the visual cortex. We’ve gone down that inevitable, what-if road (scroll down to June 30 in the right-hand column): if I’m not responsible for my behavior when my brain was hacked by a tumor, why should I be held responsible for my behavior at other times? Is a “normal” individual any more responsible for the wiring of his brain than a sociopath? In a world of mechanical minds, how can anyone be held accountable for anything?

Such arguments are both unassailable and inadmissible: unassailable because the very concept of “free will” doesn’t make sense in a cause-effect world unless one invokes supernatural elements, and inadmissible because society has to pretend that it does regardless. Our whole legal system is predicated on the culpability of individuals for their actions: we’re not going to throw out our whole worldview just because science shows us that it’s wrong.

As it turns out, though, I spoke far too soon on that score; I gave the legal profession too little credit. Because, of course, not being responsible for one’s actions is a venerable favorite amongst defense lawyers whose mass-murdering clients have been caught red-handed; and not guilty by reason of predestination works even better than not guilty by reason of insanity in some cases.

High-tech war crimes using neuroelectronic weaponry, for example.

An article over on Technovelgy pointed me to “Brave New World: Neurowarfare and the Limits of International Humanitarian Law“, by Stephen E. White. Thirty-four heavily-footnoted pages of legal opinion on the implications of brain/weapons interfaces in the battlefield. I learned a bunch of neat things poking around in there (did you know that the Geneva Conventions include a protocol that proscribes the use of weapons which could cause “widespread, long-term and severe damage to the natural environment”?), but primarily I was reminded that to be found guilty of a criminal act, you have to have intended to commit that act. And since certain toys already in DARPA’s stockpile act on preconscious neural cues, it’s easy to imagine a scenario in which a soldier massacres unarmed civilians — using weaponry that responds directly to the activity of her own brain — and yet remains innocent of any crime for the simple reason that she had no conscious intent.

The most obvious application of this involves any interface (such as these brainoculars) that improve reaction time by bypassing the conscious thought process. The whole point of the exercise is to act faster than you can think, faster than the conscious self can veto1. But there is other technology that acts not by bypassing conscious thought, but by anticipating it, by acting on predicted intent. As White says:

“…a computer can make a correct prediction of what a subject will do 71% of the time by analyzing the electrical activity generated by the subject’s medial prefrontal cortex when he or she makes a decision. Theoretically, a brain-machine interface weapon could fire a weapon based on such a predictive response, thereby making it uncertain whether or not a volitional act actually took place.

“…a brain-interface guided weapon could circumvent the pilot’s normal volitional processing signals and rely solely on the recognition activity, thereby making it impossible for courts to determine whether a volitional act occurred before weapon targeting. Alternatively, a brain-interface guided weapon could employ a combination of sensory recognition of the pilot’s incipient volitional thought and probabilistic software calculations in such a way that a prosecutor could never definitively prove anything more than the most attenuated guilt for misdirected attacks on protected persons.”

Who’s culpable in these cases? A drunk who runs someone over doesn’t get to invoke his own impairment as an alibi; presumably there was some point at which he made a sober, conscious decision to start drinking, knowing the risks. But what if your CO orders you to get behind the wheel impaired? What if you get courtmartialled for refusing?

Blame the CO, then. But does the CO have the technical expertise to assess the weaponry under his command? Probably not; the stuff shows up in crates one day, along with an instruction manual and the usual upbeat spin about “game-changing technology” and being “master of your domain”. So charge the geeks who designed the stuff. Some grunt wipes out a village in the blink of an eye, put the Tophers and the Bryces up against the wall; surely they must have known the limits of their own technology.

All of this stuff is disturbing enough as it is. But White takes the next obvious step down that road. Just as Big Pharma cranks up its prices two days before the new regs kick in, just as the logging industry finds out which forests are slated for protection and then clear-cuts them before that legislation passes, the military now has an incentive: not to limit the technology, not to improve its ability to discriminate foe from friend, but to deploy these weapons as widely as possible:

“…international humanitarian law would create perverse incentives that would encourage the development of an entire classes of weapons that the state could use to evade criminal penalties for even the most serious types of war crimes.”

So whatever you do, son, stay jacked in, keep online, because zombies— zombies can never be found guilty.

Silly me, thinking that our legal system was too hidebound to adapt to the latest neurological findings. I shouldn’t have worried: ultimately, we’re still totally fucked.

In the meantime, though, I bet I could wring a story or two out of this.

——————————

1Of course, this all depends on your definition of the “self”. If you define the self as the brain with all its running processes, whether conscious or not, then the “self” remains ultimately culpable even in this scenario. Any I didn’t know! bleatings by the pointy-haired boss would be true, but irrelevant; after all, no one has ever stayed an execution just because the tapeworm in the condemned’s gut was innocent of wrongdoing. But let us save that discussion for another time.



This entry was posted on Tuesday, November 17th, 2009 at 5:27 pm and is filed under neuro, sentience/cognition. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.
12 Comments
Inline Feedbacks
View all comments
James
Guest
James
15 years ago

This’d be an interesting twist on the idea of humans being hunted by emotionless machines.

There’s something similar to these zombies being featured in “Dumbspeech”, isn’t there?

Hljóðlegur
Guest
Hljóðlegur
15 years ago

Of course, this all depends on you definition of the “self”. If you define the self as the brain with all its running processes, whether conscious or not, then the “self” remains ultimately culpable even in this scenario. Any I didn’t know! bleatings by the pointy-haired boss would be true, but irrelevant

And, Thank You. Starting with the definition is my favorite MO, because, technically speaking, all of science is supposed to work that way, with definitions first, then later we can revise them as the experimental results dictate. Where you draw the borders of items is vitally important to clear thinking and cogent discussion of them.

Also, I like this definition, so of course I approve. I’ve mentioned before that all this free-will stuff, and the when-do-I-perceive-my-decisions-as-having-been-made hoopla, becomes moot if each of us is a city, a community, a corporation, a tiny world. In a corporation, some decisions are made by the top layers of the origanization, but many are made independently under general directives from above, and most are made autonomously by lower-level decision makers. However, the actions as seen from the outside are that the business acts as a unit, in one direction, with one intent. Mostly.

Same thing here – if the aggregate response of the whole community/being is to shoot a policeman in the head, since we can’t punish parts of the brain separately, the whole being gets the needle. The self remains ultimately culpable in this scenario. Absolutely.

The only way this becomes a problem is if we assume perfect justice when we weigh culpability, and that’s not part of our scenario, unless we want to wander off into evaluations using Abrahamic religions, and NObody wants to discuss religion, right? *smile*

In the meantime, though, I bet I could wring a story or two out of this.

We’d enjoy it.

I shouldn’t have worried: ultimately, we’re still totally fucked.

How did I know you were going to say that? *strokes chin thoughtfully*

Hljóðlegur
Guest
Hljóðlegur
15 years ago

One more thought –

Who’s culpable in these cases? A drunk who runs someone over doesn’t get to invoke his own impairment as an alibi; presumably there was some point at which he made a sober, conscious decision to start drinking, knowing the risks. But what if your CO orders you to get behind the wheel impaired? What if you get courtmartialled for refusing?

The individual human being is culpable, morally. The soldier who knows that the mechanical device will allow him to kill others with his subconscious is like the drunk who makes the decision to get into the car. He might not have used direct volition to commit the act, but his decision to set in motion events that would result in death of another was freely-done.

Even the soldier who has been ordered by her CO to use the subconscious weapon in war is morally responsible for the death it causes. Legally, isn’t this a Nürnberg situation? “I was only following orders” didn’t cut it for WW2 crimes, why should it for crimes by neuroweapon?

Richard Mason
Guest
15 years ago

There’s an obvious analogy to landmines: you bury landmines without the specific intent to kill a particular person, and may not ever even know whether you killed someone or not. Sometimes you kill the wrong person.

I think the military trend is generally away from landmines, not towards them. But the widespread use of IEDs (by terrorists rather than by soldiers) is arguably a contrary trend.

Michael Grosberg
Guest
Michael Grosberg
15 years ago

From a military standpoint, developing such weapons would be a very unwise move. You do NOT want your weapon to misfire (or to fail to fire) just because your soldier had the wrong thought at the wrong moment.

Anyway, when it comes to war crimes, it’s almost always the higher-ups (generals, ministers, heads of state) that are accused, not the foot soldiers. For a head of state, “I didn’t know what the weapon was capable off when I approved its deployment” is hardly a defense.

Keippernicus
Guest
15 years ago

Oooh, snap. The boys in Dark Brigade just got upgrades to suppress their own conscious desires in favor of an even twitchier trigger finger.

Thanks for the ideas, there are so many good ones to steal from around here.

Cheers!

trackback

[…] By wildrote The End of War Crimes is an amazing, mind expanding, read over at SF author Peter Watts blog No Moods, Ads or Cutesy […]

Leona
Guest
Leona
14 years ago

Armies will develop the technology anyway, and will be prepared to take the hit (legal costs, med insurance etc) when their mis-wired pawns screw up. If the overall error-rate is such that losses can be written of as costs of war, it would make sense for them to deploy the technology anyway.

It’s boring tech though… unless they’re planning to roll this out with full-body Gandamu suits as well. THEN we’re talking. Might just quit the day-job and enlist!

Just kidding, just kidding. I could never(say never) be a killing machine.

Ilya
Guest
Ilya
14 years ago

if I’m not responsible for my behavior when my brain was hacked by a tumor, why should I be held responsible for my behavior at other times? Is a “normal” individual any more responsible for the wiring of his brain than a sociopath? In a world of mechanical minds, how can anyone be held accountable for anything?

Such arguments are both unassailable and inadmissible: unassailable because the very concept of “free will” doesn’t make sense in a cause-effect world unless one invokes supernatural elements, and inadmissible because society has to pretend that it does regardless. Our whole legal system is predicated on the culpability of individuals for their actions: we’re not going to throw out our whole worldview just because science shows us that it’s wrong.

I think this is more a matter of practicality than anything else. If someone’s sociopathic behavior is caused by a tumor and can be decisevely ended by removing said tumor, then that’s what society does. If someone else’s sociopathic behavior is caused by a much more subtle structure in their brain, society puts them in jail or executes them because, realistically, what else can you do with them?

If a procedure is ever discovered which safely, reliably and without major side effects, turns a rapist, a murderer, or an embezzler into a stable productive citizen, I think it will be adopted pretty quickly. There will be a lot of resistance, from both Left and Right (for different reasons), but eventually practicality will win out. Key words here are “reliably” and “productive” — turning criminals into brain-dead zombies will not be popular any time soon.

A destructive behavior will be recognized as a disease if and only if it can be actually cured.

John Gilmore
Guest
14 years ago

While you’re borrowing story ideas about how governments can get away with horrific crimes…

If the US government developed a substance or microbe that, when put in the drinking water of a major city, would cause the residents to tend to vote for the incumbent, what laws or rights would be violated? None. The courts have already ruled that not only is the government free to force you NOT to take substances that it unilaterally decides to prohibit — it can also force you TO TAKE substances if it finds that your natural mental processes are not to its liking. Check any drug-war opinion for the first, and check Sell v. US for the second. Or http://cognitiveliberty.org. I spent a couple of years funding efforts to change this (before some idiot invents such a substance) — and failed.

Such a substance is not too far outside the bounds of reality. MDMA, for example, suppresses the fear response, which is what makes it useful in treating post-traumatic stress. All it would take is to increase the longing for security, the trust for authority, the submission to “daddy”.

Of course, if patriots rather than totalitarians looked to invent such substances, maybe they’d come up with one that would encourage novelty-seeking, questioning of authority, having a bit more backbone or courage or assertiveness, etc. If we could reduce the incumbency rate in the US Congress to something lower than that of the Chinese Communist Central Committee, without violating any laws or rights, hey, why not?

trackback

[…] that sounds like something that has escaped from the fantasy worlds of video games. Nobody except Peter Watts took much interest in this scholarly piece, which looks at an area where exotic technologies may […]

Gabor Varkonyi
Guest
Gabor Varkonyi
14 years ago

I think that it’s better to punish you severely even in the event of the pointy haired boss being out of the building. Because, if your unconscious mind works so wonderfully – and I don’t doubt it does – than there’s at least some chance that it might put the future punishment into the equation. Not 100% (but again, even with fully conscious mass murderers it never works 100%, most of them thinking never being cought), but at least you give society some chance that these soldiers might not fire, because, well, because the unconscious mind knows that it might not be good for it.

I actually think severe punishment might stop some serious sociopaths from committing murder, so I don’t really buy the argument that we shouldn’t punish ill people. Why not?

Actually I think most people (be they sociopaths or just normal people) who go on a rampage or something like that might think of themselves as crazy, and therefore they might think they have a large chance of not being put into jail. Actually that might be more true of healthy people (if you can call healthy someone who kills some other people whom he didn’t even know before) , who in the end end up in prison. So maybe if we didn’t have that defense, we could save a few healthy people (and their victims) from themselves, even if we couldn’t save any victims of the true psycopaths.