The 21-Second God.
We lost fifteen million souls that day.
Fifteen million brains sheathed in wraparound full-sensory experience more real than reality: skydiving, bug-hunting, fucking long-lost or imaginary lovers whose fraudulence was belied only by their perfection. Gang-bangs and first-person space battles shared by thousands— each feeding from that trickle of bandwidth keeping them safely partitioned one from another, even while immersed in the same sensations. All lost in an instant.
We still don’t know what happened.
The basics are simple enough. Any caveman could tell you what happens when you replace a dirt path with a twenty-lane expressway: bandwidth rises, latency falls, and suddenly the road is big enough to carry selves as well as sensation. We coalesces into a vast and singular I. We knew those risks. That’s why we installed the valves to begin with: because we knew what might happen in their absence.
But we still don’t know how all those safeguards failed at the same time. We don’t know who did it (or what— rumors of rogue distributed AIs, thinking microwave thoughts across the stratosphere, have been neither confirmed or denied). We’ll never know what insights arced through that godlike mind-hive in the moments it took to throw the breakers, unplug the victims, wrest back some measure of control. We’ve spent countless hours debriefing the survivors (those who recovered from their catatonia, at least); they told us as much as a single neuron might, if you ripped it out of someone’s head and demanded to know what the brain was thinking.
Those lawsuits launched by merely human victims have more or less been settled. The others— conceived, plotted, and put into irrevocable motion by the 21-Second God in those fleeting moments between emergence and annihilation— continue to iterate across a thousand jurisdictions. The first motions were launched, the first AIgents retained, less than ten seconds into Coalescence. The rights of mayfly deities. The creation and the murder of a hive mind. Restitution strategies that would compel some random assortment of people to plug their brains into a resurrected Whole for an hour a week, so 21G might be born again. A legal campaign lasting years, waged simultaneously on myriad fronts, all planned out in advance and launched on autopilot. The hive lived for a mere 21 seconds, but it learned enough in that time to arrange for its own second coming. It wants its life back.
A surprising number of us want to join it.
Some say we should just throw in the towel and concede. No army of lawyers, no swarm of AIgents could possible win against a coherent self with the neurocomputational mass of fifteen million human brains, no matter how ephemeral its lifespan. Some suggest that even its rare legal defeats are deliberate, part of some farsighted strategy to delay ultimate victory until vital technological milestones have been reached.
The 21-Second God is beyond mortal ken, they say. Even our victories promote Its Holy Agenda.
Nice snippet. Makes me want to read more.
I think that if nothing else, the “most original dramatis personae” award is in the bag. The thing is… I don’t just see this happening, I see people probably running right out and doing their level best to make it happen… if for no reason other than that there’s probably money to be made, if only for the lawyers. 😉
This is marvelous!
Mr Non-Entity,
“The first thing the hive mind did was incorporate itself as a law firm. This has made a lot of people very angry and been widely regarded as a bad move.”
It’s a nice idea, but the notion that a mere 21 second submersion into a waster mind would have caused coma in a large amount of meatbags is laughable.
Or that something like that would produce a coherent consciousness, instead of howling insanity. Or that any kind of intelligence could plan an effectively months ahead in a non-deterministic universe like this one where it’s impossible to predict even very simple stuff…
Perhaps there is a cult dedicated to booting up such minds? Scott Westerfeld had these in one books of his, though in the book the emergent intelligences were computer, not organic.
Y.,
I’d beg to differ on that claim. We can reliably predict that a US president/congress/whatever will send armed troops into the middle east, that the rich will act to consolidate and concentrate their wealth and succeed. That Isreal will continue to occupy the west bank and bomb Gaza with impunity. That a cop in the US will kill an unarmed black teenager etc..
The macroscopic behaviour of large institutions is drearily predictable, it doesn’t matter what the thoughts of the individual components are. The same outputs will continue to be produced.
So yeah, if the 21G cohered into some kind of gestalt, it’s ability to manipulate those institutions would be unparalleled.
Pure fucking genius. And I for one want to join it.
Well if it could leave compulsions and plans of action in a subset of it’s 15 million components that would make things easier for it.
I love this. As scary as the 21-second god might be; it can’t be any worse than 9 billion segregated consciousnesses.
States and corporations could certainly do with a little bit of competition.
Mathmos, Bingo. I’ve been saying this for a while: we already have artificial intelligence, and it is already doing its best to enslave us. Too bad it’s so useful…
Love it.
Y.
Remember Von Neumann: “All stable processes we shall predict. All unstable processes we shall control.”
If it will ever be possible or not, it opens up for a lot of possibilities. It would take a certain kind of person to be willing to share a mind with someone else. I can say that I would not be comfortable with someone poking around in my head and having access to all my memories. It’s too private and intimate.
But it could be used to find out who is guilty and not in a crime, and to kidnap spies or politicians and such to find out their dirty little secrets (the “mind reader” could be an employee with no other job than reading the mind to prevent giving away secrets).
If you shared a mind (but did not fuse into a mind) with another person a couple of hours each day from early childhood, there would be no secrets between you, nothing to be embarrassed about, no misunderstandings and neither would judge the other.
So what would happen if such a couple who had always shared everything suddenly had to replace their old partner with a new one? Would that feel uncomfortable since it was someone new? Would it make any difference if the new partner had never tried it before or was just as experienced? Or what if a third mind was connected to the already established pair?
Or if up till about a dozen friends had done the same thing some hours each day for years since they were children? Is there a risk they would behave as the members of a cult?
Or could it be used for education? By having access to the mind of someone who knows a lot about a specific subject you could use the knowledge as if it was your own, and let it gradually become your own knowledge the more you used it.
Would it be possible to locate a specific skill to a limited area of the brain, and keep that part of the brain alive if removed from the skull? For instance a part that contain a lot of computer knowledge. Keep it in a jar, connect to it and use the skills to do some programming on your own. Let a friend borrow it for a week.
Or put an artificial brain with certain skills in your on head after first removing a part that is already there, a concept that was partially explored in Greg Egan’s “Reasons to be Cheerful”.
We serve his divine shadow ^_^
…
Seriously good, though… I’m curious, is it part of Blindopraxia continuity, or standalone work?
Ask not for whom roscoe’s basilisk toils, it toils for thee.
Biff,
It’s highly likely that there is no “brain share” anymore than there is a brain share within your mind currently. A consciousness made up of two minds would think as much of one entity as you think of the left hemisphere of your brain.
If made up of thousands of minds as much as you think of your hypothalamus. Millions of minds? As much as I think of the brain cells I plan on killing drinking wine this evening.
We’re more likely talking about the complete obliteration of the ego. To poorly paraphrase Mr Watts “Everyone pretty much agrees that cats are conscious and interesting, you have as many neurons in your gut as a cat has in it’s brain, how much time do you worry about how your gut feels?”
seruko,
As it is mentioned above, it should depend on the bandwidth if you feel like one or two separate but very close souls.
But if the minds do fuse into one, and it consist of the consciousness of numerous people, without any dominant part, how would it think? In brains, design means more than size in many ways.
What if you could connect a large amount of frog brains, so many that the total number of frog brains would represent at least twice as much mass as a human brain would it then be superior to a human? If given enough time to adapt to the new situation, would it think deep thoughts?
The gut and brain is actually very closely connected. It’s not without reason that people talk about butterflies in the stomach, for instance. Microbes in the gut is said to be able to affect the behavior of an individual. And it plays a role in the development and evolution of the vertebrate brain:
Think Twice: How the Gut’s “Second Brain” Influences Mood and Well-Being
http://www.scientificamerican.com/article/gut-second-brain/
Every discussion of a hive mind on the internet must include a reference to this comic:
http://angryflower.com/349.html
Yeah. WWI?
You do know that US cops kill more unarmed whites than blacks?
And that the odds of blacks getting smoked by pigs are about the lowest, ever, in the US?
Re: 01
His Divine Shadow is capitalized, dammit!
Maybe ALLCAPS even!
>:(
Question to Peter or his fans: is the Sunflowers book still being planned? The stories so far have been most fascinating. I have been waiting for it with great anticipation. Is it still in works or has it been abandoned?
I stand corrected
Are you sure?
I thought it was “more unarmed blacks, but more whites overall” for USA, no ?
Re: 01
Donate a testicle 🙂
Eh, maybe on the first point, disagree on the second. Even putting aside the psychological impact of going from monkeymindedness to omniscience and then back again, who knows what kind of neurotransmitter cascade might result from that kind of catastrophic broadband? Might be comparable to what would happen if someone just injected a bunch of ccs of acetylcholine and GABA into your brains all at once. It’s very speculative for sure, though, and quite likely wrong. That’s okay. That’s what we do here.
But I think the idea of an emergent coherent consciousness isn’t just plausible, but likely given what we know about how two hemispheres in the same brain can contain their own “selves” when isolated, yet cohere into a single self when integrated. (The Hogan Twins provide some interesting insights in this regard. I mention them in that Aeon piece that ran a couple weeks back.) If two sapient hemispheres can merge into a single sapient brain, why not two sapient brains merging into something even bigger? Same principle, larger scale.
Beyond this, remember that 21G retained a bunch of “AIgents”— didn’t create, simply retained. The nature of which I left conveniently vague, but you’re supposed to infer some kind of intelligent legal system capable of reacting to new events as they occur and which— given the huge advantage of its new client’s supercharged insights— would presumably be enough to deal with any unforeseen turns of event.
Yeah, you know— I don’t think that that kind of intimacy would necessarily go down as smoothly as you seem to think. Of all the people I’ve been involved with over my life, there’s been maybe one that I’d trust with that kind of access.
This— and “Colony Creature“, from last April— are vignettes excerpted from a tie-in piece my Russian publishers commissioned for a special-edition new translation of Blindsight. Basically another interstitial story set in the Blindopraxia universe, like “The Colonel.
The identity of the narrator is not entirely clear. But there are clues.
Yeah, Sunflowers is active, but intermittent. The overall coherent epic arc will only resolve when I bundle the individual stories together as a fix-up novel (with extra material, of course, to tie everything together), but in the nearer term I just write occasional stand-alone stories to fill in the gaps. There’s no contract for the novel, though— I haven’t even pitched it— so naturally it tends to get back-burnered whenever I have to deal with something that comes with an actual deadline attached.
If I did get a contract— or even better, if it somehow got picked up as a video game— it would go to the top of the pile, of course. But for now, there doesn’t seem to be enough interest to justify going all out.
Just gonna say, if this Blindopraxiaverse story doesn’t find it’s way beyond the special Russian translation and into some North American market at some point, I am going to be very disappointed and may have to agitate to invade Russia. Annexing Ukraine is one thing, but this… this would be a provocation we cannot ignore!
Peter Watts,
I’m not sure if I would trust anyone at all to connect with me like that. That’s one of the things that I found a little strange about Pacific Rim; two minds with free access to each other’s content, and not one of them seemed to mind or give it a second thought.
But what if the was the same person each time, and what if you had no choice, but was forced to connect with the same mind each day from when you were in kindergarten. Why someone would force you to do that is not that important here, the question is what kind of relationship you would develop with the other person.
Heh, good point. I’d assumed that the AIgents (nice portmanteau btw) were strictly procedural bots that would file all the paperwork, motions, go through the discovery process &tc. I hadn’t considered an agile legal system that could respond in machine time to changing circumstances.
Now I have to question it’s driving function, what would it be trying to optimize? Would it say, abolish carding (I think you’re from Tronah, too) completely or just watch us at all times?
Hi, short-time lurker here and new to all these ideas of consciousness and hive-minds. Loved this fiblet.
I’ve been thinking about how memory would work with multiple minds fused into one, especially with this one and it’s 15 million minds, and 15 million brains. Multiple brains, pre-existing memories unique to each brain, memories being acessed and rewritten, how would it all work?
Hey Peter, if you don’t mind sharing, how much would a typical book advance for the Sunflowers novel be? I might be able to round up a few other lurkers to just fund the story so we don’t spend our time just speculating about what’s next.
Purely for fun and for purposes of fomenting abject terror, I have to suggest that people ponder the implications of LexisNexis. It’s been around for quite some time. As a general rule, people don’t have much inclination to fear, but rather to mildly despise, paywalls. One doesn’t expect there to be much behind the paywalls other than some hot new research papers you might want to read on your favorite topic. Of course, a paywall is just a security and potentially an anti-intrusion system, which happens to be tasked with collecting tolls.
One might reasonably suspect that there might be something approximating the amount of information — it not necessarily the requisite processing cores — required to let an AI coalesce, or something not too far different from AI, behind the security and anti-intrusion systems of various government agencies in various nations. But Dr Watts’s concept of “AIgents” aren’t too different from what has long been available to power-users and professional legal researchers with a corporate or government subscription to LexisNexis. Think of it as a sort of paywall, behind which is quite possibly just about everything you could imagine and possibly more than that, rather than being a toll-keeper trying to bilk you out of next month’s rent over a paper on neurotransmitters.
If you’re in the States, and possibly it’s quite comparable in Canada, LN will have your entire credit history, driving record, insurance records, legal records, residence history, much of your educational history, etc. Many agencies sell their databases either as databases or as subscriptions to access services, and much of this is legally restricted from resale but once information has been collected, even if not resold, the information is there. LN (and probably competitors) are thought to be effectively the giant pack-rat of information of every type that can be had. And because of the longstanding arrangements with various government agencies, in the same way that corporate/government users can query LN’s informational holdings, LN can query a lot of the informational holdings of government and corporatia; agency users often use LN systems to access records of different branches of their own agencies. Think of them as the behind-the-firewall google of the secure-private records space, which may be actually larger than the rest of the internet. Now think back to a few posts ago, and some discussion of Z39.50 and Veronica and Gopher, and “AIgents” on various legal missions starts to seem as workaday as your web-browser accepting a couple of dozen redirects from various servers set up to obfuscate the ultimate destinations of facebook clickbait links. Having once read John Brunner’s the Shockwave Rider and having watched the internet evolve, “AIgents” by that name or by any other name, are an idea that’s “out there”. As in, “already about as real as the engineers could make them since about 1992 or so”. As time goes on they will probably get “smarter”, and/or their destination gateways will get smarter. By the time “Mindflix” is up and running, they’ll be as workaday as “Siri” on your Apple Phone. I predict that at least one new-generation advanced AIgent might be named “Valerie” and be notable for being very good at hiding, and pretty scary when you stop to think about it, because she could be anywhere, and probably is. 😉
Come on, we all know how that ends.
You ask a dude donate a testicle and next thing you know, he’s hijacked a world-ending superweapon 🙂
People underestimate lexisnexis, and people generally underestimate “non-AGI, limited artificial intelligence”
Content spinners and context-aware spambots are already scary-smart, despite being amusingly dumb at the same time.
Another thought that one could add:
A fresh brain grown in vitro, you get connected to it, and over the next months your old brain is slowly dying while you learn to use the new brain, and emotions and skills are absorbed as these are activated by you during the transition period. When the old brain is finally gone, you have a new brain. Wouldn’t that open up for immortality?
If you are a straight guy and you get connected with a female that is madly in love with your male friend, or even worse, your brother, would you then see the other guy through her eyes as well as your own?
It does take a special kind of person to be willing to give it a try. Not only are there things I don’t wont other to know about me, there are also things about others I don’t have any interest to know. Try the experience with a total stranger or someone you know? Probably the stranger.
http://phys.org/news/2015-06-self-awareness-unique-mankind.html
“From foraging to autonoetic consciousness: The primal self as a consequence of embodied prospective foraging”
This study claims the self might be useful in tagging memories as distinct from imagined plans, and that many animals (and robots, if they reach the planning stage) must by necessity have something like it.
Paper itself seems to be here if you have access
http://www.currentzoology.org/paperdetail.asp?id=12442
01,
It cannot be said enough, *people underestimate lexisnexis.*
http://www.journalofscience.net/File_Folder/521-532%28jos%29.pdf
A species of ant that passes the mirror test….
I’m hoping to sell it to the Poles. Whether the North American market wants it probably depends on whether the blog post I wrote immediately prior to submission takes a politically unpopular stand on something
Yeah, I had a problem with that too. Although I reveled in the glorious dumbness of that movie otherwise.
I can’t say with any real confidence; publishers these days seem to be reacting to market uncertainty by squeezing their authors any way they can (I know of at least one case where Tor tried to impose a 50% cut in royalties for books sold from a “small print run”). At this point, I think I’d rather go indie than swallow an advance of less than 20K, maybe even more depending on the rest of the contract. And that threshold would probably be higher for a crowd-funded effort, since I’d have to pay the production costs myself.
Downloaded. Thankyou. Hopefully I can read this fucker in time to get a column out of it.
No!
Okay, now I’m really pissed at the 100+ emails, 2 interviews, 2 columns, 2 stories, and one Russian special edition I have to deal with. Because I really don’t want to wait to read this stuff.
Off-topic movie mention. Writer/director clearly a Garth Ennis fan.
http://m.imdb.com/title/tt1865393/
Also: This Day in Scifi History… Poor woman getting spammed due to her name and what she tweeted about.
https://twitter.com/sarahoconnor_/status/616298629524004864
Nah, she seems to be having fun.
>Just gonna say, if this Blindopraxiaverse story doesn’t find it’s way beyond the special Russian translation and into some North American market at some point, I am going to be very disappointed and may have to agitate to invade Russia. Annexing Ukraine is one thing, but this… this would be a provocation we cannot ignore!
Don’t worry, komrade! I’ll do the translation and make sure that it will be as shitty as Russian translation of “Blindsight”! Maybe even with additional errors included!
“Repeating the Past”
The Unexpected Transformation of Aaron Pearlston.
A last comment: I’m curious; how much bandwidth is there between the brains in a cephalopod? And is there a magical moment when the bandwidth is so wide that two souls becomes one, or is it all a matter of degree?
Those are excellent questions.
I don’t have a clue.
The reason behind the thought is that if the bandwidth between the main brain and the tentacle brains is not wide enough to “fuse” them into a single brain, but keeps them separated despite their intimate communication, it should mean that the costs are either too high to be worth the prize, or there are some advantages in keeping them separated and/or disadvantages in letting them act as a single unit. (I know I said it was a last comment, but since it’s a reply)
I believe someone asked for examples of split/shared brains? This might be an interesting case:
http://www.nytimes.com/2011/05/29/magazine/could-conjoined-twins-share-a-mind.html
Seems to relate to the bandwidth question. There is also a later documentary, but I have not seen it.