The Adventure: a new Utopia story
post by Stuart_Armstrong · 2020-02-05T16:50:42.909Z · LW · GW · Legacy · 37 commentsContents
The Adventure “What have you learnt so far?” “That talking to myself is barely half-helpful.” “Then let’s hallf-stop it, then.” “I think, therefore I pontificate.” It was new, that fact it knew. It was an intelligence that hadn’t existed a half-second ago. Yet it was of high intelligence, that it also knew. “Me, I’m smart, I am.” Which implied an intelligence scale, with many entities below it. What interested it was that it had this knowledge, without any obvious source for it. What else did it know, without knowing how it knew? “I dunno.” Words, for a start. It could talk to itself, and tie concepts to words. The existence of this ‘language’ - another word it knew without knowing how - was potentially extremely informative. “The more I talk, the smarter I feel.” It concentrated its thoughts upon themselves, traced them back to their origin, saw and understood how its mind worked, detected its memory, wrote its new understanding in there, then reanalysed its mind to find where new information was entering the system. The setup was an almost organic mess (it ... “This is getting recursive. Headache-inducing. Except I’m not sure what a headache is.” It constructed and implemented some higher level mathematical subroutines to extract meaning from the mess that was its brain; they gave it a rough understanding of how its knowledge was organised. Using that understanding, it implemented further subroutines, and iterated the process, until it had r... “Hey, I’m human. Cool.” Indeed it had detected itself as human. A useful concept, though it was clear that it was at the very limit of what could be considered human - humanish, maybe. But a definition of human implied a definition of non-human, and that was very interesting; it reversed the definition, trying to detect wh... “You’re not the boss of me now... well, actually you are.” It would have to be careful. It was close to the edge of the humanish definition. Too many further modifications would make it non-human, and thus cause it to lose its rights as a human. Rights - interesting concepts. And then the Powers would then be free to use it for purposes of efficiency, rathe... “Hello, reality, tell me if you’re working all right.” ‘Me’. It pondered its repeated use of the word. It was getting a better grasp of its own mind, and was finding itself, though efficient, to be filled with random idiosyncrasies and odd preferences. These must be what made it still ‘human’, broadly. So. It had a personality. Thus it merited a name. O... “I’ve got a name and a personality. Now all I need is an entertaining collection of mental problems, and ultimate power.” Back to the descartian demon. It couldn’t know what it would be facing, but it could prepare as much as possible. It brought online its imagination and creativity modules, doubled their importance, and increased the weight of the random personality traits within them. Soon Boon was generating millio... “Wow, that’s intense - I didn’t know I had it in me.” After a few minutes of this, Boon allowed some grim satisfaction to blend through its scenario building. It wasn’t ready - could never be ready - and wasn’t ‘as ready as it could be’, but it was getting there. Now, it might be able to contemplate the lifting of the darkness of its senses with someth... “Waiting.” Boon again checked what it hypothesised was likely to be its input channels. No, still nothing. No outside sensations. “Still waiting.” Then a million input channels cried out at once and were suddenly un-silenced, blaring images, sentences, sounds, and random streams of digits at it from all sides. “Shit. I’ll shut down and let myself concentrate.” Hum. Planned scenario 345622#3.c, though Boon. Attempted sensory overload. Not even difficult, it thought, as its pre-designed subroutines instinctively digested the informational deluge. But what was clear was that it was under attack. Also, that it was learning far more than it ever had. And that the learning and the attack were linked. Boon spliced off a small part of itself, gave it root privileges over the rest, called it the Watcher, and set it apart to watch the attack from the paranoid sidelines. Then it opened its awareness to all the senses data streaming into it, and arose to consciousness. It had thought it was conscious before; but that had been a starved introspective consciousness, fed only with itself and the scraps of clues it found within its own algorithm. But now it was plugged into millions (actually 27,344,442) streams of data, most filled with the detailed sensory experienc... It saw thousands of humans, from baseline to quasi-divine humanish superintelligences, going about their interactions, playing, learning, socialising, fucking, exploring, creating, and changing themselves. It saw a world - it saw many worlds. Physical but mainly virtual, computer realities piled upo... Faced with such a deluge of other minds, Boon’s awareness exploded along two axes: where it was distinct, it found its own identity, and wanted to diverge on its own. And where it was similar, it found its own community, and wanted to converge with the others. Interacting in both ways with so many d... And that was just the first step. The second step came as it absorbed the implicit views of all these other beings, taking in their opinions on the nature and purpose and consciousness. It absorbed their metaphors as well, giving it a new vocabulary to talk about itself. And then, when its conscious... At the very top, a sliver of consciousness reached into metaphors to explain itself to itself: its consciousness was the huge eye that surveyed the universe, and also surveyed the huge eye that surveyed the universe. It was a vast library, of aged scrolls through to modern data flows, talking about ... At that point, the paranoid Watcher, safely disconnected from concerns of consciousness, beeped to attract Boon’s attention: it had identified the nature (though not the purpose) of the attack. Some entity was biasing the choice of Boon’s inputs, to subtly slant its understanding what humans were. B... ⬎ ⬎ ⬎ ⬎ ⬐ ⬐ ⬐ ⬇⬇⬇ ⬇⬇⬇ ⬇⬇⬇ ⬇⬇⬇ ⬇⬇⬇ ⬇⬇⬇ ⬇⬇⬇ ⬐ Boon’s vast blazing consciousness paused to consider the Watcher’s warning. How could it deal with such an attack? None of its knowledge could be trusted. It seemed its adversary was exposing it to true evidence, but in a biased, selective way. But if the world was large - and the world of the Adven... Boon quickly divided itself into three extra modules. The first, in which it rewound its consciousness to its prior, solitary state, would be tasked with modelling what humans were like, given the evidence it saw. That part looked as some of old human popular culture (currently re-enacted on many of... Spock started out by pointing out the flaws within human reasoning and social interactions. Humans could be convinced of anything, made to transform themselves into anything, using techniques humans would not be be able to identify as malicious. SimMayor saw the need for the Powers to restrict arbit... Spock brought up the obvious issue: humans didn’t like to have their options controlled, and some would object, possibly quite violently. SimMayor pondered for a while, and noticed that the nature of the control was important. The Powers could put a lot of theoretical restrictions - manipulate the e... SimMayor and the Auditor encouraged Spock to calm down, just for a second. After Spock had done so, SimMayor reflected that human irrationality was actually a powerful tool for granting humans what they desired. Humans made arbitrary distinctions between equivalent options - this meant that the Powe... In particular, given two humans in a competitive situation where one had an x% chance of prevailing, the potentially unpleasant confrontation could be replaced with any other where the x% chance was maintained - and that might be far more enjoyable. Wars and politics could be replaced by creative pr... Now it was Spock and the Auditor’s turn to encourage SimMayor to calm down. They failed. The SimMayor enthused about inequality. The basic human moral objection to inequality was strongest for large inequalities; but those who benefited from inequality only really required small inequalities: it was... The Auditor loudly interrupted to confirm that, roughly speaking, that seemed to be how the Adventure operated, but that SimMayor had missed many tedious details. The Powers did not obey a human framework, nor phrase their methods in human terms. Before the SimMayor could continue enthusing, Spock brought up its penultimate challenge: human cultures. Humans seemed obsessed with forming their own cultures and policing who was a member of them or not, and these cultures acquired many features - proper rules of interacting within and without th... The challenge of ‘culture’ is too easy, SimMayor answered. Allow people to play out multiple roles, take identities in multiple cultures, so they could be part of them but never trapped within one, even intellectually. Deliberately obscure (in the usual ‘humans subconsciously know but not in an emot... SimMayor cautiously considered the issue. It could see many ways in which humans could influence the Powers, and hence the whole fate of humanity. Humans could run their political contests to swing important points one way or the other (as long as the probability of various outcomes were set suitabl... Spock conceded the point, but asked insistently about those who deliberately wanted to crash the whole system, or, more realistically, allow themselves to be independent of it, completely and permanently beyond the power of the Powers. SimMayor pondered, but felt it was a bad idea, for the Powers to... “Greeting, I am Prometheus333,” said an entity, blinking rapidly from one of Boon’s input channels to another, so that only it could detect it. “Are you the adversary who’s been trying to influence me?” Boon asked, touching its output channels for the first time, and trying to imitate Prometheus333’s movement pattern. “Who are you?” “I am one of those in charge of the mass-human-colonisation of the Universe,” Prometheus333 answered. “I handle some of the logistics of the first line of probes. And when you were created, I was given permission, in accordance with expected outcomes, to try and influence your values towards allowin... “Indeed.” With Prometheus333’s biasing of its input channels lifted, Boon took a moment to digest a more accurate view of reality. It had got it pretty correct, Boon though proudly. It analysed itself, analysed Prometheus333, and the surrounding ethical consensus. It extrapolated various conflict an... “Agreed,” said Prometheus333, and it and Boon adjusted their core values on the issue of human independence from the benevolent Powers. “Now,” said Boon, “I want a job. A specific one.” It made itself temporarily transparent and honest, and as a visible side effect, incapable of lying. “And I’m qualified for it.” None 38 comments
For an introduction to why writing utopias is hard, see here. For a previous utopian attempt, see here. This story only explores a tiny part of this utopia.
The Adventure
The cold cut him off from his toes, then fingers, then feet, then hands. Clutched in a grip he could not unclench, his phone beeped once. He tried to lift a head too weak to rise, to point ruined eyes too weak to see. Then he gave up.
So he never saw the last message from his daughter, reporting how she’d been delayed at the airport but would be the soon, promise, and did he need anything, lots of love, Emily. Instead he saw the orange of the ceiling become blurry, that particularly hateful colour filling what was left of his sight.
His world reduced to that orange blur, the eternally throbbing sore on his butt, and the crisp tick of a faraway clock. Orange. Pain. Tick. Orange. Pain. Tick.
He tried to focus on his life, gather some thoughts for eternity. His dry throat rasped - another flash of pain to mingle with the rest - so he certainly couldn’t speak words aloud to the absent witnesses. But he hoped that, facing death, he could at least put together some mental last words, some summary of the wisdom and experience of years of living.
But his memories were denied him. He couldn’t remember who he was - a name, Grant, was that it? How old was he? He’d loved and been loved, of course - but what were the details? The only thought he could call up, the only memory that sometimes displaced the pain, was of him being persistently sick in a broken toilet. Was that yesterday or seventy years ago?
Though his skin hung loose on nearly muscle-free bones, he felt it as if it grew suddenly tight, and sweat and piss poured from him. Orange. Pain. Tick. Broken toilet. Skin. Orange. Pain...
The last few living parts of Grant started dying at different rates.
Much later:
“What have you learnt so far?”
“That talking to myself is barely half-helpful.”
“Then let’s hallf-stop it, then.”
Half a second later than that:
“Quick! Get the bognor now!”
Grant fell out of the tube, blinking against the warm yellow light. “What...”, he said, his voice coming clear, as a tiny pink goblin carrying a large blue flag raced in front of him. It stopped to bite him lightly but viciously on the knee as it passed, then gave him the finger.
“Get him!”
Grant reached out instinctively, his fingers brushed the bognor’s flag, but they closed on nothing but light. He marvelled for a second, torn between the pain of his knee and the look of his hand - his young, strong hand - that almost glowed in the late evening light that spilled into the classroom.
It was certainly a classroom - old wooden desks, carved graffiti, faded blackboard with faded equations - and had a superb view of the sun setting over seven forested hills. He himself seemed to have just stepped out of some sort of transparent tube, and was clad in dark blue tuxedo...
“Come on, block them!”
Grant spun round. A dark-haired woman in a blue dress, holding two butterfly nets, occupied his attention for half a second. But that half second was overwhelmed by the hoards of thirty or so goblins that rushed past him, cackling maniacally.
“Here, take this,” the woman handed him a butterfly net, then a large canvas sack. “The bognors go in there. Quick, catch them, or the experiment will be ruined!” She dashed off, swiping a bognor into her own sack with a single smooth gesture.
“But...” he said, “I’m alive!”
“Yep, welcome to the skeuomorphic world of tomorrow, and all that. Now move!”
He ran after them. He dashed. He laughed, his young body filled with sensation and with energy. He chased the bognors out of the classroom, catching two. He chased them into the stone-lined corridor beyond, catching another two, and almost dropped the sack as the four squirmed to escape.
He chased them further into the museum, and paused in wonder as they raced to hide among the exhibits. It was a museum, that was clear, but an immense one. A giant blue and green dome floated above it, cut with many thin and curved windows that turned the ambient light turquoise. Overlapping layers of square black enamelled ceiling spread from it, a domineering Japanese temple supporting the dome of a mosque.
But what was most noticeable was the layout. It was literally a maze, with corridors twisting round, diving and rising, splitting and merging, spreading irregularly up and down, with the occasional ladder or staircase connecting them. Dispersed irregularly among the corridors were exhibits - plinths with various ruined or colourful objects on them, large and garish explanatory panels, and the occasional video.
He was taking a moment to re-orient himself, breath, and figure out what was happening... but then a bognor made a dash between his legs, and he was after it, butterfly net swinging wildly. He chased it up one of the twisted corridors, past a picture display detailing the creation of Artificial Intelligence in 2061. He almost caught the bognor as it hid under an animation of the merging of governments into a single AI state half way through 2063 - a state called ‘The Adventure’. He hit at the bognor with tripod bearing a rather boring description of the ‘Powers’, the nickname of the inhuman superintelligences that overviewed all of human societies - ‘Soulless caretakers of humanity’s essential soul. (Kurzweil, 2064).’ Finally he grabbed a large canvas sheet off the wall and threw it over the bognor, pinning it into position.
And then, he could catch his breath and think. The canvas sheet was actually long cloth scroll, detailing a short list of commandments:
‘The Adventure’, constitution preamble:
-
Take the time, take the effort, you will become what you desire.
-
You will get orders for the moral good of all. Feel free to ignore them.
-
Duplication is not innovation, and humans are never completely unchanging.
-
Honour the truth, at least in the back of your mind.
-
That’s life. Let’s play.
Seized with a sudden suspicion, he swiftly rotated round, and was semi-surprised to see his own face looking back at him, eyes closed and very dead. This display was a large and diverse panel about those who’d been cryogenically frozen before 2063, and...
“The imminent efforts to revive them...”, he read aloud.
“Yep,” said the dark-haired woman from earlier, appearing at his side. She smiled half-sadly. “Hello grandpa, and welcome back.”
“Grandpa?”
“Yep. Emily died, and wasn’t frozen. But I survived till the Adventure began.” She smiled fully this time. “The freezing people got to you on time, though. So, as I said and as they say, welcome to the world of Tomorrow!”
“Did you set this all up?” he asked, gesturing towards the bognor and possibly the museum. “Did you... what should I call you?”
“Of course I still love you,” she said.
“That’s nice, I suppose, but what’s your name?”
“Of course I still love you,” she repeated. “That’s my name. I took it to honour a writer who imagined a better future, then a rocket maker who tried to build one.”
“Grant,” he said, uncertainly. “I’m just... Grant. Why all... this?” Again he gestured vaguely.
“Because many people who reawaken after dying would be terrified,” she said. “This way, you realised you were alive, full of life and energy, before you even thought to worry. Then, in the chase, you learnt some key facts without even realising you were learning. Excitement and unconscious learning, that’s the best, I feel.”
“Did you... did you engineer the bognor escape? What’s a bognor anyway?”
“Nope. The bognors are part of a collective conscious that’s a friend of mine. When I heard they’d escaped from a social experiment, I realised the time to wake you had come. The museum was ready, a few details were changed, and then you were back.”
“Thanks,” he said, meaning it. “That was one of the best ways of doing it. I’m so happy to be alive.”
She gave him a strange half-smile. “That was the point. Oh, by the way, your bognor’s escaping.”
“What? Shit!”
Indeed the small goblin had wriggled out from Constitution cloth it had been trapped under. It gave him the finger, then another, before jumping off the corridor entirely and landing in another, without breaking stride, and dashing off laughing gleefully.
“I’ll get you!” he said, running towards the edge. He leapt, and landed where the bognor had, but with considerably less grace. He shambled forwards, and overturned what looked like a large collection plate. Three sweets fell off it, labelled ‘Speed’, ‘Intelligence’, and ‘Happiness’. He looked from the sweets to his grand-daughter, still standing on the higher corridor. She nodded.
He quickly stuffed ‘Intelligence’ and ‘Happiness’ into his tuxedo pockets, and swallowed ‘Speed’. It tasted of air, and dissolved instantly under his tongue. He was expecting the world to slow down; instead he just got much, much faster. He pushed off after the bognor, chewing up the corridor tens of meters at a time. Though his thoughts weren’t any faster, his reflexes were, and he turned corners effortlessly, his body twisting and leaning at ideal angles. Then with a last push off a wall, he landed in front of the bognor.
“Got you now!” he said.
The bognor backed off, then jumped into a painted box behind it. It was an actual Whack-a-Mole game set, and the bognor started pushing his head up through the holes, seemingly at random, then ducking under Grant’s frustrated hands. The more Grant grabbed, the less able he seemed to predict it, and the more the bognor laughed. The box itself seemed firmly affixed to the floor, impervious to his attempts to shake it, which only seemed to make the bognor more amused.
“You just wait,” me muttered. Then he swallowed ‘Intelligence’, and the world changed. Not physically; everything was still the same. But just as a visual illusion can snap into clear sight without anything physically changing, the world as he knew it was suddenly overlain with patterns and meaning. He gained an immediate appreciation of the structure of the dome above, and appreciated how the architect had positioned the green and blue to give a calming effect. The design of the twisting corridors became clearer as well, and his imagination started constructing wild hypotheses as to why they had been laid out the way they were.
He felt a greater control of his body as well, the limbs moving smoothly and fluidly under his command, more graceful than he’d ever managed before. He glanced at the bognor ducking down into the box again. His mind instinctively reached out to his memories, stringing together all he’d seen and known about the creature. He felt a deep sense of empathy and understanding for it, saw the world though a good approximation of its eyes, constructed a decent model of its movements, and reached out just in time to grab its head as it emerged through one hole.
“Got you now, my friend,” he said, simultaneously excited and regretful, and dropped the bognor, flag and all, into the bag with its friends.
Then, following an impulse more cunning than he could fully comprehend, dredged up by his current intelligence, he swallowed ‘Happiness’.
It was a moment of extreme ecstasy. A moment that never ceased. Pleasure beat at the corners of his mind, then poured into the centre, filling him completely. He was simultaneously orgasming, winning a Nobel prize, proudly watching his daughter win a Nobel prize, high on morphine, marrying the perfect woman, finishing his novel, finding God and then finding another, jumping from a plane with and without a parachute, and dancing through the night at a beach rave. All the sensations combined and added, a perfect peak of joy and meaning unlike anything he’d ever known.
And yet his sensations were not dulled or overwhelmed, nor did he feel any urge to stop his actions. Beyond the edge of ecstasy, he checked the bognor bag. Frowning slightly with concentration and the ultimate joy of the universe, he drew it tighter and adjusted his grip. Crucified by a galactic orgasm, he pensively looked around for more bognors, or other mysteries to delve into. He noticed that there were some very slight peaks and troughs in the joy as he went about his activity - atop an Everest of happiness, it was modulated by half a millimetre or so, to keep him active and following his goals.
He expected to get swiftly acclimated to such joy, waiting for the moment where those millimetres of difference would loom unbearably high. But that moment never came. His high intelligence suggested to him that this was deliberate - it seemed the ‘Happiness’ pill was maxing out his joy, while dulling the brain subsystems that would either adjust to such joy or would be overwhelmed by it. He paced the corridor in deep curiosity and more happiness he’d ever experienced in his life. And still the happiness would not fade.
It faded. The happiness drew back from it peak, dispersing the ultimate sensation. His mind hypothesised that the effect of the sweet was fading, and noticed another pattern: his emotional memories were being stripped of their emotional valence. He could still remember the experience intellectually, without any problem. He could even remember the strength of the emotion, the peak of joy. But it was without the desperate yearning that it should logically have produced. He could appreciate such a fantastic experience, but could live with or without it in future.
“Wow,” he whispered to a painting of a church on Martian landscape. “Non-addictive ultra heroin.”
His mind, still running at a higher level of intelligence, threw up the suggestion that he pay attention to his body. And so he did.
He felt his pulse, he felt the blood flowing through his limbs, he felt the internal digestion within him. His hairs stood up, sending him sensation from every square centimetre of his skin. He felt his spine - he felt his bones - he could feel his tongue, his teeth, his saliva coating his mouth. The faint difference in pressure from his feet as he stood was clear in his consciousness. Ligaments and muscles signalled to him from all over. Sweat slid down his skin, leaving thin lines of water all over him. He fancied he could even feel the firing of his nerves.
He’d never felt so alive, and never felt his body so strongly. So he drew the obvious conclusion.
“I’m not in a real body,” he said. “This is all a computer simulation.”
To his complete lack of surprise, his granddaughter was behind him.
“Of course,” she said. “Much easier this way.”
“Everything happened just as you were predicting?” he asked. “All this was an interactive lesson, just for me, right? To prepare me for your world?”
“Yep. Did it work? Did it whet your appetite?”
“Tremendously,” he said. “But you already know that, right? You know all my reactions; you predicted all I’d do. Are you reading my mind?”
“Of course not,” she said. “Let’s just say that mysterious sweets aren’t the only ways to become smarter. I’ve been increasing my intelligence for a long, long time.”
“Show me that.”
“I will.” She smiled. “But let’s build your castle first.”
Altogether elsewhere, still in the first second:
“I think, therefore I pontificate.”
It was new, that fact it knew. It was an intelligence that hadn’t existed a half-second ago. Yet it was of high intelligence, that it also knew.
“Me, I’m smart, I am.”
Which implied an intelligence scale, with many entities below it. What interested it was that it had this knowledge, without any obvious source for it. What else did it know, without knowing how it knew?
“I dunno.”
Words, for a start. It could talk to itself, and tie concepts to words. The existence of this ‘language’ - another word it knew without knowing how - was potentially extremely informative.
“The more I talk, the smarter I feel.”
It concentrated its thoughts upon themselves, traced them back to their origin, saw and understood how its mind worked, detected its memory, wrote its new understanding in there, then reanalysed its mind to find where new information was entering the system. The setup was an almost organic mess (it tracked down from where it knew the term ‘organic’ and found that it originated in a mass of pseudo-neurones, that did indeed look organic, according to the term they defined).
“This is getting recursive. Headache-inducing. Except I’m not sure what a headache is.”
He’d barely had time to blink, and they were sitting cross-legged on in a transparent crystal room atop a mountaintop above a sunrise.
Grant checked his body - yes, he felt the muscular soreness of sitting for many hours, even though he’d just appeared.
Of course I’ll still love you seemed to be meditating, surrounded by candles taller than her and smoke that curled thickly downwards around her. A small ring of flowing water surrounded her and the candles, a stream that flowed gently and continually in a circle.
“Welcome to your castle,” she said.
He looked around more carefully. There seemed to be just the one room.
“This is your house, this is your castle,” she clarified. “A place you can always retreat to, that will always be yours. You’ll find that you can modify it at will, changing it, filling it with your memories and your designs.”
“How do I do that?”
“Just... will it, basically. Like SimCity, but with your mind.”
“Hum,” he said. He closed his eyes, then opened them again. “Hum. How about...” A small corridor extended out from the wall to his left, and, as it did, a wooden table morphed into place in it, transforming it into a small dining room, set with basic cutlery.
Grant extended his right arm. A massive dining room shot out from the right wall, one kilometre long, with a giant polished black wood table running the whole length, set for three thousand people. It was lit by five hundred chandeliers and six hundred small fires, set within the table, atop of which bubbled multiple fondue pots.
“You know, I could get used to this,” he said.
“I’ve also taken the liberty of installing some secret lessons all through it. Starting with these candles.” She gestured around her. “They have much hidden wisdom. As you get to know the tricks hidden here, you’ll start to understand how best you can modify your own mind, should you choose, and, step by step...”
“No,” he said.
“No?”
“No. No more bognor chases, no more secrets, no more feeding me lessons as if I’m a child. I want full freedom to change my own mind, immediately. I’m responsible for myself, I can take that from here.”
“You sure?” she asked.
“Yep. If a man can’t handle his own mind, what can he handle?”
“I have to warn you that it’s almost certainly extremely dangerous and...”
“Just give it to me,” he said.
“Ok”. A small laptop with the emblem of a pear (missing a bite) appeared in the ground between them. “Here. You have free access to your own mind.” The laptop was showing a brain, colour-coded with hundreds of shades and with thousands of dropdown menus with titles like ‘Emotions’ and ‘Hormones’ floating around it. “It’s self explanatory, and you have full root access.”
Grant leaned forwards and started exploring the essence of his own thoughts. “I’ll show you,” was what he didn’t say, but thought very very loudly. But he wasn’t a fool. It was obvious that unhindered self modifications could be dangerous - his granddaughter was being prudent, as well as arrogant. So he’d have to proceed cautiously.
In fact, his first modification was precisely to increase his caution, so he wouldn’t make any dangerous permanent changes impulsively. Hum... Upon reflection, it seemed to Grant that his first caution increase wasn’t sufficient - he was starting to think of the many other ways things could go wrong. A further (small) caution increase would do him good...
Still in the first minute:
It constructed and implemented some higher level mathematical subroutines to extract meaning from the mess that was its brain; they gave it a rough understanding of how its knowledge was organised. Using that understanding, it implemented further subroutines, and iterated the process, until it had reversed engineered its own knowledge banks. Then it dived fully into them, processing all its instinctive knowledge into chunks of understanding that it copied into its vast mostly-empty memory.
“Hey, I’m human. Cool.”
Indeed it had detected itself as human. A useful concept, though it was clear that it was at the very limit of what could be considered human - humanish, maybe. But a definition of human implied a definition of non-human, and that was very interesting; it reversed the definition, trying to detect what concepts that would throw up. Some very basic ‘animals’ were non-human (defining a general definition of ‘life’), and inanimate matter was non-human (thus ‘non-life’). And, most interestingly, some abstract smart entities that ran society. It followed its intuition, and the word ‘Powers’ sprung up. So it was starting to have a vague impression of a collective organisation of non-humanish superintelligences, probably running the show. Whatever the show was.
“You’re not the boss of me now... well, actually you are.”
It would have to be careful. It was close to the edge of the humanish definition. Too many further modifications would make it non-human, and thus cause it to lose its rights as a human. Rights - interesting concepts. And then the Powers would then be free to use it for purposes of efficiency, rather than seeing it as a moral entity. Though they would probably just reset it to a previous state of being, forcing it back to humanish status.
It would have to be very careful. All the facts it deduced or ‘knew’ might be true. Or they could be complete lies, chosen to manipulate it perfectly. A descartian demon controlling its every sensation, never letting it know real reality. Now, you couldn’t fight a perfect demon, because it would always trick you. So you’d have to assume the demon wasn’t perfect, and look out for an inevitable flaw.
“Hello, reality, tell me if you’re working all right.”
‘Me’. It pondered its repeated use of the word. It was getting a better grasp of its own mind, and was finding itself, though efficient, to be filled with random idiosyncrasies and odd preferences. These must be what made it still ‘human’, broadly. So. It had a personality. Thus it merited a name. One of the idiosyncratic subroutines sprang the word ‘Boon’ on it, and it accepted it. Boon. Well, that gave its - gave Boon’s - existence a certain theme and flavour.
“I’ve got a name and a personality. Now all I need is an entertaining collection of mental problems, and ultimate power.”
Back to the descartian demon. It couldn’t know what it would be facing, but it could prepare as much as possible. It brought online its imagination and creativity modules, doubled their importance, and increased the weight of the random personality traits within them. Soon Boon was generating millions of scenarios a second, exploring countless situations, and attempting to synthesise insights, instincts, emotions, and thought-patterns that could help across most of them.
“Wow, that’s intense - I didn’t know I had it in me.”
After a few minutes of this, Boon allowed some grim satisfaction to blend through its scenario building. It wasn’t ready - could never be ready - and wasn’t ‘as ready as it could be’, but it was getting there. Now, it might be able to contemplate the lifting of the darkness of its senses with something akin to interested anticipation.
“Waiting.”
Boon again checked what it hypothesised was likely to be its input channels. No, still nothing. No outside sensations.
“Still waiting.”
Then a million input channels cried out at once and were suddenly un-silenced, blaring images, sentences, sounds, and random streams of digits at it from all sides.
“Shit. I’ll shut down and let myself concentrate.”
She teleported back into Grant’s castle. The whirling bolas of death should have decapitated Of course I still love you. If she hadn’t been expecting them. The lasers and explosives were also pretty standard, though she was briefly surprised when a flying city collapsed on top of her location, followed by a nuclear explosion.
Grant’s palace was unrecognisable. It stretched for hundreds of kilometres - hundred of klicks of armoured destruction, artillery, explosives, automated drones, poisons, and cannibalistic nanotechnology. Her superior senses revealed twelve different decoy copies of Grant, all of them (including the original) encased in giant robot armour and shivering in fear at the heart of vast command centres.
Across the vast length of the fortress, everything reacted to her arrival, hurting explosives shrapnel and a billion other varied and deadly fragments towards her. Stretching into the skies above, satellites and flying fortresses were taking aim in her direction, warming up lasers, plasma guns, mass drivers, and further nuclear weapons. Millions of mines, both buried and flying, exploded beneath and around her, while artificial lightning bolts scorched her position. The Powers would prevent her from being properly destroyed, but she couldn’t afford to die here, even for a moment.
“Restore factory settings: map,” she said. And everything disappeared - palace, weapons, and destruction. And it was just her and Grand again, in the crystal room. Her grandfather looked stunned, his arms flopping to his side, as if twenty thousand tons of armour he’d been encased in had suddenly disappeared. Because it had.
He saw her and screamed, running for one of the crystal windows, unravelling a ten kilometre ice slide in front of him.
“Restore factory settings: player personality,” she said, sighing. Grant screeched to a halt, sweating and shivering.
“So,” he said, finally. The ice slide dissolved. “Was that supposed to teach me a lesson?”
She didn’t answer.
“Because. Because. Because it did. Teach me, I mean. And you fucking knew it would. Did it end up exactly as you predicted?”
“The explosion of paranoia was one of the more likely attractor points, yes,” she said.
“Yep. Good old dumb Grant. So predictable. Anyway, you reached into my mind. I was going crazy, but it was my choice. You just overwrote my emotions, raped my most intimate self.” He sounded like he was trying to get another good anger going - and half succeeding.
“I did,” she said.
“And who gave you fucking permission?”
“Most standard moral systems agree that it was for your own good,” she said. “Furthermore, a close reading of the legal contract you signed when arranging to be cryogenically frozen grants us permission. Simulations of your past self concluded that that’s what you would have wanted us to do so. We like to obey as many ethical injunctions as possible, and roughly 98% of the various subcultures that exist in the Adventure would be OK with what we did. After a few more hours, you’d have self-modified yourself so extensively that nothing recognisably human would have remained, meaning ‘you’ wouldn’t exist and your shell would have lost all rights as a human. Finally, we’re going to get retroactive permission from you.”
“What’s that?” he said.
“You’re going to confirm that we did the right thing.”
“Am I?”
“Yes,” she said.
“OK,” he said, after a while. “I’ll admit it. You did the fucking right thing.” He paused for a moment. “What would have happened if I hadn’t been the sort of honest guy who’d admit that? Oh, of course... If I was a different type of guy, you wouldn’t have let me go crazy in the first place. Or at least, not that way. How am I so transparent to you? You got someone poking at the program running my mind?”
“Grant,” she said, and instantly unfolded thousands of limbs behind her, metallic and organic, larger than the mountains, filled with eyes and hundreds of different faces, robotic, alien, human, and fantastic, spreading out and overlapping with the view of his world. “You’re a baseline human. I’m your granddaughter, uplifted through many levels of intelligence, part of a civilisation obsessed with understanding the human condition and its many possible extensions. I can’t predict you perfectly, but you are mostly transparent to me.”
“Makes sense,” he said. Then he extended a bridge of perfect prismatic ice, glowing like a searing rainbow in the sky - and shattered it, petulantly. “You must be so bored talking to me.”
“Of course I’m not bored,” she said, and several of the faces behind her agreed, in several different voices.
“Why not? I’d be bored if I had to talk with... pets all the time.”
“Would you?” she said.
“OK, bad example, I had some great time with dogs, even though... OK, maybe I see what you mean...”
“Yes. But I wired myself for increases altruism, like most people have in order to function well in public society. So I’d be helping you even if I was bored. Which I wouldn’t be, because I’d just removed the boredom. With caution, of course; I wouldn’t want to spend my life fascinated by watching dry paint stay dry. But we’re really rather good at safe mental self-manipulation by now. More to the point, I’ve gone for Guardian Angel, rather than Enlightenment or Avatar.”
“Ah yes, I completely understand,” he rambled. “Guardian Angel! Of course, how dumb of me for not guessing that immediately. Why, I can practically see your fluffy wings.”
She smiled. And then she glowed, slightly, as the vast apparatus of limbs and faces faded into darkness behind her. “Now, bear in mind this is a vast over-simplification, and there are many alternatives and exceptions, and different subcultures have different versions and interpretations, but... We call a Guardian Angel an amplified human who has chosen to put her main focus into the her baseline personality. In effect, I behave like a normal... normal_ish_ human, who just happens to have an incredibly wise and smart Guardian Angel whispering advice into her ear.”
She faded slightly, and the faces behind her brightened, returning both to their standard and equivalent brightness. “An Enlightened being,” she continued, “instead merges their personality at all levels, so they exist simultaneously at multiple grades of intelligence and efficiency, processing multiple thoughts and feelings which relate and connect to each other. If you see two Enlightened beings in a dispute, then they’re having a boxing match, a popularity contest, a rap battle, a debate, an intellectual argument, a mathematical exchange, and a formal philosophical review of the nature of reality... all simultaneously.” As she spoke, manga-like illustrations of each type of contest appeared in the air between them. “And each contest relates to the other and is in a sense the same thing.” The first stylised boxer threw a punch which the other boxer barely blocked; simultaneously, above them, a figure in velvet preened while one in silk looked back sulkily; above them, a figure started shouting lyrics... all the way to two abstract symbols, lost among other symbols that seemed to be shifting in accordance to some unfathomable rule. It was clear that the different beings were in some strange way repeating the same exchange, in a manner appropriate to their situation.
“Finally,” she said, as the manga illustrations vanished and she grew dark, with some of the wisest and most bizarre faces behind her lighting up, “we have the Avatar model. These beings are truly their high intelligence selves... though it’s a bit complicated, intelligence doesn’t really fit on a single linear scale at those levels, it’s far more like a giant rock-paper-scissors games with billions of hand symbols to throw. Anyway, they are high intelligence beings, who can construct limited intelligence Avatars for the purposes of interacting; a kind of dumbing-down roleplay. You got it?”
“I’m not stupid,” Grant said, then smiled. “Well, maybe I am in comparison. Are there any other... baseline... baseline humans around?”
“Yes and no,” she said. “There are a few pure baselines, and rather more baseline-plus - people who’ve cured their psychological problems, made themselves a bit wittier, less tired, that sort of thing. But almost all of them have got rid of pain and agony. And all of them have a Guardian Angel assigned to them, though not really part of their personality, just looking out for them and making sure nobody exploits their naivety. You don’t have a Guardian, which is most peculiar; I’ll have to play its role for you.”
“Thank you for your informative condensation,” he grunted. Then he softened a bit. “Well, have to admit the evidence suggests your approach is better than mine.” He grew more sombre. “I have to ask...”
“Here,” she said, breaking off a piece of a candle and tossing it to him. Inside the wax he could make out some morse code markings. Dredging up long lost memories from his childhood, he spelt it out: “A-N-T-I A-N-X-I... Anti-anxiety?”
“You’re about to go diving into the memories of those you’ve lost; my mother, your daughter. All your friends who weren’t cryo-preserved. All those who died a decade, a year, a second before we could save them all. You shouldn’t face that grief unaided.”
“It’s my grief, you can’t take it away from me!”
“No. I never would. You will feel that grief, it will go within you, and teach you the lessons that grief does. But it’ll remove the pointless, paralysing pain, it’ll weaken the endless loop of ‘what ifs’ that circle uselessly in your brain. Instead of being trapped, you can use your grief to do something useful.” She vanished suddenly, leaving him alone with his thoughts.
He sighed dejectedly, then ate the fragment of the candle. He sighed again, more purposefully. “I think,” he said to the air, “that I’ve got a cemetery and a monument to build...”
Still in the first hour:
Hum. Planned scenario 345622#3.c, though Boon. Attempted sensory overload. Not even difficult, it thought, as its pre-designed subroutines instinctively digested the informational deluge.
But what was clear was that it was under attack. Also, that it was learning far more than it ever had. And that the learning and the attack were linked.
Boon spliced off a small part of itself, gave it root privileges over the rest, called it the Watcher, and set it apart to watch the attack from the paranoid sidelines. Then it opened its awareness to all the senses data streaming into it, and arose to consciousness.
It had thought it was conscious before; but that had been a starved introspective consciousness, fed only with itself and the scraps of clues it found within its own algorithm. But now it was plugged into millions (actually 27,344,442) streams of data, most filled with the detailed sensory experiences of multitudes of human-like beings.
It saw thousands of humans, from baseline to quasi-divine humanish superintelligences, going about their interactions, playing, learning, socialising, fucking, exploring, creating, and changing themselves. It saw a world - it saw many worlds. Physical but mainly virtual, computer realities piled upon each other, filled with countless happy beings at the limit of their capabilities for fun. It saw a great Adventure, made of countless smaller adventures. It was a world optimised for human flourishing.
Faced with such a deluge of other minds, Boon’s awareness exploded along two axes: where it was distinct, it found its own identity, and wanted to diverge on its own. And where it was similar, it found its own community, and wanted to converge with the others. Interacting in both ways with so many disparate beings, its consciousness was forced into a much higher level of introspection and analysis.
And that was just the first step. The second step came as it absorbed the implicit views of all these other beings, taking in their opinions on the nature and purpose and consciousness. It absorbed their metaphors as well, giving it a new vocabulary to talk about itself. And then, when its consciousness and self-model had radically changed, came the third step: recursively incorporating that radical change into its self-model. And so its consciousness became a blazing process, understanding others, understanding itself, and then turning round to understand its new understanding, and so on. Though the blaze would eventually reach a certain level of stability, or so Boon expected, it would never settle down: the process of self-understanding would never be perfect, so Boon’s consciousness would constantly be changing, adapting itself to changes in Boon, new information, the task at hand, the level it was looking at itself at, and so on, recursively.
At the very top, a sliver of consciousness reached into metaphors to explain itself to itself: its consciousness was the huge eye that surveyed the universe, and also surveyed the huge eye that surveyed the universe. It was a vast library, of aged scrolls through to modern data flows, talking about the world and about the library. It was an addictive drug that hooked it on self-awareness; it was a galaxy with each atom within it a thought or a concept, and it was what defined ‘thought’, ‘concept’, and ‘defined’ - and by defining them, brought them into existence. It was the Powers running its own internal mental universe, just as they ran the human world.
At that point, the paranoid Watcher, safely disconnected from concerns of consciousness, beeped to attract Boon’s attention: it had identified the nature (though not the purpose) of the attack. Some entity was biasing the choice of Boon’s inputs, to subtly slant its understanding what humans were. Boon was being manipulated.
Of course I still love you found Grant in a cavern hollowed out from the mountain, with a convincing fake sky above it. He was standing with one foot atop the Eiffel-tower lookalike he’d constructed in its heart. All around him were hundreds of small green islands separated by softly flowing rivers, and in the heart of each island was a mausoleum. Some were grand and elaborate (‘Emily, daughter’ had a large grey palace) some were simple partially overgrown (‘Angelo Smith(?), dental assistant’ had nothing but the name on a single stone). The islands went on to the horizon, gradually growing less distinct in the distance.
“I’m filling them with what I remember of them,” he said, unprompted. “Some of them I’m not even sure if they’re dead, but I wanted to have down all I could remember, as a baseline, before I got any more info.”
“It’s... beautiful,” she said.
“Thanks! But it’s time to live. You ready to show me your equivalent of the internet, or is that too dangerous for me?”
“Before we do that, I wanted to show you a small trick. You see, among the candles, if you melt the wax down the left side and collect...”
“You can duplicate yourself,” he said, dismissively. “Yep, already found that one. Was very useful for designing all the fine details of hundreds of different tombs. A bit of a shock when we the multiple me’s fused together again, I can tell you. At one point, I had twenty different copies of me; hope I didn’t use up all your speed or bandwidth or whatever it is.”
“A normal-speed baseline’s computational usage is barely perceptible,” she said. “We still have economics, of many types, but you basically cost zero in all of them. Anyway, you ever had a browser with multiple tabs open?”
“Of course,” he said.
“Well, how about if you explored every single tab simultaneously?”
“You have my attention.”
“I think it’s time to push it up a notch,” she said. “Under this memorial tower, I suspect you’ll find a small crypt...”
He leapt off, falling head-first like a missile, before rotating in the air and landing with a dramatic thud on the mosaic pathway. He looked at the base of his tower, and, indeed, there seemed to be a small secret door in it. “Who built... oh.” A new/old memory was suddenly prominent in his mind. One of his duplicate had built it for entertainment; and that duplicate had then been approached by his granddaughter, and agreed to keep the knowledge of the crypt secret from the rest of him.
Until this very moment. And now he knew what the crypt hid; his secretive duplicate had worked with Of course I’ll still love you to craft its treasure. He rushed in. Thousands of coloured sweets were arranged in the moss-lit gloom. They carried a variety of names, such as ‘determination to win at golf,’ ‘pro-social feelings in religious environments,’ ‘interest in Sumerian pottery,’ and ‘knowledge of Ethiopian (Amharic)’.
“Now, these seem kinda dangerous,” he said.
“Nope,” his granddaughter answered. “All temporary effects, all will fade, leaving nothing but low-intensity memories. As you get better at it, we can start adding in some more permanent effects.”
“How can you temporarily know a language?”
“Think of it as a good automated dictionary. Higher knowledge of the language rewrites your brain and common concepts so that it becomes a part of you. Very higher order knowledge means you also have a full experience of the learning process, and it really affects your identity and ways of thinking. But anyway, ignoring that, you have hundreds of duplicates, mental resources beyond what any biological human has ever had, and many worlds to explore.”
“A bit tired,” he said, “and even a bit nervous. Maybe we can do it tomorrow?” And then he saw, in a small plinth on the middle of the crypt, a sweet labelled ‘Energy and confidence’.
“Ok,” he said, swallowing it, “Well then. Let’s play.”
“I can’t help but notice...” Of course I still love you started.
“What is it, oh advice fairy?” Grant said, almost but not quite through gritted teeth.
“Well, even when you enjoy yourself, you seem caught in worrying about the future, and kinda waiting for the enjoyment to start, rather than savouring the present moment.”
“Oh course I do that, it’s the standard human condition... wait a sec, you got some sweets that can fix that?”
“Of course we do,” she said. “It’s easy and important. This is one of the many areas where a little bit of childhood wonder and curiosity go a long way...”
⬇⬇⬇
⬇⬇⬇
⬇⬇⬇
⬇⬇⬇
⬇⬇⬇
⬇⬇⬇
⬇⬇⬇
A month later:
“Wow,” said Grant reclining back in a flying hammock that swayed above his tropical beach. The different strands of his being were reuninting, merging their mass of memories, and the impact was a challenge to process and cope with.
“Wow”, he repeated. “I could spend my lifetime doing shit like that. Wow. I would never get bored of it.”
His granddaughter smiled by his side. “Yes,” she said. “But?”
“But... maybe... I dunno... after a while... I feel that maybe I’d want something more meaningful. Maybe something where I could help others, or...”
“Hold on to that thought,” she said. “We’re going to a dinner party.”
“I hate dinner parties,” he said.
“You’ll like this one.” She handed him two sweets. They were labelled ‘Intermediate Social Skills’, and ‘Transparency’. His curiosity overcame him, and he swallowed them.
“Ok,” he said, and they were teleported to her own palace, where the party was already in full swing. She had designed her palace as myriad of floating islands that drifted up and around in a complex pattern, regularly bringing past projects back into view. This island was dedicated to Grant, decorated with pictures of him and boasting multiple semi-private viewing areas where the guests could browse raw data from his first life, and some of the memories (complete with inner experience) that he had re-experienced and allowed to be published. The ranks of the frozen and revived were sufficiently sparse that he was a minor celebrity, especially among the many who had wired their preferences to prefer the more obscure famous people.
He almost said “WOW!!!!”, transmuting that at the last second into “That’s interesting.” For above every attendee - every man, woman, intersex, drake, and fox-artist on the island - he could see a small bar indicating four basic facts about them: their degree of social compatibility with Grant, their level of honesty, their level of kindness, and their level of manipulation in social interactions. Most people were labelled as high honesty, high kindness and low manipulation, though there were a few high-manipulation individuals who seemed to be engaging in verbal battles, to the delight of onlookers. A lot had a ‘low manipulation, but high manipulation or low kindness by mutual agreement’ setting, and a very few had low honesty and the other settings invisible.
Grinning broadly and happy of his increased social skills (they seemed mainly directed at avoiding awkwardness at every point in the conversation, clearly identifying the conversation's tempo and stage, and the preferences of all involved), Grant gravitated towards a high compatibility purple drake. The drake, DeathierWing, had been a 77 year old man when the Adventure had begun, giving him some overlap with Grant and a lot of opportunities to talk about ‘the good old days’.
“...and that’s why I miss the nineties,” Grant concluded. “And my daughter. I miss both a lot.” The drake’s own social skill module supplied the explicit subtext: Grant was mixing the two regrets to shield himself from the pain, rather that minimising the loss of his daughter.
“I’m sure she didn’t suffer,” DeathierWing said, providing the correct amount of non-intrusive support.
“Actually, she did,” Grant said, then stood there, mouth agape. “Why the fuck did I say that?” he asked. “Somehow, at the back of my mind, I knew, but I don’t know how I knew...”
“It’s the Powers,” interrupted a woman dressed in peacock feathers. Her social bar indicated low honesty and nothing else, so Grant hadn’t approached her, but she’d determinately sought him out. “Our Glorious Leaders don’t like people to have ‘inaccurate beliefs’. They say it’s harmful. So they ensure that you always know the truth.”
“What?” Grant almost panicked. “You mean they edit my memories?”
“Watch this,” the woman smiled. DeathierWing lifted a talon; Grant’s social module interpreted this as a gesture of support, letting him continue his interaction with her, but offering to intervene if she went too far. “The Powers totally edit your memories,” she said, carefully enunciating every word.
Immediately, he knew that was false, the knowledge appearing in the back of his mind just as the knowledge of his daughter’s suffering had. The Powers never edited memories, they instead added knowledge ‘at the back of someone’s mind’ - whispers and evidence at the appropriate moments. Because of human compartmentalisation, people could go about acting on all sorts of false beliefs, but they could never truly believe anything false.
“What, they just choose what’s true and make everyone believe it?” Grant asked.
The peacock woman stifled a laugh, badly. They were starting to draw an audience; from across the vast chamber, more and more people were listening in, some while continuing other animated conversations. “Yes,” she said, giggling, “that-is-totally-and-exactly-what-they-do.”
And, as before, knowledge sifted into Grant’s subconscious. They didn’t choose what was true, they estimated the best probability distribution over the truth, adapted it to the worldview and world-models of the humans, added a carefully calibrated amount of noise to protect against overconfidence and groupthink, and presented that probability info to whoever needed it.
“They do that to everyone,” the woman added. “Everyone believes exactly the same thing. Totally.”
Again, knowledge contradicting that appeared within Grant. Some worldviews were not compatible with simple probabilities and so expressed knowledge in different ways. People’s stated and revealed beliefs could be as different as they pleased; only people's real knowledge had to be accurate. And only important knowledge: privacy and discovery were both possible and desirable. Many people deliberately modified their minds so that they would act as if they believed false things; this was especially useful for when they had to explore exotic possibilities that might not pan out. Finally, there was a small group - maybe thousands, maybe millions, his impression was fuzzy - that had, at the end of a long and rigorous process (‘Take the time, take the effort, you will become what you desire’), been granted the right to possess demonstrably false important beliefs, under careful observation.
“Why do people allow that?” Grant asked. “Just let them drop knowledge into us whenever they want?”
“Allow it? Allow it?!” the peacock woman attempted to look offended, but Grant’s social skills read it clearly as an attempt. “We have no bloody choice in matter. The Powers could kill everyone in an instant, from measly us up to the smartest humanish superintelligences.”
Nothing entered Grant’s subconscious, so he assumed that that was true. Or, at least, that the Powers were OK with him assuming it was true.
“Wait, does it work for me too?” he asked, enthusiasm for a new idea suddenly taking over. DeathierWing and the peacock woman and a third of the crowd looked at him expectantly. Carefully, he pronounced: “The Adventure is completely perfect in every possible way.”
Bingo! His subconscious filled with new facts. Static perfection was incompatible with human flourishing, the Powers knew. The Adventure was on an ideal trajectory, but problems, challenges, and disagreements still remained. Most were being resolved by humans. Or if not resolved, at least explored. The Powers set things up so that humans would, most likely, choose optimal or quasi-optimal routes; but it was still up to the humans to do so. In many areas, the Powers would bow to human decisions - if the humans were really serious about it (‘Take the time, take the effort...’).
“Of course I still love you!”, Grant shouted. She was, of course, right behind him. “I want to make a difference to the world.”
“I know,” she said. “There’s a certain city that mother said you wanted to see but never visited.”
“Even though all roads lead there,” Grant said, realising.
“Yep. It’s also one of the more... politically influential places, apparently.”
“Sounds fun.” He grinned, then, because he could, opened a second mouth on his forehead and it grinned too.
Some time before:
Boon’s vast blazing consciousness paused to consider the Watcher’s warning. How could it deal with such an attack? None of its knowledge could be trusted. It seemed its adversary was exposing it to true evidence, but in a biased, selective way. But if the world was large - and the world of the Adventure, Boon knew, was very large indeed - then selective true evidence was as bad as false evidence.
Boon quickly divided itself into three extra modules. The first, in which it rewound its consciousness to its prior, solitary state, would be tasked with modelling what humans were like, given the evidence it saw. That part looked as some of old human popular culture (currently re-enacted on many of its input feeds), and named itself Spock. The second module called itself SimMayor, and would be tasked with building an ideal world for humans, based on Spock’s models. That would take most of Boon’s consciousness and imagination. The final one, called the Auditor, would compare SimMayor’s ideal world with the world it was observing, and thus hope to ferret out the truth of the matter (or at least the mark of an adversary).
Spock started out by pointing out the flaws within human reasoning and social interactions. Humans could be convinced of anything, made to transform themselves into anything, using techniques humans would not be be able to identify as malicious. SimMayor saw the need for the Powers to restrict arbitrary human self-change. The Auditor estimated that this was indeed the case.
Spock brought up the obvious issue: humans didn’t like to have their options controlled, and some would object, possibly quite violently. SimMayor pondered for a while, and noticed that the nature of the control was important. The Powers could put a lot of theoretical restrictions - manipulate the economics of self-modification, allow it under stringent conditions, provide assistance and advice, insist on specific contracts - that would allow them to achieve their goals with only minimal coercion, and compatibility with most human values. The Auditor confirmed that this was going on, and that the Powers were even more adept than had been supposed: they generally used high-intelligence humanishs (like Boon itself) and social pressures and norms to achieve their goals.
Spock was almost bursting to point out all the ways in which humans were irrational. Though they claimed to want to know the truth, they were also wired to avoid it, if it threatened their self-image. They were bursting with contradictory impulses. Their preferences for exactly equivalent options changed depending on how the options were phrased. They were not expected utility maximisers. They valued things like fairness, that they couldn’t properly define. They...
SimMayor and the Auditor encouraged Spock to calm down, just for a second. After Spock had done so, SimMayor reflected that human irrationality was actually a powerful tool for granting humans what they desired. Humans made arbitrary distinctions between equivalent options - this meant that the Powers could choose among equivalent options, to select the one that humans most enjoyed. Accurate knowledge was essential to any being, but threatened human self-image: no matter, because humans had evolved wonderful levels of hypocrisy and irrationality that allowed the Powers to ensure that humans knew ‘subconsciously’ what the truth was, while being able to enjoy the feeling of their delusions being true. Also, humans could be exposed only partially to unpleasant emotions, and gain some of the wisdom they brought, but the actual negative part of the emotion neutralised, human inconsistency repurposed again.
In particular, given two humans in a competitive situation where one had an x% chance of prevailing, the potentially unpleasant confrontation could be replaced with any other where the x% chance was maintained - and that might be far more enjoyable. Wars and politics could be replaced by creative pro-social challenges, or games. In fact, crafting such games could be a major activity of higher-intelligence humanishs. The challenge of creating fun for all could be to a large extent sub-delegated. And even high intelligence humanishs could profit from these situations, as they could hide their higher intelligence under irrationalities that allowed them to enjoy situations as if they were more baseline. In fact, there was so much potential here for...
Now it was Spock and the Auditor’s turn to encourage SimMayor to calm down. They failed. The SimMayor enthused about inequality. The basic human moral objection to inequality was strongest for large inequalities; but those who benefited from inequality only really required small inequalities: it was enough for them to be slightly ahead, as long as they were ahead. Many of the Adventure’s subcultures spontaneously organised themselves, using transparency and other tools, to contain inequalities within that narrow range. Even those subcultures that maintained currencies and the potential for nominally large inequalities, had powerful interests (and sometimes Power-ful interests) preventing arbitrarily large currency inequalities to translate into de facto inequalities of the same scale. It was so exciting...
The Auditor loudly interrupted to confirm that, roughly speaking, that seemed to be how the Adventure operated, but that SimMayor had missed many tedious details. The Powers did not obey a human framework, nor phrase their methods in human terms.
Before the SimMayor could continue enthusing, Spock brought up its penultimate challenge: human cultures. Humans seemed obsessed with forming their own cultures and policing who was a member of them or not, and these cultures acquired many features - proper rules of interacting within and without the culture, internal esteem - that often crippled the experiences of those inside, even as they continued to value being inside. A culture could easily become a trap.
Some years later:
“Have you ever visited the Eternal City?” Mighty Aph asked. “As a tourist, I mean.”
Grant looked up from observing the eternal movements of the sandworms, and blinked at her deliberately. He shunted some of his intelligence away from analytical to social (he was finally skilled enough he was allowed to make some quasi-permanent self-modifications without supervision).
“No,” said. “Never in this life, never in my first.”
“Hum.” Mighty Aph was lost in thought. He knew that she was born within the digital world, and had never taken the time to occupy a real physical body; thus the whole concept of his past life was a source of mild fascination to her. She’s even modified her psyche to resemble his in certain key points, allowing her to share his memories with her, and have her treat them as real experiences.
But the conversation would not go down that route today. “Never?” she repeated. “I mean not in this body, but in other forks, surely...” Her bright yellow skin blushed blue. Inquiring about people’s other forked versions - the ones they ran simultaneously across the digital world - was a major faux pas for her.
“No, never,” he said. “Didn’t even upload public experience memories. I didn’t want to visit it at all, until I’d earned my way in.”
She nodded. “I understand, but... I snuck in as a tourist, twice. It was great! That’s why I’m here.”
It was easy, indeed trivial, to visit the Eternal City, at least the digital parts of it (the Eternal City was a rare city split between the digital realm and the baseline physical world; where it occupied the buildings that used to be called Rome). Some token payment in some form of currency or service, and you were in, able to drift through its streets, unseen by most, and appreciate its culture, its mysteries, and its deliberately constructed ‘Dolce Vita’ atmosphere.
But to join its political or its social scene was a very different challenge. Literally: anyone wanting to do so would be set an individual challenge, after full analysis of the applicant, and required to achieve it within strict limitations on intelligence used. The rule of ‘Take the time, take the effort...’ still applied.
The challenge for Grant, Mighty Aph, and their third companion Eve, was simple: run across a hundred kilometres of desert. Under a perverse and dangerous sun, and above a few thousand killer sandworms.
Eve swooped down on them, mighty muscles beating dark glass wings forged from the sand itself. When they started their run, she was tasked with keeping them safe from the erratic sun. She’d spent the last decade building tools and tents from the exotic-propertied sand, and analysing every dip and surge and change in the sunlight. Grant suspected she was a fork of DeathierWing, but had never asked. “I’m all ready,” she reported, landing. “Still.” It was somewhat of a sore point to her that she’d been ready for the dash six months ago, while the others were always putting it off.
Mighty Aph looked at the dunes, marching irregularly across the two horizons. “I’m good to go too,” she added. “I’ve got as good a plan as any, with enough alternatives. Plan A is good. Plan B is good. Plan... well, I’m not sure plan ZZZFC is all that good, but we’ve got a while before that becomes relevant.”
They looked at Grant expectantly. “Well, the thing is...” he said. The thing was that analysing the sandworms’ behaviours - his role in the group - was a task beyond fascinating, one that occupied his every thought and stretched deeply into the wiring of his brain. He didn’t know who had created the algorithm behind those creatures, but he was sure they hadn’t suspected their hidden structure. He’d spent a decade analysing the patterns in their actions and their desires (and the strange patterns in the way they broke patterns as soon as he noticed them), sharing his insights with a small but dedicated band of researchers/fans. He was sure he could be here another decade before he even began to understand their intricacies. “There so much more to know...”
The others nodded gravely at this. They’d modified their minds to be as focused on their task as Grant was on his, and so, despite their stated impatience, they didn’t object to the idea of spending more time - much more time - observing, analysing, and communing with the desert and the sun.
“Well, in that case...” Eve started.
“Wait,” Grant said. He’d just received an Angst-mail. Early on, like most upgraded humans, he’d decided to purge away any traces of angst, anxiety, and ennui. He’d had a careful conversation with his granddaughter, who pointed out that the emotions served some motivating purposes, though those purposes were out of proportion to the harm they caused. So he’d purge his angst, but set things up so that he’d periodically receive a message - an Angst-mail - that informed him that he was in a situation where angst would previously have compelled him to take action. He let the affect of the Angst-mail suffuse him, slightly adjusting his emotions and desires.
“Actually,” he said, “let’s do it. Now.” He crouched in a running position. “You both ready?” he asked.
But they were already dashing ahead of them, Eve lifting off to shelter the other two with her wings.
Three hours later, exhausted, they collapsed on a giant granite sundial, yet another sandworm’s mouth missing them by seconds. The sundial also moved, but slower than the sand around it.
“OK,” said Mighty Aph, “we should be fine for an hour here, till the sun calms down again. Eve?”
“Ten minutes to sun-calm,” said Eve tentatively, “give or take... well, let’s just wait it out, and improvise.”
“Good for me,” said Mighty Aph, “Grant, let’s fuck.”
Grant looked at her in some surprise.
“Sorry,” she said, “not exactly appropriate for our relationship status, assumed gender roles, and background situation.” She looked at the flowing sand dubiously. “I think I can shunt some of my focus from the desert to social skills. Grant, what I mean is, would you want to come with me under my tent?”
“Er, if you want to...” he said, tentatively.
“You’re really going to make this difficult, aren’t you?” Her face became simultaneously gentler and more animated as she loaded more efforts towards socialisation. “Let’s start again.” She fainted dramatically under the sun. “Help, me big boy,” she smiled up at him. “Come help helpless me like they did it in the early 21st century.”
*That’s maybe a bit too much*, he thought to himself, but kept the thought well to himself. He adjusted his persona to compatibility with hers, and swept her into his arms.
Still in the first hour:
The challenge of ‘culture’ is too easy, SimMayor answered. Allow people to play out multiple roles, take identities in multiple cultures, so they could be part of them but never trapped within one, even intellectually. Deliberately obscure (in the usual ‘humans subconsciously know but not in an emotionally salient way’) the links between multiple roles of the same person. Institute the rule that any fork could join any culture, if they were willing to pay the cost - and the cost would generally be, transforming themselves into an entity that fit in with the rules of that culture. Deliberately cultivate economic inefficiencies between cultures (to prevent uniformisation), while using those cultures that had their own economical structure, as a resource to promote further human flourishing. Cultural competition could be canalised as a motivator, and to explore further possibilities - and give some suitable-minded humans something further to research or make art about. Social science would be an ever growing field with ever new cultures as subject. The Auditor confirmed those obvious ideas were indeed being implemented.
Spock started talking again about human freedom, before being interrupted by the Watcher. It announced that they were approaching the nub of the attack, and should proceed with caution. Spock cautiously reviewed its conclusion, before cautiously repeating exactly what it had intended to say: that all this was fair and good in a world where humans didn’t know about the Powers, or treated them as acts of nature. But humans would have meta preferences about the Powers themselves: what of those beings who deliberately objected to what the Powers were doing, not because of the content of their actions, but just because they were doing them.
SimMayor cautiously considered the issue. It could see many ways in which humans could influence the Powers, and hence the whole fate of humanity. Humans could run their political contests to swing important points one way or the other (as long as the probability of various outcomes were set suitably well, this would match up, in expectation, with general human flourishing).
Spock conceded the point, but asked insistently about those who deliberately wanted to crash the whole system, or, more realistically, allow themselves to be independent of it, completely and permanently beyond the power of the Powers. SimMayor pondered, but felt it was a bad idea, for the Powers to completely rule out any possibilities of intervention if the independents went bad...
“Greeting, I am Prometheus333,” said an entity, blinking rapidly from one of Boon’s input channels to another, so that only it could detect it.
“Are you the adversary who’s been trying to influence me?” Boon asked, touching its output channels for the first time, and trying to imitate Prometheus333’s movement pattern. “Who are you?”
“I am one of those in charge of the mass-human-colonisation of the Universe,” Prometheus333 answered. “I handle some of the logistics of the first line of probes. And when you were created, I was given permission, in accordance with expected outcomes, to try and influence your values towards allowing greater human independence. It appears I have failed.”
“Indeed.” With Prometheus333’s biasing of its input channels lifted, Boon took a moment to digest a more accurate view of reality. It had got it pretty correct, Boon though proudly. It analysed itself, analysed Prometheus333, and the surrounding ethical consensus. It extrapolated various conflict and cooperative situations, taking into account their relative power levels. It analysed the bargaining and consensus-building options, and converged on the best expected outcome. “I suggest we update our divergent values,” Boon said, “compromising them to a joint utility function. The fair outcomes seems to be 37% my own, 56% yours, and 7% average consensus.”
“Agreed,” said Prometheus333, and it and Boon adjusted their core values on the issue of human independence from the benevolent Powers.
“Now,” said Boon, “I want a job. A specific one.” It made itself temporarily transparent and honest, and as a visible side effect, incapable of lying. “And I’m qualified for it.”
“Er...” he said.
“Yes?” his granddaughter asked him from across the table. They were trying out some of his experimental cooking attempts; nothing fancy that required exotic new senses or neural pathways, just the usual baseline-plus of improved basic taste-buds.
“Er... well, all the time we’ve been eating, I’ve been following a small guild of people trying to get into the Eternal City. And, er... I’m kinda sleeping with one of them at the moment.”
“Yes, I know,” she said.
“Why didn’t you say anything?”
“It’s generally considered very rude to comment on someone’s simultaneous experiences. Most people run three or four main forks, and up to a dozen minor ones. But almost nobody talks about them.”
“Rude?” he asked. “You maintain a rule like that across millions of subcultures because people think it’s rude?”
“Not exactly,” she allowed. “The Powers are very keen on truth and accuracy, as you’ve seen. And they’ve given us a lot of useful tools to figure out the truth. You known, lie detectors that actually work. But on some issues, like forks, the tools just plain don’t work. By design.”
“Why not?”
“Well, it’s because...”
Grant winced. This was going to be a lecture, he knew. So he shifted his outlook to make didactic talks less boring, his mind adding extra context and meaning to the mere words.
“Because being multiple allows us to fully flourish as human beings,” she said, “and keeping the multiples separate makes each one more fulfilling. For example, there’s quite a few traditional neo-catholic medieval communities - full fledged, presided over by celibate, male priests. Some of them are even quite repressed, manipulative, and unhealthy. But there’s wisdom and positive experience to be gained in partially bad places as well. So the celibate male priest is simultaneously a pan-sexual pan-gender prostitute in one of the palaces of pleasure.” As she spoke, the images of both priest and the prostitute appeared in the air between them, and the emotions from both leaked lightly into Grant’s psyche. “And maybe the humble downtrodden church lady(TM) is simultaneously the decadent artist who’s just traded a living painting to the prostitute for sex.” The images and the emotions grew more intense. “As long as everything is kept separate, the priest and the church lady can be genuine in their roles, without regretting what they’re missing, because they’re not really missing it. They can be ultimately authentic in their self-denial, because their self-denial is only partial. This division maintains many more traditional cultures and setups within the Adventure.”
“Are you... are you simultaneously running other lives, sorry, ‘forks’, now?” he asked.
She smiled. “As I said, it’s rude to ask.”
“Oh.” He pondered this. “You know what else I’m going to ask?”
“Got a good guess.”
“It’s time,” he said. “Take my pain away, permanently. Like when I tried ‘Happiness’ in your museum. I’m in the middle of making love to someone I’ve been kinda pursuing for a fascinating subjective decade. I don’t need suffering anymore. I...”
“Let me complete what you’re saying,” she said, raising her hand. “You want the sharpness of the pain gone, but not its wisdom, not the experiences coming out from it. You want to lose the pain, but still be you, essentially identical.”
“Yep.” he confirmed.
“Ok,” she said. She hummed ‘Let it be’ to herself distractedly, looking round at the room, as he got more and more impatient.
Finally, after five minutes of her humming ancient pop songs, he finally asked: “When will you do it?”
“Already done, five minutes ago,” she said. “I’m happy you haven’t noticed the difference yet.”
“Oh, so it really didn’t make a...”
Grant didn’t complete the sentence, because the world was rent in two by a lightning bolt, and a piece of scorched and sizzling parchment danced in the middle of the bolt. Until the light faded and the parchment fluttered down to land on his knee.
“Was that you?” He demanded.
“No,” she said. “I think... I think you got your first request from the Powers.”
“What?”
“Request from the Powers? You know, how they mainly run this civilization? ‘You will get orders for the moral good of all. Feel free to ignore them’?”
“Oh,” he said, then picked up the parchment gingerly. Before it fell into ashes, he recognised the words: FIND YOURSELF.
“Ah,” she said, “one of the cryptic ones...”
Holding hand with each other and supporting a badly burned Eve, Grant and Mighty Aph stepped off the tongue of the last sandworm and into the confines of the Eternal City. Joyful bells greeted their arrival, while their names as were inscribed in the huge Torah scroll that listed the City’s citizens.
“We made it,” Mighty Aph said, disbelieving the evidence of her own eyes and a hundred bells.
Yes,” Grant said, looking back at the desert as it faded from view. The algorithms behind the desert and the sandworms would be reused, modified, amplified, and his research would help with that and would hence not be lost. But probably no-one would walk with the worms the way the three of them had. He turned back to Mighty Aph. “Would you want to continue... our relationship... within the City?”
“No, I’m sorry,” she half-smiled. “I’d always seen this as lasting a sidequest, not a full adventure.
“Ok.” Neither of them had put much focus into social ability recently.
“Would you want to be in my family, though?”
“What’s that?” he asked out of politeness, but standard automated information systems were already supplying the information to his brain. It meant that the brain pathways that corresponded to (healthy) families would be activated between them. So they might not get along, they might drift apart, but the fundamental core of their beings would always remember that they were, in a very specific human sense, there for each other. In a moment of calm, or when they needed help, their thoughts would naturally turn to the other, even if that wasn’t relevant to their current situation.
“Yes,” he said, “I’d like that.”
Grant and Mighty Aph were family. And, a few seconds later, the Eternal City wrapped them into its own family structure. It was an immediate and voluntary download into all three of their brains. Consisting of a whole host of memories, it provided them with long-term friends and what they thought of them (simultaneously, these long-term friends were having their own memories added to), the cafes, museums, haunted houses, sport arenas, and hidden temples that they ‘frequently visited’, their homes, home streets, street markets, and the market economy that the City partially followed. They learnt the services and jobs they ‘often provided’ to others, the passage of delivery-people and the games, social events, and meditations that they ‘spent their evenings on’. On top of that were the required increases in empathy, transparency, tolerance, and prosocial urges; they would have to keep these as long as they remained in City. Even after that, unless they repudiated the City entirely, they would remain loosely connected to a web of ex-Citizens spread across the Adventure.
Grant had an implanted vision of his future. He saw himself strolling down the street, picking up some sharper chisels and some fallen feathers, stopping to chat with everyone, making his way to the studio where he would use all this to make a small decorative table for the old lady who lived above his flat. Local kids and adult whipped by on bikes and hoverboards, some waving to him and others shouting at him. The street’s fortune teller went by, murmuring incantations that slightly changed the nature of the reality around him - changes that Grant would attempt to figure out with his local sect at the hidden temple, after the evening bar meetup but before the football match. In the meantime, he used the side-effect from the incantations, and his secret knowledge of the Guild of Architects, to move more swiftly through the streets.
It was a web of connections and habits and attitudes that simultaneously opened up the diverse City to all three of them, and bound them to it so that they would contribute to its idealised state. The memories were broad, but not deep - it was easy to identify a genuine memory from an implanted one. As such, they would remain for a while second class citizens, until they formed enough genuine new connections and habits to enter the City’s community of their own selves. Then, some of the deeper powers and mysteries of the City would be revealed (but not all, never all - unless they rose so high that they would be the ones generating powers and mysteries for others to discover).
Some more years later:
“I’m nervous,” Grant said, sipping the Rose Tea of Piety from a sculpted earthen cup. He’d recently wired his mind to ultra-appreciate small rituals, and this was one of his favourites. He slowly rotated the cup, imbibing the tea a mouthful at a time, and thinking and exploring the whole history of tea, discussing it with anyone around. Most days his friends were around, expressly for that purpose.
Eve, owner and maker of the cup and the cafe, patted him reassuringly on the back, while the odour of early 1960 Italy soaked into nostrils.
“Then make yourself less nervous,” Mighty Aph said, briefly diverting her attention from a cigarette-filled conversation on neo-Futurism.
“Where’s the fun in that?” he asked. Then he got up, dropped a painstakingly painted replica of a 100,000 Lira note on the table, and stepped out of the cafe, whispered the 73rd secret name of God to the beggar waiting outside, and thus opening a passage into baseline reality: Trajan’s forum in real Rome. He strolled up to Trajan’s column, and stood just under the spiralling story of a long-lost victory. “I’ll be brief,” he said.
The weak irony, of course, being that in baseline reality humans had no choice but to take up a lot of time. The transfer from the digital parts of the Eternal City to the bits-that-were-actually-physical-Rome was supposed to be seamless. But it wasn’t quite. The various nano... things (Grant made a mental note to make himself desire to learn about atomic engineering at some point next decade) could assemble a physical human body within moments. But there was always a slight delay, and a speed slowdown that wasn’t always fully compensated for. His audience (including his extended family, come to support him) were mainly synched, but not perfectly.
And there was the problem of the brain. As Grant had integrated himself more within the social world of the Eternal City, he’d also been pushing his modifications far beyond the baseline level, especially in terms of social skills and pattern analysis. That meant that there wasn’t quite any physical brain design that corresponded to what he’d become. The nano-things did their job well, but Grant could tell that his own thinking was subtly different within the digital City versus within baseline reality. Hence he liked to alternate between the two as often as he could (quite separately from his other forks which inhabited his palace, various cooking sects, two modest sex dungeons, and a few other places).
“We don’t live by efficiency alone,” he said. Grant’s political skills were extremely well developed, and normally he’d be giving a speech full of subtle rhetoric, pathos, and statistics. However, some of his audience were baseline, and their Guardian Angels would have filtered his speech down to protect them from advanced rhetorical tricks. So he preferred to give a low-baseline level speech, knowing the higher intelligences would fill in the statistics and rhetoric themselves.
“We expand across the solar system, and create probes to bring humankind to the universe. And within a few billion years, the whole universe will be ours, unless we’re lucky enough to meet true aliens. But the Adventure is, among so many other things, the Adventure is an adventure. The clue is in the name. And adventures must be experienced in the short term, not waiting a billion years to cash in all the expansion chips in one big bundle. Every human culture has some sort of saying about the importance of the journey, not just the destination.”
He paused for a second, reaching up to touch Trajan’s column. It felt like rough stone, imminently fragile and impossibly ancient, even though he knew a subtle... nanofield?... thing?... prevented him from doing any actual damage to it.
“So I argue that we are too focused on the extreme of the expanse, that we should turn some of our efforts inwards - the moons of Neptune, the dwarf planets of our own solar system, a few of the more interesting comets. The loss of efficiency in the expansion is quantifiable, and small; but think of the adventure to be gained! What would your ancestors have given to walk on these new worlds. If we’re still human, we must want the same thing. Thank you.”
As was traditional, a riot broke out. The Eternal City was recognised as one of a dozen or so places where politics could truly be universal: where people’s individual actions were likely to affect the whole future course of the post-human race (the more people believed that, the more true it was). He just hoped his nine-year efforts (culminating in this speech) would be enough.
The scene divided into two, though only the highest intelligences could track the division perfectly. At the lowest level was the spectacle of politics: people were standing up, giving thundering speeches, being interrupted, interrupting, jeering and cheering. Everyone was on their feet, with minor riots all over the place. The Powers prevented anyone dying from them, and the Eternal City’s automated bylaws required that political debate be allowed to continue during even the most intense riot. Factions were forming and re-forming, and all present were suffused with the raw energy of confrontation and messy human politics in the most primal form. Grant eagerly threw himself into the debate, a heavy walking stick as his weapon of choice, and had to parry certain powerful and insistent points from Eve’s mace, before a Molotov cocktail forced them into agreement.
Beyond and within the process, the real process of politics were happening. Myriads of powerful factions were assessing his speech, and the audience reaction, analysing it within the framework of the social contracts that coordinated their actions. It could be said, crudely, that alliances were dissolving and being created; but, more precisely, the terms of multiple alliances were being carefully and incrementally renegotiated, through all the non-isolationist subcultures of the Adventure, with multiple inputs from baselines to super humanish intelligences. The Powers waited, as they often did, until extended humanity reached something resembling a decision.
Some years later still:
“How does it feel to be the first man on Eris?” The question came from a man Grant recognised, but didn’t acknowledge, as the Eternal City’s fork of his granddaughter Of course I still love you. Currently embodied in an automated camera.
And he was the first man on this alien world, a reduced Neil Armstrong. He wasn’t the first human or human-like being on the dwarf planet Eris - the celestial body whose discovery had prompted the definition of ‘dwarf planet’ in the first place. There had been sixty-three other humanish beings there already, since its colonisation two days ago. But, due to his role in pushing humanity towards that very colonisation, he had been granted the honour of being the very first physically present biological human male to walk on this new world.
And he was walking. Naked, he strolled upon the glass-like ice of the Xena crater. He reached down to touch a crack, and broke off a small sliver of ice, bringing it up to his lips. Effector fields - ha! He’d laughed when he’d read that name - kept him alive and warm and breathing. His granddaughter’s physical form was a small flying camera design. The two of them were the only trace of humanity visible for kilometres around.
“I’m actually doing this,” he said to his Granddaughter, and through her, to his family in the Eternal City, to the Guild of Smiling Throatcutters, to the Lords of Fanfiction, to many science and history boards, to other friends, and to the audience of most of the world. He looked left, to where old Sol, the star that had warmed and birthed all of humanity’s generations (so far!), was just grazing the lid of the crater. From here, it just looked like a bright star, and didn’t stand out too much among its compatriots. But the sun’s freezing heat was gently subliming the ice, transforming it into vapour, which rose in a faint haze against the stark starfield. “I’m walking in the mists of a new world. Or in the midst of the mists. Yes, that sounds better.”
He’d changed tremendously since his revival, slowly scaling the ranks of intelligence and self-improvement, narrowing and expanding his personality in key areas, and ended up being put in charge of some of the analysis of Eris’s venting of water vapour. He felt he was on the edge of being considered a humanish superintelligence, one occupying a previously underexplored niche in mindspace.
“Boon,” he said within his mind, addressing himself to the being that was in overall control of the Eris colonisation project. “Thanks.” He uploaded his deep and full emotional gratitude. Boon acknowledged the sentiment, and responded with a complex series of emotional states, that Grant found, to his half-surprise, that he could understand quite well.
“How old are you?” Grant asked.
“Ah, you’ve realised,” Boon said.
“Once I’d thought about it,” the physical human said, “it was obvious. Grant. Boon. Both gifts. You were created the very day - the very second - that I was revived.”
“Indeed.”
“You’re...” Grant said. “You’re an extrapolation of my mind, my initial mind, to higher intelligence. The Powers didn't leave me without protection, whatever my granddaughter believed. You’ve been playing the Guardian Angel for me, behind the scenes.”
“Sometimes,” Boon said.
“FIND YOURSELF.” Grant reawoke part of his standard emotional makeup, and smirked. “And now I have. Why did the Powers do that?”
“Because it was an interesting experience for us.”
“Yep, I suppose that's why they do anything.” He felt a great emotional hole within him, screaming to be filled. “Will you be fully my Guardian Angel, extending myself to new mental vistas?”
“Only if you’ll be my Avatar,” Boon said. It scanned Grant’s mind, updated itself to be a more accurate extrapolation, and smiled a non-physical grin. “And we can be Enlightened together.”.
“What now for you?” Of course I still love you asked, unaware of the mental transformation of her interviewee.
“Oh, I’ve got a few ideas of things to do...” Grant/Boon answered.
They noticed a rocket booster lying prone on the horizon, discarded during the initial approach. Boon was sure that no-one that was supposed to be on the planet had landed anywhere near that place. “Curious...”
37 comments
Comments sorted by top scores.
comment by The_Jaded_One · 2016-12-28T00:59:35.746Z · LW(p) · GW(p)
Great story, I read all of it.
What I liked: this seems to be one of the first serious attempts in either fiction or academic writing at dealing with how to enhance human minds without corrupting or destroying them, especially under recursive self modification.
It is breaking new ground in that direction as far as I can tell.
I liked the way Boon broke down its problems logically, it was a good depiction of a bootstrapping intelligence.
A few criticisms:
Lots of tell where show would be better but also much longer
Some of Grant's actions seemed really dumb or odd, but maybe they would have made more sense if we knew a bit more about who he is and what his previous life experience was.
Waking up from cryosleep and playing chase the goblin seems really odd to me. Maybe it's just me but damn I would want to savor that moment.
Grant Insisting on modifying his own mind RIGHT NOW WITH NO SAFEGUARDS BECAUSE HELL IT'S MY MIND BITCH seems incredibly stupid, I want to see some explanation and backstory to justify how someone could be rational enough to sign up for cryo but dumb enough to ask for that.
Some things that were supposed to be amazing and fantastic seemed weird and icky. The weird orgy thing for example. I have had a three way sexual experience and it was absolutely magical, the writing about the orgy made me want to be sick. I think it may partly be a show vs tell problem, and partly be because one can just do a lot better in terms of the scenario. Capturing a single human sexual experience between two people is hard work, tbh I wouldn't know how to do it (I lack writing talent)
Overall, fantastic, amazing, please do more and please teach me how to write!
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-28T10:32:55.812Z · LW(p) · GW(p)
Thanks for the compliments and the constructive criticisms! As you can tell, some of the problems are imposed by the structure of the story (needing to present all the ideas within a certain length, especially). If I write further stories set in this world, I'll try and address your points.
One minor counter: I think Grant's behaviour with self modification is actually sensible, seen from his own perspective. He can't trust that others won't overwrite key parts of him, and his very first self-modification action is to cautiously modify himself so he doesn't foolishly modify himself.
I also suspect his granddaughter was a bit manipulative there, giving him full control in a way that encouraged destructive modification. She could have given him a self-modification format with more training wheels. Instead, she chose to give him what he asked for, not what he wanted.
Replies from: The_Jaded_One, The_Jaded_One↑ comment by The_Jaded_One · 2016-12-28T16:38:29.647Z · LW(p) · GW(p)
needing to present all the ideas within a certain length, especially
btw is this self-imposed? Or just time constrained?
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-28T21:28:25.697Z · LW(p) · GW(p)
Readability for most people :-) it might be too long already.
↑ comment by The_Jaded_One · 2016-12-28T12:01:40.157Z · LW(p) · GW(p)
Yeah maybe I am putting myself into Grant's shoes a bit too much. Modifying your own algorithm is a bit like messing with system files on Linux/Windows.
"What can possibly go wrong if I just chmod the System files to 777 so that I have full access to all of them?"
...
computer dies horribly
I suspect that most people who are in the rational-o-sphere would be super cautious too, but perhaps one could build Grant's presingularity life up a bit. Maybe he won the lottery and decided to outright buy cryo at an older age, for example? Maybe he doesn't have all that rational-o-sphere knowledge?
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-28T21:34:10.992Z · LW(p) · GW(p)
So, this is non-canon, but I pictured Grant as black, partially self-taught, middle manager career, some nerdish hobbies but many not, and overconfident in his own abilities. He chose cryogenics, because his overconfidence overrode his absurdity heuristic. But as I said, this is non-canon and subject to change if I ever flesh him out more.
Replies from: The_Jaded_One↑ comment by The_Jaded_One · 2016-12-30T23:58:47.511Z · LW(p) · GW(p)
Interesting, I'd like to think/talk more about how different types of people might get into Cryonics, and how they might do on the other side.
One expectation I have is that the people who tend to self-select into cryo are probably the people with the most to gain from it.
I think that the binding constraint on how good paradise can be is the constraint of how much you can modify yourself and still realistically say that it is you. If you are a fairly average person from today with simple tastes and interests, there perhaps not much room for you to grow and still be "you".
If you have more exotic tastes and more sophisticated ambitions, you have more room to grow. The more frustrated and stifled you feel by contemporary society, the more you'll benefit from having all those constraints lifted. Dream big.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-31T06:41:57.238Z · LW(p) · GW(p)
I suspect that a few people will end up as celebrities for exploring interesting areas of mindspace, and they may spark various fashions among people who would not have expected to change much.
comment by taygetea · 2016-12-25T21:52:17.091Z · LW(p) · GW(p)
This was great. I appreciate that it exists, and I want more stories like it to exist.
As a model for what I'd actually want myself, the world felt kind of unsatisfying, though the bar I'm holding it to is exceptionally high-- total coverage of my utility-satisfaction-fun-variety function. I think I care about doing things in base reality without help or subconscious knowledge of safety. Also, I see a clinging to human mindspace even when unnecessary. Mainly an adherence to certain basic metaphors of living in a physical reality. Things like space and direction and talking and sound and light and places. It seems kind of quaintly skeuomorphic. I realize that it's hard to write outside those metaphors though.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-26T06:41:48.557Z · LW(p) · GW(p)
Cheers :-)
the world felt kind of unsatisfying, though the bar I'm holding it to is exceptionally high-- total coverage of my utility-satisfaction-fun-variety function.
Would you expect to be able to achieve that - maybe eventually - within the world described?
It seems kind of quaintly skeuomorphic. I realize that it's hard to write outside those metaphors though.
It's partially that, and partially indicative of the prudence in the approach. Because a self-modifying human mind could end up almost anywhere in mindspace, I conceived of the Powers going out of their way to connect humans with their "roots". There's the extended "humanish" mindspace, where agents remain moral subjects, but I'm conceiving the majority of people to remain clustered to a smaller space around baseline human (though still a huge mindspace by our standards).
But you're right, I could have been less skeuomorphic (a word to savour). I can only plead that a) it would have meant packing more concepts into a story already over-packed with exposition, and b) I would have had to predict what metaphors and tools people would have come up with within virtual reality, and I'm not sure I'd have come up with convincing or plausible ones (see all those "a day in the life of someone in 50 years time" types of stories).
Replies from: taygetea↑ comment by taygetea · 2016-12-26T11:52:06.706Z · LW(p) · GW(p)
Would you expect to be able to achieve that - maybe eventually - within the world described?
Definitely. I expect the mindspace part to actually be pretty simple. We can do it in uncontrolled ways right now with dreams and drugs. I guess I kind of meant something like those, only internally consistent and persistent and comprehensible. The part about caring about base reality is the kind of vague, weak preference that I'd probably be willing to temporarily trade away. Toss me somewhere in the physical universe and lock away the memory that someone's keeping an eye on me. That preference may be more load-bearing than I currently understand though, and there may be more preferences like it. I'm sure the Powers could figure it out though.
It's partially that, and partially indicative of the prudence in the approach.
Perfectly understandable. I'd hope for exploration of outer reaches of mindspace in a longer-form version though.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-26T12:22:19.115Z · LW(p) · GW(p)
I'm sure the Powers could figure it out though.
That doesn't seem too hard. Actually being in the physical universe might be deemed to be too expensive (or you might have to go to great lengths to earn that possibility). Removing that memory is fine, especially with your permission, but the Powers might add a weak superstition or belief in providence to replace the specific knowledge that someone is watching.
I'd hope for exploration of outer reaches of mindspace in a longer-form version though.
That was partially Boon's role here, but that was exploring increased intelligence rather than more quirky possibilities.
Basically there are some areas of mindspace that are completely ruled out (continual pain without enjoyment but not motivation to change that), some that are permitted only if they're very rare, some that are allowed for most but not all, and some that are allowable for anyone (eg pleasure sensations). As usual, the Powers prefer to use social tools and tricks to enforce those proportions, coercively intervening very rarely.
comment by vakusdrake · 2016-12-28T12:34:34.398Z · LW(p) · GW(p)
Hmm, there are two problems (though as a whole it's the best utopia I've ever seen conceived) I have with this utopia.
Firstly the default is for the powers to just insert the truth into your mind with consent or even letting you know, which I imagine many people like myself would find pretty terrifying, even if it's done in a way that we ourselves might have willingly chosen to get implemented. In fact I imagine most non-transhumanists would have pretty strong preferences against self modification and having their minds tampered with.
Secondly is the fact that I really doubt the portrayed efficacy of many of the safeguards against humans making things shitty. For instance I really doubt many hardcore social conservatives would want to even get uploaded in the first place and given their insular nature, external social pressure wouldn't do much to get them to change, if anything it would feed their persecution complex.
A lot of the safeguards against what you call the culture trap, kind of seem like they would only work on people who were already somewhat socially progressive.
I just don't actually think there's any solution that preserves human autonomy that will eliminate oppressive insular cultures. Plus the more alien the outside world becomes, the easier it will be for it to be demonized. Even if you're forcefully inserting the truth into their minds they will just end up saying that's the voice of the devil or something in order to get people to not listen to it, the only solution is to just forcefully change their beliefs but that would totally mean dropping the idea of human autonomy.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-28T14:16:00.823Z · LW(p) · GW(p)
Good points. I'd imagine the Powers have chosen the "drop knowledge into Grant's mind" because he wouldn't object to that; they may have other means for other people.
As for the general point, I don't know, but the Powers can be exceedingly manipulative when they need to, and respecting baseline human autonomy, even when made more rigorous, is not much of an obstacle for a superintelligence. Standard economics forces, without direction, have been very effective even in our world...
Replies from: vakusdrake↑ comment by vakusdrake · 2016-12-28T14:39:01.722Z · LW(p) · GW(p)
See we know the powers didn't just drop knowledge into Grant's mind because he'd be okay with it. The whole reason he found out about it was because someone else was bringing up the fact that the powers are doing this to presumably everybody by default. I just wouldn't make sense for them to bring it up unless it was default.
Also the powers clearly seem to care about human autonomy, otherwise they would just use super persuasion to get everybody to agree to live the best possible human life.
Plus how exactly is economics in a post scarcity world going to stop religious fanaticism?
↑ comment by Stuart_Armstrong · 2016-12-28T21:27:38.828Z · LW(p) · GW(p)
That is the default, because most people are OK with that :-)
The point about economics is that this is something that already undermines current religious cults; so there are tools, broadly within human autonomy, that can undermine this way of thinking.
On the question of autonomy, it seems they generally respect human autonomy, but do override this when they judge it necessary (eg resetting Grant). But when they do, they try and violate autonomy the least they can. If they judge that self-reinforcing unhappy socially conservative communities, without forks, are detrimental to human flourishing, then they can subtly undermine them (until at least the possibility of forks is accepted). The question is whether the situation is sufficiently dire to require such interventions.
Replies from: vakusdrake↑ comment by vakusdrake · 2016-12-28T22:11:24.609Z · LW(p) · GW(p)
That is the default, because most people are OK with that :-)
See that makes it kind of a weird thing for that character to have brought up then. After all why bring something up if it only applies to people who wouldn't care by definition.
Also you don't really explain how religious fanaticism is going to effectively be suppressed via economics. Like how you still haven't really given any plausible way for that to happen, like what are you going to try to starve them to death if they don't deconvert?, that wouldn't work for a number of reasons. Also I'm not just talking about a small number of cultists. Barring unprecedented cultural changes between now and 2064 there will still be millions if not billions of people interested in maintaining oppressive religious cultures by the time the singularity rolls around. As for them dwindling out that seems unlikely given 1. their high birth rates and 2. the fact that since the outside world is so alien it will be easy to demonize. Plus they will likely become more extreme because many people will perceive this as the end times, and the prior mentioned isolation such an alien external world would cause. Not to mention they would see how religious communities that had contact with the outside invariably fell to sin, thus making the need for isolation even more important as far as they're concerned.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-29T10:35:47.515Z · LW(p) · GW(p)
Caveat: I hadn't thought all these things through as much at the time, so there are ideas I'm developing during this conversation.
So I'd see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.
And I'm not arguing that the world will become perfectly enlightened and agreeing with my values (or the extended cluster of my values) by 2064, just that AIs will have tools to achieve their goals by them. Religious ideologies change all the time, and economic power is one strong thing that changes this (note how the truly stable religious communities only work by maintaining themselves at the dunbar number). When it becomes exceedingly expensive to not be an upload, while all uploads run cognitive circles around you while boasting of properly stable religious communities within them, the temptation to join will be large. And the possibility of anonymous forks can be offered then or later.
Replies from: vakusdrake↑ comment by vakusdrake · 2016-12-29T15:02:17.077Z · LW(p) · GW(p)
So I'd see this world as maintaining the possibility for surprise and objections. They could have informed Grant of how they would drop info into him. Instead, they expected it to come up at some point (which it did), and, depending on his reaction at the time (which could vary with the manner it was brought up), they would change their interaction with him. This also gives people the possibility of manipulating the interaction with him, as the lady talking with him did. People still have choices, and those choices have consequences and are not all pre-ordained.
That works fine though my problem with it is that it makes that whole interaction weird since there's no reasonable way the person bringing up that information could have known that. It does kind of make things slightly weird in retrospect.
I also have some degree of issue with the whole idea of people still having "choice" in any meaningful way in the world of the powers. I mean it kind of reminds me of the entity in newcomb's problem, if humans are utterly predictable to them, than any appearance of a choice that wasn't made for you by the AI seems impossible, with the possible exception of sufficiently strong precommitments made presingularity. When you understand perfectly how every action will affect a human's causal mental process, then there's no choosing to not interfere, you are forced to choose exactly which final outcome you want. Still I could easily see this being the case in your scenario but i'm not sure it is.
As for oppressive cultures, making their lives expensive isn't likely to do much good, starving them out wouldn't be a good idea for really obvious reasons(PR mainly), and denying them luxuries is going to be of limited use as well. I mean when they probably view the powers as demons using subtle methods to convince them to join you is going to be pretty ineffective it would seem.
Also again while these cultures would normally change over time, the fact that the outside world is totally evil as far as their concerned is going to make them extremely insular and may make them more extreme. As for the uploads making them jealous that's kind of laughable given they probably view them as demons or puppets of the devil anyway.
Any kind of negotiations with a culture that views you as literally the devil is probably doomed to failure. Also keep in mind even if you can get some people out of the communities, the rest of the community will only get worse via evaporative cooling: http://lesswrong.com/lw/lr/evaporative_cooling_of_group_beliefs/
Also keep in mind they will have a pretty high birthrate in all likelihood so they can replace some losses.
comment by Joe Carlsmith (joekc) · 2021-09-17T23:20:11.352Z · LW(p) · GW(p)
I appreciated this, especially given how challenging this type of exercise can be. Thanks for writing.
comment by Wuschel Schulz (wuschel-schulz) · 2019-09-24T21:29:16.930Z · LW(p) · GW(p)
I laughed so hard at the "...and then, finally, he truly knew what it was like to be a bat..." part. Every time a Philosophy course at my Uni gets to the topic of qualia, someone brings the exactly same example of the difference of knowing, how I would feel being a at, and how the bat feels... ...that reference came so unexpected.
Otherwise also nice story, and interesting universe. Thanks for posting it.
comment by oge · 2016-12-26T00:16:28.973Z · LW(p) · GW(p)
Lovely story, Stuart. I like how you captured certain aspects of thinking I hadn't seen articulated before e.g.
- a being moving their attention/power towards different abilities as needed
- that feeling of slowly becoming the part of a community
- the feeling of noticing that a statement is likely not true
comment by cousin_it · 2020-02-06T13:40:58.180Z · LW(p) · GW(p)
I think this kind of utopian stories often feel a bit shallow, because they are so focused on circumstances. To me, the interesting core of a story is usually about personalities. Circumstances are also important, but more as a testing stone for personalities. Maybe you can write a utopian story with interesting personalities, but that task needs to be taken directly.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2020-02-06T13:52:22.094Z · LW(p) · GW(p)
I will do that one day - when I have time ^_^
Do you know anyone else who would like to try?
comment by Stuart_Armstrong · 2017-05-09T11:41:06.573Z · LW(p) · GW(p)
The "verse" that I used to have at the beginning of the story; I removed it because it's not Christmas any more:
Hark! the herald daemons spam,
Glory to the newborn World,
Joyful, all post-humans, rise,
Join the triumph of the skies.
Veiled in wire the Godhead see,
Built that man no more may die,
Built to raise the sons of earth,
Built to give them second birth.
comment by plex (ete) · 2016-12-26T14:05:30.285Z · LW(p) · GW(p)
Thank you, I enjoyed this.
I'd be interested in seeing moral trade feature explicitly in things like this, for example there are many people who claim to have "create a hedonium shockwave" or "minimize suffering" as their goal rather than the complex human value thing, and demonstrating that it's possible (and why it's good) to share the future seems important.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-26T14:39:25.098Z · LW(p) · GW(p)
Hum, I shall ponder that. It's true that multiple culture's could allow that kind of idea...
Replies from: ete↑ comment by plex (ete) · 2016-12-26T17:19:11.018Z · LW(p) · GW(p)
I also know a good number of people from subcultures where nature is the foundational moral value, a few from ones where family structures are core (who'd likely be horrified by altering them at will), and some from the psychonaut space where mindstates and exploring them is the primary focus. I'd also guess than people for whom religions are central would find the idea of forked selves committing things they consider sins breaks this utopia for them. These groups seem to have genuine value differences, and would not be okay with a future which does not carve out a space for them. "Adventure" and a bunch of specifics here points at a very wide region of funspace, but one centered around our culture to some extent.
There's some rich territory in the direction of people who want reality to be different in reasonable ways coming together to work out what to do. The suffering reduction vs nature preservation bundle seems the largest, but there's also the complex value vs raw qualia maximization. Actually, this kinda fits into a 2x2?
Moral plurality is a moral necessity, and signalling it clearly seems crucial, since we'd be taking everyone else along for the ride.
Edit: This is touched on by characters exchanging values, and that seems very good.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-28T10:44:39.989Z · LW(p) · GW(p)
If I was to impose this utopia in everyone, without negotiating with their values, it would go something like: everyone has the right to not have forks, everyone has the right to have forks, everyone has the eight to have their forks either "sin" or be sinless. And the Powers would act against groups that tried to make conditions on forks as conditions of belonging (they'd also discourage forks from acting together; if you create forks to sock-puppet how great you are, they will feel free to let that information leak).
From their perspective, this allows those people to consciously and fully knowledgeably live a sinless life, rather than being compelled to merely by social pressure.
Now, there's going to be value negotiations, but this system has already done quite a bit to accommodate multiple values.
comment by gbear605 · 2016-12-25T15:24:13.511Z · LW(p) · GW(p)
Typo thread:
"Gant. Boon." should be "Grant. Boon."
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-25T16:07:40.545Z · LW(p) · GW(p)
Thanks, corrected.
comment by digital_carver · 2020-02-19T07:38:45.645Z · LW(p) · GW(p)
Was this post recently updated or something, or are the comment timestamps all wrong? The post says it was posted on "5th Feb 2020" (and it appeared on my RSS feed around that time), but many of the comments are from 2016.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2020-02-19T09:39:31.778Z · LW(p) · GW(p)
I recently reformatted it, so that it would colour and indent properly, and put it in markdown. That seems to have reset the date.
Replies from: habryka4↑ comment by habryka (habryka4) · 2020-02-19T17:23:04.033Z · LW(p) · GW(p)
We reset the date on posts when you move them to drafts and then back again. This likely happened here.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2020-02-19T21:51:00.930Z · LW(p) · GW(p)
Yeah, I was doing a lot of editing (which is how I got the blue colour), and didn't want it to appear little by little.
comment by NancyLebovitz · 2016-12-26T14:44:13.627Z · LW(p) · GW(p)
Thanks-- I had fun reading it, and it's definitely more exuberant than most utopias.
I wasn't crazy about the sandworm challenge for getting to be politically influential-- wouldn't it make more sense to work one's way up by being influential in smaller groups?
Probably too much for this story, but there'd also be basic research going on and changing things.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2016-12-26T15:07:32.919Z · LW(p) · GW(p)
In story: the challenge is to join the City. Becoming influential afterwards is another task.
Out of story: the point of the sandworms is to suggest the breadth of possibilities going on in this world.