Just another day in utopia
post by Stuart_Armstrong · 2011-12-25T09:37:00.410Z · LW · GW · Legacy · 118 commentsContents
Just another day in Utopia None 118 comments
(Reposted from discussion at commentator suggestion)
Thinking of Eliezer's fun theory and the challenge of creating actual utopias where people would like to live, I tried to write a light utopia for my friends around Christmas, and thought it might be worth sharing. It's a techno-utopia, but (considering my audience) it's only a short inferential distance from normality.
Just another day in Utopia
Ishtar went to sleep in the arms of her lover Ted, and awoke locked in a safe, in a cargo hold of a triplane spiralling towards a collision with the reconstructed temple of Solomon.
Again! Sometimes she wished that a whole week would go by without something like that happening. But then, she had chosen a high excitement existence (not maximal excitement, of course – that was for complete masochists), so she couldn’t complain. She closed her eyes for a moment and let the thrill and the adrenaline warp her limbs and mind, until she felt transformed, yet again, into a demi-goddess of adventure. Drugs couldn’t have that effect on her, she knew; only real danger and challenge could do that.
Right. First, the safe. She gave the inner door a firm thud, felt it ring like a bell, heard the echo return – and felt the tumblers move. So, sound controlled lock, then. A search through her shoes produced a small pebble which sparked as she dashed it against the metal. Trying to ignore the ominous vibration as the triplane motor shook itself to pieces, she constructed a mental image of the safe’s inside from the brief flashes of light. Symmetric gold and gilded extravagances festooned her small prison – French Baroque decorations, but not yet Roccoco. So Louis XIV period. She gave the less visited parts of her mind a good dusting, trying to remember the tunes of Jean Batiste Lully, the period’s most influential composer. She hoped it wasn’t any of his ballets; she was much better with his operas. The decorations looked vaguely snake-like; so she guessed Lully’s ‘Persée’ opera, about the death of the medusa.
The engine creaked to a worrying silence as she was half-way through humming the Gorgon theme from the opera. Rushing the rest of the composition, she felt the door shift, finally, to a ten-times speeded up version of Andromeda’s response to Perseus’s proposal. She kicked the door open, exploded from the safe, took in the view of the temple of Solomon rushing up towards her, seconds away, grabbed a picture from the floor, grabbed an axe from the wall, hacked off one of the wings with three violent cuts, and jumped out of the plane after it.
Behind her, the plane disintegrated in midair as the temple lasers cut it to shreds and she fell through space, buffeted by the wind, not losing her grip on to the mangled wing. She had maybe thirty seconds to tie herself to the wing, using the object’s own canvas as binding, and she rushed through that. The Machines wouldn’t allow the fall to kill her, of course, but it would hurt quite a bit (another of her choices – she’d allowed herself to feel moderate amounts of pain), put back her attempts to ever find Ted, and, most importantly of all, be crushingly embarrassing socially.
Once she was lashed to the plummeting piece of wood and canvas, and she was reasonably confident that the fall was slow enough, and her knots secure enough, she finally looked at the photograph she had grabbed during her explosive exit from the plane. It showed Ted, trussed up in chains but smiling and evidently enjoying the novel experience. Underneath was finely engraved note saying “If you ever want to see your lover again, bring me the missing Stradivarius by noon tomorrow. Nero the 2nd”. Each capital letter was beautifully decorated with heads on spikes.
So! It seemed that her magnificent enemy Nero had resorted to kidnapping in order to get his way. It wasn’t like Nero could actually harm Ted – unlike Ishtar, her lover had never chosen to accept any level of pain above mild, brief discomfort. But if he was ‘killed’, Ted would feel honour-bound to never see her again, something she wasn’t ready to cope with yet. On the other hand, if she gave Nero her last Stradivarius, he might destroy it for good. It was her own choice: she had requested that her adventures have real meaning, hence real consequences. If she failed, and if Nero so choose, a small piece of humanity’s cultural history could be destroyed forever, permanently stymying her attempts to reconstruct Stradivarius’s violin-making techniques for the modern world. Culture or love, what to choose? Those were her final thoughts before she crashed into an oak tree shaped like a duck.
She returned to bleary consciousness fifteen minutes later. Her fainting was a sign the Machines were only granting her partial success in her escape attempt; she would have to try harder next time. In the meantime, however, she would have to deal with shotgun pressed into her face and the gorgeous man at the other side of it shouting “Get off my property!”.
“Pause,” she said softly. The man nodded; she had temporarily paused her adventure, so that she wouldn’t have to deal with danger or pursuit for the next few minutes, and so that this guy wouldn’t have to get her away immediately to protect his property from collateral damage. Most Adventurers disdained the use of the pause, claiming it ruined the purity of their experience. But Ishtar liked it; it gave her the opportunity, as now, of getting to know the people she bumped into. And this person definitely seemed to be in the ‘worth getting to know’ category. He put down his shotgun without a word and picked up his paintbrush, applying a few more touches to the canvas in front of him.
After disengaging herself from both the mangled wing and the duck-shaped tree (she’d have a dramatic scar from that crash, if she choose to), she worked her way round to what he was painting. It was a rather good neo-impressionistic canvas of her, unconscious in the tree, pieces of torn canvas around her, framed by broken branches and a convenient setting moon. Even with his main subject out of the frame, as it were, he still seemed intent on finishing his painting.
“Why did you splice your tree’s genes to make it look like a duck?” she asked, when the silence had gone on, in her estimation, for ten times as long as it should have. He had done a pretty good job with that oak, in fact; the feathers and the features were clear and distinct amongst the wood – or had been, until someone had crashed a triplane wing into the middle of it.
“I didn’t,” he said. “That’s normal oak; I just trim and tie it.”
“But...” she looked at it again in astonishment; the amount of work involved to get that detail from natural wood was beyond belief. And oak wasn’t exactly a fast growing plant... “It must have taken you decades!”
“Two centuries,” he answered with dour satisfaction. “All natural, no help from the Machines.” He waved his hand up the side of the hill. “I’m making the perfect landscape. And then, I shall paint it.”
The layout was a tapestry of secret themes. Hedges, streams, tree-rows, pathways, ridges and twined lianas carved the landscape into hidden pockets of beauty. Each such pocket seemed to be a private retreat, cut off from the others and from the rest of the world – and yet all were visible at once, the layout a cunning display of multiple intimacy. Here and there were formal gardens, with lines of flowers all at attention, row after row, shading across colour and size from huge orchids to tiny snowdrops. Some pockets were carefully dishevelled, mini deserts or prairies or jungles, perfect fragments of wild untamed nature that could only exist at the cost of supreme artifice. There were herb gardens, rock gardens, orchards, water parks and vineyards; modelled on ancient Persia, England, Japan, France, Korea, Spain, the Inca and Roman empires – of those she could immediately recognise.
And then a few touches of fancy, such as the segment they were in, with the oaks shaped into animals. Further off, a dramatic slew of moss-coated sculptures, with water pouring out from every nook and cranny. Then a dynamic garden, with plants blasting each other with discharges of pollen, set-up in a simple eight-beat rhythm. And a massive Baobab, its limbs plated with a forest of tiny bonsai trees.
“What’s your safety level for all this?” she asked. If he’d chosen total safety, he wouldn’t have needed her off his property, as the Machines wouldn’t have allowed his creations to be damaged by her adventure. But surely he wouldn’t have left such artistic creation vulnerable to the fallout of Adventurers or random accidents...
“Zero,” he said.
“What?” No-one choose zero safety; it just wasn’t done.
“As I said, no help from the Machines.” He looked at her somewhat shyly, as she stared in disbelief. “It’s been destroyed twice so far, but I’ll see it out to the end.”
No wonder he’d wanted her out... He only had himself to count on for protection, so had to chase out any potential disturbances. She felt deeply moved by the whole grandiose, proud and quixotic project of his. Acting almost – almost – without thinking, she drew out a battered papyrus scroll: “Can you keep this for me?”
“What is it?” he asked, before frowning and tearing up his painting with a sigh. Only then did he look at the scroll, and at her.
“It’s my grandfather’s diary,” she said, “with my own annotations. It’s been of great use and significance to me.” Of course it had been – the Machines would have gone to great pains to integrate such a personal and significant item deeply into her adventures. “Could you keep it for my children?” When she finally found the right person to have them with, she added mentally. Ever since her split with Albert... No, that was definitely not what she needed to be thinking right now. Focus instead on this gorgeous painter, name still unknown, and his impossible dreams.
“What was he like?” he asked.
“My grandfather? Odd, and a bit traditional. He brought me up. And when we were all grown up, all his grandchildren, he decided we needed, like in ancient times, to lose our eldest generation.”
“He died?” The painter sounded sceptical; there were a few people choosing to die, of course, but those events were immensely rare and widely publicised.
“No, he simply had his intelligence boosted. Recursively. And he withdrew from human society, to have direct philosophical conversations with the Machines.”
He thought for a while, then took the scroll from her, deliberately brushing her fingers as he did so. “I’ll keep this. And I’m sure your children will find their ways to me.” An artefact, handed down and annotated through the generations, and entrusted in a quirky landscape artist who laboured obsessively with zero safety level? It was such a beautiful story hook, there was no way the Machines wouldn’t make use of it. As long as one of her children had the slightest adventurous streak, they’d end up here.
“This feels rather planned,” he said. “I expect it’s not exactly a coincidence you ended up here.”
“Of course not.” He was reclusive, brilliant, prickly; Ishtar realised a subtle seduction would be a waste of time. “Shall we make love?”, she asked directly.
“Of course.” He motioned her towards a bed of soft blue moss that grew in the midst of the orchids. “I have to warn you, I insist that the pleasure-enhancing drugs we use be entirely natural, and picked from my garden. Let me show you around first, and you can make your choice.” They wandered together through the garden, shedding their clothes and choosing their pleasures.
Later, after love, she murmured “unpause” before the moment could fade. “Get off my property!” he murmured, then kissed her for the last time. She dived away, running from the vineyard and onto the street, bullets exploding overhead and at her feet.
Three robot gangsters roared through the street in a 1920 vintage car, spraying bullets from their Tommy guns. The bullets ricocheted off the crystal pavement and gently moving wind-houses, causing the passer-by’s (all of whom had opted for slight excitement that week) to duck enthusiastically to the floor, with the bullets carefully and barely missing them. Diving round a conveniently placed market stall a few seconds before it exploded in a hail of hurtling lead, she called up her friend:
“Sigsimund, bit busy to talk now, but can you meet me in the Temple of Tea in about five...” a laser beam from a circling drone sliced off the pavement she was standing on, while three robot samurai rose to bar her passage, katanas drawn (many humans were eager and enthusiastic to have a go at being evil masterminds, but few would settle for being minions). “...in about ten minutes? Lovely, see ya there!”
It actually took her twelve minutes to reach the Temple (she’d paused to vote ‘yes’ on the question as to whether to bring back extinct species to the new Amazonian Rainforests, and to do some light research on the Stradivari). It was nearly-safe ground, meaning that adventures were only very rarely permitted to intrude on it, just enough to give a slight frisson of background excitement. She would certainly be safe for the duration of her conversation.
The priest, in gold and white robes with huge translucent butterfly wings, bowed to her as she entered. “I shall need to Know all about you,” he intoned, to her nodded agreement.
Sigsimund waved at her from a floating table that was making its way serenely through the temple’s many themed rooms, floating on a river that brought them through the Seventy-Seven Stages of Civilization. Ishtar swam out to join her, taking her seat at the gondola-shaped table.
“By your current lack of clothes,” Sigsimund said, “I take it you’ve been putting my advice into practice.”
Sigsimund was one of those who wished above all else to help their fellow human beings. In a world without poverty, disease or death, she specialised in the remaining areas of personal pain: relationship difficulties, jealousies and emotional turmoil. It was quite a popular and respected role, since most humans were unwilling to get rid of those negative emotions artificially, lest they become less than human; but at the same time, they appreciated those who ensured they didn’t have to suffer the full sting of these painful experiences unaided.
Sigsimund had first developed an interest in Ishtar when her long term relationship with Albert had fallen apart. Albert was a physicist (by mutual agreement with the Machines, physics was one of the areas where research was reserved to humans; so all fundamental new discoveries about the nature of reality were entirely triumphs of the unaided human mind), and his need for monogamy had ultimately proved incompatible with Ishtar’s desires. In Sigsimund’s expert analysis, the first stage in Ishtar’s recuperation was a lot of casual sex; she disapproved of Ted for this reason, feeling her friend wasn’t leaving enough time for play before starting another serious relationship (she dismissed comparisons with her own 78-year relationship, started two days after her previous one ended, with the line “we ain’t all the same, you know”).
So as Ishtar recounted her adventure, while strategically wrapping herself in an embroidered sarong that fell from the temple’s sarong-tree, Sigsimund started positively glowing.
“Fabulous!” she said. “I couldn’t have designed it better. And, even more perfect, you’ll never see or hear from him again, and didn’t even get his name. Maybe we can move on to the next step of my recuperation curriculum?”
“Go on”, Ishtar said suspiciously.
“Have you considered spending some time as a man? It would broaden your perspectives on things.”
Ishtar stared fixedly for a full twenty seconds, hoping to convey the full ridiculousness of the suggestion. “I am entirely convinced,” she said, “that that would be entirely unhelpful.”
“As someone who has been mending people’s psyches for a hundred and seven years, and who has access to your full psychological profile and detailed recordings of your activities and emotions for the last decade, let me say that I am entirely convinced that it would be entirely helpful.” A passing clockwork insect dropped a plump apple-strawberry into her hand, and she devoured it. “You should learn to live a little.”
“Why don’t you ever have Adventures?”, Ishtar asked. “You’re the one who should live a bit.”
“Oh, just let me continue spreading happiness and healing pain all around me. Adventures aren’t really my thing.”
“99.7% of people have had adventures,” Ishtar said, lifting a lime sherbet from a leaf floating past. “That makes you, my friend, a member of a tiny and dweebish minority.”
“Yes, but most people just have short adventures when they’re teenagers or on honeymoons. Only...” she let the thought out to the world, and the answer appeared in her mind a second later: “...only 32% of people have adventures as a major part of their lifestyle. And as for people like you, whose whole lives revolve around adventures, the number drops precipitously... J’accuse you of being the member of a tiny and dweebish minority. Also, I need time to learn Akkadian properly, if I’m going to be any use in my next dig. Incidentally, what do you think of my new face? You haven’t commented on it.”
“I like it,” Ishtar said, politely. “Very... colourful. Ethnic, even.” Though of no ethnicity known to man or beast, she added mentally, and the universe is very thankful for that fact. Though maybe some of the more brightly coloured lizards could find some small aspects of it alluring, she conceded. In dim light. If they didn’t have to see it all at once.
“I find it brings out the best in my friends,” Sigsimund said with a huge rainbow grin. “Nobody likes it, nobody dares say anything.”
“Whereas I hope that is not the verdict you shall give on my tea,” said the temple priest, holding a tiny cup aloft as his ivory throne descended lazily from the sky. “Madame Ishtar, I have downloaded your full history, biological records, run thousands of simulations with models of your taste buds, Glossopharyngeal nerve and brain stem; looked through your history for all pleasant and unpleasant taste associations I could find, analysed your stomach contents and recent consumptions, cast your horoscope, computed your chi, and peered deep into your chakras. And added a bit of flair and feeling. I believe this is the best cup of tea you ever had.”
The smell hit her before her hand even touched the cup, a light scent of burning grass that catapulted her into her childhood, dancing with her sisters in front of the traditional forest fires. It was the young girl and the woman who both clasped their finger across the cup, reunited across time by the single perfect aroma. She wasn’t conscious of drinking the tea, but she must have, for it exploded in her mouth, hot, spicy, cool, fruity, chocolaty and lemony. The tastes chased each other across her tongue and nerves, alternating with rapid and smooth transitions. She had just enough time to appreciate one taste combination, register a dozen half-formed marvellous impressions, feel that her joy was about to peak – but already the transition had happened, and she was in a new taste-world. She lost the consciousness of her tongue and nose; the sensations were applied directly to her brain, the mediating machinery stripped away.
And then, in three glorious seconds, it was over, and only one word could describe her feelings with sufficient poetry and precision:
“Wow!”
She also recognised the tell-tale sign of her dopamine system being inhibited. This was an essential precaution with any super-stimulus, to prevent addiction: though she liked the tea more than anything in recent memory, she wasn’t filled with an irresistible want for it. It was just a perfect moment for her to treasure.
“It is traditional,” the priest intoned, “for guests to leave a little something in exchange.”
Ishtar thought deeply. She wasn’t that used to ritual situations, and she couldn’t think of anything she had of comparable value to exchange. “Well,” she said, “I did spend two decades as house-wife, a while back.” The priest’s expression didn’t change. “One of the things I became good at was... baking cookies.” Still no sign as to what the priest was thinking, in any direction. “I did a whole lot of chemical research, of course, and some of them turned out sublime... One batch in particular, took my breath away and pounded my lungs with the sheer joy of being alive and tasting existence itself. And chocolate. I can... I can share the memory of that with you. It’s... very private, so please don’t go bandying it around to anyone...”
“That is...”, the priest said, as the memory was downloaded into his mind, “...generous.”. He bowed and his throne levitated away.
They sat in silence for a minute, until Sigsimund felt it was time to return their thoughts to trivialities. “By the way, I had a chat with Nero,” she said.
“You did?” Ishtar blinked, struggling to conceive of Nero as anything else but the magnificent and constant bane of her existence, the perfect enemy.
“Oh yes,” Sigsimund said, “He’s another one of my friends; he’s doing quite well, in fact, and is trending happy and well balanced and looking around for a healthy, low hassle relationship.”
“Well, I’m happy for him, I suppose...”
“In fact,” Sigsimund said, wagging eyebrows the size of maple leaves as suggestively as she could, “I would go so far as to say, that if you’re interested (and I recommend you to be interested) the rivalry between you might be amenable to... ending up in the traditional way, if you catch my drift. No pressure, just a thought to keep in mind when you both end up sweaty and wrestling over an exploding volcano.”
“That’s interesting,” Ishtar allowed, grudgingly. She sat in silence for a while, a stray thought nagging the rest of her brain for attention. She brought back the memory of the picture of Ted in chains. They had felt a little odd and fake, like they were made of plastic. Or maybe not weighing as much as they should. Like there wasn’t enough gravity. Not space or the moon but maybe...
“For the moment, I need a bit of a break,” she said. “You want to go skiing?” Sigsimund shook her head. “With rockets?” Still more head shaking. “On the mountains of Mars?”
“Now you’re talking!” Then Sigsimund’s gaze grew a little more serious, staring over her friend’s shoulder. “Though I see you’ll have to take the long route?”
Ishtar turned round. There, paddling slowly towards her through the stream, was the largest mechanical tiger she’d ever seen, its diamond-and-steel jaws glowing in the light of the temple as the other guests arranged themselves around it to witness the spectacle. Smoke belched from its nostrils alongside a tinny synthesised version of Nero 2’s laughter. Ishtar’s hand reached for her weapon, which she didn’t have, so it closed around a cheese knife instead. “If I make it, meet you tomorrow at the little Italian starport at the foot of Olympus Mons, okay?”
“See ya there,” Sigsimund said, and saluted, as her friend gave a blood re-curdling scream and launched herself over a fleet of tiny sailing ships battling each other, cheese knife pointed directly at the tiger’s clockwork heart.
118 comments
Comments sorted by top scores.
comment by gwern · 2011-12-24T22:58:59.511Z · LW(p) · GW(p)
She also recognised the tell-tale sign of her dopamine system being inhibited. This was an essential precaution with any super-stimulus, to prevent addiction: though she liked the tea more than anything in recent memory, she wasn’t filled with an irresistible want for it. It was just a perfect moment for her to treasure.
Nice touch.
comment by [deleted] · 2011-12-24T18:28:52.589Z · LW(p) · GW(p)
.
Replies from: jaimeastorga2000, Stuart_Armstrong↑ comment by jaimeastorga2000 · 2011-12-29T05:39:51.982Z · LW(p) · GW(p)
Just in Main? I think this belongs in a professional science-fiction publication that accepts reprints. It's really high quality, and the inferential distance is short enough to an SF fan.
↑ comment by Stuart_Armstrong · 2011-12-28T14:26:56.194Z · LW(p) · GW(p)
I always obey flattering requests with 16 upvotes :-)
comment by J_Taylor · 2011-12-28T10:39:48.726Z · LW(p) · GW(p)
I certainly would like to live in this utopia. However, I am not so certain that most people on earth would. I am also fairly confident that most people historically would not want to live in it.
Nonetheless, you did a very good job at crafting a utopia in which contemporary liberal Westerners would be happy. This is still quite the achievement.
comment by Kaj_Sotala · 2011-12-24T13:32:34.019Z · LW(p) · GW(p)
I love it. A wonderful tale for Christmas.
Reminds me of Permutation City and The Metamorphosis of Prime Intellect (in a good way).
Replies from: Armok_GoB↑ comment by Armok_GoB · 2011-12-26T20:33:45.909Z · LW(p) · GW(p)
Yes, except with the awesome twist that it's presumably not a simulation, but an actual collection of quarks with no in built fail-safes. If my judgement of authorial intent is right, they machines don't even have ubiquitous nanotech or beat chaos theory generally, they are just that good at xantos gambits. Which makes it a fantastic illustrative example of the thing a truly superhuman intelligence could manage to do.
Replies from: thomblakecomment by Normal_Anomaly · 2011-12-24T18:21:55.175Z · LW(p) · GW(p)
It was a nice place to visit, and I'd want to live there. Awesome!
comment by kilobug · 2012-01-10T15:36:45.982Z · LW(p) · GW(p)
Nice story, but I feel very uncomfortable about the Stradivarius part. That (and to a lesser extent, the way Ted is treated) makes me feel more that it is an utopia for Ishtar, not for humanity as a whole. I don't see how, funtheorically speaking, the potential lost of one of the last Stradivarii and of the hope of duplicating them, allowing for increased musical enjoyment for all humans and all to-be-born humans, can be lower than the slightly higher thrill for a one-day aventure of Ishtar.
Something like "if you chose to sacrifice the Stradivarius, you won't ever be able to hear it, but the rest of humanity still will" would feel much more appropriate to me. But I really don't see an utopia in which, for a bit more of fun of a single person, the whole humanity will lose something like one of the last Stardivarii.
Replies from: Vaniver, thomblake, TheOtherDave, Luke_A_Somers↑ comment by Vaniver · 2012-01-10T17:54:03.474Z · LW(p) · GW(p)
Notice that, as far as I can tell, she chose the instrument over Ted, and may have been helped in that decision by the encounter the Machines led her to. It may have been a false option to give both her breakup with Ted and the Stradivarius more meaning.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-10T18:40:11.014Z · LW(p) · GW(p)
Yeah, though the minute you take seriously the notion that the Machines are simply lying to people about the degree of real risk involved, a lot of the emotional appeal of this story is subverted. (Why anyone in this universe believes otherwise, I have no idea; possibly it's thanks to relatively subtle mind control.)
Replies from: Vaniver↑ comment by Vaniver · 2012-01-10T19:10:36.768Z · LW(p) · GW(p)
Even in the story, once she gives Nero the Strad, he has to decide to destroy it. She doesn't appear to have a very complete model of Nero besides "we're rivals."
It's also not clear to me that presenting someone with an option you strongly suspect they're not going to take in order to frame their choice counts as lying to them.
↑ comment by thomblake · 2012-01-10T17:00:21.975Z · LW(p) · GW(p)
But I really don't see an utopia in which, for a bit more of fun of a single person, the whole humanity will lose something like one of the last Stardivarii.
I suppose this depends on whether one is allowed to own a Stradivarius. If I own a Stradivarius, I can indeed use it to row a boat, or for target practice, or to start a campfire. If this is Ishtar's (or the villain's) Stradivarius, then they can play this game with it. This changes if you're not allowed to own things, or some kinds of things, in utopia.
↑ comment by TheOtherDave · 2012-01-10T16:09:46.745Z · LW(p) · GW(p)
The question you're asking is, in its broadest term, is it actually a good idea for actions to have potentially negative consequences for people other than the agent? (A related question is whether it's even a good idea for actions to have potentially negative consequences for the agent.)
In my experience all answers to this question are uncomfortable if I think them through enough. Rejecting one answer because I've recently seen it illustrated, and choosing an opposite answer by negating that answer, just causes me to flipflop. (Or, in local terms, subjects me to Dutch Booking.)
Replies from: kilobug↑ comment by kilobug · 2012-01-10T16:16:36.527Z · LW(p) · GW(p)
My question is not as broad. I can accept a slight cost to external people for a great gain to one agent, under some situation. I don't mind paying (reasonable) taxes to care for people with diseases, even diseases I know I'll never get (like genetic diseases), I'm even in favor of it. What I was more pointing to is a scope problem : the gain to Ishtar in term of slightly higher trill for her quest seems way too low to compensate for a lost that'll affect all presents and future humans, for some even in a comparable way (just hearing the news that the Stardivarius was destroyed can hurt more a music fan than the additional thrill Ishtar gained).
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-10T16:45:25.570Z · LW(p) · GW(p)
Interesting. So if Ishtar's quest was modified so that the gain to her was much much higher, it would be OK?
Replies from: kilobug↑ comment by kilobug · 2012-01-10T17:06:47.690Z · LW(p) · GW(p)
If the gain to her was high enough, and it wouldn't be possible to get that gain (or something close enough to it) in a less harmful way to others, it could be OK. It would be a complicated question with no easy answer, a case in which I don't trust myself to use raw consequentialism because I don't know how to evaluate the real harm done by destroying a Stradivarius, both because I'm not enough of a music fan and because integrating the lost on all current and future humans is beyond my skill. So for high enough values of her gain I'ld be like "are you really sure you can't give her the gain without that destruction ? and if so, well, I don't know".
↑ comment by Luke_A_Somers · 2013-02-08T15:11:33.018Z · LW(p) · GW(p)
Isn't the point of a heroic adventure usually to not comply with options that the villian presents you with? You're assuming that Nero 2 wins, or even has a significant chance to win.
Replies from: Desrtopa↑ comment by Desrtopa · 2013-02-08T15:21:07.415Z · LW(p) · GW(p)
If Nero 2 doesn't have a significant chance to win, I think that would largely defeat the purpose of the adventure. There's not much tension if you consider your success a foregone conclusion.
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2013-02-08T19:45:44.452Z · LW(p) · GW(p)
True, though this environment involves a significant element of faking/limiting dangers.
If this environment permitted high-excitement people to destroy historical artifacts on a regular basis, there wouldn't be any left after an alarmingly short period.
comment by AlexMennen · 2011-12-25T01:00:42.822Z · LW(p) · GW(p)
This scenario seems a bit hard on Ted. Great story, though.
Replies from: army1987↑ comment by A1987dM (army1987) · 2011-12-25T19:12:23.194Z · LW(p) · GW(p)
Yes, it should be somehow made clear that Ted's OK with Ishtar having sex with other people, at least.
Replies from: Alicorn, AlexMennen, Stuart_Armstrong↑ comment by Alicorn · 2011-12-25T22:06:58.720Z · LW(p) · GW(p)
It's stated that she broke up with the last guy because he wanted to be monogamous and it would follow that the current guy probably doesn't want that.
Replies from: army1987↑ comment by A1987dM (army1987) · 2011-12-26T11:00:59.584Z · LW(p) · GW(p)
I had interpreted “his need for monogamy” as the need for him to be monogamous, but now I realize that a relationship where you commit to not have sex with other people but your partner doesn't is pretty unlikely (ETA: and even so, it would be unlikely to make Ishtar want to break up with Albert), and so Albert needed both to be monogamous.
Replies from: TheOtherDave, wedrifid↑ comment by TheOtherDave · 2011-12-26T15:16:23.694Z · LW(p) · GW(p)
One-sided monogamous relationships aren't all that uncommon; I know of several. As with a lot of non-standard practices involving monogamy, couples who practice it often don't tell their friends and family about it. (Also, individuals who practice it often don't tell their partners about it, but culturally we tend to put that in a different category.)
↑ comment by wedrifid · 2011-12-26T11:40:24.656Z · LW(p) · GW(p)
I had interpreted “his need for monogamy” as the need for him to be monogamous, but now I realize that a relationship where you commit to not have sex with other people but your partner doesn't is pretty unlikely
At least, it is in our particular culture.
↑ comment by AlexMennen · 2011-12-26T00:01:13.122Z · LW(p) · GW(p)
Actually, I was referring to the fact that Ted won't be able to see his girlfriend again if she fails because she wanted her adventures to have real-life consequences.
Replies from: Baughn, thomblake↑ comment by Baughn · 2011-12-28T21:23:54.491Z · LW(p) · GW(p)
Ishtar decided that her adventures should have real consequences; Ted decided to follow a code of honor that made keeping him away from her possible.
If he hadn't, the Machines would presumably have found something else to use for stakes.
↑ comment by Stuart_Armstrong · 2011-12-26T11:18:10.497Z · LW(p) · GW(p)
Divergent sexual/relationship desires are things that are explicitly not solved in this world. That said, Ted seems pretty fine with everything here.
comment by katydee · 2011-12-26T06:40:28.385Z · LW(p) · GW(p)
Not scary enough, but entertaining.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2011-12-26T09:04:48.013Z · LW(p) · GW(p)
Actually, I found it much harder to write a non-scary utopia than a scary one. You want the story to be entertaining, and scary is entertaining. And "this is scary, but trust me, it's actually good" is a far too easy way to get entertaining but claim it's actually utopic.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-12-28T16:49:00.166Z · LW(p) · GW(p)
It's only easy if you take something that's scary-and-nongood and claim it's scary-and-good. Coming up with things that you believe are genuinely optimal, and which happen to be scary, is the realistic way of generating a scary utopia. It is not cheating. And many readers will disagree, but that's fine.
Replies from: Stuart_Armstrong, Armok_GoB↑ comment by Stuart_Armstrong · 2011-12-28T18:37:21.127Z · LW(p) · GW(p)
An ideal author can write ideal stories. But I found that scariness sucks story energy from the other parts (both for me and the reader). A long story can cope with it, but a short story risks becoming a SCARY utopia rather than a scary UTOPIA.
More details in my post at http://lesswrong.com/r/discussion/lw/920/eutopia_is_scary_for_the_author/
comment by lukeprog · 2011-12-26T05:32:15.116Z · LW(p) · GW(p)
This better not be made into a short film or else it may cause Utopia-induced depression.
Replies from: katydee, Normal_Anomaly, Jonathan_Graehl↑ comment by katydee · 2011-12-26T06:35:29.063Z · LW(p) · GW(p)
In the game Civilization: Call to Power, there was a unit called a "Neural Ad" that worked by broadcasting insidious advertisements for nonexistent products, making opposing peoples unhappy thanks to their insatiable want for these products. I first found this idea absurd and obviously flawed. Having read that article, though, I'm not so sure anymore...
↑ comment by Normal_Anomaly · 2011-12-26T17:28:44.881Z · LW(p) · GW(p)
That's an interesting link. I'd like to see some data on whether the effect is stronger on people whose lives are worse, or who are already less happy.
↑ comment by Jonathan_Graehl · 2011-12-27T21:25:02.982Z · LW(p) · GW(p)
Before checking your link, I'd already thought of the Avatar-lovers. I've somehow avoided the feeling entirely in spite of spending over my life many thousands of hours enjoying escaping-via-fiction. The closest I've felt to it is the single day of waking from a nice lucid-dream adventure (a rare event for me).
comment by Multiheaded · 2011-12-24T20:49:40.867Z · LW(p) · GW(p)
That's great! Were you, like me, disturbed by EY's suggestion that a genuine eutopia HAS to be scary? I do prefer to know that I can have perfect safeguards against anything, including psychological disquiet, at any moment, even if I choose to tune them down.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2011-12-25T09:40:34.007Z · LW(p) · GW(p)
I think his point was more that any plausible utopia would be scary, because it would be different.
But a story to convince people that utopias can actually be interesting places to live in is not the place to bring up all the scary parts...
Replies from: FiftyTwocomment by kateblu · 2011-12-24T12:49:55.837Z · LW(p) · GW(p)
Magnificent. I gather one has an eternity to figure out his or her version of utopia and that physical death is an option. It's not quite clear to me whether Ishtar exists in manipulated multiverses or as an avatar with her brain in a vat, or ?
Replies from: DanielLC↑ comment by DanielLC · 2011-12-24T21:09:15.373Z · LW(p) · GW(p)
It's not quite clear to me whether Ishtar exists in manipulated multiverses or as an avatar with her brain in a vat, or ?
I get the impression that it's the real world. There was a guy obsessed with everything being natural, which would be impossible in a simulation. I suppose he could have meant it being identical to nature.
Replies from: kateblu↑ comment by kateblu · 2011-12-24T22:01:40.280Z · LW(p) · GW(p)
I think he wanted to create his eden without the assistance of machines. Since he has been at it apparently for centuries, he couldn't be totally natural.
Replies from: Stuart_Armstrong, Strange7↑ comment by Stuart_Armstrong · 2011-12-25T08:20:31.850Z · LW(p) · GW(p)
Indeed :-) but just like modern nature lovers will tell you all about it on their cell-phone, there are some artifices he just won't count as being artificial...
Plus, the Machines just dropped a love interest straight on him...
Replies from: army1987, kateblu↑ comment by A1987dM (army1987) · 2011-12-25T11:53:46.993Z · LW(p) · GW(p)
Last evening, after a TV news piece about breast implants:
My aunt: “I don't like that. A woman should accept her own body.” Me: “That isn't your natural hair colour, is it? So, should a woman accept her breast but not her hair?”
Replies from: duckduckMOO↑ comment by duckduckMOO · 2012-04-17T00:16:53.501Z · LW(p) · GW(p)
?????????????
so not the same thing.
↑ comment by kateblu · 2011-12-25T16:18:17.663Z · LW(p) · GW(p)
True, but you must remember that it is HER adventure. She is the one who hit the "pause" button. Did he have the ability to say "No"? Was there a "pause" button that he could have hit before she did?
Replies from: Stuart_Armstrong, Baughn↑ comment by Stuart_Armstrong · 2011-12-25T17:10:25.321Z · LW(p) · GW(p)
He wasn't in an adventure - but when a nice opportunity to make someone happy came along at low price, the Machines wouldn't sneeze on it, I'd imagine...
comment by Psychosmurf · 2011-12-25T07:26:12.058Z · LW(p) · GW(p)
Makes me take the warnings not to be seduced by my imagination far more seriously. Excellent work.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2011-12-25T16:09:26.691Z · LW(p) · GW(p)
Cool. Didn't quite understand what you meant by seduction by imagination, though...
comment by Nymogenous · 2011-12-24T18:52:05.142Z · LW(p) · GW(p)
Excellent story! I second the idea that this belongs in Main.
Also, I particularly liked your idea of physics being left to humans so as not to spoil the fun. It's both an unusual idea and one of my personal requirements for a utopia...spoilers are so boring.
Replies from: Thomas↑ comment by Thomas · 2011-12-24T20:08:13.959Z · LW(p) · GW(p)
Unfortunately, the physics can't be left to humans - it is too important. I am not sure if it's too difficult also, but it is surely too important.
Replies from: Nymogenous↑ comment by Nymogenous · 2011-12-24T20:26:42.177Z · LW(p) · GW(p)
Well no, but an AI could figure things out and then not tell the physicists. Same thing as when you let a kid take apart a toaster to find out how it works instead of just telling them...or was that only my parents that did that?
Replies from: Armok_GoB↑ comment by Armok_GoB · 2011-12-26T20:41:46.436Z · LW(p) · GW(p)
I got the impression that was what happened to the fields that WEREN'T left for humans, and that the humans wanted to genuinely be the first to know X rather than just having the experience of discovering it, or believe so falsely.
Replies from: shokwavecomment by TurnTrout · 2020-11-24T04:54:11.473Z · LW(p) · GW(p)
What a beautiful, bold, and chaotic story. Thanks for writing, Stuart.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2020-11-24T08:54:40.999Z · LW(p) · GW(p)
Enjoyed writing it, too.
comment by Will_Newsome · 2012-01-01T07:30:56.012Z · LW(p) · GW(p)
Adventure and excitement are okay I guess, but I'm more of a sehnsucht kinda guy. What's with you pagans and your worship of the fun god?
comment by Shmi (shminux) · 2011-12-25T00:04:05.803Z · LW(p) · GW(p)
Interesting, a world where gods are real. Pretty neat. Upvoted for constructing a toy model of a CEV-based world.
(Spelling note: it's "artifact", not "artefact")
Replies from: RobertLumley↑ comment by RobertLumley · 2011-12-25T01:02:10.282Z · LW(p) · GW(p)
I believe "artefact" is a British spelling, like colour.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2011-12-25T08:23:02.832Z · LW(p) · GW(p)
Everything sounds better in a British accent, of course ;-)
Replies from: Alejandro1↑ comment by Alejandro1 · 2011-12-25T14:45:31.114Z · LW(p) · GW(p)
...and hence only British accent and spelling will be utilised in Utopia?
Replies from: thelittledoctor↑ comment by thelittledoctor · 2011-12-25T18:56:27.438Z · LW(p) · GW(p)
Extra points for the s in "utilised".
Replies from: dlthomas↑ comment by dlthomas · 2011-12-25T19:19:36.939Z · LW(p) · GW(p)
Possibly some points lost for not just saying "used," though.
Replies from: bentarm↑ comment by bentarm · 2011-12-26T14:17:42.000Z · LW(p) · GW(p)
But isn't "used" spelled the same way on both sides of the Atlantic, thus ruining the joke?
Replies from: dlthomas↑ comment by dlthomas · 2011-12-26T18:15:57.501Z · LW(p) · GW(p)
Yes, of course, I was just making an orthogonal joke.
Replies from: Alejandro1↑ comment by Alejandro1 · 2011-12-26T18:38:29.976Z · LW(p) · GW(p)
Maybe the mirror-image joke would be "only American accent and spelling will be uzed in Utopia".
comment by thelittledoctor · 2011-12-24T17:01:42.468Z · LW(p) · GW(p)
Absolutely delightful.
comment by knb · 2011-12-28T09:23:40.180Z · LW(p) · GW(p)
Reading about this tragic and horrifyingly wasteful dystopia really solidifies my hope that the future goes more like what Robin Hanson has envisioned.
Replies from: wedrifid, None, Multiheaded↑ comment by wedrifid · 2012-01-17T14:58:14.229Z · LW(p) · GW(p)
Reading about this tragic and horrifyingly wasteful dystopia really solidifies my hope that the future goes more like what Robin Hanson has envisioned.
Accordingly I must raise my estimation of the threat Robin Hanson poses to humanity. He has persuaded at least one person to advocate his Malthusian hell!
Replies from: knb↑ comment by knb · 2012-01-17T20:06:58.119Z · LW(p) · GW(p)
Meh, I wasn't advocating it, just saying it would be way better than this scenario. Either n humans burning the cosmic commons for tacky IRL video games and sex with strangers or 1 Billion n humans living worthwhile productive lives.
It just seems obvious when you do the math.
Replies from: wedrifid, thomblake, NancyLebovitz, ArisKatsaris↑ comment by wedrifid · 2012-01-17T23:26:06.176Z · LW(p) · GW(p)
Either n humans burning the cosmic commons for tacky IRL video games and sex with strangers or 1 Billion n humans living worthwhile productive lives.
I don't share your premises - including the one that seems to be that the agents that survive in Hansonian Hell are humans in any meaningful sense.
It just seems obvious when you do the math.
Your expression of preference here cannot be credibly described as 'doing math'.
↑ comment by thomblake · 2012-01-17T20:11:20.924Z · LW(p) · GW(p)
Either n humans burning the cosmic commons for tacky IRL video games and sex with strangers or 1 Billion n humans living worthwhile productive lives.
I guess one person's tacky IRL video game is another's worthwhile productive life.
Replies from: knb↑ comment by knb · 2012-01-17T21:20:24.043Z · LW(p) · GW(p)
I'm not saying Ishtar's life isn't worth living, just that it's tragic that so many also worthwhile lives are being prevented from existing so that she can play her silly games.
Also, it isn't productive, it is consumptive. She is simply consuming resources provided by others.
Replies from: thomblake, TheOtherDave↑ comment by thomblake · 2012-01-17T21:33:09.030Z · LW(p) · GW(p)
Right. My point was that you view it as "her silly games", but they might be precisely what make her life worth living. One might just as well say "It's tragic that so many silly lives exist so that Ishtar cannot live her worthwhile life".
Not so much a "Who are you to say" style rejection, as much as noting that it's not obvious, math or no math.
Replies from: knb↑ comment by knb · 2012-01-18T05:17:48.340Z · LW(p) · GW(p)
Ok, so would you prefer to pop all but 7 humans out of existence (assume that the process is painless) in exchange for the remaining humans experiencing Ishtar level happiness?
If you just mean to say that terminal values are arbitrary, so some possible minds might prefer the Ishtar scenario, then that's fine, so far as it goes. But if you take multiplication seriously, then its insanely hard to make the case for this being a genuine eutopia.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-18T07:02:55.534Z · LW(p) · GW(p)
But if you take multiplication seriously
You seem to be forgetting the old adage "Shut up and don't multiply by a count unless it is a count of something that you value linearly with said count". Very few people value the existence of Hansonian Hell-bots in direct proportion to the number of Hell-bots that are 'alive'. Of those that do it isn't clear that they value the existence of these creatures in that condition positively. So taking things 'seriously' here is more to do with what preferences people have than it is about 'multiplication'.
Replies from: knb↑ comment by knb · 2012-01-18T07:50:42.187Z · LW(p) · GW(p)
A lot of people live at subsistence levels. They aren't much less happy than you or me on average. Their lives are very well worth living by their own standards. And they would likely be better adapted to their environment than we are, so there's good reason to believe they would be better of than 1st worlders are now. And the denizens of this alleged eutopia don't seem much happier than some people I know now.
And as long as we're focusing on the preferences of people in the world now, how many of them do you think would approve of the implicit AI autocracy, reproductive central-planning, and hedonism of this world?
This is a dream-world for nerdy/polyamorous/transhumanist folks common on LW but rare everywhere else (except maybe reddit).
Replies from: army1987↑ comment by A1987dM (army1987) · 2012-01-18T17:47:40.751Z · LW(p) · GW(p)
I do not think the parent is so obviously wrong to be worth being downvoted to -4 without even mentioning what's wrong with it. (I actually agree with much of it. There was even a Robin Hanson post about the fact that “the poor also smile” (can't link to it because Overcoming Bias is blacked out today).
Replies from: Nornagest↑ comment by Nornagest · 2012-01-18T18:14:09.662Z · LW(p) · GW(p)
I downvoted the grandparent for making unintuitive claims about the money:happiness relation without presenting evidence (my understanding is that subsistence-level income does have a significant and negative effect on happiness, although the plot of happiness over income levels off quickly after a basic level of financial security is achieved), for making sketchy claims about adaptation without evidence, for conflating approval with preference (particularly glaring because the point about happiness/income above only works without conflating the two), and for the entirely unnecessary swipe at perceived LW norms in the last sentence.
Oh, and by way of disclaimer, I didn't find the original story especially compelling as a vision of utopia.
↑ comment by TheOtherDave · 2012-01-18T16:00:36.737Z · LW(p) · GW(p)
Isn't it equally tragic that in the real world the resources that are currently maintaining my life aren't instead being used to support other, more worthwhile, lives? (Or, well, more tragic, since it's actually real?)
Replies from: knb↑ comment by knb · 2012-01-18T20:33:34.510Z · LW(p) · GW(p)
Isn't it equally tragic that in the real world the resources that are currently maintaining my life aren't instead being used to support other, more worthwhile, lives?
My argument isn't that each life which might have been is more valuable, but that they are when added up.
First of all, uploads aren't yet possible, so far fewer lives worth living could be supported with your resources in the first place. More importantly, since most resources aren't being centrally distributed by all-powerful machine gods, we would have to tax your earnings. This involves the infamous "leaky buckets" problem acknowledged by all utilitarians. People engage in tax avoidance behaviors, work fewer hours, hide income, and some money which is captured is spent on overhead. These problems don't exist when all resources are being created and distributed by a central depot.
Furthermore, the ability to actually get those resources to the people in need are in doubt, due to grabby governments/warlords/logistical problems etc.
But yes, overall I would say it is tragic that some of our 1st-World resources aren't going to save marginal lives.
Replies from: khafra↑ comment by khafra · 2012-01-18T21:19:43.190Z · LW(p) · GW(p)
But yes, overall I would say it is tragic that some of our 1st-World resources aren't going to save marginal lives.
I don't think this is the alternative he was proposing. I think the more relevant analogy would be our 1st-World resources going to produce extra marginal barely-worth-living lives in the third world.
↑ comment by NancyLebovitz · 2012-01-18T13:14:22.625Z · LW(p) · GW(p)
What do you think people should be doing? In a post scarcity economy, it seems to me that a lot of what remains to be done is keeping each other entertained.
Replies from: knb↑ comment by knb · 2012-01-18T20:48:35.550Z · LW(p) · GW(p)
My problem isn't particularly with the Ishtar's pastimes, but with the overall system. I'm arguing that Hanson's upload society would be better than this because it could support so many more lives worth living total and more total utility than this alleged eutopia.
Replies from: Normal_Anomaly↑ comment by Normal_Anomaly · 2015-01-06T20:28:39.040Z · LW(p) · GW(p)
So you'd be happy with this world if it all existed inside a small piece of the galactic computronium-pile, and there was lots more of it? I actually hadn't considered that, because I just assume all post-Singularity futures are set inside the galactic computronium-pile unless explicitly stated otherwise.
↑ comment by ArisKatsaris · 2012-01-18T16:41:54.053Z · LW(p) · GW(p)
It just seems obvious when you do the math.
What math is that? Are you talking about number of lives at any given century -- effectively judging the situation as if time-periods were sentient observers to be happy or unhappy at their current situation?
Do you have any reason to believe that maximum diversity in human minds (i.e. allowing lots of different humans to exist) would be best satisfied by cramming them all in the same century, as densely as possible?
A trillion lives all crammed in the same century aren't inherently more worthwhile than a trillion lives spread over a hundred centuries -- any more than 10 people forced to live in the same flat are inherently more precious than 10 people having separate apartments. Do you have any reason to prefer the former over the latter? Surely there's some ideal range where utility is satisfied in terms of people density spread over time and location.
Replies from: knb↑ comment by knb · 2012-01-18T20:02:04.657Z · LW(p) · GW(p)
You are misunderstanding my argument.
When you use up negentropy, it is used up for good, and there is a finite amount in each section of the universe. The amount being used on Ishtar could theoretically support good lives of billions of upload minds (or a smaller but still huge number of other possible lives). This isn't a matter of a long and narrow future or a short and wide future, but of how many total, worthwhile lives will exist.
As for quality, there seems to be no reason why simulations can't be as happy, or even happier than Ishtar.
↑ comment by [deleted] · 2011-12-28T09:35:03.668Z · LW(p) · GW(p)
You know you're looking at a dystopia when even Hanson's malthusian hell world looks good in comparison.
(Agree with the sentiment, though.)
Replies from: Baughn↑ comment by Baughn · 2011-12-28T21:17:29.635Z · LW(p) · GW(p)
It's one world, or one solar system, and for all we know they've found a way around entropy - or this could all be a highly realistic simulation.
But even if it isn't, I consider this option far better than Hanson's dystopia. Its main flaw is inefficiency, which can be fixed.
Replies from: knb↑ comment by knb · 2011-12-29T03:32:43.252Z · LW(p) · GW(p)
Its main flaw is inefficiency
Its main characteristic is inefficiency.
Replies from: Eugene, Stuart_Armstrong↑ comment by Eugene · 2012-01-03T09:41:22.883Z · LW(p) · GW(p)
There's little indication of how the utopia actually operates at a higher level, only how the artificially and consensually non-uplifted humans experience it. So there's no way to be certain, from this small snapshot, whether it is inefficient or not.
I would instead say that it's main flaw is that the machines allow too much of the "fun" decision to be customized by the humans. We already know, with the help of cognitive psychology, that humans (which I assume by their behavior to have intelligence comparable to ours) aren't very good at making assessments about what they really want. This could lead to a false dystopia if a significant proportion of humans choose their wants poorly, become miserable, then make even worse decisions in their misery.
Replies from: TheOtherDave, army1987↑ comment by TheOtherDave · 2012-01-17T15:18:04.232Z · LW(p) · GW(p)
OTOH, nothing in that story requires that the humans are making unaided assessments. The protagonist's environment may well have been suggested by the system in the first place as its best estimate of what will maximize her enjoyment/fulfilment/fun/Fun/utility/whatever, and she may have said "OK, sounds good."
↑ comment by A1987dM (army1987) · 2012-01-17T20:29:18.249Z · LW(p) · GW(p)
I would instead say that it's main flaw is that the machines allow too much of the "fun" decision to be customized by the humans. We already know, with the help of cognitive psychology, that humans (which I assume by their behavior to have intelligence comparable to ours) aren't very good at making assessments about what they really want. This could lead to a false dystopia if a significant proportion of humans choose their wants poorly, become miserable, then make even worse decisions in their misery.
I'm afraid I'd prefer it that way. Having the machines decide what's fun for us would likely lead to wireheading. Or am I missing something?
[off to read the Fun Theory sequence in case this helps me find the answer myself]
Replies from: Nornagest↑ comment by Nornagest · 2012-01-17T20:32:54.100Z · LW(p) · GW(p)
Depends on the criteria the machines are using to evaluate fun, of course -- it needn't be limited to immediate pleasure, and in fact a major point of the Fun Theory sequence is that immediate pleasure is a poor metric for capital-F Fun. Human values are complex and there's a lot of possible ways to get them wrong, but people are pretty bad at maximizing them too.
↑ comment by Stuart_Armstrong · 2011-12-29T08:26:50.854Z · LW(p) · GW(p)
Also known as fun.
Replies from: Baughn↑ comment by Baughn · 2011-12-29T22:29:37.592Z · LW(p) · GW(p)
Efficiency in fun-creation.
Efficiency in doing something that doesn't match my utility function seems.. fairly pointless, really. An abuse of the word, even.
Replies from: Multiheaded↑ comment by Multiheaded · 2012-01-17T13:58:54.569Z · LW(p) · GW(p)
Yet the horror is that it's what you might catch yourself worshiping down the line, forgetting to enjoy any of it. Just take a look at the miserable and aimless workaholics out there, if they can still handle whatever it is they're doing, their boss will happily exploit them. Do you think your brain would care more about you if you set "efficiency" as its watchword?
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-17T15:23:23.652Z · LW(p) · GW(p)
Yup, if we set out to build a system that maximized our ability to enjoy life, and we ended up with a system in which we didn't enjoy life, that would be a failure.
If we set out to build a system with some other goal, or with no particular goal in mind at all, and we ended up with a system in which we didn't enjoy life, that's more complicated... but at the very least, it's not an ideal win condition. (It also describes the real world pretty accurately.)
I'm curious: do you have a vision of a win condition that you would endorse?
Replies from: Multiheaded↑ comment by Multiheaded · 2012-01-17T16:46:58.498Z · LW(p) · GW(p)
See more in my latest post; I'll be adding to it.
http://lesswrong.com/r/discussion/lw/9g0/placeholder_against_dystopia_rally_before_kant/
↑ comment by Multiheaded · 2012-01-17T13:53:15.857Z · LW(p) · GW(p)
wasteful
You best be sarcastic. Waste is good! It's signaling, it's ease, it's a lack of tension, it's the life's little luxuries that you'd wish back if they were all taken from you simultaneously, without caring much about the "efficiency" of it.
Replies from: knb↑ comment by knb · 2012-01-17T20:00:19.346Z · LW(p) · GW(p)
I wasn't being sarcastic.
Waste is good!
No, waste is by definition not good. Resource usage can be good, but the world of this story makes me pessimistic about how it is being done. It seems like the AI gods of this world have engineered a "post-scarcity" society with population control to keep the amount of resources extremely high per person--which enables this video game-like lifestyle for people who want it. Millions of lives could be supported with resources centrally allocated to her. That is a horribly anti-egalitarian form of communism.
Admittedly, it is possible that this takes place within a simulation, but that is never stated, and we have reason to believe that it isn't true. For example, the author mentions that Ishtar knows the AI-gods won't let her die even if she crashes, implying that this is her physical body.
Replies from: Multiheaded↑ comment by Multiheaded · 2012-01-17T20:14:55.189Z · LW(p) · GW(p)
Millions of lives could be supported with resources centrally allocated to her.
Are you sure you want them to pop into existence? Why? I just can't understand! Why must there be more people? So that you can have more smiley faces? That's the road to paper-clipping!
Replies from: Nornagest, knb↑ comment by Nornagest · 2012-01-17T20:28:21.453Z · LW(p) · GW(p)
Well, yes. Several popular versions of utilitarianism lead by a fairly short path to what's probably the first paperclipping scenario I ever read about, although it's not usually described in those terms.
Coming up with a version of utilitarianism that doesn't have those problems or an equally unintuitive complement to them is harder than it looks, though.
↑ comment by knb · 2012-01-17T21:16:20.175Z · LW(p) · GW(p)
Why does anyone value anything? If we could painlessly pop all but 70 human beings out of existence but make the ones who remain much happier (say, 10x as happy), would you do it? Why not? Why must there be more people?
Replies from: Multiheaded↑ comment by Multiheaded · 2012-01-17T21:38:55.864Z · LW(p) · GW(p)
That's easy; we have to look at both cases in some detail.
-Forking over a part of our genes, mind, society and culture to create new beings with new complexity, knowing that less than optimal conditions await them, -
-versus refraining from erasing all of the extant and potential value and complexity of current beings, here and now, for a very mixed blessing (increasing the smileyness of faces while decreasing the amount of tiles). The second action has much greater utility, and is not very much like the first at all. So we could easily do the second while avoiding the first, and be consistent in our values and judgment.
Sorry, I'm a bit high.
comment by [deleted] · 2012-01-08T21:48:42.742Z · LW(p) · GW(p)
This has a lot of literary value. If this was a book, I would read it. (hint hint)
comment by Dallas · 2011-12-25T09:32:32.856Z · LW(p) · GW(p)
This would be a great world to bottle the reactionaries in. (except for a few minor details involving replication, but I suppose we could subconsciously edit that out)
Replies from: Normal_Anomaly, Stuart_Armstrong↑ comment by Normal_Anomaly · 2011-12-26T17:29:48.636Z · LW(p) · GW(p)
Do you mean that it's sufficiently "low-tech" or similar enough to our world that it would be palatable to technophobes, and they could do that while the technophiles did something weirder?
↑ comment by Stuart_Armstrong · 2011-12-26T11:19:35.706Z · LW(p) · GW(p)
I don't get this. Can you explain?