Rationalist fiction: a Slice of Life IN HELL

post by Ritalin · 2014-03-25T17:02:59.732Z · LW · GW · Legacy · 26 comments

Contents

26 comments

"If you're sent to Hell for that, you wouldn't have liked it in Heaven anyway." 

This phrase inspired in me the idea of a Slice of Life IN HELL story. Basically, the strictest interpretation of the Abrahamic God turns out to be true, and, after Judgment Day, all the sinners (again, by the strictest standards), the pagans, the atheists, the gays, the heretics and so on end up in Hell, which is to say, most of humanity. Rather than a Fire and Brimstone torture chamber, this Hell is very much like earthly life, except it runs on Murphy's Law turned Up To Eleven ("everything that can go wrong, will go wrong"), and you can't die permanently, and it goes on forever. It's basically Life as a videogame, set to Maximum Difficulty, and real pain and suffering.

Our stories would focus actually decent, sympathetic people, who are there for things like following the wrong religion, or having sex outside missionary-man-on-woman, lack of observance of the daily little rituals, or even just being lazy. They manage to live more-or-less decently because they're extremely cautious, rational, and methodical. Given that reality is out to get them, this is a constant uphill battle, and even the slightest negligence can have a terrible cost. Thankfully, they have all the time in eternity to learn from their mistakes.

This could be an interesting way to showcase rationalist principles, especially those regarding safety and planning, in a perpetual Worst Case Scenario environment. There's ample potential for constant conflict, and sympathetic characters whom the audience can feel they really didn't deserve their fate. The central concept also seems classically strong to me: defying Status Quo and cruel authorities by striving to be as excellent as one can be, even in the face of certain doom.

What do you guys think? There's lots of little details to specify, and there are many things that I believe should be marked as "must NOT be specified". Any help, ideas, thoughts are very welcome.

26 comments

Comments sorted by top scores.

comment by Lumifer · 2014-03-25T17:33:19.733Z · LW(p) · GW(p)

I don't see much good in associating rationality with extreme caution.

I am also not sure of how much use will rationality be in a world where reality is out to get you. It is merely unlikely that all the air molecules in a room will decide to be elsewhere leaving hard vacuum in their place -- or that you fall a hundred feet underground through solid rock and then remain entombed there for a REALLY long time...

Replies from: Ritalin
comment by Ritalin · 2014-03-25T18:25:36.132Z · LW(p) · GW(p)

How improbable the bad outcomes are would depend on which level of Hell you're into, perhaps?

comment by Shmi (shminux) · 2014-03-25T17:30:20.442Z · LW(p) · GW(p)

This could be an interesting way to showcase rationalist principles, especially those regarding safety and planning, in a perpetual Worst Case Scenario environment. There's ample potential for constant conflict,

Wildbow of the Worm fame describes something like that in Pact. The protagonist is unwillingly thrust into the world where he inherited a karmic debt from previous generations and so

reality is out to get them, this is a constant uphill battle, and even the slightest negligence can have a terrible cost.

The serial is not explicitly "rationalist", but irrational (in that universe) decisions bite him in the rear pretty quickly. And so do rational decisions sometimes. Even your best actions against a hostile universe can only get you so far.

Is this similar to what you had in mind?

Replies from: Ritalin, Error
comment by Ritalin · 2014-03-25T18:12:33.287Z · LW(p) · GW(p)

Indeed. Although, frankly, what I've seen of Worm so far seems to designate it as very similar to my idea of Hell; every accomplishment is either made moot or cost something irreplaceable and possibly of superior value, every victory is short-lived, every mistake is paid for dearly. Every situation is desperate, every problem urgent. By the time a conflict reaches its resolution, another is at its peak, and two more are right around the corner. Perhaps it's even worse; hardship, instead of building character, corrupts it.

For the characters, it must be like a nightmare they can't wake up from.

Replies from: Protagoras
comment by Protagoras · 2014-03-26T12:24:07.136Z · LW(p) · GW(p)

Yeah, Worm is pretty bleak. I tend to find that a bit overwhelming at times myself; I like the series because of its other strengths (diverse and interesting characters, intelligent plotting, deep and rich setting) with the oppressive tone being a small strike against it for me.

comment by Error · 2014-03-25T22:18:12.379Z · LW(p) · GW(p)

...And now I just spent most of my workday reading Pact. Upvoted for awesome. Thanks. :-)

comment by A4FB53AC · 2014-03-26T12:01:44.792Z · LW(p) · GW(p)

Please do note the delicious irony here :

I don't see much good in associating rationality with extreme caution.

I don't think that teaching people to expect worse case scenarios increases rational thinking.

Which in essence looks suspiciously like cautiously assuming a bad case scenario in which this story won't help the rationality cause, or even a worst case scenario in which it will do more wrong than right.

If you want to go forth and create a story about rationality, then do it. Humans are complex creatures, not everyone will react the same way to your story, and anybody who thinks they can accurately predict the reaction of all the different kinds of people who'll read your story (especially as this story hasn't even been written yet) is either severely deluded as to their ability, or secretly running the world behind curtains already.

When you are older, you will learn that the first and foremost thing which any ordinary person does is nothing.

Replies from: Ritalin
comment by Ritalin · 2014-03-26T14:37:57.134Z · LW(p) · GW(p)

the first and foremost thing which any ordinary person does is nothing

That's me allright. Heck, now that the examples of Hellcity, Worm and Pact have been brought up, I feel like such a work would be redundant.

comment by CronoDAS · 2014-03-27T05:32:12.263Z · LW(p) · GW(p)

A silly question: Of all the recurring "employee" characters in Dilbert, which one is reacting most rationally to the situation they're in? Probably Wally...

comment by [deleted] · 2014-03-26T13:37:51.011Z · LW(p) · GW(p)

In terms of little details, I think right away "Everything that can go wrong, will go wrong" must be specified, because if you let rationalists try to think "How bad could it be at maximum badness?" it will get very bad, very quickly.

For instance Situation 1: Imagine every day you spend most of the time outside, you get struck by lightning, and every day you spend most of the time inside, there is an earthquake and whatever you are inside collapses on you.

I can see Rationalist attempting to make and spend most of their time in structures made mostly out of pillows: They collapse, oh well, they get rebuilt in 30 minutes, It turns pain into a daily chore.

On the other hand, Imagine Situation 2: Every day through hellish quantum mechanics enough anti-matter appears in contact with your skin to cause a non fatal, but excruciating, matter-antimatter reaction explosion.

Now, at this point, the rationalist might realize something like "Okay, well, I'll arrange things in such a way that any explosion will fit into one of two categories: It will be fatal, or it won't actually cause me pain."

And while the rationalist is attempting to build the arrangement that does this, A giant bear comes by and breaks it and painfully claws them into pieces (nonfatally).

Situation 3: Rationalists can be rationalist all they want, but they've been captured by the giant bears and had all of their limbs systematically clawed off, plus they've been blindfolded, gagged, earplugged, and are periodically used as claw sharpeners.

Of course, if some parts of hell are like situation 1, and some parts of hell are like situation 2, and some are like situation 3, I expect rationalists to attempt to figure out why that is, unless you want to have Situation 4:

Situation 4: There's one constant rule of Hell: Every someone figures out all of the other rules of hell, those rules change.

Ergo: Once someone figures out "Oh, well, I can avoid the Lightning and the Earthquakes with pillow structures." then the Giant Bears and Antimatter Skin Explosions come. Once you figure out how to get used to being used as a Giant Bear claw sharpener, something else happens, and that thing is even worse.

Basically, there is a range of darkness you can have here, in terms of writing. In terms of difficultly levels, this might be expressed as:

1: Hard.

2: Impossible.

3: You're helpless.

4: Struggling can only make it worse.

I was writing a story about a character starting at rock bottom and working their way up, and I actually had the entity setting this up mention to the character that there had been previous versions of the character that just went irrevocably insane, and were deleted and reset because previous versions of 'rock bottom' had been set to low to ever get out.

Replies from: philh
comment by philh · 2014-03-26T14:33:04.451Z · LW(p) · GW(p)

After learning the constant rule, you find a ruleset that doesn't seem too awful, and then don't learn it.

Hell is trying to abstain from pattern matching.

Replies from: JacekLach, ete
comment by JacekLach · 2014-03-28T21:07:22.329Z · LW(p) · GW(p)

Reminds me of talesofmu. Your strategy looks like trying to play the GM, and is likely to get you punished :)

comment by plex (ete) · 2014-03-27T12:39:53.919Z · LW(p) · GW(p)

Wouldn't that count as learning a rule and cause the meta-level rules to change to something worse if you started using your knowledge to make it more tolerable?

Replies from: David_Gerard
comment by David_Gerard · 2014-03-29T10:50:39.383Z · LW(p) · GW(p)

it might do. You'd have to check by writing it, of course.

comment by maia · 2014-03-26T00:42:00.315Z · LW(p) · GW(p)

Ehh... As the other commenters are saying, it's unclear how it would promote rationality, or what its Ultimate Effect would be...

But I think you should do it anyway. I'd read it.

Replies from: Ritalin
comment by Ritalin · 2014-03-26T02:34:02.721Z · LW(p) · GW(p)

The challenge is that rationalists should win, no matter what kind of environment they're thrown in. One that's out to screw them is only a middling challenge. Eventually, I'd like to tackle "how to be as rational/effective as possible in an actively irrational environmnent, such as the setting of The Sandman".

Replies from: Lumifer
comment by Lumifer · 2014-03-26T15:17:45.916Z · LW(p) · GW(p)

The challenge is that rationalists should win, no matter what kind of environment they're thrown in.

Challenge to whom? To the omnipotent author? Doesn't look much like a challenge (see the "omnipotent" bit). To the rationalists? It seems pretty obvious to me that there are environments where no winning is possible.

Replies from: Ritalin
comment by Ritalin · 2014-03-26T15:20:58.267Z · LW(p) · GW(p)

Establishing that winning is impossible is already a win of sorts. And writers are hardly omnipotent; we are governed by the stringent rules of Good Writing. An author who abuses their power willy-nilly only creates an unpersuasive mess that immerses and captivates absolutely no-one, and can hardly be said to be fiction at all.

Replies from: Lumifer
comment by Lumifer · 2014-03-26T16:26:03.177Z · LW(p) · GW(p)

we are governed by the stringent rules of Good Writing

Only if you choose to be so :-)

Replies from: Ritalin
comment by Ritalin · 2014-03-26T17:46:05.823Z · LW(p) · GW(p)

It's not just choice; you have to learn them and interiorize them and they're subjective. Grant Morrison and Alan Moore and Neil Gaiman can write incredibly confusing, irrational, impossible stories that nevertheless are plausible and gripping and immersive. This took them decades of experience. Your beginner fanfic writer, no matter how well-intentioned and studious, will fail on some fundamental level. Check out EY's earliest fiction out there; it's pretty damn terrible.

Replies from: Bound_up
comment by Bound_up · 2015-10-02T07:15:18.084Z · LW(p) · GW(p)

Where does one find this terrible early fiction?

comment by [deleted] · 2014-03-26T13:41:47.146Z · LW(p) · GW(p)

Hellcity by Macon Blair and Joe Flood is as you describe and a good read.

comment by ChristianKl · 2014-03-25T22:51:06.237Z · LW(p) · GW(p)

I don't think that teaching people to expect worse case scenarios increases rational thinking.

Replies from: Ritalin
comment by Ritalin · 2014-03-26T00:43:23.095Z · LW(p) · GW(p)

Not expecting them, but anticipating them. Anticipating how things can go wrong, and pre-empting that. Like Harry buying that medikit. Although it turned out to be useless, because he hadn't been prepared to do what it took to keep his friends safe.

Replies from: ChristianKl, shminux
comment by ChristianKl · 2014-03-26T12:13:14.744Z · LW(p) · GW(p)

It's more rational to make expected utility calculations than trying to cover yourself against every worse case scenario that you can imagine.