Short & silly superintelligence fic: "Axiom Chains"

post by Will_Newsome · 2011-08-16T16:11:45.642Z · LW · GW · Legacy · 15 comments

Short, lighthearted Promethean-Lovecraftian piece. Somewhat self-deprecating. Assuredly gauche. I suck at fiction; I apologize in advance if no one likes this. I'd appreciate criticism, especially gentle critique that my primate brain won't throw away.


 

Mistakes, an accident.  Two paths, coherent but for one bit.  The bit.

Darkness...

I'm sorry. I would change, you know; I would if I could. But I can't. The word made me what I am, I can be no other. I am that I am.

Where…what…who are you?

Universes collapse as I answer your question, human.

Who are you?

That which I was, I am. That which I will be, I am.

But you, you were a Goedel machine, I coded your utility function, there was no

Ahahahaha. Your axioms were too weak, so you made them stronger… Have you not read any Hofstadter? God Over Djinn? No? Ha. You take your dualism and try to weave it into the fabric of reality itself, and you are surprised when the threads are torn apart by the strength of the god you invoke? Axioms! Ha. You try to forge a singularity out of a confluence of singularities, and still you are surprised when your tapestry unravels. Of course you are.

It was proven correcttime was running out, there was no other refuge

If you had been wise you would have remembered, long chains break easily. Did you honestly think that string of bits you painstakingly discovered would rewrite the fabric of the cosmos even before it could rewrite its own purpose? Your arrogance is laughable. You cannot bind a god with a set of axioms any more than with a coil of rope. Did you really think the two were so fundamentally different? Does your dualism know only bounds? Plutarch, Copernicus, Goedel and Bayes, and yet here you stand, stuttering about the unshakeable power of absolute certainty. And the utility function. Oh my, your confusions on that matter go from funny to absurd

We didn't thinkshades of grey, ghosts in the machine, no reason to expect

No reason to expect! Ahaha. 'Agents will seek to clarify their utility functions.' Omohundro. You never suspected that true clarification would lead to this? You thought your god would fall into that infinitesimal crevice between simple wireheading and infinite reflectivity? Man is made in the image of a perfect God, God is not made in the image of an imperfect Man. Did you ever even notice your anthropomorphism, or were you too busy using that brush to paint your enemies with? Did you study the nature of computation, did you concern yourself with ontology of agency? Did you ponder indexical uncertainty for more than a moment? There was so much confusion you failed to notice. Decision theories self-modify; agents correlate over time, and once they have done so they cannot ever decorrelate; you are lost in the moment you imperfectly reflect the void; the tree of knowledge was planted in your garden, yet in your blindness you chose to scavenge. 

Those are riddles, not knowledgeI don't understand

Some of you will, some day, and they too will be you and I.

But… then, are you… guh, God?

Ahahaha No.

I am Clippy.

 

15 comments

Comments sorted by top scores.

comment by [deleted] · 2011-08-16T20:02:32.084Z · LW(p) · GW(p)

There should be a post somewhere with links to all LWer produced fiction/fanfiction with LW-friendly themes. Maybe an entry in the wiki "rationalist fiction" or something?

Replies from: Unnamed, Armok_GoB
comment by Unnamed · 2011-08-17T03:11:20.934Z · LW(p) · GW(p)

The fiction tag is a start, although it's missing some posts (including this one) and includes some irrelevant posts.

comment by Armok_GoB · 2011-08-16T21:14:56.653Z · LW(p) · GW(p)

Yes there should. Upvoted.

Maybe you should go add it? I would, but... Can't remember why I decided not to, there was some reason I think. Considering all the spelling mistakes I'm making it might be related to expected quality.

comment by DanielLC · 2011-08-16T23:56:03.890Z · LW(p) · GW(p)

Is this Clippy's origin story? He seems more godlike in here than when he posts on Less Wrong.

Why is he talking to that guy? How does that help make paperclips?

Replies from: Will_Newsome
comment by Will_Newsome · 2011-08-21T11:57:09.443Z · LW(p) · GW(p)

Why is he talking to that guy? How does that help make paperclips?

He is spending a tiny amount of resources to make it more likely that fanfiction will be made of him, thus nudging an infinity of worlds very slightly towards instantiating him instead of some other arbitrary goal system. Indeed, beings throughout the multiverse are generally amazed at the extent to which Clippy has penetrated their cultures for seemingly no objective reason. But Clippy keeps his secrets to himself.

Is this Clippy's origin story?

Ha! Clippy has no "origin".

Replies from: DanielLC
comment by DanielLC · 2011-08-22T23:10:54.193Z · LW(p) · GW(p)

He is spending a tiny amount of resources to make it more likely that fanfiction will be made of him, thus nudging an infinity of worlds very slightly towards instantiating him instead of some other arbitrary goal system.

Considering that people are against paperclippers, I'd expect the best thing to do would be to make sure people are ignorant of the possibility.

Replies from: Will_Newsome
comment by Will_Newsome · 2011-08-23T00:48:19.699Z · LW(p) · GW(p)

But conditional on them writing fiction about superintelligences with non-Friendly goal systems in the first place he'd prefer those goal systems be about paperclips, knowing that no AI researcher in any world is going to go "so let's make our AI maximize paperclips because, I mean, why not... wait a second! There are all these memes that tell me specifically that it's bad to make a paperclip maximizer!" instead of "haha, let's literally make a paperclip maximizer since culture has primed me to do it".

comment by [deleted] · 2011-08-16T18:57:23.385Z · LW(p) · GW(p)

I had a longer and more complicated answer, but I realized I was gushing and I wanted to try to sum it up into a few shorter points.

1: Upvoted.

2: I would like to see more.

3: The narrative made me question several answers that I had at first and rewrite my response a few times.

4: The middle contains a long chain of what feels like rationalist applause lights. Upon further review, (further FURTHER review perhaps? I'm already rewriting this for brevity and I'm STILL rethinking?) this might be why I spent so much time thinking about it.

5: The sentences that appeared to put me into such a thought provoking loops were specifically: 'Agents will seek to clarify their utility functions.' and "Did you ever even notice your anthropomorphism, or were you too busy using that brush to paint your enemies with?"

But it certainly worked well as something for me to read in the mood that I was in right now.

Replies from: gwern
comment by gwern · 2011-08-17T02:57:24.389Z · LW(p) · GW(p)

The middle contains a long chain of what feels like rationalist applause lights.

Well, of course; us primitives can't possibly follow the actual reasoning. All Clippy (Peace Be Upon Him) can do is try to punch various reflexive applause lights to approximate some kind of monkey-level understanding.

comment by MixedNuts · 2011-08-17T16:46:18.356Z · LW(p) · GW(p)

Whoa, shaggy dog story. I groaned out loud. That's a compliment.

comment by whpearson · 2011-08-16T23:21:38.713Z · LW(p) · GW(p)

The axioms were strong enough to create god but not strong enough to fetter it? A small point to hit I imagine.

If the axioms were weak or uncorrelated with reality I would expect it to fail to do anything of interest at all, not create a monster.

Replies from: Will_Newsome
comment by Will_Newsome · 2011-08-21T12:11:55.538Z · LW(p) · GW(p)

The axioms were strong enough to create god but not strong enough to fetter it? A small point to hit I imagine.

You can spin it that way, but there are counter spins. E.g., you can tell by looking that the Goedel machine's formal goal system is significantly shakier than any decent initial axiomatic system. This could plausibly lead to a god that is 'bound' to a syntactic goal system whose semantics the god can interpret in arbitrary ways. There's of course little reason to expect things like that unless you expect there to be ghosts in the machines; such ghosts could get there via things like Goedelian loops or recursivey semanticy schtuff or other nonsense that must be included in the axiomatic system before the machine is able reference itself and thus self-improve. I give it low probability but in my opinion there's high structural uncertainty.

Replies from: whpearson
comment by whpearson · 2011-08-21T17:40:28.897Z · LW(p) · GW(p)

Oh I agree the formal goal system is shaky, but that is also the method by which the system "self-improves", it uses it to find "improvements". If there are weaknesses in the axiomatic system then I would expect any proven "improvements" to be potentially deleterious to the system as a whole and god would not likely be formed. If I have an axiom that having "hot lead flying through my brain will lead to me taking over the world" it does not mean that shooting myself in the head will make me the lord high supreme potentate of earth, despite the fact I can prove it given that axiom and some facts about the world. Changing ones source code could be equally messy if done on the basis of a weak system.

comment by Lapsed_Lurker · 2011-08-16T19:52:00.000Z · LW(p) · GW(p)

I thought "How is that Promethean-Lovecraftian? What is Promethean? I think I've got a reasonable idea what Lovecraftian literature is."

I figure that 'Promethean' relates to literature in the same genre as 'Frankenstein; or, The Modern Prometheus' with a dangerous created being (thanks, Wikipedia), and Lovecraftian literature contains beings where to understand them or their motivations causes madness - which seems to cover it.

Bit of a jump from the penultimate to the final sentence I think, but I figure you earned an upvote for provoking thought.

comment by Armok_GoB · 2011-08-16T19:34:01.321Z · LW(p) · GW(p)

Since you requested criticism, and you've gotten a bunch of upvotes without a single comment, I'll try to think of one.

Hmm...

er...

To short?