Posts

Comments

Comment by Pierre-André_Noël on Not Taking Over the World · 2008-12-16T15:43:30.000Z · LW · GW

Peter de Blanc: You are right and I came to the same conclusion while walking this morning. I was trying to simplify the problem in order to easily obtain numbers <=1/(3^^^^3), which would solve the "paradox". We now agree that I oversimplified it.

Instead of messing with a proof-like approach again, I will try to clarify my intuition. When you start considering events of that magnitude, you must consider a lot of events (including waking up with blue tentacles as hands to take Eliezer's example). The total probability is limited to 1 for exclusive events. Without proof, there is no reason to put more probability there than anywhere else. There is not much proof for a device exterior to our universe that can "read" our choice (giving five dollars or not) and then perform said claim. I don't think that's even falsifiable "from our universe".

If the claim is not falsifiable, the AI should not accept unless I do something "impossible" from its current framework of thinking. A proof request that I am thinking of is to do some calculations with the order 3^^^^3 computer and shares easily verifiable results that would otherwise take longer than the age of the universe to obtain. The AI could also ask "simulate me and find a proof that would suit me". Once the AI is convinced, it could also throw in another five dollars and ask for some algorithm improvements that would require billion years to achieve otherwise. Or for an ssh access on the 3^^^^3 computer.

Comment by Pierre-André_Noël on Not Taking Over the World · 2008-12-16T04:44:59.000Z · LW · GW

Sorry, I have not been specific enough. Each of my 3^^^^3, 3^^^^3-1, 3^^^^3-2, etc. examples are mutually exclusive (but the sofa is part of the "0" case). While they might not span all possibilities (not exhaustive) and could thus sum to less than one, they cannot sum to higher than 1. As I see it, the weakest assumption here is that "more persons/pigs is less or equally likely". If this holds, the "worst case scenario" is epsilon=1/(3^^^^3) but I would guess for far less than that.

Comment by Pierre-André_Noël on Not Taking Over the World · 2008-12-16T03:55:26.000Z · LW · GW

Sorry for being out of topic, but has that 3^^^^3 problem been solved already? I just read the posts and, frankly, I fail to see why this caused so much problems.

Among the things that Jaynes repeats a lot in his book is that the sum of all probabilities must be 1. Hence, if you put probabilities somewhere, you must remove elsewhere. What is the prior probability for "me being able to simulate/kill 3^^^^3 persons/pigs"? Let's call that nonzero number "epsilon". Now, I guess that the (3^^^^3)-1 case should have a probability greater or equal than epsilon, same for (3^^^^3)-2 etc. Even with a "cap" at 3^^^^3, this makes epsilon <= 1/(3^^^^3). And this doesn't consider the case "I fail to fulfill my threat and suddenly change into a sofa", let alone all the >=42^^^^^^^42 possible statements in that meta-multiverse. The integral should be one.

Now, the fact that I make said statement should raise the posterior probability to something larger than epsilon, depending on your trust in me etc, but the order of magnitude is at least small enough to cancel out the "immenseness" of 3^^^^3. Is it that simple or am I missing something?