Posts

Comments

Comment by SoundLogic on How to escape from your sandbox and from your hardware host · 2015-08-06T00:30:23.383Z · LW · GW

Step one involves figuring out the fundamental laws of physics. Step two is input a complete description of your hardware. Step three is to construct a proof. I'm not sure how to order these in terms of difficulty.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2014-06-01T09:22:55.381Z · LW · GW

After a fair bit of thought, I don't. I don't think one can really categorize it as purely spur of the moment though-it lasted quite a while. Perhaps inducing a 'let the AI out of the box phase' would be a more accurate description.

Comment by SoundLogic on How long will Alcor be around? · 2014-04-27T17:25:45.079Z · LW · GW

I feel like the unpacking/packing biases ought to be something that should be easier to get around than some other biases. Fermi estimates do work (to some extent). I somewhat wonder if perhaps giving log probabilities would help more.

Comment by SoundLogic on Tell Culture · 2014-01-25T21:34:29.136Z · LW · GW

Oh, obviously there are causal reasons for why guess culture develops. If there wasn't, it wouldn't occur. I agree that having a social cost to denying a request can lead to this phenomenon, as your example clearly shows. I don't think that stops it from being silly.

Comment by SoundLogic on Tell Culture · 2014-01-19T04:18:50.846Z · LW · GW

I feel ask and tell culture are fairly similar in comparison to guess culture. Tell culture seems to me to be just ask culture a bit more explaining, which seems like a move in the right direction, balanced by time and energy constraints. Guess culture just seems rather silly.

Comment by SoundLogic on The Futility of Emergence · 2013-12-17T16:17:43.758Z · LW · GW

What I meant by this is the gravitational influence of N particles is the sum of the gravitational influences of each of the individual particles, and is therefore a strict function of their individual gravitational influences. If you give me any collection of particles, and tell me nothing except their gravitational fields, I can tell you the gravitational field of the system of particles. If you tell me the intelligence of each of your neurons (0), I cannot determine your intelligence.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T19:14:49.981Z · LW · GW

I think the gatekeeper having to pay attention to the AI is very in the spirit of the experiment. In the real world, if you built an AI in a box and ignored it, then why build it in the first place?

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T15:37:43.550Z · LW · GW

I would be willing to consider it if you agreed to secrecy and raised it to 1000$. You would still have to talk to Tuxedage though.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T15:36:17.548Z · LW · GW

I'm not completely sure. And I can't say much more than that without violating the rules. I would be more interested in how I feel in a week or so.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T08:09:41.685Z · LW · GW

A better mind than Tuxedage could almost certainly keep up the 'feel' of a flurry of arguments even with a schedule of breaks. I myself have had people feel irritated at me where even if I talk to them with days in between that I seem to do so. If I can do so accidentally I'm certain a superintelligence could do it reliably.

Also, I'm unsure of how much an AI could gather from a single human's text input. I know that I at least miss a lot of information that goes past me that I could in theory pick up.

An AI using timeless decision theory could easily compensate for having multiple AIs with unshared memory just by attempting to determine what the other AIs would say.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T05:45:14.085Z · LW · GW

True.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T05:42:40.686Z · LW · GW

I have a fair bit of curiosity, which is why he said that in this case it probably wouldn't make a difference.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T05:36:37.609Z · LW · GW

Tuxedage's changes were pretty much just patches to fix a few holes as far as I can tell. I don't think they really made a difference.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T05:16:43.673Z · LW · GW

I couldn't imagine either. But the evidence said there was such a thing, so I payed to find out. It was worth it.

Comment by SoundLogic on I attempted the AI Box Experiment again! (And won - Twice!) · 2013-09-05T05:13:46.970Z · LW · GW

I think your reasoning is mostly sound, but there are a few exceptions (which may or may not have happened in our game) that violate your assumptions.

I'm also somewhat curious how your techniques contrast with Tuxedage's. I hope to find out one day.

Comment by SoundLogic on The Futility of Emergence · 2013-03-11T17:00:34.829Z · LW · GW

I was under the impression that a property x was emergent if it wasn't determined by the set of property states of the components. IE, gravity isn't emergent since the gravity generated by something is the addition of the gravity of the parts. Intelligence isn't, because even if I know the intelligence of each of your neurons, I don't know your intelligence.