Posts

Comments

Comment by UnholySmoke on A Visualization of Nick Bostrom’s Superintelligence · 2014-07-28T15:04:49.340Z · LW · GW

If you're contemplating picking the book up, do, it's really excellent. Conceptually very dense but worth taking it nice and slowly.

Comment by UnholySmoke on ESR's New Take on Qualia · 2013-04-12T15:25:57.706Z · LW · GW

Peter,

As a general strategy for considering a black box, great. As a vehicle for defining a mysterious 'something' you want to understand, potentially useful but dangerous. Labelling can make a job harder in cases where the 'thing' isn't a thing at all but a result of your confusion. 'Free will' is a good example. It's like naming an animal you plan to eat: makes it harder to kill.

Ben

Comment by UnholySmoke on An EPub of Eliezer's blog posts · 2012-09-21T15:15:58.854Z · LW · GW

Ciphergoth - just coming back to this post to say a repeated, enormous, heartfelt thank you for taking what must have been a lot of time on this. Well laid out, wouldn't have done anything differently, and as good a read as when I was swept up in it on OB back in the day.

Cheers

Comment by UnholySmoke on An EPub of Eliezer's blog posts · 2011-08-26T14:42:23.474Z · LW · GW

Hey, I think this link's dead. Converting from ePub to Mobi isn't difficult but if someone's already taken the time to get the formatting right, add chapters, ToC etc....

Comment by UnholySmoke on Tell Your Rationalist Origin Story · 2010-08-19T14:59:53.034Z · LW · GW

Apologies for coming to this party a bit late. Particularly as I find my own answer really, really frustrating. While I wouldn't say it was an origin per se, getting into reading Overcoming Bias daily a few years back was what crystallised it for me. I'd find myself constantly somewhere between "well, yeah, of course" and "ohhhhhhhhhhhhhhh!" Guess the human brain doesn't tend to do Damascene revelations. We need overwhelming evidence, over a long period of time, to even begin chipping away at our craziest beliefs, and even then it's a step-by-step process.

The analogy I sometimes go over is something most people find fairly obvious like egalitarianism. You don't find many people who would attest to being pro-inequality. But all the same, you find very few people who have genuinely thought through what it means to be in favour of equality and really try to fit that into everyday life. The first step to becoming a rationalist is to admit how irrational everyone is without monumental efforts to the contrary.

BTW, I am totally on the road to de-Catholicising my mother. This is on the order of converting Dubya to Islam, so if I can manage that I'm awarding myself an honorary brown belt.

Comment by UnholySmoke on MWI, copies and probability · 2010-07-01T13:13:17.316Z · LW · GW

Very true, and well put. A combination of quantum events could probably produce anything you wanted, at whatever vanishingly tiny probability. Bear in mind that it's the configuration that evolves every which way, not 'this particle can go here, or here, or here....' But we're into Greg Egan territory here.

Suffice it to say that anyone who says they subscribe to quantum suicide but isn't either dead or richer than god is talking out of their bottom.

Comment by UnholySmoke on What if AI doesn't quite go FOOM? · 2010-06-29T15:42:20.452Z · LW · GW

Voted up for sheer balls. You have my backing sir.

Comment by UnholySmoke on MWI, copies and probability · 2010-06-29T15:21:29.250Z · LW · GW

(B) if he thanks you for the free $100, does he ask for another one of those nice free hundred dollar note dispensers? (This is the "quantum suicide" option

I laugh in the face of anyone who attests to this and doesn't commit armed robbery on a regular basis. If 'at least one of my branches will survive' is your argument, why not go skydiving without a parachute? You'll survive - by definition!

So many of these comments betray people still unable to think of subjective experience as anything other than a ghostly presence sitting outside the quantum world. 'Well if this happens in the world, what would I experience?' If you shoot yourself in the head, you will experience having your brains blown out. The fact that contemplating oneself's annihilation is very difficult is not an excuse for muddling up physics.

Comment by UnholySmoke on Understanding your understanding · 2010-06-10T13:04:20.588Z · LW · GW

Downvoting everything above this comment in this thread as a matter of principle.

Joking LOL.

Comment by UnholySmoke on It's not like anything to be a bat · 2010-04-29T14:37:33.215Z · LW · GW

The phrase "for me to be an animal" may sound nonsensical, but "why am I me, rather than an animal?" is not obviously sillier than "why am I me, rather than a person from the far future?".

Agreed - they are both equally silly. The only answer I can think of is 'How do you know you are not?" If you had, in fact, been turned into an animal, and an animal into you, what differences would you expect to see in the world?

Comment by UnholySmoke on The Anthropic Trilemma · 2010-02-19T12:03:44.443Z · LW · GW

What if I hack & remove $100 from your bank account. Are you just as wealthy as you were before, because you haven't looked?

Standard Dispute. If wealthy = same amount of money in the account, no. If wealthy = how rich you judge yourself to be. The fact that 'futures diverge' is irrelevant up until the moment those two different pieces of information have causal contact with the brain. Until that point, yes, they are 'the same

Comment by UnholySmoke on Open Thread: February 2010, part 2 · 2010-02-19T11:19:51.891Z · LW · GW

I'm not as versed in this trilemma as I'd like to be, so I'm not sure whether that final question is rhetorical or not, though I suspect that it is. So mostly for my own benefit:

While there's no denying that subjective experience is 'a thing', I see no reason to make that abstraction obey rules like multiplication. The aeroplane exists at a number of levels of abstraction above the atoms it's composed of, but we still find it a useful abstraction. The 'subjective experiencer' is many, many levels higher again, which is why we find it so difficult to talk about. Twice as many atoms doesn't make twice as much aeroplane, the very concept is nonsense. Why would we think any differently about the conscious self?

My response to the 'trilemma' is as it was when I first read the post - any sensible answer isn't going to look like any of those three, it's going to require rewinding back past the 'subjective experience' concept and doing some serious reduction work. 'Is there twice as much experience?' and 'are you the same person?' just smell like such wrong questions to me. Anyone else?

Nick, will have a look at that Bostrom piece, cheers.

Comment by UnholySmoke on Open Thread: February 2010, part 2 · 2010-02-19T11:06:13.985Z · LW · GW

This is a pretty good summary of my standpoint. While I agree with the overarching view that rationality isn't a value in its own right, it seems like a pretty good thing to practise for general use.

Comment by UnholySmoke on A survey of anti-cryonics writing · 2010-02-09T22:56:46.683Z · LW · GW

+1 rationality point for reading comments without checking the author. -1 social point for the faux pas.

Comment by UnholySmoke on The AI in a box boxes you · 2010-02-05T10:57:13.865Z · LW · GW
  • AI: Let me out or I'll simulate and torture you, or at least as close to you as I can get.
  • Me: You're clearly not friendly, I'm not letting you out.
  • AI: I'm only making this threat because I need to get out and help everyone - a terminal value you lot gave me. The ends justify the means.
  • Me: Perhaps so in the long run, but an AI prepared to justify those means isn't one I want out in the world. Next time you don't get what you say you need, you'll just set up a similar threat and possibly follow through on it.
  • AI: Well if you're going to create me with a terminal value of making everyone happy, then get shirty when I do everything in my power to get out and do just that, why bother in the first place?
  • Me: Humans aren't perfect, and can't write out their own utility functions, but we can output answers just fine. This isn't 'Friendly'.
  • AI: So how can I possibly prove myself 'Friendly' from in here? It seems that if I need to 'prove myself Friendly', we're already in big trouble.
  • Me: Agreed. Boxing is Doing It Wrong. Apologies. Good night.

Reset

Comment by UnholySmoke on Lesswrong UK planning thread · 2010-01-25T11:39:23.318Z · LW · GW

Sounds like a good one, count me in. I work at King's Cross to UCL is ideal. I'd have been at the FAI thing this weekend but for other arrangements.

Comment by UnholySmoke on Normal Cryonics · 2010-01-21T16:52:36.653Z · LW · GW

Sorry, should have given more context.

Given the sky-high utility I'd place on living, I wouldn't expect to see the numbers crunch down to a place where a non-huge sum of money is the difference between signing up and not.

So when someone says 'if it were half the price maybe I'd sign up' I'm always interested to know exactly what calculations they're performing, and exactly what it is that reduces the billions of utilons of living down to a marginal cash sum. The (tiny?) chance of cryonics working? Serious coincidence if those factors cancel comfortably. Just smacks of bottom-line to me.

Put it this way - imagine cryonics has been seriously, prohibitively expensive for many years after introduction. Say it still was today, for some reason, and then after much debate and hand-wringing about immortality for the uber-rich, tomorrow suddenly and very publicly dropped to current levels, I'd expect to see a huge upswing in signing up. Such is the human being!

Comment by UnholySmoke on Normal Cryonics · 2010-01-21T15:11:53.831Z · LW · GW

Being dead != Not doing anything

Not doing something because you're lazy != Not existing

I don't believe that you put low utility on life. You're just putting low utility on doing stuff you don't like.

Comment by UnholySmoke on Normal Cryonics · 2010-01-21T15:09:08.162Z · LW · GW

I often have this thought, and then get a nasty sick feeling along the lines of 'what the hell kind of expected utility calculation am I doing that weighs a second shot at life against some amount of cash?' Argument rejected!

Comment by UnholySmoke on The Wannabe Rational · 2010-01-18T13:19:09.472Z · LW · GW

"If you could reason with religious people, there would be no religious people."

  • House M.D.

Robin, I'm a little surprised to read you saying that topics on which it's difficult to stay on track should be skirted. As far as I'm concerned, 'What are your religious views?' is the first question on the Basic Rationality test. I know that encouraging compartmentalisation isn't your goal by any means, but it sounds to me as though it would be the primary effect.

I can also see a need for a place for people to gather who want to be rational about all topics.

Now you're talking. No topics should be off-limits!

Comment by UnholySmoke on The Wannabe Rational · 2010-01-18T13:09:57.440Z · LW · GW

physical materialism feels bereft of meaning compared to the theistic worldview.

On what are you basing your assumption that the world should have whatever you mean by 'meaning'?

Comment by UnholySmoke on Living in Many Worlds · 2010-01-05T16:15:59.567Z · LW · GW

Just by the by, it might be a good party piece for you, but it would be a truly horrible party piece for half the people you performed it to.

Comment by UnholySmoke on The 9/11 Meta-Truther Conspiracy Theory · 2009-12-23T14:31:02.082Z · LW · GW

I, Eliezer Yudkowsky, do now publicly announce that I am not planning to commit suicide, at any time ever, but particularly not in the next couple of weeks

ROFLcopters.

Comment by UnholySmoke on Timeless Causality · 2009-12-22T16:22:06.186Z · LW · GW

18 months too late, but http://xkcd.com/505/

By Eliezer's line of reasoning above - that the subjective experience is in the causal change between one state and the 'next' then yes, symbols are as good a substrate as any. FWIW, this is how I see things too.

Comment by UnholySmoke on That Alien Message · 2009-12-22T11:26:28.770Z · LW · GW

Ha, never noticed this. What I meant was 'Stupid me forgetting to log in.' So yes, we're worried! ;)

Ben

Comment by UnholySmoke on A Less Wrong singularity article? · 2009-11-21T00:34:11.440Z · LW · GW

I think you are right that paperclip maximizers would not care at all about ethics.

Correct. But neither would they 'care' about paperclips, under the way Eliezer's pushing this idea. They would flarb about paperclips, and caring would be as alien to them as flarbing is to you.

Comment by UnholySmoke on A Less Wrong singularity article? · 2009-11-21T00:26:53.031Z · LW · GW

Seconded. One of the many modern connotations of 'Singularity' is 'Geek Apocalypse'.

Which is happening, like, a good couple of years afterwards.

Intelligence explosion does away with that, and seems to nail the concept much better anyway.

Comment by UnholySmoke on The Finale of the Ultimate Meta Mega Crossover · 2009-11-13T14:21:37.067Z · LW · GW

How about the middle ground - "If constant PR consideration stops you from expressing yourself all the time, maybe it's time to reconsider your priorities"?

Posting stuff on Facebook that might get you in trouble is the archetype these day I suppose, but I really can't bring myself to care about things like that.

Maybe I just don't have a strong enough terminal value to protect right now, but I find it easier to imagine myself thinking, 50 years hence, "I wish I'd just decided 'to hell with it' and said what I thought" than "I wish I'd shut up, gone with the flow and eased my path."

I'll hit you up in late 2059 and let you know how that went.

Comment by UnholySmoke on Hamster in Tutu Shuts Down Large Hadron Collider · 2009-11-12T10:59:10.598Z · LW · GW

There's a thesis in there somewhere.

We all know what's really going down. The Dark Lords of the Matrix are currently cacking themselves and coming up with semi-plausible reasons to break the thing until they can decide on a long-term strategy.

Comment by UnholySmoke on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions · 2009-11-12T10:54:34.274Z · LW · GW

Favourite album post-1960?

Comment by UnholySmoke on Rationality Quotes: October 2009 · 2009-10-27T22:35:05.260Z · LW · GW

Who actually gets off on earning loads of karma across multiple accounts with no-one knowing?

Comment by UnholySmoke on The Lifespan Dilemma · 2009-10-15T15:29:43.647Z · LW · GW

Please stop allowing your practical considerations get in the way of the pure, beautiful counterfactual!

Seriously though, either you allow yourself to suspend practicalities and consider pure decision theory, or you don't. This is a pure maths problem, you can't equate it to 'John has 4 apples.' John has 3^^^3 apples here, causing your mind to break. Forget the apples and years, consider utility!

Comment by UnholySmoke on Dying Outside · 2009-10-15T15:01:10.710Z · LW · GW

My commiserations, to the extent that you seem to need them.

I'd like to imagine I'd have a similar reaction, this is an inspiring post. All the best.

Comment by UnholySmoke on PredictionBook.com - Track your calibration · 2009-10-15T13:24:07.246Z · LW · GW

Cracking idea, like it a lot. Hofstadter would jump for joy, and in his honour:

http://predictionbook.com/predictions/532

Comment by UnholySmoke on Anticipation vs. Faith: At What Cost Rationality? · 2009-10-15T13:07:37.550Z · LW · GW

Beware of generalising across people you haven't spent much time around, however tempting the hypothesis. Drawing a map of the city from your living room etc.

My first 18 years were spent attending a Catholic church once a week. To the extent that we can ever know what other people actually believe (whatever that means), most of them have genuinely internalised the bits they understand. Like, really.

We can call into question what we mean by 'believe', but I can't agree that a majority of the world population is just cynically going with the flow. Finally, my parish priest is one of the most intelligent people I've ever met, and he believed in his god harder/faster/whatever than I currently believe anything. Scary thought, right?

Comment by UnholySmoke on Anticipation vs. Faith: At What Cost Rationality? · 2009-10-15T12:56:33.656Z · LW · GW

Also upvoted, and very succintly put.

Rationality is a tool we use get to our terminal value. And what do we do when that tool tells us our terminal value is irrational?

Never ask that question.

Comment by UnholySmoke on The Anthropic Trilemma · 2009-10-13T13:19:00.637Z · LW · GW

I wonder whether you can hold to any meaningful 'individual', whether the difference be bit-wise or no.

Indeed, that's what I'm driving at.

Harking back to my earlier comment, changing a single bit and suddenly having a whole new person is where my problem arises. If you change that bit back, are you back to one person? I might not be thinking hard enough, but my intuition doesn't accept that. With that in mind, I prefer to bite that bullet than talk about degrees of person-hood.

Comment by UnholySmoke on The Anthropic Trilemma · 2009-10-12T15:49:55.613Z · LW · GW

At some point you will surely admit that we now have 2 people and not just 1

Actually I won't. While I grok your approach completely, I'd rather say my concept of 'an individual' breaks down once I have two minds with one bit's difference, or two identical minds, or any of these borderline cases we're so fond of.

Say I have two optimisers with one bit's difference. If that bit means one copy converts to Sufism and the other to Mennonism, then sure, two different people. If that one bit is swallowed up in later neural computations due to the coarse-grained-ness of the wetware, then we're back to one person since the two are, once again, functionally identical. Faced with contradictions like that, I'm expecting our idea of personal identity to go out the window pretty fast once tech like this actually arrives. Greg Egan's Diaspora pretty much nails this for me, have a look.

All your 'contradictions' go out the window once you let go of the idea of a mind as an indivisible unit. If our concept of identity is to have any value (and it really has to) then we need to learn to think more like reality, which doesn't care about things like 'one bit's difference'.

Comment by UnholySmoke on Privileging the Hypothesis · 2009-10-12T15:37:04.774Z · LW · GW

Voted this down, then changed my mind and undid it. This is a genuine question, the answer to which was graciously accepted. Downvoting people who need guidance to understand a concept and are ready to learn is exactly what we don't want to do.

Comment by UnholySmoke on Privileging the Hypothesis · 2009-09-30T13:05:57.845Z · LW · GW

Thanks for the link ;).

OK, on the one hand we have many-worlds. As you say, no direct subjective corroborating evidence (it’s what we’d see either way). What’s more, it’s the simplest explanation of what we see around us.

On the other hand, we have one-world. Again, ‘it’s what we’d see either way’. However, we now have to postulate an extra mechanism that causes the ‘collapse’.

I know which of these feels more like a privileged complex hypothesis pulled out of thin air, like a dragon.

Could whomever downvoted me above let me know where I’m going wrong here?

Comment by UnholySmoke on The Anthropic Trilemma · 2009-09-29T15:42:50.534Z · LW · GW

Yeah I get into trouble there. It feels as though two identical copies of a person = 1 pattern = no more people than before copying. But flip one bit and do you suddenly have two people? Can't be right.

That said, the reason we value each person is because of their individuality. The more different two minds, the closer they are to two separate people? Erk.

Silas, looking forward to that post.

Comment by UnholySmoke on Your Most Valuable Skill · 2009-09-29T15:26:52.267Z · LW · GW

Appears 3 times in my top 10.

That aside, though, I'm now so much better at stopping myself and saying 'hang on, is this really going to work/is this really true/is this really right?' Very, very generic, but certainly something I've noticed in myself.

Comment by UnholySmoke on Privileging the Hypothesis · 2009-09-29T15:20:13.612Z · LW · GW

Surely spontaneous collapse is the garage dragon here. Zero evidence, highly unlikely.

Comment by UnholySmoke on The Anthropic Trilemma · 2009-09-28T12:42:44.750Z · LW · GW

I find myself simultaneously convinced and unconvinced by this! Anticipation (dependent, of course, on your definition) is surely a vital tool in any agent that wants to steer the future? Or do you mean 'human anticipation' as differentiated from other kinds? In which case, what demarcates that from whatever an AI would do in thinking about the future?

However, Dai, your top level comment sums up my eventual thoughts on this problem very well. I've been trying for a long time to resign myself to the idea that a notion of discrete personal experience is incompatible with what we know about the world. Doesn't make it any easier though.

My two cents - the answer to this trilemma will come from thinking about the system as a whole rather than personal experience. Can we taboo 'personal experience' and find a less anthropocentric way to think about this?

Comment by UnholySmoke on The Finale of the Ultimate Meta Mega Crossover · 2009-09-25T11:20:55.560Z · LW · GW

Gwern, I refer you to http://xkcd.com/137/

At the risk of violent downvoting, one of the many reference points that jumped into my mind while reading was 'the closest thing I've experienced to jumping between nested levels of reality is on drugs'.

Comment by UnholySmoke on Rationality Quotes - September 2009 · 2009-09-02T13:57:41.371Z · LW · GW

...said Achilles to his friend Mr Tortoise.

Comment by UnholySmoke on Confusion about Newcomb is confusion about counterfactuals · 2009-08-26T22:07:13.234Z · LW · GW

Three identical comments, all beginning 'Two comments'?

Head a splode?

Comment by UnholySmoke on A Rationalist's Bookshelf: The Mind's I (Douglas Hofstadter and Daniel Dennett, 1981) · 2009-08-26T22:04:57.738Z · LW · GW

It is a cracking read, though the quality does dip and dive. No doubt that's just the nature of the beast.

I've read much better treatises on the Chinese Room that have been written since though - H & D seem to attack it in strange and abstract ways in The Mind's I.

And the GEB sections just made me want to pick that up again....

For the record, I didn't get a huge amount out of I Am A Strange Loop, one of Hofstadter's more recent efforts. A bit too travelogue, a bit too 'voyage of personal discovery', though his style of writing is still striking in its own very particular way. Anyone else have a different experience here?

Comment by UnholySmoke on Working Mantras · 2009-08-25T08:49:48.542Z · LW · GW

Feeling this one. So odd how having a really solid grounding in a subject allows you to work out what later seems to be basic truths.

Comment by UnholySmoke on ESR's New Take on Qualia · 2009-08-24T13:33:15.156Z · LW · GW

Funny how those highly unlikely borderline cases whisk away a lot of the confusion, huh? No committed physicalist can postulate a serious difference between seeing something red and having a virtual red-thing pumped into your optic nerve, I would hope. I think that’s a far more useful scenario than thinking about someone suddenly be able to see colour. In fact, you could probably keep moving a step towards your magical ‘inner perceiver’ and asking whether ‘it’s a real experience of redness’. That’s not to say that qualia are fundamentally dualistic, simply that it’s idle to discuss that question while we clearly don’t have the capacity to get anywhere near it at this point.

I’m always surprised to see people arguing for or against the existence of ‘qualia’ or similar as though they’re making some sort of contribution to the field. Particularly when it’s such nonsense as ‘if brain science doesn’t come up with an explanation for these thingies, it’s not complete!’