Posts

Comments

Comment by Insert_Idionym_Here on Entangled Truths, Contagious Lies · 2013-07-03T06:25:36.174Z · LW · GW

I think you enormously over-state the difficulty of lying well, as well as the advantages of honesty.

Comment by Insert_Idionym_Here on Post ridiculous munchkin ideas! · 2013-05-20T03:29:25.091Z · LW · GW

Already done this to myself -- it lowers your self-esteem enormously.

Comment by Insert_Idionym_Here on Post ridiculous munchkin ideas! · 2013-05-20T03:17:13.296Z · LW · GW

I used to do exactly this, but I created whole backstories and personalities for my "hats" so that they would be more realistic to other people.

Comment by Insert_Idionym_Here on Rationality Quotes May 2013 · 2013-05-10T00:00:12.676Z · LW · GW

It might be more accurate to say that pretty much everything, including what we call biology and physics -- humans are the ones codifying it -- is memetically selected to be learnable by humans. Not that it all develops towards being easier to learn.

Comment by Insert_Idionym_Here on Firewalling the Optimal from the Rational · 2012-10-17T01:10:59.226Z · LW · GW

May I ask how many people any of you have seen walking around entirely barefoot, as opposed to wearing minimalist footwear of any kind?

Comment by Insert_Idionym_Here on Hold Off On Proposing Solutions · 2012-09-11T05:08:44.824Z · LW · GW

To be perfectly honest, at the time I simply planted my face on the table in front of me a few times. I was at a dinner party with friends of my mother's; I would have sounded extremely condescending otherwise.

Comment by Insert_Idionym_Here on Expecting Short Inferential Distances · 2012-09-11T05:04:00.243Z · LW · GW

That is what happened to me.

Comment by Insert_Idionym_Here on Expecting Short Inferential Distances · 2012-09-11T04:58:50.548Z · LW · GW

The lack of this knowledge got me a nice big "most condescending statement of the day award" in lab a year ago.

Comment by Insert_Idionym_Here on Hold Off On Proposing Solutions · 2012-09-10T22:05:41.006Z · LW · GW

I have attempted using this in more casual decision making situations, and the response I get is nearly always something along the lines of "Okay, just let me propose this one solution, we won't get attached to it or anything, just hear me out..."

Comment by Insert_Idionym_Here on The Power of Reinforcement · 2012-08-12T17:45:23.254Z · LW · GW

One could attempt to fight that by reducing the number or frequency of M&Ms eaten over a long period of time, essentially weaning one's self off of extrinsic rewards.

Comment by Insert_Idionym_Here on Mandatory Secret Identities · 2012-08-12T03:42:32.678Z · LW · GW

I agree. I think that failure mode might then be better avoided by restricting possible "somethings", as opposed to adding another requirement on to one's reasons for wanting to be rational.

Comment by Insert_Idionym_Here on Mandatory Secret Identities · 2012-08-11T23:02:22.486Z · LW · GW

If you have "something to protect", if your desire to be rational is driven by something outside of itself, what is the point of having a secret identity? If each student has that something, each student has a reason to learn to be rational -- outside of having their own rationality dojo someday -- and we manage to dodge that particular failure mode. Is having a secret identity a particular way we could guarantee that each rationality instructor has "something to protect"?

Comment by Insert_Idionym_Here on Terminal Bias · 2012-01-31T03:48:33.810Z · LW · GW

But don't you want to understand the underlying principles?

Comment by Insert_Idionym_Here on Archimedes's Chronophone · 2012-01-30T21:49:09.488Z · LW · GW

It seems that in order to get Archimedes to make a discovery that won't be widely accepted for hundreds of years, you yourself have to make a discovery that won't be widely accepted for hundreds of years; you have to be just as far in the dark as you want Archimedes to be. So talking about plant rights would probably produce something useful on the other end, but only if what you say is honestly new and difficult to think about. If I wanted Archimedes to discover Bayes' theorem, I would need to put someone on the line who is doing mathematics that is hundreds of years ahead of their time, and hope they have a break-through.

Comment by Insert_Idionym_Here on Zen and the Art of Rationality · 2012-01-20T08:51:23.596Z · LW · GW

I applaud your fourth paragraph.

Comment by Insert_Idionym_Here on New Improved Lottery · 2012-01-20T08:39:58.340Z · LW · GW

I think that perhaps you may be missing the point.

Comment by Insert_Idionym_Here on The Spotlight · 2012-01-17T07:26:16.405Z · LW · GW

I'm thinking about why I care about why I care about what I'm thinking, and I'm realizing that I have other things that I need to do, and that realization is not helping me get past this moment.

Comment by Insert_Idionym_Here on Well-Kept Gardens Die By Pacifism · 2012-01-15T09:03:38.637Z · LW · GW

One: I support the above post. I've seen quite a few communities die for that very reason.

Two: Gurren Lagann? (pause) Gurren Lagann? Who the h*ll do you think I am?

Comment by Insert_Idionym_Here on Where are we? · 2012-01-15T04:32:31.933Z · LW · GW

I used to live in Ann Arbor, rather recently. I live in Saginaw now.

Comment by Insert_Idionym_Here on Can the Chain Still Hold You? · 2012-01-13T19:01:55.889Z · LW · GW

I believe the point is that we do not know how much more is possible, or what circumstances make that so. As such, we must check, as often as we can, to make absolutely sure that we are still held by our chains.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-22T22:38:44.972Z · LW · GW

All of the above.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T17:04:56.211Z · LW · GW

Feet are for standing, not hands, but that doesn't keep us from admiring the gymnast.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T07:26:34.133Z · LW · GW

Ah, I see. I just don't think that cryonics significantly improves the chances of actually extending one's life span, which would be similar to saying that democracy is not significantly better than most other political systems.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T06:30:33.975Z · LW · GW

Are you saying that cryonics is not perfect, but it is the best alternative?

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T06:25:36.404Z · LW · GW

I'm not sure I understand your point. I'll read your link a few more times, just to see if I'm missing something, but I don't quite get it now.

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T06:23:20.714Z · LW · GW

Ah. Wrong referent. It's hilarious for me, and it may, at some point, be hilarious for them. But it's mostly funny for me. That would be why I took time to mention that it was also, in fact, asinine.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T06:06:31.271Z · LW · GW

I think cryonics is a terrible idea, not because I don't want to preserve my brain until the tech required to recreate it digitally or physically is present, but because I don't think cryonics will do the job well. Cremation does the job very, very badly, like trying to preserve data on a hard drive by melting it down with thermite.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T06:00:19.985Z · LW · GW

Oh, hello. I've posted a couple of times, in a couple of places, and those of you who have spoken with me probably know that I am one: a novice, and two: a bit of a jerk.

I'm trying to work on that last one.

I think cryonics, in its current form, is a terrible idea, I am a (future) mathematician, and am otherwise divergent from the dominant paradigm here, but I think the rest of that is for me to know, and you to find out.

Comment by Insert_Idionym_Here on Welcome to Less Wrong! · 2011-12-20T05:46:41.160Z · LW · GW

Bugmaster, I call down hurricanes everyday. It never gets boring. Meteorites are a little harder, but I do those on occasion. They aren't quite as fun.

But the angry frogs?

The angry frogs?

Those don't leave a shattered wasteland behind, so you can just terrorize people over and over again with those. Just wonderful.

Note: All of the above is complete bull-honkey. I want this to be absolutely clear. 100%, fertilizer-grade, bull-honkey.

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T05:35:15.358Z · LW · GW

That's alright. My humor, in real life, is based entirely on the fact that only I know I'm joking at the time, and the other person won't realize it until three days later, when they spontaneously start laughing for no reason they can safely explain. Is that asinine? Yes. Is it hilarious? Hell, yes. So I apologize. I'll try not to do that.

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T05:16:23.273Z · LW · GW

I am being somewhat ... absurd, and on purpose, at that. But I have enough arrogance lying around in my brain to believe that I can trick the super-intelligence.

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T05:00:32.377Z · LW · GW

You aren't doublethinking hard enough, then.

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T04:42:56.975Z · LW · GW

Because the million is already there, along with the thousand. Why not get all of it?

Comment by Insert_Idionym_Here on Newcomb's Problem and Regret of Rationality · 2011-12-20T04:28:40.306Z · LW · GW

I think it is important to make a distinction between what our choice is now, while we are here, sitting at a computer screen, unconfronted by Omega, and our choice when actually confronted by Omega. When actually confronted by Omega, your choice has been determined. Take both boxes, take all the money. Right now, sitting in your comfy chair? Take the million-dollar box. In the comfy chair, the contra-factual nature of the experiment basically gives you an Outcome Pump. So take the million-dollar box, because if you take the million-dollar box, it's full of a million dollars. But when it actually happens, the situation is different. You aren't in your comfy chair anymore.

Comment by Insert_Idionym_Here on Quantum Non-Realism · 2011-12-17T19:06:05.642Z · LW · GW

How would reality go about being not normal? Or more specifically, what is normal, if not reality?

Comment by Insert_Idionym_Here on Feynman Paths · 2011-12-17T05:44:49.697Z · LW · GW

Thank you very much.

Comment by Insert_Idionym_Here on Feynman Paths · 2011-12-16T23:51:01.230Z · LW · GW

Okay, so where did those arrows come from? I see how the graph second from the top corresponds to the amount of time a particle, were particles to exist, would take if it bounced, if it could bounce, because it's not actually a particle, off of a specific point on the mirror. But how does one pull the arrows out of that graph?

Comment by Insert_Idionym_Here on Configurations and Amplitude · 2011-12-16T23:46:34.631Z · LW · GW

I... Er... What. Where did the whole 'amplitude' thing come from? I mean, it looks a lot like they are vectors in the complex plane, but why are they two dimensional? Why not three? Or one? I just don't get the idea of what amplitude is supposed to describe.

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T22:54:28.484Z · LW · GW

Thank you.

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T22:45:15.497Z · LW · GW

I believe I suggested earlier that I don't know what moral theory I hold, because I am not sure of the terminology. So I may, in fact, be a utilitarian, and not know it, because I have not the vocabulary to say so. I asked "At what point is utilitarianism not completely arbitrary?" because I wanted to know more about utilitarianism. That's all.

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T22:31:33.570Z · LW · GW

At what point is utilitarianism not completely arbitrary?

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T22:21:55.011Z · LW · GW

No-one asked for a general explanation.

The best term I have found, the one that seems to describe the way I evaluate situations the most accurately, is consequentialism. However, that may still be inaccurate. I don't have a fully reliable way to determine what consequentialism entails; all I have is Wikipedia, at the moment.

I tend to just use cost-benefit analysis. I also have a mental, and quite arbitrary, scale of what things I do and don't value, and to what degree, to avoid situations where I am presented with multiple, equally beneficial choices. I also have a few heuristics. One of them essentially says that given a choice between a loss that is spread out amongst many, and an equal loss divided amongst the few, the former is the more moral choice. Does that help?

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T22:10:15.873Z · LW · GW

I don't agree. The existence 3^^^3 people, or 3^^^3 dust specks, is impossible because there isn't enough matter, as you said. The existence of an event that has only effects that are tailored to fit a particular person's idea of 'bad' does not fit my model of how causality works. That seems like a worse infraction, to me.

However, all of that is irrelevant, because I answered the more "interesting question" in the comment you quoted. To be blunt, why are we still talking about this?

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T21:56:26.747Z · LW · GW

Yes. I believe that because any suffering caused by the 3^^^3 dust specks is spread across 3^^^3 people, it is of lesser evil than torturing a man for 50 years. Assuming there to be no side effects to the dust specks.

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T21:51:13.239Z · LW · GW

That is in no way what was said. Also, the idea of an event that somehow manages to have no effect aside from being bad is... insanely contrived. More contrived than the dilemma itself.

However, let's say that instead of 3^^^3 people getting dust in their eye, 3^^^3 people experience a single nano-second of despair, which is immediately erased from their memory to prevent any psychological damage. If I had a choice between that and torturing a person for 50 years, then I would probably choose the former.

Comment by Insert_Idionym_Here on Torture vs. Dust Specks · 2011-12-16T21:23:49.733Z · LW · GW

No, I'm pretty sure it makes you notice. It's "enough". "barely enough", but still "enough". However, that doesn't seem to be what's really important. If I consider you to be correct in your interpretation of the dilemma, in that there are no other side effects, then yes, the 3^^^3 people getting dust in their eyes is a much better choice.

Comment by Insert_Idionym_Here on Zombies! Zombies? · 2011-12-16T19:42:18.088Z · LW · GW

Better late than never.

Comment by Insert_Idionym_Here on Zombies! Zombies? · 2011-12-16T18:29:59.861Z · LW · GW

You haven't said anything. Make a relevant point.

Comment by Insert_Idionym_Here on Bayesian Flame · 2011-12-03T02:33:44.837Z · LW · GW

... What is it that frequentists do, again? I'm a little out of touch.

Comment by Insert_Idionym_Here on 2011 Less Wrong Census / Survey · 2011-11-07T21:50:12.358Z · LW · GW

I missed newton by over 150 years. Pray for a curve.