Posts

Comments

Comment by hrishimittal on The Power of Reinforcement · 2012-06-22T07:58:22.040Z · LW · GW

What expert timing, Luke! Just two days ago, I came across the fascinating practice of clicker training for horses - http://www.theclickercenter.com, while reading Kathy Sierra's old blog - http://headrush.typepad.com/creating_passionate_users/2006/03/clicker_trained.html.

My only problem is that I need to train my own behaviour rather than someone else's. I'm going to try to use these techniques on myself, although I'm not sure if that's supposed to work.

Comment by hrishimittal on Rationality Quotes - July 2009 · 2009-07-03T15:10:10.975Z · LW · GW

...you have to make a conscious effort to keep your ideas about what you want from being contaminated by what seems possible.This is isomorphic to the principle that you should prevent your beliefs about how things are from being contaminated by how you wish they were. Most people let them mix pretty promiscuously. The continuing popularity of religion is the most visible index of that.

-- pg

Comment by hrishimittal on Fourth London Rationalist Meeting? · 2009-07-02T18:41:08.090Z · LW · GW

I can come this Sunday.

Comment by hrishimittal on Open Thread: July 2009 · 2009-07-02T15:27:29.171Z · LW · GW

Regarding 2, I think the default setting (Popular) is to display comments as a function of karma and time since posting. As comments get old, newer comments float to the top even if the older ones have some positive karma. If some comment has very high karma, I guess it outweighs the time constraint and stays at the top.

Comment by hrishimittal on Fourth London Rationalist Meeting? · 2009-07-02T15:06:39.155Z · LW · GW

Ok I don't mind. Richard, your call?

Comment by hrishimittal on Fourth London Rationalist Meeting? · 2009-07-02T14:46:39.831Z · LW · GW

I will come. By usual venue, do you mean 5th View cafe on top of Waterstones bookstore near Piccadilly Circus?

Is there something specific we are going to discuss or is it pretty casual?

I would prefer late morning (say after 11).

Comment by hrishimittal on Nonparametric Ethics · 2009-06-22T09:03:23.638Z · LW · GW

This might help - http://sl4.org/wiki/CoherentExtrapolatedVolition

Comment by hrishimittal on Applied Picoeconomics · 2009-06-17T23:04:16.776Z · LW · GW

Thanks Yvain, you have inspired me to commit to some important things for the next month. I have written them down.

I promise to write about my achievements here on LW on the 18th July.

Comment by hrishimittal on Rationalists lose when others choose · 2009-06-16T18:38:18.646Z · LW · GW

When giving a security clearance, for example, you would rather give it to someone who loved his country emotionally, than to someone who loved his country rationally;

Can you clarify how you distinguish between loving one's country emotionally as opposed to rationally?

Comment by hrishimittal on Rationality Quotes - June 2009 · 2009-06-16T02:32:51.472Z · LW · GW

It reminds me very much of this quote attributed to Gautam Buddha:

"Believe nothing merely because you have been told it. Do not believe what your teacher tells you merely out of respect for the teacher. But whatsoever, after due examination and analysis, you find to be kind, conducive to the good, the benefit, the welfare of all beings -- that doctrine believe and cling to, and take it as your guide."

Comment by hrishimittal on Intelligence enhancement as existential risk mitigation · 2009-06-16T00:53:29.903Z · LW · GW

Related post and discussion over at OB - http://www.overcomingbias.com/2009/06/lazy-hurt-less-than-stupid.html

Comment by hrishimittal on Intelligence enhancement as existential risk mitigation · 2009-06-15T21:42:34.427Z · LW · GW

It's interesting speculation but it assumes that people use all of their current intelligence. There is still the problem of akrasia - a lot of people are perfectly capable of becoming 'smarter' if only they cared to think about things at all. Sure, they could still go mad infallibly but it would be better than not even trying.

Are you implying that more IQ may help in overcoming akrasia?

Comment by hrishimittal on Readiness Heuristics · 2009-06-15T21:32:52.790Z · LW · GW

Yes that's how I meant it.

Comment by hrishimittal on Readiness Heuristics · 2009-06-15T11:35:18.796Z · LW · GW

The True Trolley Dilemma would be where the child is Eliezer Yudkowsky.

Then what would you do?

EDIT: Sorry if that sounds trollish, but I meant it as a serious question.

Comment by hrishimittal on Why safety is not safe · 2009-06-14T18:22:12.247Z · LW · GW

The bored teenager who finally puts together an AGI in his parents' basement will not have read any of these deep philosophical tracts.

That truly would be a sad day.

Are you seriously suggesting hypothetical AGIs built by bored teenagers in basements are "things which are actually useful in the creation of our successors"?

Is that your plan against intelligence stagnation?

Comment by hrishimittal on Why safety is not safe · 2009-06-14T18:15:28.104Z · LW · GW

You make a lot of big claims in this thread. I'm interested in reading your detailed thoughts on these. Could you please point to some writings?

Comment by hrishimittal on Why safety is not safe · 2009-06-14T17:41:53.856Z · LW · GW

stagnation is as real and immediate a threat as ever there was, vastly dwarfing any hypothetical existential risks from rogue AI.

How is blindly looking for AGI in a vast search space better than stagnation?

How does working on FAI qualify as "stagnation"?

Comment by hrishimittal on Why safety is not safe · 2009-06-14T17:36:54.839Z · LW · GW

I am convinced that resource depletion is likely to lead to social collapse - possibly within our lifetimes.

What convinced you and how convinced are you?

Comment by hrishimittal on London Rationalist Meetups bikeshed painting thread · 2009-06-11T12:59:36.798Z · LW · GW

That looks wicked!

Comment by hrishimittal on The Aumann's agreement theorem game (guess 2/3 of the average) · 2009-06-10T23:01:25.797Z · LW · GW

Or just plain wrong.

Comment by hrishimittal on Expected futility for humans · 2009-06-09T16:15:36.879Z · LW · GW

Surely the only point you're making in this long post is not that naïve consequentialism is a bad idea?

consider brainstorming for other goals that you might have ignored, and then attach priorities.

And how exactly does one attach priorities?

Comment by hrishimittal on The Aumann's agreement theorem game (guess 2/3 of the average) · 2009-06-09T11:27:05.930Z · LW · GW

Then I don't see the point of the game.

Comment by hrishimittal on The Aumann's agreement theorem game (guess 2/3 of the average) · 2009-06-09T09:41:55.082Z · LW · GW

I don't understand how the average guess will be 0. Can you please explain?

Comment by hrishimittal on London Rationalist Meetups bikeshed painting thread · 2009-06-08T09:57:26.738Z · LW · GW

I've never been actively part of an online community before, so I'm a bit scared to come along. I do find this group interesting though, so I might come to the next meetup.

I don't mind the place as long as it's quiet, but prefer the format to be casual. Except for Tuesday, any day of the week is fine by me.

Comment by hrishimittal on Macroeconomics, The Lucas Critique, Microfoundations, and Modeling in General · 2009-06-07T19:06:50.836Z · LW · GW

and when possible, use irrationality for the short run.

How exactly do you use irrationality?

Comment by hrishimittal on Open Thread: June 2009 · 2009-06-03T18:10:39.742Z · LW · GW

I'm considering donating to World Vision UK. Does anyone know much about them?

More generally, is there an easy way to find out how good a charity is? Are there reviews done by third parties?

Comment by hrishimittal on With whom shall I diavlog? · 2009-06-03T09:20:46.198Z · LW · GW

Eric Drexler.

Comment by hrishimittal on Open Thread: June 2009 · 2009-06-02T17:12:09.227Z · LW · GW

Thanks for the link Cyan.

Comment by hrishimittal on Open Thread: June 2009 · 2009-06-02T17:06:15.126Z · LW · GW

I'm in a situation which seems sort of the opposite of yours. I'm with a woman, who's more rational than any other I personally know. But the sex is just not very good, and I find myself getting physically drawn to other women a bit too much. I've struggled for weeks, trying to decide whether to continue or not. I've tried hard to think what I really want. And I think that if I were sexually satisfied, I would be very happy with the relationship because everything else seems perfect. So, I'm trying to work on that now. I'm paying more attention to being a loving and sensuous partner. Let's say I'm experimenting on the weak aspects of my relationship.

If I were in your place, I'd take each point of disagreement on its own merit. If it's decisions where the results can be seen clearly I wouldn't argue but just politely point to the results. As far as religious beliefs are concerned, the more I think about it the more I feel, that defining myself as an 'atheist' is only useful in saying that I don't believe in God. Beyond that, it doesn't add anything valuable to my personality. It can't because it's a negative definition. So, I would try and deal with specific issues rather than try to convince my partner that theism is wrong. If she believes in magic, playful humour might lighten things up a bit.

I also think it would be useful if you learnt more about her way of thinking, just like she has learnt about yours.

Comment by hrishimittal on Taking Occam Seriously · 2009-05-29T18:21:03.550Z · LW · GW

Thanks. That looks like a really interesting body of work. This one on ethics is quite a fun read.

Comment by hrishimittal on Rationality quotes - May 2009 · 2009-05-21T13:18:16.014Z · LW · GW

"Plod forever, but never believe you are going to get there."

-Sir Ranulph Fiennes

EDIT: I found this quote funny and strangely motivational, if you read it within the context. But looks like some people really dislike it.

Comment by hrishimittal on Bad reasons for a rationalist to lose · 2009-05-19T12:35:18.553Z · LW · GW

If the master sat there listening to people's inane theories about how they need to punch differently than everybody else, or their insistence that they really need to understand a complete theory of combat, complete with statistical validation against a control group, before they can even raise a single fist in practice, that master would have failed their students AND their Art.

Even so, as a student, I do want the master to understand a complete theory of combat, complete with statistical validation against a control group.

What is your theory o Master?

Comment by hrishimittal on Share Your Anti-Akrasia Tricks · 2009-05-19T11:28:11.716Z · LW · GW

Thanks. I'll check it out.

Comment by hrishimittal on "Open-Mindedness" - the video · 2009-05-17T16:38:13.875Z · LW · GW

Can you share the video?

Comment by hrishimittal on Welcome to Less Wrong! · 2009-05-17T13:35:51.177Z · LW · GW

Hi, I'm Hrishi, 26, male. I work in air pollution modelling in London. I'm also doing a part-time PhD.

I am an atheist but come from a very religious family background.

When I was 15, I once cried uncontrollably and asked to see God. If there is indeed such a beautiful supreme being then why didn't my family want to meet Him? I was told that their faith was weak and only the greatest sages can see God after a lot of self-afflicted misery. So, I thought nevermind.

I've signed up for cryonics. You should too, or it'll just be 3 of us from LW when we wake up on the other side. I don't mind hogging all the press, but inside me lives a shiny ball of compassion which wants me to share the glory with you.

I wish to live a happy and healthy life.

Comment by hrishimittal on Share Your Anti-Akrasia Tricks · 2009-05-17T13:07:39.708Z · LW · GW

I'm seriously thinking about asking my boss about that one. With a pro-rata decrease in salary, of course.

The extra money just doesn't seem to be worth the constant struggle with myself. Plus I think it would be good to start at a level I'm comfortable with and build on that. By forcing myself to work at a rate I'm clearly incapable of, I'm losing out on all the positive feedback that comes from small successes.

To draw a crude analogy, air pollution modelling is as hard a problem for me as say, AI is for EY. And if he needed to take every other day off once upon a time,...

EDIT: PS I have been reading OB/LW for a while but have started commenting here only recently. Hello everyone!

Comment by hrishimittal on "What Is Wrong With Our Thoughts" · 2009-05-17T12:36:09.036Z · LW · GW

Genetic engineering aside, given a large aggregation of human beings, and a long time, you cannot reasonably expect rational thought to win. You could as reasonably expect a thousand unbiased dice, all tossed at once, all to come down 'five,' say. There are simply far too many ways, and easy ways, in which human thought can go wrong. Or, put it the other way round: anthropocentrism cannot lose.

That's the same argument against rationalist winning that has been seen many times on LW. However, it is based on hopelessness and fear, rather than on knowledge of even a single failure of an organised attempt at large-scale rational winning. So, while Stove recognises the obviously wrong thoughts of philosophers, he himself goes wrong in thinking the above by making a wrong probability estimate.

So just to be clear, we are saying that the probability of a significant number of people turning to rational thinking is greater than the probability of winning a lottery, right?

Comment by hrishimittal on Share Your Anti-Akrasia Tricks · 2009-05-15T22:00:17.141Z · LW · GW

Thanks Alicorn. This sounds like a brilliant idea. I have been thinking of something along these lines but hadn't quite thought of day chunks - makes a lot of sense to me too.

I'll give it a try. And yes, I'll be careful.

Comment by hrishimittal on Religion, Mystery, and Warm, Soft Fuzzies · 2009-05-15T17:16:59.816Z · LW · GW

We value rationality first and foremost because if you take the long view it wins and in the world we populate it wins.

You seem to be making an argument both for and against our cause in the same breath.

The reason irrationality "wins" for the "many people" you mention is that they re-define winning in hindsight when things don't work out.

We are challenging those social systems, which are unaccountable and only provide mysterious explanations when they fail. We aspire to build more robust systems. That's what I think winning is.

I imagine you feel bad for all the religious people being left out, but that's only because of their large numbers. No one feels bad for string theorists. A large following doesn't make religion right. Lots of stupidity is not intelligence.

What I'm basically getting at is that the tendency to emphasize the latter distinction can cause one to undervalue dissimilarity in the human social world.

The point of emphasising this distinction is to put the value of human intelligence on the right order.

And if your main point is recognising the fact that bad or irrational decisions may perhaps be a result of variability in intelligence or its use, then religion only functions to hide that truth. We are at least admitting it and saying it's not fair.

Denial is not a path to improvement.

Comment by hrishimittal on Religion, Mystery, and Warm, Soft Fuzzies · 2009-05-15T16:43:32.346Z · LW · GW

I just got off the phone with my mom.

Mom: You're working hard on your PhD, aren't you?

Me: Yes, ma there's lots to do. Oh and I put in a paper for a conference. If it gets accepted I'll go to America to present it.

Mom: Of course it will get accepted. You're working so hard, won't God listen to you?

Everything comes from God. Forget making amazing awe-inspiring monuments. Writing a paper on air pollution in London comes from Him. Getting to go to a conference comes from Him.

My mom can't truly appreciate what I do. Because fundamentally, at the gut level, she can't get that I can accomplish anything. It's arrogant for me to even think I could do anything without Lord Krishna's supreme flute inspired magic.

Now that's a problem I want to solve.

Comment by hrishimittal on Outward Change Drives Inward Change · 2009-05-15T16:25:45.549Z · LW · GW

I agree. Go Vladimir!

Comment by hrishimittal on Outward Change Drives Inward Change · 2009-05-15T16:19:15.623Z · LW · GW

I'm confused by that example.

Let's say, by increased attractiveness you mean he started talking more attractively, then that is an outward change, but then the question is whether it was brought about by an inward change.

If the change happened without him thinking about it and only because of his surroundings as a seaman, - which is the point of your post - then it's surely not an inward change.

But if he changed upon reflection of his experience at sea and consciously changing his behaviour, then your robot analogy breaks.

Help!

Comment by hrishimittal on Outward Change Drives Inward Change · 2009-05-15T15:57:43.813Z · LW · GW

Wow that's amazing Vladimir, well done. The obvious next question is.... how did you do it? Please give an example of at least one of your tricks if possible.

Comment by hrishimittal on A Parable On Obsolete Ideologies · 2009-05-15T15:13:52.959Z · LW · GW

there's no long-term benefit associated with its removal.

The first step in solving a problem is to recognise it. If I discovered I have cancer I would be demoralised immensely but I'd prefer that and take a shot at recovering rather than die unknowingly.

Denial is not a path to improvement.

Comment by hrishimittal on Outward Change Drives Inward Change · 2009-05-15T14:56:05.772Z · LW · GW

Perhaps the title should be 'Outward change obviates inward change'?

Comment by hrishimittal on Religion, Mystery, and Warm, Soft Fuzzies · 2009-05-15T14:46:31.891Z · LW · GW

I didn't say it would be difficult for a religious person to come up with that idea. But if a religious person did come up with it, what does that have to do with their religion?

Comment by hrishimittal on Religion, Mystery, and Warm, Soft Fuzzies · 2009-05-15T10:19:05.174Z · LW · GW

there's a warm fuzziness to life that science just doesn't seem to get

Not true. Science helps create new warm fuzzies whereas religion has been re-using the same old one for millennia. The problem with religion is not that it lets people have warm fuzzies but that it provides false explanations.

For example, the building in Ireland that is discussed in the first BHTV episode: I imagine the warm fuzzies one gets on visiting that place are to do with the atmosphere that has been created, that rare experience of the sunlight breaking through carefully crafted openings in dark walls. It must be beautiful because it's scarce in both time and space. That's why it works. No one needs to know that to enjoy it. But here's the problem: religion's claim is that it's only by believing in God that such a beautiful thing has been possible. Which is not true. It has been made possible through people's imagination, engineering and hard work.

The point is that with religion, it's easy to forget that more is possible.

For example, imagine this future: one group of people builds a beautiful monument for another group of people as a gift. Most people in the second group would enjoy the sheer beauty of it, while some curious others could get extra warm fuzzies by figuring out how the first group made it.

certain religious stories and artwork may be of artistic value.

Yes, they certainly are. But I imagine a future where religious stories and art will pale in comparison to the ones people create without resorting to harmful lies.

His point seems to be that rationality isn't the only way to experience the world, which is absolutely, 100% right.

But it's the one that wins. And people do want to win.

You don't experience the world through rationality.Appreciating art, or food, or sex, or life is not generally done by applying rationality.

Right. It's done through intelligence, that's why rats don't paint. Remember EY's intelligence scale? The distinction is not between village idiot and Einstein. It's between amoeba, chimps, humans and higher intelligences.

And this I think is the biggest problem and it has been mentioned before.

Right now, individual rationality is bounded by individual intelligence. When someone needs to make a decision which is too much work for their intelligence or even beyond it i.e. a rational decision, they give up. It hurts their egos to think they can't make the right decision. They start rationalising: "it's not really necessary to always make rational choices." "all this rationality business is for those super clever nerdy types." And then they make bad decisions.

I wonder if over time a chemical structure has evolved in the brain which does this.

Hard problem->Computational limit->Rationalising->Wrong answer.

Comment by hrishimittal on Willpower Hax #487: Execute by Default · 2009-05-13T10:59:28.334Z · LW · GW

I get up most easily when I've slept enough. If I get 8 hours of sleep, I don't even have to try getting up. I feel refreshed and am happy to get up. I'm not sure if the number of hours is 8, but from memory it seems to be around that much.

Does anyone else have the same experience?

Comment by hrishimittal on Rationality in the Media: Don't (New Yorker, May 2009) · 2009-05-12T15:47:56.550Z · LW · GW

What do you mean by 'practice self-distraction'? Can you give an example?

Comment by hrishimittal on No One Knows Stuff · 2009-05-12T11:16:16.346Z · LW · GW

Thanks conchis.