Posts

Comments

Comment by kurige on A Less Wrong singularity article? · 2009-11-18T06:19:09.900Z · LW · GW

1) You can summarize arguments voiced by EY.
2) You cannot write a book that will be published under EY's name.
3) Writing a book takes a great deal of time and effort.

You're reading into connotation a bit too much.

Comment by kurige on Computer bugs and evolution · 2009-10-27T12:27:39.947Z · LW · GW

10,000-line Perl program.

Ouch.

It's nice to see some programming related content on LW. Thanks.

Comment by kurige on Deciding on our rationality focus · 2009-07-22T08:42:10.693Z · LW · GW

I would prefer a variation of bullet point number 3:

  • Allow i-rationality discussion, but promote only when it is an application of a related, tightly coupled e-rationality concept.

I am here for e-rationality discussion. It's "cool" to know that deodorant is most effective when applied at night, before I go to bed, but that doesn't do anything to fundamentally change the way I think.

Comment by kurige on The Strangest Thing An AI Could Tell You · 2009-07-17T06:01:19.464Z · LW · GW

There is a soul. It resides in the appendix. Anybody who has undergone an appendectomy is effectively a p-zombie.

Comment by kurige on How to come up with verbal probabilities · 2009-04-29T17:28:47.588Z · LW · GW

Thanks for the examples of how to apply OB/LW techniques to everyday life.

Definitely more articles in this vein would be greatly appreciated.

Comment by kurige on Less Meta · 2009-04-26T10:01:55.113Z · LW · GW

Just to avoid confusing Nominull... This post has now been "promoted", so it does now appear on the front-page, and in RSS feeds.

Comment by kurige on Escaping Your Past · 2009-04-23T03:17:11.525Z · LW · GW

Epistemic rationality alone might be well enough for those of us who simply love truth (who love truthseeking, I mean; the truth itself is usually an abomination)

What motivation is there to seek out an abomination? I read the linked comment and I disagree strongly... The curious, persistent rationalist should find the truth seeking process rewarding, but shouldn't it be rewarding because your working toward something wonderful? Worded another way - of what value is truth seeking if you hold the very object you seek in contempt?

If you take the strictly classical, rational view of the world than you lose the ability to say that truth is "beautiful". Not a great loss, considering "beauty" is an ill-defined, subjective term - but if you continue to cut everything our of your life that has no rational value then you very quickly become a psuedo-vulcan.

Truth, at the highest level, has an irrational, indefinable quality. It's this quality that makes it seductive, worthwhile, valuable, desirable. Truth is something you grok. Heinlein was a loony, but I do thank him for that word.

but some of my friends tell me there should be some sort of payoff for all this work of inference. And indeed, there should be: if you know how something works, you might be able to make it work better. Enter epistemic rationality, the art of doing better. We all want to better, and we all believe that we can do better...

I like to think that I seek truth. Others are here to "win" or "be better". Maybe we're all talking about the same thing. Maybe not.

This comment is a bit off-topic from the rest of the post, and quickly becoming dangerously Zen, but I would much appreciate it if somebody more knowledgeable on the subject could offer some disambiguation either here or in a separate post.

Comment by kurige on Proposal: Use the Wiki for Concepts · 2009-04-22T09:50:43.030Z · LW · GW

Eliezer, I don't know if you're familiar with the CIA's Intellipedia, but you seem to have hit the nail on the head.

The CIA have had huge success doing exactly what you describe here. You can read more about it in the paper here. The basic idea is that the intelligence community should harness the synergy of the blog/wiki combo.

From the paper:

The Wiki and the Blog are complimentary companion technologies that together form the core workspace that will allow intelligence officers to share, innovate, adapt, respond, and be—on occasion—brilliant. Blogs will cite Wiki entries. The occasional brilliant blog comment will shape the Wiki. The Blog will be vibrant, and make many sea changes in real-time. The Wiki, as it matures, will serve as corporate knowledge and will not be as fickle as the Blog. The Wiki will be authoritative in nature, while the Blog will be highly agile. The Blog is personal and opinionated. The Wiki is agreed-upon and corporate.

Comment by kurige on Well-Kept Gardens Die By Pacifism · 2009-04-21T05:14:02.481Z · LW · GW

The karma system is a integral part of the Reddit base code that this site is built on top of. It's designed to do one thing - increase the visibility of good content - and it does that one thing very well.

I agree, though, that there is untapped potential in the karma system. Personally I would love to see - if not by whom - at least when my comments are up/down voted.

Comment by kurige on Individual Rationality Is a Matter of Life and Death · 2009-03-23T06:29:16.957Z · LW · GW

Also, for it to be an unbiased comparison the two statements, "smart cars for all" and "cryopreservation for only the people who actually died that year" should be limited to the same domain.

If you compare different sets, one substantially larger than the other, then of course cryo is going to be cheaper!

A more balanced statement would be: "buying smart cars to save the lives of only the people who would have otherwise died by car accident in any given year would probably cost less than cryo-surance for the same set of people."

Plus you don't die. Which, for me, is preferable.

Comment by kurige on Cached Selves · 2009-03-23T00:08:49.824Z · LW · GW

Great post.

Here's some additional reading that supports your argument:

Distract yourself. You're more honest about your actions when you can't exert the mental energies necessary to rationalize your actions.

And the (subconcious) desire to avoid appearing hypocritical is a huge motivator.

I've noticed this in myself often. I faithfully watched LOST through the third season, explaining to my friends who had lost interest around the first season that it was, in fact, an awesome show. And then I realized it kind of sucked.

Comment by kurige on You're Calling *Who* A Cult Leader? · 2009-03-22T20:33:10.698Z · LW · GW

Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".

It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.

Comment by kurige on Individual Rationality Is a Matter of Life and Death · 2009-03-21T22:44:27.346Z · LW · GW

In the modern world, karate is unlikely to save your life. But rationality can.

The term "bayesian black-belt" has been thrown around a number of times on OB and LW... this, in my mind, seems misleading. As far as I can tell there are two ways in which bayesian reasoning can be applied directly: introspection and academia. Within those domains, sure, the metaphor makes sense... in meatspace life-and-death situations? Not so much.

"Being rational" doesn't develop your quick-twitch muscle fibers or give you a sixth sense.

Perhaps, where you live, you are never in danger of being physically accosted. If so, you are in the minority. Rationality may help you avoid such situations, but never with a 100% success rate. When you do find yourself in such a situation, you may find yourself wishing you'd studied up on a little Tae Kwon Do.

On at least two occasions - one only a year past - my life was at serious risk because I was not thinking clearly. ... As a gambler I don't like counting on luck, and I'd much rather be rational enough to avoid serious mistakes.

Can you give an example of how being "more rational" could have avoided the accidents?

Of course, properly applying rational techniques will bleed over into all areas of your life. Having a more accurate map of the territory means that you will make better decisions. The vast majority of these decisions, however, can be written off as common sense. Just because I drink coffee when I drive at night to stay alert doesn't make me a master of the "martial art of rationality".

Comment by kurige on Counterfactual Mugging · 2009-03-19T11:08:10.502Z · LW · GW

Thank you. Now I grok.

So, if this scenario is logically inconsistent for all values of 'me' then there really is nothing that I can learn about 'me' from this problem. I wish I hadn't thought about it so hard.

Comment by kurige on Counterfactual Mugging · 2009-03-19T10:34:18.539Z · LW · GW

That's not the situation in question. The scenario laid out by Vladimir_Nesov does not allow for an equal probability of getting $10000 and paying $100. Omega has already flipped the coin, and it's already been decided that I'm on the "losing" side. Join that with the fact that me giving $100 now does not increase the chance of me getting $10000 in the future because there is no repetition.

Perhaps there's something fundamental I'm missing here, but the linearity of events seems pretty clear. If Omega really did calculate that I would give him the $100 then either he miscalculated, or this situation cannot actually occur.

-- EDIT --

There is a third possibility after reading Cameron's reply... If Omega is correct and honest, then I am indeed going to give up the money.

But it's a bit of a trick question, isn't it? I'm going to give up the money because Omega says I'm going to give up the money and everything Omega says is gospel truth. However, if Omega hadn't said that I would give up the money, then I wouldn't of given up the money. Which makes this a bit of an impossible situation.

Assuming the existence of Omega, his intelligence, and his honesty, this scenario is an impossibility.

Comment by kurige on Counterfactual Mugging · 2009-03-19T09:44:21.150Z · LW · GW

Can you please explain the reasoning behind this? Given all of the restrictions mentioned (no iterations, no possible benefit to this self) I can't see any reason to part with my hard earned cash. My "gut" says "Hell no!" but I'm curious to see if I'm missing something.

Comment by kurige on How to Not Lose an Argument · 2009-03-19T08:35:38.030Z · LW · GW

This post goes hand in hand with Crisis of Faith. Eliezer's post is all about creating an internal crisis and your post is all about applying that to a real world debate. Like peanut-butter and jelly.

If you want to correct and not just refute then you cannot bring to the table evidence that can only be seen as evidence from your perspective. Ie. you cannot directly use evolution as evidence when the opposing party has no working knowledge of evolution. Likewise, a christian cannot convince an atheist of the existence of God by talking about the wonders of His creation. If you picture you and your opponent's belief systems as vin-diagrams then the discussion must start where they overlap, no matter how small that sliver of common knowledge might be. Hopefully, if you and your opponent employ crisis-of-faith properly, those two circles will slowly converge.

Comment by kurige on Never Leave Your Room · 2009-03-18T04:14:16.643Z · LW · GW

There is an excellent example of "priming" the mind here.

The idea is that specific prior knowledge drastically changes the way we process new information. You listen to a sine-wave modulated recording that is initially unintelligible. You then listen to the original recording. You are now primed. Listen again to the modulated recording and suddenly the previously unintelligible recording is clear as day.

I first listened to all of the samples on December 8th, when the link was posted on kottke.org. If I'm not mistaken that means it's been exactly 100 days since I last heard, or even thought about, these recordings. I listened to them again just a few minutes ago and understood every single one of them perfectly.

I can't decide if this is impressive or terrifying.

Comment by kurige on The Most Important Thing You Learned · 2009-03-14T11:01:49.462Z · LW · GW

Just did a quick search of this page and it didn't turn up... so, by far, the most memorable and referred-to post I've read on OB is Crisis of Faith.

Comment by kurige on Don't Believe You'll Self-Deceive · 2009-03-10T09:15:45.846Z · LW · GW

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

From the original comment:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept:

The power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them. … To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just so long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies.

The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was not my intention at all, although, upon reflection, it should have been obvious.

This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

If one does not agree with the other then my understanding of one or the other is flawed.

Comment by kurige on Don't Believe You'll Self-Deceive · 2009-03-10T00:31:46.144Z · LW · GW

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to recognize that it's a testament to this community that:

A) There have been very few purely emotional or irrational responses.

B) Of those that fall into (A) all have been heavily voted down.

Comment by kurige on The Mystery of the Haunted Rationalist · 2009-03-09T00:23:05.627Z · LW · GW

No, my experience with alone/together situations is quite different.

I usually don't laugh when I'm watching a funny movie by myself and, although I might flinch during jump scenes, I don't normally find scary movies to be all that scary when I watch them by myself.

There are hotels that tout themselves as "haunted hotels" and even bring in teams of "ghost experts" to get an official certificate proudly declaring the amount and type of "haunting" taking place at that location.

If it's known to be a joke, then sure, it's all fun and games - just as there is a sense of security in walking through the woods with a group of friends. But if even one of those friends is genuinely terrified, then that's a whole other story. It's enough to put everybody in the group on edge. You would be much better off walking through the woods alone.

Perhaps it's herd mentality - but knowing that other people are genuinely scared has a way of bleeding into your own psyche. Even if you rationally know better.

Comment by kurige on Moore's Paradox · 2009-03-08T08:45:37.674Z · LW · GW

If you're reading this, Kurige, you should very quickly say the above out loud, so you can notice that it seems at least slightly harder to swallow - notice the subjective difference - before you go to the trouble of rerationalizing.

There seems to be some confusion here concerning authority. I have the authority to say "I like the color green." It would not make sense for me to say "I believe I like the color green" because I have first-hand knowledge concerning my own likes and dislikes and I'm sufficiently confident in my own mental capacities to determine whether or not I'm deceiving myself concerning so simple a matter as my favorite color.

I do not have the authority to say, "Jane likes the color green." I may know Jane quite well, and the probability of my statement being accurate may be quite high, but my saying it is so does not make it so.

I chose to believe in the existance of God - deliberately and conciously. This decision, however, has absolutely zero effect on the actual existance of God.

Critical realism shows us that the world and our perception of the world are two different things. Ideally any rational thinker should have a close correlation between their perception of the world and reality, but outside of first-hand knowledge they are never equivalent.

You are correct - it is harder for me to say "God exists" than it is for me to say "I believe God exists" for the same reason it is harder for a scientist to say "the higgs-boson exists" than it is to say "according to our model, the higgs-boson should exist."

The scientist has evidence that such a particle exists, and may strongly believe in it's existence, but he does not have the authority to say definitively that it exists. It may exists, or not exist, independent of any such belief.

Comment by kurige on Moore's Paradox · 2009-03-08T07:31:49.945Z · LW · GW

Weasel words, as you call them, are a necessary part of any rational discussion. The scientific equivalent would be, "evidence indicates" or "statistics show".

Comment by kurige on Teaching the Unteachable · 2009-03-06T17:48:46.401Z · LW · GW

I was once told that half of Nobel laureates were the students of other Nobel laureates. ... Even after discounting for cherry-picking of students and political pull, this suggests to me that you can learn things by apprenticeship - close supervision, free-form discussion, ongoing error correction over a long period of time - that no Nobel laureate has yet succeeding in putting into any of their many books.

What is it that the students of Nobel laureates learn, but can't put into words?

You can't put mentornship in a book. When I face a problem that may or may not have a solution I find it useful to convince myself that there is a solution, and that I only need to find a path to it. Once I eliminate the doubt or fear that I might be wasting time I'm able to concentrate on the problem at hand. If you define "success" as a problem that may or may not have a solution (ie. you may or may not be able to achieve it) then studying under a super-star may give you a psychological edge over others in the same field. It's a form of tacit permission by which you subconsciously feel entitled to success and may be more likely to take gainful risks or less likely to simply give up.

Comment by kurige on Information cascades · 2009-03-06T06:46:16.800Z · LW · GW

In other words, be aware that popularity breeds popularity.

Comment by kurige on No, Really, I've Deceived Myself · 2009-03-06T06:35:18.619Z · LW · GW

I just looked it up, and it looks like you were correct about the Bonobos. Should have said "Pan Prior".

Comment by kurige on No, Really, I've Deceived Myself · 2009-03-05T06:14:37.489Z · LW · GW

This I can understand.

I am a protestant Christian and your friend's experience with "belief" are similar to mine. Or seem to be, from what I gather in your post.

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

The double-think comes into play when you're faced with non-axiomatic concepts such as morality. I believe that there is a God - and that He has instilled a sense of right and wrong in us by which we are able to evaluate the world around us. I also believe a sense of morality has been evolutionarily programmed into us - a sense of morality that is most likely a result of the formation of meta-political coalitions in Bonobo communities a very, very long time ago.

These two beliefs are not contradictory, but the complexity lies in reconciling the two. This post is not about the details of my Escher-esque brain, so suffice to say there are questions unanswered by viewing only the scientific side and there are just as many unanswered if viewed only from the spiritual side.

Simply because your friend is not blind to contradictions in the Orthodox Jewish belief system does not mean she does not sincerely believe - or that she's deceived herself into believing that she believes. It means that she, as all intelligent believers who practice crisis of faith should, understands just how much she doesn't understand.