Posts
Comments
The pain of a crisis is in having to deal with it in the first place. What you are proposing is to deal with a crisis before it occurs, in order to prepare ourselves better for it. This means incurring the pain upfront even though there is a low probability that the crisis will ever occur. Now multiply this by the number of different crises - New Orleans, after all, is different from Bear Sterns - and we might spend all of our spare time preparing for low-likelihood catastrophes.
Perhaps you'll argue that not everyone needs to prepare, only a few people need to do it on everyone's behalf, but I don't think that's true. Preparedness has to be pervasive, or else we're not prepared.
With the last quotation especially, this is ceasing to be "Rationality Quotes" and is beginning to be "Idealist Quotes".
Eliezer: how does this square with Robin's recent What Belief Conformity?
He quoted:
"physicists and mathematicians perform best in terms of "rationality" (i.e. performance according to theory) and psychologists worst. However, since "rational" behavior is only profitable when other subjects also behave rationally ... the ranking in terms of profits is just the opposite: psychologists are best and physicists are worst."
I don't think that you'll be able to develop a wholly deterministic machine that will surpass a human mind though. You need the random search to find a solution to problems where a good enough solution is relatively easy, but an optimal solution is impossibly tough.
It seems to me that the mind is just a generator of random ideas based on things experienced recently, where the ideas are checked in various layers for how much sense they make and passed to the conscious mind when they have already passed some filters. In essence, our thinking process is a combination of semi-random idea generation and tweaking, combined with validation and testing.
There seems to be no reason why the same could not be implemented in a machine. People who argue that machines cannot do random stuff have apparently never dealt with cryptography.
Eliezer: if the "ethical override" differs from culture to culture, and some people don't even have it, what's universal about it?
I'm not saying the phenomenon does not exist, but calling it an "ethical override" seems a misnomer. It might be more accurate to regard it as a form of hypnosis. If you're familiar with how hypnosis works, this seems similar to the environment impressing on you, as a child, that some arbitrary things should / should not be done. Since generally such instructions relay accumulated knowledge which one cannot earn or safely test in one's lifetime, it increases an individual's genetic fitness to heed such instructions, i.e. be "hypnotizable".
Eliezer: Are you really that sure that the ethical impulse you speak of is due to nature?
I am probably not alone in suggesting that it is due to nurture. It may seem to you that the ethical override is as hard-wired in you as hunger or thirst, but it may be that what is actually hardwired is not an ethical override. It is the listen-to-your-parents override.
It is kind of peculiar, is it not?, that ethic overrides such as you describe seem to be common among people who began their lives in religion, but not quite as common, and not quite as overriding, in people who did not. Contrast the principled attitudes of uptight religious people with those who were raised without stories of hell and damnation to scare them. Which type of person can be expected to avoid sex until they're married? And what for? A hard-wired ethical override? Or because evolution taught us that if parents tell us not to eat certain berries, we should not, or we will die?
I don't think that the ethical override you speak of is nearly as common as you purport. You only need to venture into a suitable part of Africa, where your head will be removed for the slightest of reasons, or into communities which raise their children ways quite dissimilar to how Catholic or Jewish children are raised.
Many of us have the ethical override because we are designed to internalize, on pain of death, the serious lessons taught by our environments. Remove the environmental lesson, and the ethical override disappears.
Eliezer: I don't get your altruism. Why not grab the crown? All things being equal, a future where you get to control things is preferable to a future where you don't, regardless of your inclinations.
Even if altruistic goals are important to you, it would seem like you'd have better chances of achieving them if you had more power.
Unless, I guess, if you judge that the activities needed to keep power, and to remain alive while under increased threat, would be too much of an obstacle to your other goals.
The only valid reason I see not to grab power is a selfish one: if it would mean getting yourself into a mess that you don't really need or want. Which seems likely to be the case. But then this is a selfish motivation, not an altruistic one.
Vladimir Nesov: thanks for your comment. I found it insightful.
Eliezer,
what's with the ego?
In other words - why are you so driven?
I gather from your posts that you have metaphysical views which make you believe that solving the FAI problem is the most important thing you should be doing.
But is it really that important that you are the one to bring this work to fruition?
Do you think your life will have been unfulfilled, or your opportunity wasted, if you don't finish this, and finish it as soon as you can?
Would building an exceptional foundation, which future exceptional people can improve on, not be achievement enough?
What does it matter how smart you are, if you are doing what you love, and giving it your best effort?
Perhaps it is the fear of being too late that is causing you distress. Perhaps you fear that humanity is going to be destroyed because you didn't build an FAI soon enough. Perhaps you fear that your life will end some 10,000 years sooner than you'd like.
But it is not your responsibility to save the world. It can be fun if you contribute to the effort. But planets are a dime a dozen, and lives are even cheaper than that. We are not really that important. No one is. In the grand scheme of things, our dramas and concerns are lightweight fun.
One of the problems of always being the top banana is that you never learn to realize that you don't have to be the top banana to be fulfilled in your life.
There's no need to worry so much about being on Jaynes's or Conway's level. Do what you do best, and do it because it's fun. If you've been given what it takes, then this is the fastest way to become the master of your field. And even if you didn't have what it takes - which in your case is unlikely - you would still be making a contribution and having fun.
Ian C.: Objective superiority is undefined unless you specify "Superiority in terms of what?"
In the example you made, it appears as though you are using "superior" to mean "the one I like more" or "the one I think is worthy of praise" or "the one whose behavior should be encouraged".
Objectively, the lazy genius is, by definition, "superior" in terms of intelligence.
Tiiba: I did a poll asking people if they would be in favor of banning the eating of meat. I also asked if they were themselves already vegetarians.
Surprisingly, most vegetarians did NOT vote in favor of banning meat.
Apparently, most people who are vegetarians don't necessarily believe that it's not okay to slaughter other animals for consumption.