Posts
Comments
"Society begins to appear much less unreasonable when one realizes its true function. It is there to help everyone to keep their minds off reality." Celia Green, The Human Evasion.
As for Newton's exact mental processes, they are lost to history, and we are not going to get very specific theories about them. Newton can only give us an outside view of the circumstances of discovery. His most important finds were made alone in his private home and outside of academic institutions. Eliezer left school early himself. Perhaps a common thread?
Teachers select strongly for IQ among students when they have power to choose their students. This might be a more powerful aggregator of high-IQ individuals than transmission from parents to children. It might be the case that teachers don't transmit any special powers to their students, but just like to affiliate with other high-IQ individuals, who then go on to do impressive things.
At a certain level of IQ (that of Yudkowsky, Newton) pedagogy becomes irrelevant and a child will teach itself, given the necessary resources. At this point, teachers are more likely to take credit for natural talent while doing nothing to aid it than they are to "transmit intellectual power."
http://www.nytimes.com/2010/02/19/opinion/19brooks.html
Link is to David Brooks, an elite columnist for an elite paper, chiding "elites". He gets paid for this stuff, and is presumably read in earnest by millions of Americans.
Irony is a means of simultaneously signalling and countersignalling.
By ironically obeying correct social forms, it is possible to receive status from conventional culture and counter-culture. The conventional culture does not want to admit that it is the butt of irony, and the counterculture likes people who score points off of the conventional culture. Is anyone aware of research into irony as a signalling strategy?
You are right to be confused. The idea that the simulators would necessarily have human-like motives can only be justified on anthropocentric grounds - whatever is out there, it must be like us.
Anything capable of running us as a simulation might exist in any arbitrarily strange physical environment that allowed enough processing power for the job. There is no basis for the assumption that simulators would have humanly comprehensible motives or a similar physical environment.
The simulation problem requires that we think about our entire perceived universe as a single point in possible-universe-space, and it is not possible to extrapolate from this one point.
GDP per capita is a better predictor of fertility than access to contraceptives.
The rejection is only as flimsy as the contraceptive programs are effective, on the margins where increased funding might make a difference. They may not be very effective at all while additional children are still profitable.
"Socioeconomic development is considered the main cause of a decline over time in the benefits of having children and a rise in their costs."
And when one goeth through fire for his teaching--what doth that prove? Verily, it is more when one's teaching cometh out of one's own burning!
-Friedrich Nietzsche, The Antichrist
I read this last year. It contained many of the important insights from ev. psych, especially in the area of mating strategies. It was far too wordy and long to justify its informational content. Robert Wright snagged most of the ideas from scientists, but he is a journalist, so he tends to mangle concepts and play up spurious "angles" of the "story." This was the most tedious thing I've read on the subject. Pinker is somewhat better.
There are many small daily problems I can't imagine addressing with math, and most people just cruise on intuition most of the time. Where we set the threshold for using math concepts seems to vary a lot with cognitive ability and our willingness to break out the graphing calculator when it might be of use.
It might be useful to lay down some psychological triggers so that we are reminded to be rational in situations where we too often operate intuitively. Conversely, a systematic account of things that are too trivial to rationalize and best left to our unconscious would be helpful. I'm not sure either sort of rule would be generalizable beyond the individual mind.
If losing is a soul-crushing defeat to be avoided at all costs and winning is The Delicious Cake, not the icing, there is a much stronger incentive to win.
See the OB article on Lost Purposes- there's a distinct chance that a process optimized for fact-finding or interesting-fact gathering won't be optimized for winning. Sometimes our map needs to reflect the territory just enough for us to find the treasure.
In the real world where games have consequences, there is specialization, insofar as it is possible, in exploration and winning. Defense R&D is a function very separate from combat, and engineering is mostly separate from physics. Because there are limits to the scope of human attention, our sense of "this might be useful elsewhere" comes into conflict with the drive to start and finish projects.
"With that caveat, this summary and plenty of the posts contained within are damn useful!"
I resoundingly agree.
That said, Eliezer is attempting to leverage the sentiments we now call "altruistic" into efficient other-optimizing. What if all people are really after is warm fuzzies? Mightn't they then shrink from the prospect of optimally helping others?
Hobbes gives us several possible reasons for altruism, none of which seem to be conducive to effective helping:
"When the transferring of right is not mutual, but one of the parties transferreth in hope to gain thereby friendship or service from another, or from his friends; or in hope to gain the reputation of charity, or magnanimity; or to deliver his mind from the pain of compassion [self-haters give more?]; or in hope of reward in heaven; this is not contract, but gift, free gift, grace: which words signify one and the same thing."
There is also the problem of epistemic limitations around other-optimizing. Charity might remove more utilons from the giver than it bestows upon the receiver, if only because it's difficult to know what other people need and easier to know what oneself needs.
"...then there's the idea that rationalists should be able to (a) solve group coordination problems, (b) care a lot about other people and (c) win..."
Why should rationalists necessarily care a lot about other people? If we are to avoid circular altruism and the nefarious effects of other-optimizing, the best amount of caring might be less than "a lot."
Additionally, caring about other people in the sense of seeking emotional gratification primarily in tribe-like social rituals may be truly inimical to dedicating one's life to theoretical physics, math, or any other far-thinking discipline.
Caring about other people may entail involvement in politics, and local politics can be just as mind-killing as national politics.
It might be prudent to avoid associating rationality with particular people or social institutions.
There's always the risk that particular instances of rationality will result in disaster, or that Bad Guys will be painstakingly rational, and in the early stages, wouldn't want to suffer the fate of religions, which often take reputation hits when their followers do nasty things.
Rationality could be advertised as a morally neutral instrumental value, i.e., Better Living Through Rationality.
On the other hand, we could sell rationality as a tool for atheists, drug policy activists, and stockbrokers, and publicly associate with their successes.
I would venture that emotivism can be a way of setting up short-run incentives for the achievement of sub-goals. If we think "Bayesian insights are good," we can derive some psychological satisfaction from things which, in themselves, do not have direct personal consequences.
By attaching "goodness" to things too far outside our feedback loops, like "ending hunger," we get things like counterproductive aid spending. By attaching "goodness" too strongly to subgoals close to individual feedback loops, like "publishing papers," we get a flood of inconsequential academic articles at the expense of general knowledge.