Posts
Comments
"Promoting less than maximally accurate beliefs is an act of sabotage. Don't do it to anyone unless you'd also slash their tires."
I disagree with this one. If you scrupulously include every disclaimer and caveat, you'll be too boring for anyone to pay attention to. It's better to be pragmatic. Giving someone an improved but still not maximally-accurate belief is still an improvement.
I propose that the author of this quote is placing a moral value on people possessing maximally accurate beliefs. If so, the author's moral system is incompatible with Standard Utilitarianism, is it not?
"Your denial of the importance of objectivity amounts to announcing your intention to lie to us. No-one should believe anything you say." -- John McCarthy
Gee--I wish I had this one when I was taking that anthropology professor's class!
"If you want to do good, work on the technology, not on getting power." - John McCarthy
Except for technologies with catastrophic potential (nanotech, biotech.)
Maybe people have the idea that the line moves slowly, and that they can't cut in line. Thus if the front of the line gets past them, they have to wait until the entire line is gone before exiting.
You're probably right. But I can see a small benefit: we have become wary. There's still the possibility that someone will develop an effective defense system against the bomb. On the other hand, if we had never used the bomb, it would probably be less widely known and there would be the possibility of a sucker punch from cult of mentally disturbed physicists.
"Morality" generally refers to guidelines on one of two things:
(1). Doing good to other sentients. (2). Ensuring that the future is nice.
If you wanted to make me stop caring about (1), you could convince me that all other sentients were computer simulations who were different in kind than I was, and that there emotions were simulated according to sophisticated computer models. In that case, I would probably continue to treat sentients as peers, because things would be a lot more boring if I started thinking of them as mere NPCs.
If you wanted to make me stop caring about (2), you could tell me that I was living in computer simulation that would grant my every request (similar to the plot of this novel). If that were the case, I would set up sophisticated games for myself. Just taking the path of least resistance and maximizing momentary dopamine release would get boring quickly. (There's a reason why you see more kids eating candy than adults.) I would think carefully before I even experimented with maximizing dopamine release, since it would make everything else seem petty by comparison.
Either way, you would be ruining the secret to happiness:
"The secret of happiness is to find something more important than you are and dedicate your life to it." - Dan Dennet
@poke:
I imagine Eliezer is more interested in doing what works than avoiding criticism. And the real danger associated with creating a superhuman AI is that things would spiral out of control. That danger is still present if humanity is suddenly introduced to 24th century science.
I started reading the first chapter of Structure and Interpretation of Computer Programs and was reminded of this post.
A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real. It can perform intellectual work. It can answer questions. It can affect the world by disbursing money at a bank or by controlling a robot arm in a factory. The programs we use to conjure processes are like a sorcerer's spells. They are carefully composed from symbolic expressions in arcane and esoteric programming languages that prescribe the tasks we want our processes to perform.
Makes you want to learn to program, doesn't it?
So, any ideas on how to become one of that incredibly tiny number of people who desparately want to learn?
Well, I guess I'm not talking about the learning process itself so much as what keeps you going. In a traditional school environment, grades are the de facto student motivator.
My old Creative Minds professor has plenty of anti-school arguments. But when he tried attending a school without grades, he learned that it sucked: many students didn't show up for class, and of the ones that did, the only ones who participated in classroom discussions were those who had strong opinions.
So my question is when you're learning on your own, how do you find ways to motivate yourself? As I mentioned before, curiousity can be unreliable. Another technique is to think of what you're doing as special and unique, and saying to yourself "Hardly anyone is teaching themselves using the direct, efficient methods that I'm using. I'm operating outside the system and learning things that very few others are learning. If I finish all the exercises in this book, I will be a Level 6 Probability Master."
The upside of this is that you're motivated to learn more. The downside is that it might make you arrogant.
billswift, that's a really good point. This explains why newspapers can be bad--they arouse your curiosity, but on many different subjects, many of which are completely unproductive (such as the status of the US presidential election. For some reason, extensive coverage of voter opinion trends is within the realm of prestigious reporting.)
Kaj Sotala:
Your story is written decently, but it sounds like a parody of pretty much any traditional exam. If you remove the write-in answers requirement, you can have much more colorful examination scenarios.
Eliezer:
This seems like a cool motivation tactic. At the same time, I'm a little afraid that thinking my knowledge makes me special and unique will cause me to be arrogant.
Any commited autodidacts want to share how their autodidactism makes them feel compared to traditional schooled learners? I'm beginning to suspect that maybe it takes a certain element of belief in the superiority of one's methods to make autodidactism work. Otherwise you'd be running on pure curiosity, and in my experience that doesn't always hold out for long, especially when you're trying to tackle something more advanced.
In Nick Bostrom's paper on the survival of humanity, several potential catastrophe scenarios are technological ones. That makes me think that it might actually be a bad idea to popularize science.
The irony here is that information about how to create a catastrophe - how to make a nuke, how to construct viruses in a laboratory, how to make a nanobot - is just about the only scientific information that people are hiding. (Fortunatetly, though, they don't make a big deal about the fact they're hiding it.)
If you crunch the numbers differently, you can come to different conclusions. For example, if I choose 1B over 1A, I have a 1 in 34 chance of getting burned. If I choose 2B over 2A, my chance of getting burned is only 1 in 100.
Re: the Harris quote
The quote may be true. But if you passed up the opportunity to save a drowning child, then the ratio at which you value the happiness of yourself to the happiness of others is probably pretty damn high. So you're still an asshole.
BillK said:
"It really is the hardest thing in life for people to decide when to cut their losses."
No it's not. All you have to do is to periodically pretend that you were magically teleported into your current situation. Anything else is the sunk cost fallacy.