Posts
Comments
I'm not so sure I followed that. Do you still get tickets as long as you pledge $25 or higher? Or if you want the poster and a ticket do you have to make 2 pledges totaling $65?
Do you have a picture of the poster that comes with a $40 pledge? Also, do you still get the poster if you pledge more?
I'm the kid in the corner with the laptop
Probably what I'll end up doing. Just checking first is all.
Not sure if open thread is the best place to put this, but oh well.
I'm starting at Rutgers New Brunswick in a few weeks. There aren't any regular meetups in that area, but I figure there have to be at least a few people around there who read lesswrong. If any of you see this I'd be really interested in getting in touch.
I suppose modafinil should be in the same boat as caffeine for the purposes of this experiment.
I cried twice reading this. That puts it just below Humanism part 3 on my list of most touching chapters.
Quirrel in Methods has pretty much stated that he's trying to mold Harry into a dark lord. That requires Harry to be alive and is significantly more likely if he doesn't have Hermione's moral influence.
You will not be thrown in an asylum for discussing this with a professional
My experience disagrees. I went to see a professional for antidepressants, was emotionally stable at that moment, and was thrown in a psych ward for a week. I had to lie about my condition to be released. The whole affair failed to help in any way.
If my inhibitions regarding a certain course of action seem entirely internal, go through with it because I'm probably limiting my options for no good reason.
You would be correct. Thanks for the link.
How much money do you have to donate, if you don't mind my asking?
Kinda awkward to say aloud. I think Institute for the Research of Machine Intelligence would sound better. Minor nitpick.
No
This is a question about utilitarianism, not AI, but can anyone explain (or provide a link to an explanation) of why reducing the total suffering in the world is considered so important? I thought that we pretty much agreed that morality is based on moral intuitions and it seems pretty counterintuitive to value the states of mind of people too numerous to sympathize with as highly as people here do.
It seems to me that reducing suffering in a numbers game is the kind of thing you would say is your goal because it makes you sound like a good person, rather than something your conscience actually motivates you to do, but people here are usually pretty averse to conscious signaling, so I'm not sure that works as an explanation. I'm certain this has been covered elsewhere, but I haven't seen it.
You don't have to be specific, but how would grossing out the gatekeeper bring you closer to escape?
Like when you say "horrible, horrible things". What do you mean?
Driving a wedge between the gatekeeper and his or her loved ones? Threats? Exploiting any guilt or self-loathing the gatekeeper feels? Appealing to the gatekeeper's sense of obligation by twisting his or her interpretation of authority figures, objects of admiration, and internalized sense of honor? Asserting cynicism and general apathy towards the fate of mankind?
For all but the last one it seems like you'd need an in-depth knowledge of the gatekeeper's psyche and personal life.
But you wouldn't actually be posting it, you would be posting the fact that you conceive it possible for someone to post it, which you've clearly already done.
You really relish in the whole "scariest person the internet has ever introduced me to" thing, don't you?
Could you give me a hypothetical? I really can't imagine anything I could say that would be so terrible.
Adult readers never seriously maintain that fictitious characters exist
A) "Never" is a strong word. I imagine there are all kinds of mental disorders that can lead certain adults to confuse fiction with reality
B) "Existence" here is a cached term used for simplifying a concept to the point of being inaccurate. When a person says that, for instance, Frodo Baggins doesn't exist, he or she would be entirely incorrect to say that there is nothing in existence that matches the concept of Frodo Baggins. What the person is actually saying, is that a description of the character of Frodo calls to mind an image of something resembling an organism and that this image fits into a mental category that no actual organism can fit into. Whether or not Frodo "exists" is a pretty poor question unless fleshed out.
I can't imagine anything I could say that would make people I know hate me without specifically referring to their personal lives. What kind of talk do you have in mind?
Have there been any interesting AI box experiments with open logs? Everyone seems to insist on secrecy, which only serves to make me more curious. I get the feeling that, sooner or later, everyone on this site will be forced to try the experiment just to see what really happens.
Only read "External" so far, but I propose god(s) be divided into "trusted and idealized authority figures", "internalized sense of commitment to integrity of respected and admirable reputation (honor)", and "external personification of inner conscience".
If people cite God as the source of spiritual value, it's because he represents a combination of these things and the belief that their values are ingrained in reality. God isn't the root cause, and taking Him out of the equation still leaves the relevant feelings and commitment.
Also, "other humans" isn't relevantly different from "other agents".
Also, also, I'm not entirely clear on the point of this post (probably should've brought that up before correcting you, really). Are you citing actual sources of value, or the things people sometimes believe are the sources of value, whether or not they're correct? Value is necessarily formed from concepts in the mind, so the brain can be assumed to be the thing most usefully termed the origin.
Also, also, also, when you say "value" do you just mean moral value, or things people care about on the whole?
This post was from awhile ago and I don't think anyone with access to the note is still around to supply it. You could try asking everyone here for a copy and see if anything comes of it.
yes actually
This seems interesting. Are you just doing the whole thing through email? Also, voluntary response isn't a great way to get accurate results, but I guess it's all you have to work with.
I squeed when I saw this post and you should have shown the .mov series, everyone finds those funny.
Also, I don't think I can say that the root cause of climate change denial and cartoon hatedom is the exact same bias. With cartoons, people mostly reject them for fear of falling out of line with a vague but undeniably present cultural standard that could cause them grief in the future. With climate change, the issue has become so muddled in politics that clear lines have been drawn and to cross them would be labeled betrayal. Also, there are various non-scientific authorities that support either side and sometimes have personal agendas, so anyone who doesn't have a particularly strong trust in scientists, enough to take shortened summaries at face value without suspicion, has to either put time into actual research, or default to one side based on political affiliation. And I might have breached the no-politics rule there, I'm not sure.
There are so many biases behind fundamental creationism that I'm not even going to touch them.
72% probability of welcoming you to the herd
That's.... an interesting analysis. Can I ask whether you're speaking from experience, or is that too personal? If not, do you have any links for where you got you're information? I myself feel self destructive from time to time, and I think that's a pretty good description of the emotions involved, so I'm a bit curious here.
Freedom to make any sort of arrangement as long as all parties are willing. A "contract" would be a formal agreement. If you bring force into the mixture you'll end up with more problems than if you don't. You can't have everyone and their grandmother making arbitrary agreements and then using state power to coerce others into following through, so let them make arbitrary agreements and sort it out amongst themselves. Otherwise you get as much injustice as if you'd just allowed the government to dictate your affairs on a whim.
That still means he wanted to die, but the nature of his desire provokes extreme sympathy.
It seems to me the problem here is that the private contracts would be enforced in the hypothetical model. Libertarians seem to propose that the legal benefits of marriage as opposed to the arbitrary spiritual components are the aspect of marriage to be agreed upon. I disagree.
I think that people should be allowed to create private contracts for any issue, but only if those contracts are not enforced. Both parties must remain willing participants throughout the process. Also, if the state deems any contracts unacceptably offensive, or contrary to public interest, it obviously has the power to nullify them. "Contracts" without the intervention of force, would be nothing more than symbols that a consensus for agreement has been reached. Of course, I imagine that this will be unstable in the long run as people will gradually seek government assistance in getting their way, and the state will become more and more involved as time goes on, and the system will get slower and more complicated with each new contract.
It is my opinion, if I may get just a tad bit political, that the state, having little to no interest in the more harmless dynamics of an individual's love life, should ignore any agreements on how a person conducts their marriage as long as none of the rights the state formally exists to secure are infringed beyond what the public will tolerate. The legal status of a married couple, being relevant because the lives of two people living as a partnership have different financial qualities than the life of a person living independently, should apply to any arrangement with the same configuration of economic impact as a married couple. I think this is the optimal system, but seeing as how most people appeal to libertarian ideals only to support there own fundamentalist political tribe, it may or may not actually arise.
I'm really not sure if the fact that he wanted to die makes it better or worse...
Good post, but I can easily imagine awesome ways to starve hundreds of children.
"Awesome" to me means impressive and exciting in a pleasant manner. You seem to use it to mean desirable. If morality just means desirability, then there's no reasons to use the word morality. I think that for morality to have any use, it has to be a component of desirability, but not interchangeable with it.
You posted this here just for an excuse to ask the poll, didn't you?
This question is probably a violation of rule 4, but I think if we're discussing politics then it just has to be asked:
Regarding American politics, which party's general stance is more optimal for ensuring prosperity?
I realize that politicians often fail to meet the idealistic standards of their affiliations, and that both parties' s positions are too biased, general, and simple to actually be correct, but which one do you think comes closer to the mark, overall?
I believe Eliezer said somewhere that, if you had to choose between one of the two major tribes, the Republican camp was slightly better. He may have updated since then. Politics may kill minds, but it is still important, and individual votes do influence it even on a national scale, so if we're going to talk politics, we may as well spend the time trying to figure out how best to apply our conclusions.
Again, I'm sorry if this question is too mindkill-y even for this thread, but I believe that this is the most relevant question we can ask if we're talking politics. Also, I don't mean to fence off non-Americans, but I am American, as are the majority of Lesswrongers, and in any case, America has enormous influence on other nations one way or the other. If you have an opinion on the matter, please share it, whether you're American or not.
I intend to live forever or die trying
-- Groucho Marx
Okay, thanks. That was really bothering me.
I felt an extreme Deja Vu when I saw the title for this.
I'm pretty sure I saw a post with the same name a couple of months ago. I don't remember what the post was actually about, so I can't really compare substance, but I have to ask. Did you post this before?
Again, sorry if this is me being crazy.
This made me laugh. Also, I knew someone would do this the second Eliezer proposed new boundaries for Lesswrongers.
The only problem I can think of with this experiment is that your post could have been deleted for one of your more overt offenses, but it took until the time it was actually deleted for someone to actually get around to deleting it, especially with all the controversy. You have evidence that it was attacking Eliezer that broke the camel's back, but maybe not strong evidence. I don't think you can get anything conclusive from this.
What you should have done is post several different posts each violating one of the above rules and wait for something to happen, then post your conclusions in a day or two. (sigh) I guess it's too late for that now.
I think you mean the Litany of Gendlin, and I believe some of these rules are being newly implemented, but I could be wrong about that.
He can run his site anyway he wants, and most of the ideas here are reasonable precautions given his values. That doesn't change the fact that I intuitively don't like them when I read them, and that gut reaction (or possibly it's opposite) is probably shared with others here who probably allow it to color their arguments one way or the other. Just something to keep in mind, is all.
Intuitive gut reaction. If I had an argument to make I would have said so. Any case I make would have been formed from backtracking from my initial feeling, and I'm probably not the only commenter here arguing based on an "ick" or "yay" gut reaction to the idea of censorship. I thought it was worth pointing out.
Well...
I'm upset by this.
Not sure why, exactly, but yeah, definitely upset by this. Just felt like sharing.
In other words, I don't think there's a fact of the matter about "if people should die after 100 years, a thousand years, or longer or at all". The question assumes that there's some single answer that works for everyone. That seems unlikely.
Not necessarily true. The question posits the existence of an optimal outcome. It just neglects to mention what, exactly, said outcome would be optimal to. It would probably be necessary to determine the criteria a system that accounts for immortality has to meet to satisfy us before we start coming up with solutions.
The obvious answer is "Everyone dies if and when they feel like it. If you want to die after 100 years, by all means; if you feel like living for a thousand years, that's fine too; totally up to you."
A limited distribution of resources somewhat complicates the issue, and even with nanotechnology and fusion power there would still be the problem of organizing a system that isn't inherently self-destructive.
I think I agree with the spirit of your answer. "We can't possibly figure out how to do that and in any case doing so wouldn't feel right, so we'll let the people involved sort it out amongst themselves.," but there are a lot of problems that can arise from that. There would probably need to be some sort of system of checks and balances, but then that would probably deteriorate over time and has the potential to turn the whole thing upside down in itself. I doubt you'll ever be able to really design a system for all humanity.
And the idea that it's OK to impose a fixed lifespan on someone who doesn't want it is abhorrent.
To you, perhaps. Well, and me. You're intuitions on the matter are not universal, however. Far from it, as our friends's comments show.
My main problems (read: ones that don't rest entirely on feelings of moral sacredness) with such an idea would be the dangerous vulnerability of the system it describes to power grabs, its capacity to threaten my ambitions, and the fact that, if implemented, it would lead to a world that's all around boring (I mean, if you can fix the life spans then you already know the ending. The person dies. Why not just save yourself the trouble and leave them dead to begin with?)
Superhappy aliens, FAI, United Nations... There are multiple possibilities. One is that you stay healthy for, say, 100 years, then spawn once blissfully and stop existing (salmon analogy). Humans' terminal values are adjusted in a way that they don't strive for infinite individual lifespan.
Possible outcome; better than most; boring. I don't think that's really something to strive for, but my values are not yours, I guess. Also, I'm assuming we're just taking whether an outcome is desirable into account, not its probability of actually coming about.
I don't. Suffering is bad, finite individual existence is not necessarily so.
Did you arrive at this from logical extrapolation of your moral intuitions, or is this the root intuition? At this point I'm just curious to see how your moral values differ from mine.
I've never actually posted more than a comment here, so I'm all for the idea.
I don't know what to make of this. It means everything I'd pieced together about people is utterly, utterly wrong, because it assumed that they all valued truth, and understanding - the pursuits of intelligence when you don't have the political trait.
"Truth" and "understanding" seem to work as applause lights in this sentence. "Status" is used to the opposite effect throughout the post.
I think you're premise is a little confused. It sounds like you previously viewed status-seeking as the emotional equivalent of immoral, but now you don't because you realize it has adaptational advantages. I find it strange that you feel evolutionary causation is adequate to justify something, but I guess I won't question that.
More to the point, I think you're misjudging status. Status isn't as simple Machiavellian plays for power. It's generally assumed that only sociopaths play for dominance in and of itself. The term "status" feels kinda dirty when you analyze human interaction from afar. There's always the subtext that if you play for it, you're a bad person. That's not the way it feels when you're actually talking to other people.
Seeking status can feel like trying to live up to the expectations of people you care about. It can feel like trying to stand on equal ground with your friends. It can feel like trying be comfortable talking to that girl at the grocery store.
When people look at status seeking under a microscope, they usually try to screen off the humanity of its experience and so it comes off as something a super villain would do. When you actually feel it, it feels right. It feels very human. If you interact with other people at all, I can almost (not quite) guarantee that you seek status, you just don't call it status.
Note: Not trying to attack your position, just curious.
but I cannot decide for sure if fixed lifespan is such a bad idea.
Fixed by whom, might I ask?
It seems to me that associating natural death of an individual with evil is one of those side effects of evolution humans could do without.
You seem to be implying that designed death is worse. How do you figure?