Posts

Comments

Comment by Weedlayer on Welcome to Less Wrong! (7th thread, December 2014) · 2015-04-11T15:28:01.054Z · LW · GW

Edit: I misunderstood what you said by "rationalize", sorry.

As Polymath said, rationalization means "To try to justify an irrational position later"", basically making excuses.

Anyway, I wouldn't worry about the downvotes, based on this post the people downvoting you probably weren't being passive aggressive, but rather misinterpreted what you posted. It can take a little while to learn the local beliefs and jargon.

Comment by Weedlayer on Reply to Holden on 'Tool AI' · 2015-04-09T01:41:22.858Z · LW · GW

I would consider 3 to be a few.

Comment by Weedlayer on Twenty basic rules for intelligent money management · 2015-03-20T18:59:38.372Z · LW · GW

Do you feel confident that you could recognize a Bitcoin-like opportunity if one did appear, distinguishing it from countless other unlikely investments which go bust?

Comment by Weedlayer on Quotes Repository · 2015-02-24T17:46:01.239Z · LW · GW

You should definitely post the entire quote here, not just the snippet with a link to the quote. For a moment I thought the one sentence was the entire quote, and nearly downvoted it for being trite.

Comment by Weedlayer on Rationality Quotes Thread February 2015 · 2015-02-04T20:28:14.519Z · LW · GW

While the quote is anti-rationality, it IS satirical, so I suppose it's fine.

Comment by Weedlayer on The Importance of Sidekicks · 2015-01-08T17:00:29.567Z · LW · GW

I'm fairly confident it stands for "Society for Creative Anachronism".

Comment by Weedlayer on Rationality Quotes December 2014 · 2014-12-08T19:20:11.774Z · LW · GW

Too strong.

Nobody EVER got successful from luck? Not even people born billionaires or royalty?

Nobody can EVER be happy without using intelligence? Only if you're using some definition of happiness that includes a term like "Philosophical fulfillment" or some such, which makes the issue tautological.

Comment by Weedlayer on Rationality Quotes December 2014 · 2014-12-03T08:53:53.625Z · LW · GW

The quote always annoyed me too. People bring it up for ANY infringement on liberty, often leaving off the words "Essential" and "Temporary", making a much stronger version of the quote (And of course, obviously wrong).

Tangentially, Sword of Good was my introduction to Yudkowsky, and by extension, LW.

Comment by Weedlayer on Rationality Quotes November 2014 · 2014-11-23T07:51:27.254Z · LW · GW

The tricky part is the "achievable levels of accuracy". It would be possible for, say Galileo to invent general relativity using the orbit of mercury, probably. But from a pebble, you would need VERY precise measurements, to an absurd level.

Comment by Weedlayer on Rationality Quotes November 2014 · 2014-11-23T07:46:50.674Z · LW · GW

Honestly, I did read the source, and it's very difficult to get anything useful out of it. The closest I could interpret it is "Theory (In what? Political Science?) had become removed from "Other fields" (In political science? Science?)".

In general, if context is needed to interpret the quote (I.E. It doesn't stand on it's own), it's good to mention that context in the post, rather than just linking to a source and expecting people to follow a comment thread to understand it.

Sorry if this is overly critical, that was not my intention. I just don't get what the "internecine conflict" you are referring to is.

Comment by Weedlayer on Rationality Quotes November 2014 · 2014-11-21T12:27:32.117Z · LW · GW

I'm not really getting anything from this other than "Mainstream philosophy, boo! Empiricism, yeah!"

Is there anything more to this post?

Comment by Weedlayer on A Cost- Benefit Analysis of Immunizing Healthy Adults Against Influenza · 2014-11-12T00:55:24.714Z · LW · GW

EV (Shot) = -$90 EV (No Shot) = -$104

Difference (Getting the shot minus not getting it) = -$90 - (-$104) = $14

Therefore, get the shot.

The first two values are in the tree. The difference can be figured out by mental arithmetic.

Comment by Weedlayer on A Cost- Benefit Analysis of Immunizing Healthy Adults Against Influenza · 2014-11-12T00:49:05.087Z · LW · GW

Would that be altruistic value? If I'm not mistaken, the cost of blood donation is generally just time, and the benefit is to other people. I have heard infrequent blood donation might be a health benefit, but I don't know much about that.

Comment by Weedlayer on A Cost- Benefit Analysis of Immunizing Healthy Adults Against Influenza · 2014-11-11T19:28:20.862Z · LW · GW

Well, if you don't value your health at all, then this seems valid.

Comment by Weedlayer on A Cost- Benefit Analysis of Immunizing Healthy Adults Against Influenza · 2014-11-11T19:25:20.666Z · LW · GW

I have already gotten a flu shot this year, primarily because the cost of getting one is approximately 10 minutes and 0 USD (They're covered by cost of attendance at my university and in a very convenient location for me).

Comment by Weedlayer on Rationality Quotes November 2014 · 2014-11-03T17:01:06.511Z · LW · GW

Also more than have died from UFAI. Clearly that's not worth worrying over either.

I'm not terrified of Ebola because it's been demonstrated to be controllable in fairly developed counties, but as a general rule this quote seems incredibly out of place on less wrong. People here discuss the dangers of things which have literally never happened before almost every day.

Comment by Weedlayer on On Caring · 2014-10-10T05:02:53.011Z · LW · GW

My moral position different from (in fact, diametrically opposed to) Alice's, but I'm not going to say that Alice's morals are wrong

You do realize she's implicitly calling you complicit in the perpetuation of the suffering and deaths of millions of animals right? I'm having difficulty understanding how you can NOT say that her morality is wrong. Her ACTIONS are clearly unobjectionable (Eating plants is certainly not worse than eating meat under the vast majority of ethical systems) but her MORALITY is quite controversial. I have a feeling like you accept this case because she is not doing anything that violates your own moral system, while you are doing something that violates hers. To use a (possibly hyperbolic and offensive) analogy, this is similar to a case where a murderer calls the morals of someone who doesn't accept murder as "just different", and something they have the full right to have.

No, I don't think so. (and following text)

I don't think your example works. He values success, AND he values other things (family, companionship, ect.) I'm not sure why you're calling different values "Different sides" as though they are separate agents. We all have values that occasionally conflict. I value a long life, even biological immortality if possible (I know, what am I doing on lesswrong with a value like that? /sarcasm), but I wouldn't sacrifice 1000 lives a day to keep me alive atop a golden throne. This doesn't seem like a case of my "Don't murder" side wanting me to value immortality less, it's more a case of considering the expected utility of my actions and coming to a conclusion about what collateral damage I'm willing to accept. It's a straight calculation, no value readjustment required.

As for your last point, I've never experienced such a radical change (I was raised religiously, but outside of weekly mass my family never seemed to take it very seriously and I can't remember caring too much about it). I actually don't know what makes other people adopt ideologies. For me, I'm a utilitarian because it seems like a logical way to formalize my empathy and altruistic desires, and to this day I have difficulty grokking deontology like natural law theology (you would think being raised catholic would teach you some of that. It did not).

So, to summarize my ramblings: I think your first example only LOOKS like reasonable disagreement because Alice's actions are unobjectionable to you, and you would feel differently if positions were reversed. I think your example of different sides is really just explaining different values, which have to be weighed against each other but need not cause moral distress. And I have no idea what to make of your last point.

If I ignored or misstated any of your points, or am just completely talking over you and not getting the point at all, please let me know.

Comment by Weedlayer on On Caring · 2014-10-09T21:53:24.733Z · LW · GW

There's no law of physics that talks about morality, certainly. Morals are derived from the human brain though, which is remarkably similar between individuals. With the exception of extreme outliers, possibly involving brain damage, all people feel emotions like happiness, sadness, pain and anger. Shouldn't it be possible to judge most morality on the basis of these common features, making an argument like "wanton murder is bad, because it goes against the empathy your brain evolved to feel, and hurts the survival chance you are born valuing"? I think this is basically the point EY makes about the "psychological unity of humankind".

Of course, this dream goes out the window with UFAI and aliens. Lets hope we don't have to deal with those.

Comment by Weedlayer on On Caring · 2014-10-09T21:40:08.118Z · LW · GW

This is a somewhat frustrating situation, where we both seem to agree on what morality is, but are talking over each other. I'll make two points and see if they move the conversation forward:

1: "There's no reason to consider your own value system to be the very best there is"

This seems to be similar to the point I made above about acknowledging on an intellectual level that my (factual) beliefs aren't the absolute best there is. The same logic holds true for morals. I know I'm making some mistakes, but I don't know where those mistakes are. On any individual issue, I think I'm right, and therefore logically if someone disagrees with me, I think they're wrong. This is what I mean by "thinking that one's own morals are the best". I know I might not be right on everything, but I think I'm right about every single issue, even the ones I might really be wrong about. After all, if I was wrong about something, and I was also aware of this fact, I would simply change my beliefs to the right thing (assuming the concept is binary. I have many beliefs I consider to be only approximations, which I consider to be only the best of any explanation I have heard so far. Not prefect, but "least wrong").

Which brings me to point 2.

2: "Also don't forget that your ability to manipulate your own morals is limited. Who you are is not necessarily who you wish you were."

I'm absolutely confused as to what this means. To me, a moral belief and a factual belief are approximately equal, at least internally (if I've been equivocating between the two, that's why). I know I can't alter my moral beliefs on a whim, but that's because I have no reason to want to. Consider self-modifying to want to murder innocents. I can't do this, primarily because I don't want to, and CAN'T want to for any conceivable reason (what reason does Gandhi have to take the murder pill if he doesn't get a million dollars?) I suppose modifying instrumental values to terminal values (which morals are) to enhance motivation is a possible reason, but that's an entirely different can of worms. If I wished I held certain moral beliefs, I already have them. After all, morality is just saying "You should do X". So wishing I had a different morality is like saying "I wish I though I should do X". What does that mean?

Not being who you wish to be is an issue of akrasia, not morality. I consider the two to be separate issues, with morality being an issue of beliefs and akrasia being an issue of motivation.

In short, I'm with you for the first line and two following paragraphs, and then you pull a conclusion out in the next paragraph that I disagree with. Clearly there's a discontinuity either in my reading or your writing.

Comment by Weedlayer on On Caring · 2014-10-09T17:07:11.144Z · LW · GW

What basis do you have for judging others morality other than your own morality? And if you ARE using your own morality to judge their morality, aren't you really just checking for similarity to your own?

I mean, it's the same way with beliefs. I understand not everything I believe is true, and I thus understand intellectually that someone else might be more correct (or, less wrong, if you will) than me. But in practice, when I'm evaluating others' beliefs I basically compare them with how similar they are to my own. On a particularly contentious issue, I consider reevaluating my beliefs, which of course is more difficult and involved, but for simple judgement I just use comparison.

Which of course is similar to the argument people sometimes bring up about "moral progress", claiming that a random walk would look like progress if it ended up where we are now (that is, progress is defined as similarity to modern beliefs).

My question though is that how do you judge morality/behavior if not through your own moral system? And if that is how you do it, how is your own morality not necessarily better?

Comment by Weedlayer on On Caring · 2014-10-09T16:48:04.146Z · LW · GW

Ah, I understand. Thanks for clearing up my confusion.

Comment by Weedlayer on On Caring · 2014-10-09T15:49:49.242Z · LW · GW

Ah, that's probably not what the parent meant then. What he was referring to was analogous to sharing your burden with the church community (or, in context, the effective altruism community).

Comment by Weedlayer on On Caring · 2014-10-09T15:44:22.256Z · LW · GW

So if my morality tells me that murdering innocent people is good, then that's not worse than whatever your moral system is?

I know it's possible to believe that (it was pretty much used as an example in my epistemology textbook for arguments against moral relativism), I just never figured anyone actually believed it.

Comment by Weedlayer on On Caring · 2014-10-09T12:40:33.495Z · LW · GW

It's also worth mentioning that cleaning birds after an oil spill isn't always even helpful. Some birds, like gulls and penguins, do pretty well. Others, like loons, tend to do poorly. Here are some articles concerning cleaning oiled birds.

http://www.npr.org/templates/story/story.php?storyId=127749940

http://news.discovery.com/animals/experts-kill-dont-clean-oiled-birds.htm

And I know that the oiled birds issue was only an example, but I just wanted to point out that this issue, much like the "Food and clothing aid to Africa" examples you often see, isn't necessarily a good idea even ignoring opportunity cost.

Comment by Weedlayer on On Caring · 2014-10-09T08:49:05.550Z · LW · GW

Obviously your mileage may vary, but I find it helps to imagine a stranger as someone else's family/friend. If I think of how much I care about people close to me, and imagine that that stranger has people who care about them as much as I can about my brother, then I find it easier to do things to help that person.

I guess you could say I don't really care about them, but care about the feelings of caring other people have towards them.

If that doesn't work, this is how I originally though of it. If a stranger passed by me on the street and collapsed, I would care about their well being (I know this empirically). I know nothing about them, I only care about them due to proximity. It offends me rationally that my sense of caring is utter dependent on something as stupid as proximity, so I simply create a rule that says "If I would care about this person if they were here, I have to act like I care if they are somewhere else". Thus, utilitarianism (or something like it).

It's worth noting that another, equally valid rule would be "If I wouldn't care about someone if they were far away, there's no reason to care about them when they happen to be right here". I don't like that rule as much, but it does resolve what I see as an inconsistency.

Comment by Weedlayer on On Caring · 2014-10-09T08:32:50.186Z · LW · GW

I'm actually having difficultly understanding the sentiment "I get annoyed at those who think their morals are better than mine". I mean, I can understand not wanting other people to look down on you as a basic emotional reaction, but doesn't everyone think their morals are better than other people?

That's the difference between morals and tastes. If I like chocolate ice cream and you like vanilla, then oh well. I don't really care and certainly don't think my tastes are better for anyone other than me. But if I think people should value the welfare of strangers and you don't, then of course I think my morality is better. Morals differ from tastes in that people believe that it's not just different, but WRONG to not follow them. If you remove that element from morality, what's left? The sentiment "I have these morals, but other people's morals are equally valid" sounds good, all egalitarian and such, but it doesn't make any sense to me. People judge the value of things through their moral system, and saying "System B is as good as System A, based on System A" is borderline nonsensical.

Also, as an aside, I think you should avoid rhetorical statements like "call me heartless if you like" if you're going to get this upset when someone actually does.

Comment by Weedlayer on On Caring · 2014-10-09T08:14:55.195Z · LW · GW

I'm not sure what point is being made here. Distributing burdens is a part of any group, why is religion exceptional here?

Comment by Weedlayer on The Puzzle of Faith and Belief · 2014-09-29T05:11:51.369Z · LW · GW

I have say I didn't find this post particularly useful.

On my first reading, I was having some difficultly understanding what point you were making. You seem to use some words or phrases in highly non-standard ways, I still have no idea what some sentences like "Its fairly easy to make sense on a mid-level" mean. I get the general impression of a post by someone whose first language isn't English, or who didn't proofread their own work, and that makes reading it a chore, not predisposing me to like it. Cleaning the post up and using more simple language, progressing from one idea to the next in an obvious and logical fashion would make it much easier to read.

After reading it again, it seems like the point you're making is that very few people have justifications for their beliefs, and thus, there's not a significant difference between having a religious belief because your guru told you, and having a secular belief because a scientist told you. In other words, physics is basically a religion unless you can do math. The things you say in support of this argument (If indeed it resembles in any way what you are trying to say) are strange, to say the least. You seem to conflate the belief "If I eat I will not be hungry" with religious beliefs by calling them both "faith". This seems disingenuous, like comparing the probability of winning the lottery and of not being struck lightning, and concluding both are "uncertain".

Your post offers no real suggestions for overcoming our biases, which you assert all people have. You say you "eliminat[e] false beliefs, and replac[e] them with more empowering true beliefs", followed by what seems to be an advertisement for contracting some type of service (counseling?). In other words, you say "everyone has false beliefs, you included, so hire me to help fix them and be happier".

There may be something of value here, but right now I can't see it.

Comment by Weedlayer on Rationality Quotes March 2014 · 2014-03-10T06:55:56.326Z · LW · GW

This quote reminded me of a quote from an anime called Kaiji, albeit your quote is much more succinct.

Normally, those people would never wake up from their fantasy worlds. They live meaningless lives. They waste their precious days over nothing. No matter how old they get, they'll continue to say, "My real life hasn't started yet. The real me is still asleep, so that's why my life is such garbage." They continue to tell themselves that. They continue. And they age. Then die. And on their deathbeds, they will finally realize: the life they lived was the real thing. People don't live provisional lives, nor do they die provisional deaths. That's a simple fact! The problem... is whether they realize that simple fact.

  • Yukio Tonegawa in Kaiji