A funny argument for traditional morality

post by cousin_it · 2011-07-12T21:25:25.597Z · LW · GW · Legacy · 70 comments

Contents

70 comments

I just had a long conversation with my brother, a devout Christian. With my help he has outlined the following argument why it might be good for me to follow Christian deontology:

  1. Many of my moral values arose from my upbringing, as opposed to biology. This is evidenced by the fact that biologically similar people living in different places and epochs have different ideas of what's right.
  2. Therefore many of my values originally came from the society that raised me.
  3. Society's values were strongly influenced by Christian values, and many of our core moral prohibitions are inherited from Christian tradition.
  4. The world is full of people who may want to edit my values ever-so-slightly while I'm not looking, in order to further their own agenda.
  5. Also my values may drift, and most drift is harmful from the perspective of my current values.
  6. A good recipe for countering this insidious deterioration of values is to consciously pull them back toward their original source, as long as it's something unchanging, like a book.
  7. That means editing my values to more closely match Christianity. QED.

What do you think?

70 comments

Comments sorted by top scores.

comment by Alicorn · 2011-07-12T21:36:42.707Z · LW(p) · GW(p)

I think this is an argument for having your values written down somewhere, and maybe even for getting them from a source that is not original to you, but I don't think it is a good reason to base your values on Christianity. The Bible itself does not closely match most modern persons' values, is not internally consistent, and can be interpreted in a variety of ways.

Replies from: cousin_it, Dorikka, Vladimir_M
comment by cousin_it · 2011-07-12T22:58:53.164Z · LW(p) · GW(p)

Good answer, but people are paying too much attention to the last part of #6. Maybe I should've left it out. Instead of becoming a Biblical literalist (which is stupid as you correctly point out) the hero/ine could study the history of religious morality that influenced their upbringing and try to follow that.

comment by Dorikka · 2011-07-13T03:34:23.931Z · LW(p) · GW(p)

I think this is an argument for having your values written down somewhere

It's a bird...it's a plane...it's a Third Alternative!

comment by Vladimir_M · 2011-07-12T21:41:31.700Z · LW(p) · GW(p)

The Bible itself does not closely match most modern persons' values, is not internally consistent, and can be interpreted in a variety of ways.

That argument is applicable only to sola scriptura Protestantism, and therefore not to the teachings of most Christian churches.

Replies from: Alicorn, drethelin
comment by Alicorn · 2011-07-12T21:44:30.778Z · LW(p) · GW(p)

See #6. The teachings of churches are not unchanging; documents such as the Bible are or at least can be.

Replies from: Vladimir_M
comment by Vladimir_M · 2011-07-12T22:05:37.824Z · LW(p) · GW(p)

Yes, but teachings of churches can also be stable for periods of time long enough to be relevant for this discussion (at least in principle). I don't know whether the original article was written with this in mind, but I understood #6 to refer to any such long-standing tradition. Clearly no religious group nowadays (or in the last couple millenniums, for that matter) espouses Biblical teachings without thick layers of traditional interpretation, whether they admit it or not. So insofar as the question is interesting at all, it should be asked about these traditional interpretations, not the raw Biblical text.

(Also, while documents can remain unchanged for arbitrary periods of time in the sense of containing the same series of writing symbols, their interpretations will inevitably change even if the greatest efforts are made to interpret them with maximal literalism or originalism. Consider, for example, that a text written in a living language will, in some centuries, become an archaic document undecipherable without special linguistic and historical training, which by the very nature of things requires some nontrivial interpretation to extract any meaning out of it. In this situation, I don't think it's meaningful to talk about the document remaining "unchanged" in any practically relevant sense.)

comment by drethelin · 2011-07-12T22:01:31.900Z · LW(p) · GW(p)

once you can pick and choose between various churches you open yourself up to exactly the same sort of drift this is designed to avoid

Replies from: Vladimir_M
comment by Vladimir_M · 2011-07-12T22:22:12.150Z · LW(p) · GW(p)

Well, yes, clearly. But the original argument makes sense only assuming a unique and stable tradition that has determined the values you were brought up with. If this happens to be the tradition of some realistic Christian church (or Jewish denomination), chances are that the text of the Bible is only one element of this tradition -- it definitely doesn't imply the whole content of the tradition by itself, and it may well even contradict parts of it, or at least be harmonized only with strained interpretations. (All this even if an opposite pretense is maintained.)

To evaluate the argument from the original article accurately, it is necessary to have a realistic picture of what the tradition in question exactly consists of. It is mistaken to assume that the answer to that question is simply the text of the Bible.

comment by Bongo · 2011-07-13T17:23:17.095Z · LW(p) · GW(p)

Here's another, unpolitical scenario about pulling back values

Consider a world where there are cosmic rays that can hit an agent's brain, but they home in on the part containing the utility function. Shielding from these rays is possible but expensive.

In this world, when an agent considers whether to invest in the expensive protection, it considers whether a version of it newly hit with a cosmic ray would remain loyal to the old version of it by keeping on maximizing the old utility function (with some weight) as well as the new cosmic ray begotten one.

Then, when a agent newly hit by a cosmic ray agent considers whether to remain loyal, it notices that if it doesn't, it's less likely to exist since it's predecessor would have invested in the protection, so it remains loyal.

Replies from: cousin_it, Nisan
comment by cousin_it · 2011-07-15T08:27:13.287Z · LW(p) · GW(p)

Interesting, thanks! I can't formalize your scenario because I don't understand it completely, but it looks like a game theory problem that should yield an equilibrium in mixed strategies, not unconditional loyalty.

comment by Nisan · 2011-07-19T23:00:00.074Z · LW(p) · GW(p)

This assumes that the new agent prefers to have existed, and it's not clear to me that people ordinarily have such a preference.

Replies from: Bongo
comment by Bongo · 2011-07-20T09:56:45.368Z · LW(p) · GW(p)

This wasn't about people but generic game-theoretic agents (and all else equal generic game-theoretic agents prefer to exist because then there will be someone in the world with their utility function exerting an influence on the world so as to make it rate higher in their utility function than it would have if there wasn't anyone).

Replies from: Nisan
comment by Nisan · 2011-07-20T19:06:14.488Z · LW(p) · GW(p)

Ah, good point.

comment by MinibearRex · 2011-07-13T02:34:40.597Z · LW(p) · GW(p)

Major issue: Christian ethics aren't stable. Polygamy, genocide, and slavery were all perfectly normal parts of life at various times during the development of the modern Christian faith. Those practices are frowned upon currently, at least in polite company. While many Christian ideas have in some way shaped current moral beliefs, their direct influence is much smaller than it is usually given credit for. And consider the CEV proposal. Early Christians thought that women and individuals of other races were not nearly as morally important as the males in their culture, but intelligent early Christians probably would have found the idea of gender and racial equality weird and moderately disconcerting, but not terrible. It might perhaps be analogous to if I told you that a few centuries from now, it would be regarded as a horrible, immoral belief to hold that a human's life was any more important than a Chimpanzee's (ie Trolley problems with two chimpanzees vs one human). That idea is weird and semi-disturbing, but it doesn't seem terrible. Drift of your moral feelings is fine. Just make sure you put some thought into what sort of direction you want your drift to be in.

Replies from: Emile
comment by Emile · 2011-07-13T12:39:03.137Z · LW(p) · GW(p)

Early Christians thought that women and individuals of other races were not nearly as morally important as the males in their culture

Any source on the "other races" bit? That doesn't match what I've read of early Christian history, where unlike Judaism, Christianity was universal. I agree that later on, there was more "religiously sanctioned" racism (I don't know to what extent), but I don't think it was there in the early days.

Replies from: CaveJohnson
comment by CaveJohnson · 2011-09-23T11:58:42.292Z · LW(p) · GW(p)

Don't disturb popular confusions of some variants of American Christianity with Early Christianity. It ruins important narratives.

comment by Bongo · 2011-07-13T16:54:31.814Z · LW(p) · GW(p)

Check out this puzzle of mine as well.

You’re a Cthulhu cultist and believe that Cthulhu will reward you by eating you first iff you perform painful ritual self-mutilation in His honor when the moon is full and would do so even if you believed that Cthulhu does not exist. You know, of course, that Cthulhu is a perfect predictor of mortals’ actions, so He knows what you would do.

One day you’re persuaded by an atheist’s arguments that Cthulhu doesn’t actually exist.

That night the moon is full. Do you perform the ritual?

comment by Wei Dai (Wei_Dai) · 2011-07-13T02:15:11.524Z · LW(p) · GW(p)

Counterargument:

  1. At least some of your values arose from your biology.
  2. Probably, among those "biological" values is the value of allowing your other values to be edited by others around you. This is evidenced by the fact that all humans allow their values to be edited by others around them, unless they've already undergone extreme amounts of indoctrination (and probably even then).
  3. Since you have not (I guess) undergone extreme amounts of indoctrination, there's no reason to expect that you no longer have that value.
Replies from: Emile
comment by Emile · 2011-07-13T09:56:05.643Z · LW(p) · GW(p)

I find calling "allowing your other values to be edited by others" a value is a bit forced - it's a feature of human brains, but if I were to model my mind as an agent with values, I'm not sure I'd list that among the values.

Also, there are cases where everybody here (probably) agrees we don't want our values to be changed by others (advertisers trying to get us to value the awesome image of driving a Mercedes Benz!), and cases where most people here would agree we want our values to be flexible and may adopt new ones (say, when thinking about how to deal with completely new and weird cases that aren't covered by our pre-existing values or by Christian tradition, like brain uploads or duplication machines or encountering weird aliens). I think most of the discussion is where exactly to draw the line.

Replies from: Wei_Dai
comment by Wei Dai (Wei_Dai) · 2011-07-13T17:07:06.546Z · LW(p) · GW(p)

I find calling "allowing your other values to be edited by others" a value is a bit forced - it's a feature of human brains, but if I were to model my mind as an agent with values, I'm not sure I'd list that among the values.

But it's not simply a hardwired feature either, since if you gave most people the option of self-modifying away that feature, they wouldn't accept. Perhaps another way to think about it is that we value our humanity (humanness) and allowing our values to be changed (to some extent) by others around us is a part of that.

comment by wedrifid · 2011-07-13T01:07:36.782Z · LW(p) · GW(p)

as long as it's something unchanging, like a book.

People don't get their morals from books (much). Christians included.

comment by Manfred · 2011-07-13T05:18:55.799Z · LW(p) · GW(p)

The trouble is that there are multiple meanings of "moral values" here. There is the human instantiation, and the ideal decision agent instantiation. The ideal decision agent instantiation is used in 5. and a bit in 4. The human instantiation is used elsewhere.

Though usually these are pretty close and the approximation is useful, it can also run into trouble when you're talking specifically about things humans do that ideal decision agents don't do, and this is one of those things.

Specifically, 5. doesn't necessarily work for human values, since we're so inconsistent. People can go into isolation and just think and come out with different human values. How weird is that?!

Replies from: Perplexed, cousin_it
comment by Perplexed · 2011-07-13T14:23:29.722Z · LW(p) · GW(p)

I think you are right to call attention to the issue of drift.

Drift is bad in a simple value - at least in agents that consider temporal consistency to be a component of rationality. But drift can be acceptable in those 'values' which are valued precisely because they are conventions.

It is not necessarily bad for a teen-age subculture if their aesthetic values (on makeup, piercing, and hair) drift. As long as they don't drift too fast so that nobody knows what to aim for.

Replies from: timtyler
comment by timtyler · 2011-07-13T21:13:11.567Z · LW(p) · GW(p)

It is not necessarily bad for a teen-age subculture if their aesthetic values (on makeup, piercing, and hair) drift. As long as they don't drift too fast so that nobody knows what to aim for.

Those are instrumental values. Nobody cares very much if those change, because they were just a means to an end in the first place.

Replies from: Perplexed
comment by Perplexed · 2011-07-15T14:50:01.914Z · LW(p) · GW(p)

My position here is roughly that all 'moral' values are instrumental in this sense. They are ways of coordinating so that people don't step on each other's toes.

Not sure I completely believe that, but it is the theory I am trying on at the moment. :)

Replies from: timtyler
comment by timtyler · 2011-07-15T15:25:45.586Z · LW(p) · GW(p)

Right - but there are surely also ultimate values.

Those are the ones that are expected to be resistant to change.

It can't be instrumental values all the way down.

Replies from: Perplexed
comment by Perplexed · 2011-07-16T19:05:33.473Z · LW(p) · GW(p)

Right - but there are surely also ultimate values.

Those are the ones that are expected to be resistant to change.

Correct. My current claim is that almost all of our moral values are instrumental, and thus subject to change as society evolves. And I find the source of our moral values in an egoism which is made more effective by reciprocity and social convention.

Replies from: timtyler
comment by timtyler · 2011-07-21T20:19:01.115Z · LW(p) · GW(p)

I think these guys) have a point. So, from my perspective, Egoism is badly named.

comment by cousin_it · 2011-07-13T12:19:31.745Z · LW(p) · GW(p)

I mostly agree, but the argument still works if you throw out 5 altogether.

Replies from: Manfred
comment by Manfred · 2011-07-13T17:55:28.431Z · LW(p) · GW(p)

5 is the only is-ought link in the chain. Seems pretty integral to me.

Replies from: cousin_it
comment by cousin_it · 2011-07-13T18:53:24.799Z · LW(p) · GW(p)

I thought 4 and 5 were parallel, with 4 a bit stronger than 5.

Replies from: Manfred
comment by Manfred · 2011-07-13T20:17:19.494Z · LW(p) · GW(p)

But that's only an "is" statement. To think "and that's guaranteed to be bad" at the end of 4 is to assume 5.

comment by TheOtherDave · 2011-07-13T16:57:08.266Z · LW(p) · GW(p)

Mostly, I think my ability to evaluate this argument is distorted beyond reliability by including the word "Christianity." So, first, let me try and taboo that word and generate a more generalized version of the argument:

1/2. My values are primarily learned from society, not innate.

  1. There exists some tradition from which society's values were primarily derived.
    4/5. My values, once learned, can be later modified. Most such modifications are harmful from the perspective of those values.
  2. Those (harmful) modifications can be countered by reverting to a reliable specification of the tradition from which society's values were primarily derived.
  3. Therefore doing so is beneficial.

That's far from airtight, but it's not altogether unreasonable.

My major problem with it is that, once my values drift, there's no obvious reason why I should prefer my original values... so it's really more of an argument for continuing to follow that specification rather than for choosing to in the first place. But admittedly moral drift in the real world is usually not an all-or-nothing thing (that is, I can continue to mostly hold onto my core values even while I'm in the process of drifting away from them), so that's not a debilitating problem.

So yeah, I would accept a weak probabilistic form of the conclusion: all else being equal, it's usually beneficial for me to follow a reliable specification of yadda yadda.

And, sure, it follows from that that if there exists something called Christianity which is a reliable specification of yadda yadda, all else being equal, it's usually beneficial for me to follow Christianity.

That said, I've never seen anything that matches that description in the real world.

comment by Morendil · 2011-07-13T12:23:19.634Z · LW(p) · GW(p)

When scrutinizing an argument, one good heuristic is to focus on vague words like "many" and aim for a more robust version. The argument has several such words: "many" in #1, "strongly" and "many" in #3, "full of" in #4, "most" in #5.

For instance, does "many of my moral values" stand for 1%, 10%, 50%, 90% or 99% of your values in that argument? How strong an impression does the argument make on you depending on which of these rough quantifiers you substitute for "many"? (Subsidiary question - when talking about "your values", how many distinct things are we talking about?)

The "nature" vs "nurture" debate is just as vigorous in the philosophy of morals as it is regarding intelligence, so any of the above answers is probably held by some fraction of the population to be at least defensible.

Even the phrase "Christian values" is a potentially slippery one, allowing much shifting of goal posts. So the argument should probably start by unpacking that phrase into a more definite list. (Do we mean the Ten Commandments? Of these I recognize about three as describing "my values", which would falsify #3 as far as I'm concerned.)

Similarly you would be on a better footing to evaluate this argument if you had, as Alicorn suggests, an explicit list of what you think your values are.

comment by Peterdjones · 2011-07-13T17:44:28.813Z · LW(p) · GW(p)

The world is full of people who may want to edit my values ever-so-slightly while I'm not looking, in order to further their own agenda. lso my values may drift, and most drift is harmful from the perspective of my current values. A good recipe for countering this insidious deterioration of values is to consciously pull them back toward their original source, as long as it's something unchanging, like a book.

The argument assumes change is necessarily for the worse. People can aquire new values whilst seeing them as an improvement. If it is possible to meta-evaluate values this way, then you should seek to improve your values any way you can. if it is not possible to meta-evaluate values, then you don't know that your existing values are optimal or any good at all. Keeping them out of sheer conservatism is not rational. Although one could make a half hearted wisdom-of-the-crowds argument.

Replies from: None
comment by [deleted] · 2011-09-23T11:55:18.297Z · LW(p) · GW(p)

If it is possible to meta-evaluate values this way, then you should seek to improve your values any way you can. if it is not possible to meta-evaluate values, then you don't know that your existing values are optimal or any good at all.

Can you expand by what you mean by meta-evaluate? I can understand analysing values by a framework that has nothing to do with my values. But why would I try to maximise some metric employed by this framework, so I'm confused as to why I don't end up just following my values.

comment by SilasBarta · 2011-07-13T15:19:00.864Z · LW(p) · GW(p)

The world is full of people who may want to edit my values ever-so-slightly while I'm not looking, in order to further their own agenda.

Hm, at this point it sounds similar to the point Phil Goetz was making in "Reason as memetic immune disorder".

comment by DanielLC · 2011-07-12T23:33:58.876Z · LW(p) · GW(p)

There is moral error and moral disagreement. If your values change because of moral disagreement between your past and future self, that is something you'd want to prevent. If, however, you are simply correcting your moral error, this should be encouraged. In this case, your future self is acting more moral than you are by your current belief system, since he understands it better.

I think most of my change in morality will be due to correcting moral error. As such, in a matter of dispute, I trust my future self over my present self.

comment by Desrtopa · 2011-07-13T00:48:07.793Z · LW(p) · GW(p)

As Alicorn says, provided you are averse to values drift, this is an argument towards writing your values down and using that as a periodic anchor. Not only is it not clear what Christianity's values actually are (witness the tremendous proliferation of interpretations among Christians in outright defiance of other interpretations,) making this change itself constitutes a shift in your values to satisfy someone else's agenda in ways that are harmful with respect to your current values.

comment by torekp · 2011-07-12T23:16:48.433Z · LW(p) · GW(p)

The argument has merit, but the conclusion (7) needs to be replaced with something more appropriate in light of (3). You should edit your values to more closely match an amalgam of the many influences that affected you. Or better yet, as Alicorn says, have your values written down somewhere. Including your acceptance of rational change - which puts an interesting twist on the whole deal.

Replies from: cousin_it
comment by cousin_it · 2011-07-12T23:21:03.131Z · LW(p) · GW(p)

Agreed. Also, TDT adds another interesting twist. If I always do whatever I would've precommitted to doing earlier, how far should I roll back my morality once I notice the argument?

comment by Maurus · 2011-07-14T21:21:02.664Z · LW(p) · GW(p)

Can anyone explain why, in a rapidly changing world, we need "absolute" and "eternal" morality?

Replies from: cousin_it
comment by cousin_it · 2011-07-15T08:23:33.489Z · LW(p) · GW(p)

We don't. The argument in the post is trying to solve the problem of protecting whatever current values you happen to have (and the values that your past selves happened to have, if you care about them), not the problem of finding absolute eternal values (whatever that means).

Replies from: TheOtherDave
comment by TheOtherDave · 2011-07-19T18:44:42.834Z · LW(p) · GW(p)

My understanding of "eternal values" is precisely the thing I get if I solve the problem of protecting my current values in a sufficiently general way: a set of values that does not change over time. This is in the same sense that solving the "don't die today" problem in a sufficiently general way provides eternal life.

comment by [deleted] · 2011-07-13T15:53:24.169Z · LW(p) · GW(p)

First of all, this may be an attempt to change your value system and I want you to bear that in mind while reading this post.

1: Seems to be a statement of fact which there is a lot of evidence for and I don't have a problem with.

2: Seems to be a reasonable conclusion from 1.

3: Seems to be a conclusion for people living in the United States which has SOME evidence backing it, but there do exist counter arguments against that such as here But rather than sidetracking this, I'll just link a google search and let you draw their own conclusions.

4: I feel like one interpretation of 4 includes your brother himself, you, me, and Eliezer Yudkowsky. I mean, I'm actively trying to convince you of something when making this post. If I don't explicitly caveat myself with "This may be an attempt to change your value system and I want you to bear that in mind while reading this post." then you might fairly consider the attempt to be "while I'm not looking."

5: This isn't guaranteed to be true for all people, although it may be true for you. It is entirely possible to get into a value system that you DON'T like and would like to drift out of. Depression is a good example of this, although there may be others. He does account for this by saying "Most" but see my comment on #6.

6: As an example of "while I'm not looking." calling the drift in values "Insidious deterioration." may affect your perception of an argument on a subtle level. Was the fact that a drift in values was "Insidious deterioration." ever backed up? Some types of value changes feel like that to me, but others don't. Some types of value changes feel more like a lightbulb of enlightment going off. I feel like the argument is trying to fast talk me here. "Most drift is harmful, so YOUR drift is insidious deteroration." feels like it is present in the argument as an underlying assumption and is not being sufficiently backed up.

7: Considering what I discussed about 3-6, I don't think I agree with the QED.

comment by Perplexed · 2011-07-13T14:08:26.841Z · LW(p) · GW(p)

I think the argument is interesting and partly valid. Explaining which part I like will take some explanation.

Many of our problems thinking about morality, I think, arise from a failure to make a distinction between two different things.

  • Morality in daily life
  • Morality as an ideal

Morality of daily life is a social convention. It serves its societal and personal (egoistically prudent) function precisely because it is a (mostly) shared convention. Almost any reasonable moral code, if common knowledge, is better than no common code.

Morality as an ideal is the morality-of-daily-life toward which moral reformers should be trying to slowly shift their societies. A wise person will interpolate their behavior between the local morality-of-daily-life and their own morality-as-an-ideal. And probably closer to the local norm than to the personal ideal.

So, with that said, I think that your Christian friend's argument is right-on wrt morality-of-daily-life. But it is inapplicable, IMHO, to morality-as-an-ideal.

ETA: I notice, after writing, that Manfred said something very similar.

comment by prase · 2011-07-13T11:02:41.031Z · LW(p) · GW(p)

Isn't there a hidden problem with values taking other values as their arguments? If I can value having a particular value, I can possibly construct self-referential paradoxical values like V = value of not having value V, and empty values like V = value of having value V. A value system including value W = value of preserving other values, W included seems to be of that kind, and I am not sure whether it can be transformed into a consistent decision algorithm.

On the other hand, look at what happens if we avoid self-reference by introducing distinct types of values, where the 0th order values can speak only about external world states and n-th order meta-values can speak only about the external world and m<n-th order values. I assume that the Christian (or more generally any traditional) values all belong to the 0th order rank. Let's also assume that we have a 1st-order meta-value of preserving all 0th-order values. But what prevents this meta-value from drifting away? Seems that without an infinite chain of meta-values there is no way how to prevent drift. Can people hold an infinite chain of meta-values?

Now, let's leave the possibility of the value-preservation meta-value drift aside, and look at how this meta-value is exactly implemented. Does it say "try to minimise the integral of Abs(d value/dt)", or does it say "try to minimise the integral of Abs(value(t)-value(t0)), where t0 is some arbitrary instant"? If the former is the case, the meta-value doesn't encourage you to periodically revise your values with respect to some written stable standard. If an infinitesimal drift accidentally slips through your value-preservation guards, once the guards wake up they shall protect the new value set, not try to return back to the original one. Only if the latter is the case the funny argument holds water. But I suspect that the former description is more accurate, evidenced by how few people strive to return to their past values.

Replies from: cousin_it
comment by cousin_it · 2011-07-13T12:13:22.661Z · LW(p) · GW(p)

I expect that in most cases having an object-level value V should make you behave as if you were also protecting the meta-value V' of your valuing V, because your switching away from V can be detrimental to V. Also see my reply to torekp for a reason why you might want to return to your past values even though they don't coincide with the current ones.

Replies from: prase
comment by prase · 2011-07-13T13:19:56.176Z · LW(p) · GW(p)

It certainly seems that valuing V implies valuing valuing V and valuing^3 V and so on. But, if we try to formalise a bit the notion of value, doesn't it produce some unexpected paradoxes? I haven't thought about it in detail, so perhaps there is no problem, but I am not convinced.

I don't understand the relevance of your reply to torekp.

Replies from: cousin_it
comment by cousin_it · 2011-07-13T14:36:18.957Z · LW(p) · GW(p)

I don't see how paradoxes could arise. For example, if you have a value V of having value V, that's a perfectly well-defined function on future states of the world and you know what to do to maximize it. (You can remove explicit self-reference by using quining), aka the diagonal lemma.) Likewise for the value W of not having value W. The actions of an agent having such a value will be pretty bizarre, but Bayesian-rational.

I don't understand the relevance of your reply to torekp.

It shows a possible reason why you might want to return to your past values once you approach TDT-ish reflective consistency, even if you don't want that in your current state. I'm not sure it's correct, though.

comment by mutterc · 2011-07-13T00:57:05.928Z · LW(p) · GW(p)

drift is harmful from the perspective of my current values

True. And that drift would be beneficial from the perspective of your new, drifted-to values.

But neither of those statements have any bearing on whether value drift (in general or any specific instance thereof) is good or bad.

Replies from: prase
comment by prase · 2011-07-13T09:23:54.336Z · LW(p) · GW(p)

It's good measured by the new values and bad measured by the old ones. What other standards of goodness do we have at our disposal in this problem?

Replies from: mutterc
comment by mutterc · 2011-07-16T13:54:19.470Z · LW(p) · GW(p)

Good question :-)

We could measure them against some non-relativist ethical standard, like "US Dollars lost", "lives lost", "person-weeks stuck in traffic" or somesuch.

comment by Alexei · 2011-07-12T22:20:07.709Z · LW(p) · GW(p)

All values, wherever they come from, need to be re-examined on their own merit. At one point slavery was thought to be acceptable by a lot of people. If you grew up in that society, you would probably inherit that belief as well. There are very likely similar beliefs that you have right now, that were given to you by some source you find credible (society, the Bible, or LW) that you might be better off not having. That's why you need to examine every single belief that you have aside from its source. You can't assume they are automatically correct because they fit a moral framework, unless you are also absolutely certain that moral framework is right. And proving that a moral framework is right requires a LOT of evidence.

More specifically:

3) What about all the other people in this world. Many of their morality is based on principles have little to do with Christianity or the Bible per se. Are they automatically wrong? Automatically right? What makes Christianity special? If you were in their shoes, you could do this step by step argument for Buddhism or Islam or in fact any other moral framework.

4) Just because it will further their own agenda, doesn't mean it won't benefit you.

5) Consistency for the sake of consistency is bad.

6) The book can be interpreted in multitude of ways. In fact, every time you read the Bible, or any other book, your interpretation will be slightly different.

Replies from: Emile, cousin_it
comment by Emile · 2011-07-13T13:37:09.889Z · LW(p) · GW(p)

Consistency for the sake of consistency is bad.

I disagree - it's neutral at worse.

There are some advantages if everybody in a society has similar social norms; obviously driving on a certain side of the road, but also things like whether tipping in restaurants is "morally required" or not (it is in the US, it isn't in France, in both cases the pay of servers is adjusted accordingly).

Or if you're talking about consistency at the individual level, having consistent values makes one slightly more predictable, and the expected correctness of a set of consistent values is probably sligtly higher than the expected correctness of a set of inconsistent values.

In general, most people are more comfortable living in a society who shares their values, so I'd say "Consistency for the sake of consistency" is generally slighltly good, to be of course balanced with other things.

Replies from: Alexei, MixedNuts
comment by Alexei · 2011-07-13T17:27:54.728Z · LW(p) · GW(p)

Good point. I was talking about consistency on the individual level, and overall it's probably at least mildly beneficial.

comment by MixedNuts · 2011-07-13T14:16:33.395Z · LW(p) · GW(p)

Warning to foreigners: tipping in restaurants is morally required in France. It's a tiny tip, about two euros, but not tipping still makes you a very rude and bad person who defects.

Replies from: Emile
comment by Emile · 2011-07-13T14:21:53.290Z · LW(p) · GW(p)

Is it? I get the impression that it's expected in cafés, but "more optional" in restaurants (it probably also depends of the restaurant). Some quick googling seems to agree (except for the restaurant/café difference, maybe my pattern matching on the behavior of others is overactive)

Replies from: MixedNuts
comment by MixedNuts · 2011-07-13T14:56:47.967Z · LW(p) · GW(p)

The website says it's a Paris thing, which sounds plausible. I wouldn't know about nice restaurants. I'm pretty sure it's expected in regular (for some value thereof) restaurants, at least in Paris: French television movies always show waiters getting mad at customers who don't tip, and my parents (who are stingy with tips) always tip in restaurants. My intuition says that tipping in restaurants is even more important than in cafés, but I don't know why - maybe just because the tip is bigger?

(Remember the second Paris meetup, where I made an ass of myself by complaining I didn't have enough money? I added a few coins to the pile when we left anyway. Not tipping is a mortal sin.)

comment by cousin_it · 2011-07-12T23:12:45.478Z · LW(p) · GW(p)

The post assumed that protecting the morality you happen to have is a worthwhile goal, which is orthogonal to the problem of finding the "right" morality that your comment is trying to address.

comment by SeanTheMystic · 2011-07-13T16:46:24.047Z · LW(p) · GW(p)

Your brother makes an excellent point, which is something I’ve always found amusing about atheists: despite their professed godlessness, their values have an uncanny resemblance to those of their Judeo-Christian heritage. It’s extremely difficult to find an atheist who takes his/her atheism seriously and is willing to really think “outside the book” about what kinds of morality are possible and act upon those ideas. Very few can resist the moral inertia of their culture; those who do are generally thought of as “evil”.

As an example, I have no trouble imagining a scientific civilization that is brutally unegalitarian, aggressive, atheistic, polygamous, Eugenicist, Darwinian, etc. in its values; in fact this would be my preference. As another example, I've always found the Star Trek Mirror Universe more appealing and more human than the normal one. But to espouse such ideas in a society dominated by Judeo-Christian values is to be labeled a “Nazi,” etc. We in the West are mentally colonized by these values from day one, even (or perhaps especially) here on this forum of rationalist moralists, and it is very difficult to find truly free thinkers who are willing to challenge the prevailing ethos.

Replies from: ArisKatsaris, Maurus, Peterdjones
comment by ArisKatsaris · 2011-07-15T12:14:31.993Z · LW(p) · GW(p)

I can likewise imagine (and could prefer) a scientific civilization that is freely polyamorous, atheistic, Eugenicist, etc.

But "brutally unegalitarian and aggressive"? Why in the seven hells would I prefer to live in such a horrid place? Historical precedent indicates that the more unegalitarian the society the most horrid it is for the majority of its people. Aggressiveness is even more likely to lead to a horrible society. My limited personal experience confirms (my one-year military service being the the worst sub-society I've been in).

Perhaps when you imagine such a society you imagine yourself being the boot, not the face it crushes forever? To evaluate it properly you need imagine both, giving weight according to the percentage of the crushers vs the crushees.

Replies from: AlexM
comment by AlexM · 2011-07-16T20:22:23.191Z · LW(p) · GW(p)

I can likewise imagine (and could prefer) a scientific civilization that is freely polyamorous, atheistic, Eugenicist, etc.

Scientific civilization that actually understands science of biology would steer clear of eugenics.

For pure pragmatic reasons - breeding better (whatewer value of "better" you choose) humans would last at least several centuries - and the problem is that you do not know what traits would be needed then.

Here is one actual historical example of human breeding. Had Frederick II and other kings of Prussia continued the work, Germany could well have a race of eight foot tall soldiers - just in time for WWI.

comment by Maurus · 2011-07-14T21:47:37.342Z · LW(p) · GW(p)

Your brother makes an excellent point, which is something I’ve always found amusing about atheists: despite their professed godlessness, their values have an uncanny resemblance to those of their Judeo-Christian heritage.

Not many atheists and secularists are willing to turn the other cheek when someone hits them, not look lustfully at the opposite sex, give everything to the poor and care not for the future. Neither are many Christians.

As an example, I have no trouble imagining a scientific civilization that is brutally unegalitarian, aggressive, atheistic, polygamous, Eugenicist, Darwinian, etc. in its values; in fact this would be my preference.

Look at The Domination. This would fit your vision better than the Sith.

comment by Peterdjones · 2011-07-13T17:48:06.701Z · LW(p) · GW(p)

Or maybe no one has a sufficient thirst fo rebellion-for-rebellions sake to be attracted by your vision of society.

Replies from: SeanTheMystic
comment by SeanTheMystic · 2011-07-13T19:01:59.881Z · LW(p) · GW(p)

Not for rebellion's sake so much as for the sake of survival and power. As one who enjoys the benefits of modernity, I find it perplexing that modernity has become something that is, from any rational Darwinian perspective, a form of cultural suicide. What I'm looking for is a civilization whose prophets have more resemblance to Nietzsche than to Jesus or his secular disciple, Marx. I just don't see much future for non-Nietzschean modernity; it is a dying culture of empty cathedrals and hospital-tombs. If secular people continue to abort themselves into extinction, the future will belong to the fundamentalists. I see nothing particularly rational about this!

Replies from: Eugine_Nier, ArisKatsaris, CharlieSheen, Maurus
comment by Eugine_Nier · 2011-07-16T22:38:39.239Z · LW(p) · GW(p)

I would like to point out that it was a Judeo-Christian culture that developed modernity in the first place.

comment by ArisKatsaris · 2011-07-15T12:21:00.390Z · LW(p) · GW(p)

Which modern day community would you say comes closest to your hopeful vision of a future civilization?

Somalia seems to fit your criteria of polygamy, aggressiveness, unequality, etc

Admittedly, it's not atheistic or eugenic, but do the above listed elements contribute positively in your evaluation of Somalia as opposed to e.g. Sweden, which is less agressive and more egalitarian?

comment by CharlieSheen · 2011-07-14T19:48:01.501Z · LW(p) · GW(p)

Not for rebellion's sake so much as for the sake of survival and power. As one who enjoys the benefits of modernity, I find it perplexing that modernity has become something that is, from any rational Darwinian perspective, a form of cultural suicide. What I'm looking for is a civilization whose prophets have more resemblance to Nietzsche than to Jesus or his secular disciple, Marx. I just don't see much future for non-Nietzschean modernity; it is a dying culture of empty cathedrals and hospital-tombs. If secular people continue to abort themselves into extinction, the future will belong to the fundamentalists. I see nothing particularly rational about this!

Strawman has a point. The Gods of human rights are cruel and inhuman (not the same as inhumane), they demand a great sacrifice. A sacrifice of genetic legacy, on a scale that wipes out its carriers, at least all their human carriers. Only way they can keep relevance is for us to spread their worship to every corner of the world.

Remember how infectious diseases are just "kind "enough to the host to ensure their own survival? With better hygiene less lethal strains of Cholera prevail. Mass media, the internet, high literacy and the support of powerful states ... I think this qualifies as poor hygiene for the outdated biological tribal brain exposed to all the (occasionally toxic!) memes of 7 billion other vulnerable brains. The values can drift to be more genetically lethal to humans, since this dosen't impede their spread. If this line of thinking is a valid analogy, considering current trends, what we see today may be only the beginning of their ever growing maladaptivity for the hosts.

comment by Maurus · 2011-07-14T21:49:16.406Z · LW(p) · GW(p)

Following a syphilitic madman like prophet and divine being? Had been tried before :-P