0 comments
Comments sorted by top scores.
comment by Gordon Seidoh Worley (gworley) · 2018-05-23T21:39:56.174Z · LW(p) · GW(p)
Your initial reasoning seems sound, but your proposal seems entirely unrealistic. You can't control the environment enough to remove sources of status to make this stick. It's easy to create small places and award status within them, but making that status global is basically impossible unless you can control all aspects of social reality. Otherwise you always have additional avenues along which to gain status since every person is making a choice about how much status they consider a person to have even if it is heavily influenced by information they get from other people about how much status they think people should have. Consider, by way of counter example, that no person today has globally maximal status along all dimensions: there's just too many dimensions and too many people. Even people with high status along many dimensions, like say the Queen of England, lacks high status along many dimensions where she would be considered low status as an insider, say in playing competitive DOTA 2 or solving open problems in mathematics.
Replies from: None↑ comment by [deleted] · 2018-05-25T16:24:55.070Z · LW(p) · GW(p)
Otherwise you always have additional avenues along which to gain status since every person is making a choice about how much status they consider a person to have even if it is heavily influenced by information they get from other people about how much status they think people should have.
True. But you still seem to implicitly assume people are maximizers, ie that they will capitalize on these opportunities.
But okay, let's grant that there will be differences. What if we ensured a minimum? Would that be enough?
Here's one data point: I no longer feel a strong longing for status, implying that there is indeed a threshold beyond which people are mostly fine. This contradicts my assumption that people want the maximum. Maybe they just want to reach an absolute threshold of social capital.
comment by MatthewHawkchurch · 2018-07-28T01:45:17.655Z · LW(p) · GW(p)
This seems like it's missing the point.
1) What would be the rewards from League membership, aside from status and respect? "The league would be an island within which people would be freed from these pressures to do whatever they want." --> Are non-League community members now obligated to give them resources? Or just to let them blog about whatever they want? (Can't they do that already?)
2) There's substantial precedent for great thinkers running off the rails over time, and thus not meriting sacrosanct status as great thinkers. This seems like it would lead directly to "X has thought of some really good stuff and proved themselves by passing through the Rational Ring of Fire, so we're going to respect and them assume all their ideas are good even when they seem kind of nuts". That's bad! That's bad rationality!
I like the emphasis on impact - the point of (instrumental) rationality is to use it to do things. Rewarding impact and assigning status to doing impactful things is good, as are inventing ways to systematize this. (I just read "Give praise" [LW · GW] and agree with it.)
comment by habryka (habryka4) · 2018-05-23T19:53:23.600Z · LW(p) · GW(p)
Academic Tenure tries to do a bunch of things in this space, as did the original Royal Society and the way it accepted members.
comment by SurvivalBias (alex_lw) · 2018-05-24T01:40:45.964Z · LW(p) · GW(p)
"Once you’re a member, you’re recognized as having proven your worth. It means that you’re deserving of the highest respect. Within the league, everyone has the maximum amount of social capital. No more or less. If you’re a member, you’re like a sibling to everyone else. " - that part sound like the great explanation of how small groups of very close friends function. Where everyone really is like a sibling for everyone else and there's effectively no struggle for status (talking from the personal experience here). But it seems really doubtful that this can be achieved in a group of any significant size, as long as we're dealing with humans.
Replies from: ialdabaoth↑ comment by ialdabaoth · 2018-05-25T02:07:15.819Z · LW(p) · GW(p)
Yeah. Also, I've been actively kicked out of too many groups of close friends that I personally formed with my own agency and initiative.
"Once you're in, you're in for life" just doesn't work.
Replies from: None↑ comment by [deleted] · 2018-05-25T16:08:41.884Z · LW(p) · GW(p)
That's a horribly depraved thing to do. I'm not even accounting for environments that are that low-trust. Those just can't work. It's a non-starter. If this is really the kind of thing you're dealing with, and I am the exception as opposed to you, we should think about increasing trust in other ways.
Or (excuse me) you should move out of the US.
comment by Dagon · 2018-05-23T18:18:08.565Z · LW(p) · GW(p)
Do you have a description of your model/measurement for what status is? In my experience and thinking, it's not a scalar and not constant, more of weighting of edges on a relationship graph with between an individual and the MANY combinations of (sub)groups and interaction topics that the individual has.
ETA: Status is not awarded nor received. Perception of status is a side-effect (or maybe a cognitive summary) of much more complicated interpersonal feelings and habits.
I have a better policy than "programming social reality". It is "recognize that it's not reality". Play whatever games give you pleasure, but measure yourself rather than believing others' measures of you. To the extent that you're satisfied being a satisficer (heh) don't pick relative or changing measures for your satisfaction levels. pick absolute values and just be happy. I don't think this is all that much more likely for any individual to choose than your recommendation, but it has the advantage that it's unilateral and doesn't require anyone else to cooperate.
Replies from: None↑ comment by [deleted] · 2018-05-25T16:44:47.198Z · LW(p) · GW(p)
Surely if you go down to the nuts and bolts of it, you get a graph with a "willingness to help" function from People x People -> R. And then you could break this down even further adding "Time" and "Modality" to the domain, and all that...
But what I'm interested in is increasing the feeling of status, or to be more precise, minimizing the felt lack of status. I do expect those variables to be a scalar. How reality maps to this scalar is an interesting question.
status is a side-effect (or maybe a cognitive summary) of much more complicated interpersonal feelings and habits.
I think it mostly boils down to a few simple acts that are all proxies of this "willingness to help" thing.
As a general principle, In altering the perception of Thing, I believe it's best to just alter Thing. In our case that's altering the actual willingness to help each other.
I don't think this is all that much more likely for any individual to choose than your recommendation, but it has the advantage that it's unilateral and doesn't require anyone else to cooperate.
This looks like editing your utility function instead of satisfying it, which I think is a lot harder. Surely there is some low-hanging fruit in interpreting things differently to make yourself feel happier, but afaict we all learn this as kids and then we get stuck in the failure mode of assuming that it's always about reinterpretation. That's what happened to me, anyway.
Replies from: Dagon↑ comment by Dagon · 2018-05-25T18:43:06.598Z · LW(p) · GW(p)
But what I'm interested in is increasing the feeling of status, or to be more precise, minimizing the felt lack of status
Do you believe that felt lack of status is completely uncorrelated with others' willingness to cooperate? I have to admit that I care about my own status a whole lot less than many seem to, but I can't tell if this is just counter-signaling or a true reflection of the idea that status is complicated and intertwined with a lot of other real interpersonal relationship things, making it vary widely among individuals.
A more direct question about your model: would it be easier to just literally wirehead? Electricity to the part of the brain that seeks status?
Replies from: None↑ comment by [deleted] · 2018-05-25T19:41:56.205Z · LW(p) · GW(p)
Do you believe that felt lack of status is completely uncorrelated with others' willingness to cooperate?
I think it's strongly correlated, and causally bidirectional: higher status leads to better performance (for mental health reasons) leads to higher status.
The way I see it is that high status is the baseline condition and lack of status is a malfunctioning that makes one function below their capacity. In the same way that having to go to the toilet does.
would it be easier to just literally wirehead? Electricity to the part of the brain that seeks status?
If we could, yes. How many years until it's commercially available?
comment by Hazard · 2018-07-27T23:00:59.113Z · LW(p) · GW(p)
My gut response to the League of Rationalists is a resounding, "Fuck yeah!" but on reflection, my current models of status doesn't predict it working.
You've identified a target of "reduce/eliminate the feeling of not having enough status".
I'm going to be affected by all of the little non-verbal signals that emit from a person. A big part of my felt status seems to come from those. If I walk in to a room and everyone sneers at my, I'm more likely to feel like shit.
But as others have stated, "how many people actually have my back?" is likely the chunk of reality that status grounds itself in. Imagine the scenario where the people around you all send all the proper "You have status!" signals, but no one ever gets around to helping you when you need it. I'd predict that you might feel good at the beginning, but that soon enough you "status detectors" would wise up, and you'd start feeling low status again.
I'm pretty sure that any League one might make, you'd have to ensure that people actually unconditionally helped each other. That is a necessary (but maybe not sufficient?) condition. So the question is why are people going to help each other unconditionally? The proposed plan was to have some really strict high cost admittance process, such that all members will consider it a totally worthwhile to follow the decision rule "Unconditionally help all those in the League."
Okay, now that I've put this all out in order, I guess I just find it crazy unlikely that you could find an admission process that fit that criteria. My first impression is that their are too many dissenting opinions among "the rationalists" for their to be an admission process that would satisfy more than a few people. Though I'd be super interested to hear any thoughts you've had about what this process might look like. I'll give it some thought as well.