How concerned are you about LW reputation management?
post by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T02:11:38.718Z · LW · GW · 8 commentsThis is a question post.
Contents
Answers 22 Randomized, Controlled 17 Emiya 14 habryka 13 adamzerner 6 Dustin 6 anonymous72109 6 lsusr 5 Pablo Repetto 3 CraigMichael 2 ChristianKl 2 Josh Smith-Brennan 0 Josh Smith-Brennan 0 Josh Smith-Brennan None 8 comments
I've been considering why I sometimes vote, but don't comment on posts, even though I'd like to. Sometimes, I delete substantial comments that I've already written.
I think the reason is something like reputation management. Ideally, we'd all feel safe to debate and consider ideas here, without fearing looking stupid. After all, that's how we learn.
Yet I want to be able to signal "I've put a lot of work into this post, believe it really offers some value, and am prepared to stand behind it" in some cases. I fear that commenting a lot, or posting imperfect posts, puts me at risk of losing some of that credibility. The more I comment or post (or at least the more above average I do so), the more I risk accumulating a track record of incorrect statements. Also, even participating in discussions that aren't my area of writerly focus makes me perceive a credibility risk, analogous to how some scientists carefully avoid speaking about topics beyond their specialty.
A side issue is that public comments can sometimes activate my "called out in public" social sense. For that reason, I have started to use PMs and zoom calls more, both to de-confuse myself with someone else's post, and to have informal discussions off the public record. Those discussions often seem much more alive and productive than the ones I have through the comment section here.
While it seems beneficial to the community to have people willing to publicly list their mistakes and risk a free and open dialog, I wonder how much this reputation management concern is felt by others?
Also, if it's not just me, what would be the best way to solve this problem? I can think of a few:
- Deciding that this is a feature, not a bug, and continuing to treat reputation as a fragile and valuable thing.
- LW developers create an official and convenient anonymizing option, where posters can choose not to have their profile name attached to a comment/post but can reveal it later if they choose.
- Actively taking steps to overcome this phobia, perhaps by creating a "ways I've been wrong" page, or making an effort to treat the comment section of posts more as a casual and informal space.
- Creating two accounts, one public and one pseudonymous.
- Using private messages, zoom calls, and chat for more informal conversation.
- Looking into guidance aimed at academics for managing their online presence.
Answers
Yeah, I absolutely feel like this. I also have a number of LW posts that are in a quarter-baked state. Every now and then I work on one of them a bit more. I have no idea when I'll feel comfortable posting any of them, because I want them to be well-received.
The reputation system really encourages pruning [LW · GW].
... with a little more reflection: I think this also subtly affects how I write on here. Little less playful, DEF fewer emojis. Tots don't wanna look basic in-front of the other rats. Tots don't wanna seem like I'm trying too hard.
(Totally hoping I score big up votes with this first comment.)
(pant pant pant gasp wheeze !!)
↑ comment by ryan_b · 2021-05-21T16:47:45.512Z · LW(p) · GW(p)
Relax, you did great!
I encourage you to post the quarter-baked ones sooner rather than later.
- The community hasn't rejected something unless you land in the negatives, which is pretty rare.
- I've never been in anyplace better for rewarding clear expressions of where thoughts are incomplete. Further, if you write incomplete thoughts accompanied by questions, there are often comments containing helpful further reading, even on issues not core to the community's interests.
- I do not have advanced training in formalism or anything, so earlier posts which spend more effort describing the intuitions and false starts are the most valuable stage for me to consume in the intellectual pipeline.
- Think of quarter-baked like a cheap test: if it gets some votes anyway, people are interested. Then the next post will be better, and most of us like ideas being carried farther along in development. I'm pretty sure this is an aesthetic thing entirely separate from the idea itself.
↑ comment by Randomized, Controlled (BossSleepy) · 2021-05-17T02:47:52.523Z · LW(p) · GW(p)
I wonder if making reputation multi-dimensional would help?
Replies from: lsusr↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T03:54:06.063Z · LW(p) · GW(p)
I always upvote liberally to create good incentives ;D
I generally avoid commenting only if I feel I have nothing relevant to say. The only thing that makes me delete a comment mid-writing is realising that I'm writing something that's wrong.
If I notice I made a mistake mid-discussion, or after I've already posted a comment people read, I admit it, and I've seen that usually up-votes show it's appreciated.
Usually when I comment it's because I have... let's call them "political beliefs", though they are always about concrete things and decisions, that are a lot more "left leaning" than the average position here. As long as I'm confident in my reasons for having such beliefs, I don't seem to worry about my reputation at all, even if I think I'm about to say something "unpopular". As long as I'm willing to explain myself and change my mind if I'm wrong, I think that holding back on expressing such ideas makes the site weaker and betrays its spirit (I do try to keep the discussion as much apolitical as possible). I don't comment unpopular opinions if I don't think I can put on the effort to explain them well.
Often commenting on LessWrong is an useful test for my belief in something, the thought of having to justify your disagreement with the "smart kids club" makes me check more on my reasons for believing things, and put forward some research work.
The reputation system seems to work fine for me, since it gets me to improve. The few times I tried discussing in PMs something, it turned out less confrontational and more productive, though, so I think that's a good approach (and it's much more enjoyable).
I also try to remember making short comments of agreement to make our kind cooperate [? · GW].
I do feel stupid and irritated each time I get down-voted, since I try to not comment on stuff that I don't know about or to write shallow statements I can't help but think "wow, whoever this person was is very biased against my idea" which... likely isn't a mature reaction. I'd like to know why I get down voted, though.
I'm a hundred times more self conscious about making posts, though. I feel the stress of having a post come under the scrutiny of the community would make me obsessively edit and quadruple-check everything, so at least four ideas for posts died out this way without any good reason (so far I've managed to post just two questions).
Using an anonymous account or something like that wouldn't work at all, I'm not concerned about lesswrongers writing off Emiya as an idiot, I'm afraid of me thinking I'm an idiot because my ideas got shredded apart which... is not a way of thinking about this that's in any way good or useful, and it's hindering my progress, so I should really try to break through it.
In my experience talking to authors, this definitely has a pretty large effect on what people write on the site. Though I think the effect is often not really dominated by worrying about the perception of other LessWrongers, but about the perception of other random people on the internet who might take what you say out of context. Or potentially future organizations you might be collaborating with that might comb through everything you said on the internet in search of red flags and a highly risk-averse hiring or collaboration policy.
At least within the rationality community, I view commenting almost always as a positive signal about someone, and think this is pretty common. My best guess is that the reputation management of this type doesn't actually have super much of an effect on other people in the community, though it does have an effect on the broader world, and it's also hard to be confident here.
LW developers create an official and convenient anonymizing option, where posters can choose not to have their profile name attached to a comment/post but can reveal it later if they choose.
I've been thinking about doing that for a while. It's a good amount of pain from an implementation perspective, if you want to avoid allowing indirect methods of deanonymization (like voting on an anonymous comment and checking which users total karma changed, which you can either do programmatically for hundreds of users, or manually for a few users if you have a suspicion about what user it is).
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T08:06:11.289Z · LW(p) · GW(p)
Thanks for your thoughts on this. Would a soft anonymization feature work, just hiding their name and warning them that there are still ways a determined person could find them out? Or perhaps disabling the link between post/comment karma and total user karma?
↑ comment by lsusr · 2021-05-17T16:12:01.774Z · LW(p) · GW(p)
I think the fact that you can create pseudonymous accounts already functions as a superior anonymizing option without cluttering up the UI, complicating the backend or permitting the attack habryka outlines.
Secondary accounts provide better security because they remove a broader attack vector. If you trust Less Wrong to anonymize you then you are vulnerable to Less Wrong getting hacked. If you manage two separate accounts then a compromise of the Less Wrong servers will not deanonymize you.
Replies from: habryka4↑ comment by habryka (habryka4) · 2021-05-17T19:08:35.604Z · LW(p) · GW(p)
Yep, though UI affordances are surprisingly powerful and people seem to forget that this is an option pretty frequently. I do think the fact that it's super easy to create a new account is a reason why we haven't implemented this.
Sometimes people are also worried that they shouldn't use a second account because that's a bit like sockpupetting. There might be some way of making it clear to people that we don't mind, as long as you don't upvote your own content.
How about using "epistemic status" [LW · GW]? Eg.
Epistemic status: First impression. Still figuring out my viewpoint.
Personally, I find that doing so mostly addresses my hesitancy, although I still feel some amount of hesitancy to post or comment bad quality stuff even if it had a label like "Epistemic status: I don't know what I'm talking about."
I also think that using qualifiers works similarly. Eg. instead of having "epistemic status" at the top of a post, use language like "My first impression is X, but I don't feel too confident in it."
PS: We made it to Urban Dictionary guys! https://www.urbandictionary.com/define.php?term=Epistemic%20Status
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-18T04:31:23.251Z · LW(p) · GW(p)
PS: We made it to Urban Dictionary guys! https://www.urbandictionary.com/define.php?term=Epistemic%20Status
YUSSS!
So I think there's an epistemic model that people in this community often strive for. It's that competence in one area does not translate much to others. Lack of knowledge in one area does not indicate a general lack of knowledge. Under this model, it's relatively safe to display your lack of knowledge in public.
There's a contrasting set of assumptions as well. It's the idea that demonstrable lack of knowledge in one area indicates a general lack of knowledge. Expertise in one area indicates general expertise. Under this model, it's unsafe to display any areas in which you lack knowledge, even if you couch it in "epistemic status" disclaimers, lest your credibility in areas where you really do know what you're talking about be compromised.
I try to judge other people by the first standard, but I fear that others will judge me by the second. Hence the concern, and the reason why epistemic status disclaimers don't do much to alleviate it.
Replies from: adamzerner↑ comment by Adam Zerner (adamzerner) · 2021-05-18T05:44:39.919Z · LW(p) · GW(p)
There's a contrasting set of assumptions as well. It's the idea that demonstrable lack of knowledge in one area indicates a general lack of knowledge.
Hm, I think it depends on the specifics. I strongly disagree with it, but I can see people thinking something like, "Oh, you don't know what a derivative is? That's a pretty basic thing. You must not be very smart." But for something else, eg. a monoid, it's obscure enough where I don't think anyone would penalize you.
Personally, I have a reasonably strong impression that LessWrongers are generally wise enough to not judge you for not knowing something "basic" like a derivative. But that's just my data point.
I try to judge other people by the first standard, but I fear that others will judge me by the second. Hence the concern, and the reason why epistemic status disclaimers don't do much to alleviate it.
I'm curious, what happens emotionally when you focus on the benefits of writing subpar content? I'm thinking of the points I made in Writing to Think [LW · GW], namely that writing is a great way to become a better thinker. "This is going to make me look stupid now, but it's also going to make me stronger in the long run."
A little? Over the years I've considered making series of "shower thoughts" posts with collections of my not-really-fully-considered ideas. I really just do not have the time to write the posts I'd like to write, but I also don't really like just letting my thoughts languish in my head.
I've never done it because 1) it feels like wasting other people's time, and 2) what if I post bad ideas?!?!
LessWrong should seriously consider implementing a "4chan mode" where a post and all its replies are anonymous and not counted for karma. I'm imagining that the poster could choose one of two options:
- Anyone can reply
- Only users with >K karma can reply (but this is enforced by some cryptographically-secure protocol to avoid linking accounts together, because otherwise nobody will trust it)
Why is this a good idea?
I submit: The more mature a system of ideas gets, the harder it becomes for non-anonymous discussions to yield any useful results. After 15-odd years, the "LessWrong rationality" idea-system is getting to that point, so having a way to post on LessWrong anonymously is increasingly important.
Let's first step back and ask: Why does anyone bother sharing ideas at all? What benefits does one derive from doing so, that counterbalance the prima facie costs, namely:
- (A) Communicating my ideas takes time and effort that I could spend elsewhere.
- (B) Knowing things about the world gives me a strategic advantage over people who don't know those things, which I lose if I reveal to them what I'm thinking.
?
These counterbalancing benefits include:
- (1) Sharing my ideas can cause others to act in accordance with my values.
- (2) If I gain a reputation for having good ideas, then people will give more credence to what I say in the future.
- (3) I can more effectively improve my world-model by getting feedback from others.
- (4) I can signal group membership by displaying familiarity with certain ideas.
- (5) (There may be more but let's leave it at that.)
For each of these negative and positive incentives, we can ask: Does this incentive work in favor of truth-telling ("Pro-truth") or against it ("Anti-truth")?
-
(A) is Pro-truth, because lying imposes more cognitive load than sincerity. You already have a world-model that you live by, but if you tell lies you also have to keep a model-of-your-lies on top of that.
-
(B) is Anti-truth, because by actively deceiving others, I can gain even more strategic advantage than if I had simply said nothing.
-
(1) is Pro-truth if our values align, Anti-truth if they don't.
-
(2) is Pro-truth when the system is immature, Anti-truth when it's mature.
-
(3) is Pro-truth, because I can't get good feedback unless I honestly explain what I believe.
-
(4) is Anti-truth, because the more unbelievable an idea is, the more effective it is as a signal.
Why (1)? Because having true beliefs is instrumentally useful for almost any goal. If my audience consists of friends, I'll want to give them as much true information as I can, so that they can be more effective at achieving their goals. But if my audience consists of enemies, I'll want to give them false information, so that their efforts will be ineffective, or misdirected to promote my values instead of theirs. If the audience is a mix of friends and enemies, I'll be reluctant to say anything at all. (This may explain OP's reluctance to post, now that the readership of LessWrong has grown.)
Why (2)? When the system of ideas is immature, this can be a powerful Pro-truth motive - arguably this was why the Scientific Revolution got off the ground in the first place. But as the system matures, the more I need to differentiate my ideas from others' in order to stand out from the crowd. If I simply repeat ideas that have already been said, that gains me no reputation, since there's no evidence that I can come up with good ideas on my own (even if in fact I did). Instead I need to plant my flag far away from others. At the immature stage, this is easy. Later on, when the landscape grows thick with flags, the value of truth-seeking gets thrown under the bus in favor of making-a-name-for-myself - it becomes comparatively easier to just make up stuff. (Maybe the reason no one else has thought of New Idea X is because it's not true!)
What can we do about this, if we want to promote truth-seeking? (A) and (3) are unaffected by external circumstances, so we can set them aside. (B) and (1) (which are really the same thing) can be mitigated by communicating in private walled gardens, as OP suggests, but that makes collaboration much more difficult.
This leaves us with (2) and (4). These incentives can be eliminated by forcing everyone to communicate only through anonymous, transient identities. When you're using your real name, your reward is negative for truth-telling, and positive for lying. Being anonymous sets the reward to 0 in both cases, which eliminates the bad incentive; but nobody would choose to do this sua sponte, because they'd be leaving positive utility on the table. Therefore, anonymity must be imposed externally, by cultivating a norm whereby any statement will be disregarded unless it is anonymous. In other words, the positive utility of (1) must be diminished by more than I'd gain by using my real name.
Accordingly, I have posted this comment under a throwaway account, which I precommit to never reusing outside this thread.
Our community tends to respect people who say "I was wrong". I'd rather publicly look stupid for an instant than privately remain stupid forever. If I say an incorrect statement then I can just correct it.
My primary fear is not putting incorrect statements in the public record (which can be corrected). I fear putting politically unpopular statements in the public record (which cannot be corrected).
Keeping my political opinions private doesn't bother me. Rather the opposite. It would scare me if I didn't have dozens of ideas I was afraid to state publicly. If I didn't have offensive thoughts then I would be a sheep.
However, I don't like being unclear to myself what things I am afraid to say. To fix this I write as if I am comfortable publishing the truth and then I obtrusively redact the most offensive bits like this.
↑ comment by Stuart Anderson (stuart-anderson) · 2021-05-18T01:33:40.174Z · LW(p) · GW(p)
-
Some time ago I noticed this trend among people I respect on Twitter (motivating examples). It seems to me that there is a consensus view that openness has a damaging effect on discourse.
This view does not seem to stem from the problem addressed by "Well-Kept Gardens Die By Pacifism [LW · GW]" and "Evaporative Cooling of Group Beliefs [LW · GW]" -the gradual decline of a community due to environmental exposure- but rather from the problem that you percieve: the reputational hazard of public fora.
My current stance on public discourse is that it serves as a discovery mechanism: writing and speaking in public serves to find people worth talking with in private.
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-19T01:09:38.290Z · LW(p) · GW(p)
That connects with my piece on the EA Forum, Articles are Invitations. [EA · GW] That's a nice way to look at it. Over time, I've come to see article writing, including scientific articles, as primarily ways to coordinate people to work together, and only having a secondary purpose to convey information. That doesn't mean that their information-conveying purpose is unimportant, but rather that their coordination function is extremely important and often neglected.
My personal answer - yes and more so than I thought I would be. I wish people would tell me why they downvoted things I write.
A feature to maybe prompt to do that may help.
Are you asking about real reputation or about reputation points?
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-06-01T04:00:51.198Z · LW(p) · GW(p)
Primarily real reputation, but interested in both aspects!
I just deleted an entire essay on my thoughts about this subject, because I figured tl;dr is an issue for many people. I do worry about the quality of some of my posts, and the lack of succinctness, proper formatting and grammar; I know those are 'turn offs' for some people. I'm neither a scientist nor a journalist, so I wonder sometimes what I'm doing here besides failing at being both.
Anytime I post in public, I go through a sometimes horrible sequence of thoughts and emotions, worrying about how it will be received. Most of the time it's pretty anti-climactic though, and the responses I do get seem to ignore or miss what I feel are some of the most important points. It's DEF not a chan, and not a Peer-Reviewed Journal, but somewhere in the middle and I figure I''m still trying to get my footing.
I've already spent a fair amount of time posting anonymously because I wanted to test out some of my ideas and thinking before I wanted to take credit for them. Problem is, other people were posting about the same ideas using their real names. So I'm attempting to make the switch to taking responsibility for my thoughts and opinions while I work on trying to develop my concepts and ideas. In imitation - I like to think - of SpaceX's approach to prototyping Starship, I'm launching post after post, full well expecting most of them to fall to the earth in flames, until that first one lands. Then I'll build on top of it. Wish I had a several billion dollars to help though.
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T23:22:30.820Z · LW(p) · GW(p)
I also often delete comments, and occasionally posts. I think it's interesting that you and I seem to delete, rather than save/draft/improve, this written material.
Part of my motivation for deleting is recognizing constraints on my time. It takes more willpower not to write than to write, and I'm in danger of sacrificing too much time if I don't keep the reigns on it.
Replies from: josh-smith-brennan↑ comment by Josh Smith-Brennan (josh-smith-brennan) · 2021-05-18T00:03:50.104Z · LW(p) · GW(p)
I find that I can usually explain my ideas better in conversation than I can in writing, although once in a while I do impress myself. Mostly it's because I get feedback during the discussion which allows me to clarify confusing points, although I can be long winded.
Mostly though, I think I have to be in the mood to edit - I can write quite a bit as I'm intelligent - but I have untreated ADD, so while I can join lots of ideas together in a smart fashion, it's the adjusting of the timing and fit of various thoughts and progression of ideas that takes more effort than I can manage most of the time.
So I just delete it. It is a shame I think.
I'm curious at this point, considering I have a post now at -34 karma, what the limits are of the set [upper bound, lower bound] for Karma on LW. Also, coming at this from a different angle, for me at this point the reputation system here is a good example of Inaccessible information [LW · GW].
There is accessible information about the posts, in the form of the Karma, but in cases of relatively large numbers of votes with no comments, the real meaning of the accessible information is sometimes lost as inaccessible information. Usually I would assume this is more the case with down votes than with upvotes.
But if in a system not run by AI, where we can easily ask for feedback about things like downvotes, through the comments system, if we still have issues with inaccessible information despite the ease of commenting, I don't think this bodes well for cases in the future where it's not as easy to ask for feedback.
Especially in a community of Rationalists who pride themselves on confronting and attempting to understand and overcome bias, I have to admit to some confusion still about what this actually means.
At this point I am a little more concerned about LW's reputation management, I recently posted about the developing story of Bill Gates divorce, and the concerns over his relationship with Epstein. I did it tongue in cheek, as I trusted people would pick up on the sarcasm. No go though. -22 Karma and counting, but only 3 comments. I'm confused now about what those -22 people think about Bill Gates and his divorce and how we should talk about it in a rational way.
Maybe restricting upvoting or downvoting to people who comment would help align the reputation system with some less ephemeral idea of what LW users think about a particular subject?
And yes, although I'm not sure I need to, but I do apologize ahead of time for using a sensational approach to engaging with users, but I had a hunch the post would get alot of hate with nothing to back it up. Hunch confirmed, so now I'm moving onto how to best address the hate. Any suggestions would be appreciated.
↑ comment by [deleted] · 2021-05-19T03:33:05.630Z · LW(p) · GW(p)
Note that karma is not a 1:1 map to users. The more karma your account has, the higher the system weighs your vote, so there's probably a lot less than 22 people behind those 22 down votes.
Replies from: AllAmericanBreakfast, lsusr, josh-smith-brennan↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-19T05:17:00.951Z · LW(p) · GW(p)
You can find out the number of up/downvotes by hovering the mouse over the karma number between the arrows.
↑ comment by lsusr · 2021-05-19T04:04:36.472Z · LW(p) · GW(p)
To confirm: the -22 karma is the sum of just 8 votes. If the 8 votes include Josh Smith-Brennan's automatic self-upvote then it's a maximum of 7 downvotes.
Replies from: josh-smith-brennan↑ comment by Josh Smith-Brennan (josh-smith-brennan) · 2021-05-19T04:10:29.504Z · LW(p) · GW(p)
Well honestly, that puts my mind more at ease. I can deal with 7 peoples dislike of my post better than 22 peoples. My faith in this institution has been slightly restored.
↑ comment by Josh Smith-Brennan (josh-smith-brennan) · 2021-05-19T04:05:01.080Z · LW(p) · GW(p)
Thanks for the info, So that means there were potentially fewer posters, but ones who've been around awhile and have heavy votes which compose a part of that metric. Not really possible to suss out the actual number is it?
Out of all the commenters so far, you have the highest Karma with 280. How heavy is your vote if you don't mind my asking?
Replies from: Zack_M_Davis↑ comment by Zack_M_Davis · 2021-05-19T04:27:25.357Z · LW(p) · GW(p)
The karma-to-strong-vote-power-mapping can be found in the site's open-sourced codebase, and Issa Rice's alternative viewer has the list of actual user vote-powers.
Replies from: lsusr↑ comment by lsusr · 2021-05-19T04:31:00.062Z · LW(p) · GW(p)
I'm confused. The source code seems to imply that anyone with 25,000 karma or more has a small vote power of 3 but the the list of actual user vote-powers implies it maxes out at 2.
For those too lazy to read the source code.
Small Votes
user karma | small vote weight |
---|---|
0-999 | 1 |
1000-∞ | 2 |
Big Votes
user karma | big vote weight |
---|---|
0-9 | 1 |
10-99 | 2 |
100-249 | 3 |
250-499 | 4 |
500-999 | 5 |
1000-2499 | 6 |
2500-4999 | 7 |
5000-9999 | 8 |
10000-24999 | 9 |
25000-49999 | 10 |
50000-74999 | 11 |
75000-99999 | 12 |
100000-174999 | 13 |
175000-249999 | 14 |
250000-499999 | 15 |
500000-∞ | 16 |
↑ comment by Zack_M_Davis · 2021-05-19T04:37:42.642Z · LW(p) · GW(p)
Did ... did you save this table a long time ago?? Weak 3-votes have been gone since February 2020 for privacy reasons [LW(p) · GW(p)].
Replies from: lsusr↑ comment by lsusr · 2021-05-19T04:46:09.776Z · LW(p) · GW(p)
Thanks. Fixed. I wrote the table right now. I was just reading the FAQ [LW · GW] which links to outdated source code.
Replies from: habryka4↑ comment by habryka (habryka4) · 2021-05-19T18:48:45.986Z · LW(p) · GW(p)
Oops, sorry about that. Will fix that link.
↑ comment by Josh Smith-Brennan (josh-smith-brennan) · 2021-05-20T00:30:40.634Z · LW(p) · GW(p)
With the intention of taking the reputation system here as a guide as opposed to a 'brand' or scarlet letter, I'm wondering what people think about deleting posts that garner a lot of negative Karma. I'm not expecting my Karma to all of a sudden rebound, however since enough people feel strongly negative about the post in question, there is some peer pressure to remove the post, unless I want to be seen as being a rebel or contrarian or meta-contrarian or intellectual hipster. I'm smart enough to recognize and interpret signaling, and not too set in my ways to not be able to attempt to go with the flow.
8 comments
Comments sorted by top scores.
comment by Shmi (shminux) · 2021-05-17T20:31:59.458Z · LW(p) · GW(p)
For comparison, I have over two dozen posts in Drafts, accumulated over several years, that are unlikely to ever get published. One reason for it is that there are likely plenty of regulars whose reaction to the previous sentence would be "And thank God for that!" Another is the underwhelming response to what I personally considered my best contributions to the site. Admittedly this is not a typical situation.
Replies from: AllAmericanBreakfast↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T20:37:34.495Z · LW(p) · GW(p)
Curious to know which one that was, if you're willing to share?
Writing a good post on an important topic is only half the equation. It's also important for the audience to be ready and interested to hear it. This creates a real problem, I think. Especially when you can't get feedback on whether people ignore/disagree with the post's importance/relevance, or whether they disagree with its object-level thinking.
Replies from: shminux↑ comment by Shmi (shminux) · 2021-05-17T21:18:07.991Z · LW(p) · GW(p)
Sure, if you are interested, some of these are below, in reverse chronological order, but, I am quite sure, your reaction would match that of the others: either a shrug or a cringe.
And yes, I agree that the reasons are related to both the writing style, and to the audience being "ready and interested to hear it."
- Uninformed Elevation of Trust [LW · GW] is a real and pervasive phenomenon whose manifestation we see all the time, such as people relying on their trusted friends for vaccination decisions, or believing Michio Kaku's musings about string theory, or r/WSB about "stonks".
- (Double-)Inverse Embedded Agency Problem [LW · GW] and Monkey Values [LW · GW] as approaches to embedded agency and human values.
- A bunch of posts from 2018 on toy models of a predictable universe and embedded agency, starting with Physics has laws, the Universe might not [LW · GW].
- Earlier posts on subjective vs objective.
↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T22:17:07.661Z · LW(p) · GW(p)
I've been thinking recently that prolific LW writers should occasionally go back and do a literature review of their own posts. When I write, I'm usually (not always) presenting freshly-developed thoughts. It's like a miniature version of Kuhn, where one paradigm replaces another in rapid succession. Many threads are abandoned entirely. I assume others experience something like this too.
It amounts to research debt, where any individual article must be assumed to reflect something to the very primordial beginnings of thought on a subject, with a low prior likelihood of remaining relevant. Hence, there's a relatively low cost to ignoring individual articles.
But by going back and identifying the areas where you have made sustained intellectual progress, I think it could help address this problem of research debt. It might also help give others a reasonable cause to investigate those threads of yours more closely.
comment by Mo Putera (Mo Nastri) · 2021-05-17T03:01:57.496Z · LW(p) · GW(p)
Not an answer to your question, but Sarah Constantin's essay seems relevant. As usual it's hard not to just quote the entire piece:
But one thing I have noticed personally is that people have gotten intimidated by more formal and public kinds of online conversation. I know quite a few people who used to keep a “real blog” and have become afraid to touch it, preferring instead to chat on social media. It’s a weird kind of locus for perfectionism — nobody ever imagined that blogs were meant to be masterpieces. But I do see people fleeing towards more ephemeral, more stream-of-consciousness types of communication, or communication that involves no words at all (reblogging, image-sharing, etc.) There seems to be a fear of becoming too visible as a distinctive writing voice. ...
What might be going on here?
Of course, there are pragmatic concerns about reputation and preserving anonymity. People don’t want their writing to be found by judgmental bosses or family members. But that’s always been true — and, at any rate, social networking sites are often less anonymous than forums and blogs.
It might be that people have become more afraid of trolls, or that trolling has gotten worse. Fear of being targeted by harassment or threats might make people less open and expressive. I’ve certainly heard many writers say that they’ve shut down a lot of their internet presence out of exhaustion or literal fear. And I’ve heard serious enough horror stories that I respect and sympathize with people who are on their guard.
But I don’t think that really explains why one would drift towards more ephemeral media. Why short-form instead of long-form? Why streaming feeds instead of searchable archives? Trolls are not known for their patience and rigor. Single tweets can attract storms of trolls. So troll-avoidance is not enough of an explanation, I think.
It’s almost as though the issue were accountability.
A blog is almost a perfect medium for personal accountability. It belongs to you, not your employer, and not the hivemind. The archives are easily searchable. The posts are permanently viewable. Everything embarrassing you’ve ever written is there. If there’s a comment section, people are free to come along and poke holes in your posts. This leaves people vulnerable in a certain way. Not just to trolls, but to critics.
You can preempt embarrassment by declaring that you’re doing something shitty on purpose. That puts you in a position of safety. You move to a space for trashy, casual, unedited talk, and you signal clearly that you don’t want to be taken seriously, in order to avoid looking pretentious and being deflated by criticism. I think that a lot of online mannerisms, like using all-lowercase punctuation, or using really self-deprecating language, or deeply nested meta-levels of meme irony, are ways of saying “I’m cool because I’m not putting myself out there where I can be judged. Only pompous idiots are so naive as to think their opinions are actually valuable.”
Here’s another angle on the same issue. If you earnestly, explicitly say what you think, in essay form, and if your writing attracts attention at all, you’ll attract swarms of earnest, bright-but-not-brilliant, mostly young white male, commenters, who want to share their opinions, because (perhaps naively) they think their contributions will be welcomed. It’s basically just “oh, are we playing a game? I wanna play too!” If you don’t want to play with them — maybe because you’re talking about a personal or highly technical topic and don’t value their input, maybe because your intention was just to talk to your friends and not the general public, whatever — you’ll find this style of interaction aversive. You’ll read it as sealioning. Or mansplaining. Or “well, actually”-ing. And you’ll gravitate to forms of writing and social media where it’s clear that debate is not welcome.
I think what’s going on with these kinds of terms is something like:
Author: “Hi! I just said a thing!”
Commenter: “Ooh cool, we’re playing the Discussion game! Can I join? Here’s my comment!” (Or, sometimes, “Ooh cool, we’re playing the Verbal Battle game! I wanna play! Here’s my retort!”)
Author: “Ew, no, I don’t want to play with you.”
There’s a bit of a race/gender/age/educational slant to the people playing the “commenter” role, probably because our society rewards some people more than others for playing the discussion game. Privileged people are more likely to assume that they’re automatically welcome wherever they show up, which is why others tend to get annoyed at them and want to avoid them.
Privileged people, in other words, are more likely to think they live in a high-trust society, where they can show up to strangers and be greeted as a potential new friend, where open discussion is an important priority, where they can trust and be trusted, since everybody is playing the “let’s discuss interesting things!” game.
The unfortunate reality is that most of the world doesn’t look like that high-trust society.
There's more, do check it out.
Replies from: AllAmericanBreakfast↑ comment by DirectedEvolution (AllAmericanBreakfast) · 2021-05-17T04:24:06.486Z · LW(p) · GW(p)
Thanks for the recommendation! LW's been growing in content steadily since she wrote this essay, which is nice. I wonder why she herself stopped blogging. I feel sad when I see someone whose writing I enjoyed quits blogging. A strange middle ground. A fair number of bloggers my brain interprets as being in my tribe, even though I've never met them, and I feel genuine concern when they go silent.
comment by Stuart Anderson (stuart-anderson) · 2021-05-18T01:53:17.545Z · LW(p) · GW(p)
-
comment by Deborah Roux (deborah-roux) · 2021-05-19T00:45:15.895Z · LW(p) · GW(p)
it is impossible to be overwhelmed by the crowd if you do not first enter the town on a donkey. LS, dont be afraid to make mistakes, everyone will try to tell you how to do it better. It is wright to be wrong in this monkey world. Dont matter what they say, only matters to say it. ( im new, sorry if i dont speak english well, im speak spanish)