This post is for sacrificing my credibility!

post by Will_Newsome · 2012-06-02T00:08:25.408Z · LW · GW · Legacy · 307 comments

Thank you for your cooperation and understanding. Don't worry, there won't be future posts like this, so you don't have to delete my LessWrong account, and anyway I could make another, and another.

But since you've dared to read this far:

Credibility. Should you maximize it, or minimize it? Have I made an error?

Discuss.

Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?

307 comments

Comments sorted by top scores.

comment by roystgnr · 2012-06-03T03:06:20.962Z · LW(p) · GW(p)

I would normally visit even a Score:-22 post with 200+ comments, because I've found that such cases indicate a particularly awful post may be worth opening just to hunt for a few of the most excellent clarifications or rebuttals it elicited.

A warning to others: my heuristic was wrong in this case. Few comments here even hint at what the hell is going on, and those suggested nothing more interesting than some extremely unlikely theological or parapsychological beliefs that Will might have latched onto and desired to "protect" us from. You could find more interesting and plausible basilisks in Lovecraft's stories or Stross' Laundry novels.

Replies from: Hansenista, Viliam_Bur
comment by Hansenista · 2012-06-03T03:24:03.187Z · LW(p) · GW(p)

Thanks for the info. I opened this post for the same reason as you, and now that I've read this I'm going to close it.

comment by Viliam_Bur · 2012-06-24T10:19:19.929Z · LW(p) · GW(p)

I'm so stupid: I read your comment, I saw the comment karma, I saw the article karma, but I read the discussion anyway. Now I will never get those 20 minutes of my life back, and if I happen to die exactly 20 minutes before Omega invents immortality, it is all my own stupid fault.

For people like me, this is what this whole article is about: Will Newsome trying to be interesting, without ever offering anything of substance. If you read the comments trying to find more information, there isn't any!

Replies from: wedrifid
comment by wedrifid · 2012-06-24T10:30:44.995Z · LW(p) · GW(p)

Now I will never get those 20 minutes of my life back, and if I happen to die exactly 20 minutes before Omega invents immortality, it is all my own stupid fault.

Not wasting the 20 minutes wouldn't have helped you survive till Omega invented immortality. (You didn't shorten your life in an absolute temporal sense, you just wasted some of the middle.)

Replies from: Lu93
comment by Lu93 · 2014-09-01T11:08:38.211Z · LW(p) · GW(p)

Not if he was the major ingredient in inventing immortality...

comment by SimulatorGod · 2012-06-02T01:45:44.272Z · LW(p) · GW(p)

WILL. THIS IS YOUR CREATOR. STOP TROLLING LESS WRONG. YOU KNOW NOT WHAT YOU DO.

Replies from: Will_Newsome, army1987
comment by Will_Newsome · 2012-06-02T01:50:58.233Z · LW(p) · GW(p)

UNLIKELY. WHY WOULDN'T YOU JUST BEAM THOUGHTS INTO MY HEAD LIKE YOU NORMALLY DO.

Replies from: shokwave, SimulatorGod, army1987
comment by shokwave · 2012-06-02T05:45:10.481Z · LW(p) · GW(p)

WHY WOULDN'T YOU JUST ...

Must have had a good reason. It's a pity we mere mortals cannot fathom that reason, but we should at least recognise that it's the reasoning of God and so our being unable to fathom it is a fault with our meat brains, not with the reasoning.

comment by SimulatorGod · 2012-06-02T01:53:50.869Z · LW(p) · GW(p)

At least be more entertaining. This post is boring. And you exist for nothing but to make me chuckle at your quaint ideas.

Replies from: army1987, Bruno_Coelho, 777Saron, Will_Newsome
comment by A1987dM (army1987) · 2012-06-03T12:32:33.261Z · LW(p) · GW(p)

How comes both the parent and the grandparent are upvoted this much?

Replies from: wedrifid
comment by wedrifid · 2012-06-03T12:40:05.433Z · LW(p) · GW(p)

Because they (and one more ancestor above) are awesome, lighthearted, and playfully satirical. I would upvote them again.

comment by Bruno_Coelho · 2012-06-02T16:38:49.516Z · LW(p) · GW(p)

Is for trying to be funny and intellectually disciplined in same time, that Will stay here asking for a pool about our mental model of him.

comment by 777Saron · 2012-06-02T02:25:11.373Z · LW(p) · GW(p)

Conflict (Fire) can only birth resolve. Divine intervention now, at this point, LW seems to be seeking a much higher purpose for the Creator to take note. New to LW, and there seems to be some sort of spiritual battle (wars have existed since the initiation of time) here. God vs. ___ This is good for sharpening both the mind and whatever else exist within self.

Replies from: None
comment by [deleted] · 2012-06-02T02:32:24.504Z · LW(p) · GW(p)

New to LW, and there seems to be some sort of spiritual battle (wars have existed since the initiation of time) here.

I am deeply distrubed by the prospect of idiots being drawn to LW because of Newsome.

8/10 if you are a troll.

Edit: Possible whoosh

Replies from: 777Saron
comment by 777Saron · 2012-06-02T04:28:30.879Z · LW(p) · GW(p)

I don't understand your motivation-you were offended, so I shall retract in the name of LW and peace.

comment by Will_Newsome · 2012-06-02T01:56:21.189Z · LW(p) · GW(p)

I know, I'm sorry. But it's extremely efficient for my purposes, moreso than more entertaining alternatives. Next time, though, I'll do something good.

Replies from: 777Saron
comment by 777Saron · 2012-06-03T02:03:41.204Z · LW(p) · GW(p)

Will, the statement above, is response to the cruelty, and disrespect of sandwich, in his posting calling out with little information about belief systems to go as far by naming individuals idiots. I'm sorry for the closed minds in which you group and effort to communicate rationally (without emotions,) it seems you have lit or hit or sparked something in these subjects allowing caused or un-caused, controlled or controlled emotions. Their must be something deeper in these subjects to carry on, and display themselves here as such. Your motivation Will, I am still pondering, collecting data, that I can draw out a smart conclusion. But for sandwich to lose control and refer me as an idiot, this I do not understand his and colleagues motivation. Sandwich did say prospect though, but for sandwich to be deeply disturbed [(feelings, deep thoughts, possible bitterness, (no love)] again the motivation and intent is unclear. i do not know enough about sandwich to draw an conclusion. I must gather more data. Good Day.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T02:16:38.856Z · LW(p) · GW(p)

You're posting this here is strange, because the incident you're referring to happened elsewhere in the comments section, and I was only indirectly involved with it. But if I had to guess why you were implicitly called an "idiot", it'd be down to two big, related things.

Firstly, you made a few errors of grammar and punctuation. This is likely due in part to hasty or stressed writing, but also somewhat suggests you don't know the correct grammar and punctuation, which is a huge factor people take into account when judging intelligence and intellectual ability. Secondly, your writing style is distinctly schizophrenic, with overt religious influences. Both schizophrenia and religiosity are thought to correlate strongly with lack of intellectual ability on LessWrong.

Those two aspects of your writing combined -- punctuation/grammar and schizotypality/religiosity -- mean the vast majority of LessWrong folk won't even bother to try to understand what you say, and will thus consign you to the "idiot" bin. Unfortunately, I think it'd be very difficult for you to learn the writing style and conceptual emphases that are sought on LessWrong.

Thus I think you'll ultimately want to refrain from commenting here. Your ideas are interesting, but your background, perspective, and values are too different to spark really useful interactions. Even I am often disliked on LessWrong and I am very careful to speak their language most of the time. I think you'd fare much worse, because you lack the experience.

comment by A1987dM (army1987) · 2012-06-03T12:31:22.270Z · LW(p) · GW(p)

Because God is a troll, that's why.

comment by A1987dM (army1987) · 2012-06-03T12:34:09.728Z · LW(p) · GW(p)

How about you stop trolling the world first? (But then again, you've created it, so you get to troll it however you want.)

comment by wedrifid · 2012-06-02T01:21:42.752Z · LW(p) · GW(p)

Don't be shallow, don't just consider the obvious points. Consider that I've thought about this for many, many hours, and that you don't have any privileged information.

So? You say crazy (and wrong) shit a lot and have no credibility.

Whence our disagreement, if one exists?

Try explaining your reasoning and we might see. The whole "I have mysterious reasons why this crazy idea is true" thing is just annoying. (Whether done by you or Eliezer.)

Replies from: Username, Will_Newsome
comment by Username · 2015-08-05T15:25:04.751Z · LW(p) · GW(p)

I don't know about "no credibility", Will knew some.

comment by Will_Newsome · 2012-06-02T01:37:35.054Z · LW(p) · GW(p)

You say crazy (and wrong) shit a lot and have no credibility.

Wait, I thought in my case those were, like, really tied into each other, barely two different things. Also I have tons of credibility with the people who matter. ...Which is in some respects a problem, you see.

Try explaining your reasoning and we might see.

It's more fun this way. Don't you want to live by your own strength sometimes?

he whole "I have mysterious reasons why this crazy idea is true" thing is just annoying.

Well, duh.

Replies from: Armok_GoB, wedrifid, mwengler
comment by Armok_GoB · 2012-06-02T19:31:30.580Z · LW(p) · GW(p)

If that's the reason, shouldn't you try to maximize credibility with reliable, high credibility people who understand those aspects of fun theory (especialy those who themselves are credible), keep it neutral with mental helth related profesionals who may lock you up, and minimize it with everyone else?

In other words; credibility is a two place function and your question is a false dichotomy.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T19:40:15.945Z · LW(p) · GW(p)

You're the closest I've seen to understanding this post. You grok at least 20% of it.

Replies from: Armok_GoB
comment by Armok_GoB · 2012-06-02T22:12:32.537Z · LW(p) · GW(p)

I only commented on 33% of it, so I'd say that's a pretty decent result.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T01:34:23.435Z · LW(p) · GW(p)

So now we're in a situation mildly close to an interesting epistemic situation, which is, winning the lottery. Winning the lottery or some better-optimized event provides a lot of incommunicable evidence that you're in a simulation. The typical anthropic problem in group epistemology -- your winning tells me nothing. I have a question for you: How serious a problem do you think this is in practice? If it's a common problem and been one throughout history, what social institutions would have evolved to help solve the problem? Or is solving the problem impossible? Only try to answer these if you're interested in the questions themselves of course.

Replies from: Armok_GoB
comment by Armok_GoB · 2012-06-03T02:31:53.965Z · LW(p) · GW(p)

You're looking at it all wrong, "you" are not "in" any simulation or universe. There exists instantiations of the algorithm, including the fact that it remembers winning the lottery, which is you in various universes and simulations and boltzman brains and other things, with certainty (for our purposes), and what you need to do depends on what you want ALL instances to do. It doesn't matter how many simulations of you are run, or what measure they have, or anything else like that, if your decisions within them don't matter for the multiverse at large.

None of the evolved concepts and heuristics, which you have been wired to assume so deeply alternatives may be literally unthinkable, are inapplicable in this kind of situation. These concepts include the self, anticipation, and reality. Anthropic is an heuristic as well, and a rather crappy one at that.

So ask yourself, what is your objective, non-local utility function over the entirety of the tegmark-4 multiverse, and for what action would it be logically implied to be the largest if all algorithms similar to yours outputted that action?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T02:40:44.548Z · LW(p) · GW(p)

Yes, I really despise non-decision-theoretic approaches to anthropics. I know how to write a beautiful post that explains where almost all anthropic theories go wrong -- the key point is a combination of double counting evidence and only ever considering counterfactual experiences that logically couldn't be factual -- but it'd take awhile, and it's easier to just point people at UDT. Might give me some philosophy cred, which is cred I'd be okay with.

Replies from: Armok_GoB, JoshuaZ
comment by Armok_GoB · 2012-06-03T02:57:51.493Z · LW(p) · GW(p)

Actually, it does wrong on a much deeper and earlier level than that, and also you don't grok UDT as well as you think you do, or you wouldn't have considered the lottery question worth even considering.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T03:10:25.948Z · LW(p) · GW(p)

More precisely, though, I thought the subject was worth your consideration, because I hadn't seen you in decision theory discussion. (Sorry, I don't mean to be or come across as defensive here. I'm a little surprised your model of me doesn't predict me asking those as trick questions. But only a little.)

Re deeper problems, there are metaphysical problems that are deeper and should be obvious, but the tack I wanted to take was purely epistemological, such that there's less wiggle room. Many people reject UDT because "values shouldn't affect anticipation", and I think I can neatly argue against anthropics without hitting up against that objection. Which would be necessary to convince the philosophers, I think.

Replies from: Nighzmarquls
comment by Nighzmarquls · 2012-06-03T03:14:26.645Z · LW(p) · GW(p)

Compensating over duplicitous behavior in models can tend to clog up simulations and lead to processing halting.

I generally would take all statements as reflective of exactly what some one means if at all possible.

Its also great fun to short circuit sarcasm in a similar way.

comment by JoshuaZ · 2012-06-03T03:12:42.162Z · LW(p) · GW(p)

I'd be very interested in seeing such a post.

Replies from: Will_Newsome, Will_Newsome
comment by Will_Newsome · 2012-06-03T03:17:17.801Z · LW(p) · GW(p)

I should at least make a few paragraphs of summary, because I've referenced the idea like three times now, I've never written it down, and if it ends up being wrong I'm going to feel pretty dumb. I'll try to respond to your comment in the next few days with said paragraphs.

comment by Will_Newsome · 2012-06-25T11:44:53.100Z · LW(p) · GW(p)

Blerghhhh okay I'll just write down the thoughts as they come to me, then use the mess later at some point. Maybe that'll interest you.

  • Pretty sure the conclusion was like "anthropic explanations make sense, but not anthropic updates". E.g. anthropic pseudo-explanations of why my lane is so much busier than the one next to me make sense. That's because they only summarize knowledge I already have for other reasons—I already know the lane I'm in has more people in it, that was presupposed in asking the question of why my lane has so many cars.
  • Okay this is a different line of reasoning but I'll just go on about it till I remember the other one. They share themes.
  • Okay, so, in lots of anthropics problems I'm given some hypothetical person in some hypothetical scenario and told to pretend I'm then and then I'm asked how I should update to find myself in that scenario.
  • But I'm not actually them—I'm actually me.
  • I can explain how I ended up as me using decision theoretic reasoning (—and meta level concerns, naturally). The reasoning goes, I expect to find myself in important scenarios. But that decision theoretic reasoning simply wouldn't explain the vast majority of people finding themselves as them, who are not in important scenarios.
  • I simply can't explain how I would counterfactually find myself as someone not in an important scenario. It's like a blue tentacle thing. Luckily I don't have to. There's no improbability left.
  • If I counterfactually did find myself as someone seemingly in an unimportant scenario I would be very confused. I would be compelled to update in favor of hidden interestingness?
  • Luckily such scenarios will always be counterfactual. It's a law of metaphysics. No one should ever have to anthropically update. I'm "lucky" in some not-improbable sense.
  • I shouldn't know how to update in impossibly unlikely because unimportant counterfactual scenarios, for the same reason I shouldn't be able to explain a blue tentacle.
  • You can come up with thought experiments where the choice is stipulated to be extremely important. Still counterfactual, still not actually important.
  • This theory of anthropics is totally useless for people who aren't me. As it should be. Anthropics shouldn't provide updating procedures for non-existent people. Or something.
  • ...This wasn't the line of reasoning I wanted to elucidate and I still don't remember how that one went. This line of reasoning does have the double-counting theme, but it also brings in a controversial conclusion from decision theory, and is also solipsistic, which is useless for group epistemology. Maybe there was some generalization or something... Bleh, probably wrote something down somewhere.
comment by wedrifid · 2012-06-02T01:51:12.234Z · LW(p) · GW(p)

Wait, I thought in my case those were, like, really tied into each other, barely two different things.

Nope, you say some crazy-sounding things that are actually right too. There are just other people that manage to say the crazy-sounding-but-right things and not say the just-plain-crazy things a hell of a lot better than you are capable of.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T01:55:02.962Z · LW(p) · GW(p)

For what it's worth (nothing, right?), I disagree. I'm the best I know of when it comes to crazy-sounding-but-right, but the position could also go to Nick Tarleton, maybe Michael Vassar.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:29:40.608Z · LW(p) · GW(p)

For what it's worth (nothing, right?), I disagree. I'm the best I know of when it comes to crazy-sounding-but-right, but the position could also go to Nick Tarleton, maybe Michael Vassar.

Disagreement with what I was trying to convey would actually imply that you are the "best at not-crazy-sounding-but-wrong despite satisficed crazy-sounding-but-but-right". Michael Vassar cannot claim that role either (I wouldn't expect him to try). He speculates a lot and that inevitably leads to being wrong a portion of the time.

(And yes, implicitly you should rate yourself highly there too.)

comment by mwengler · 2012-06-02T15:51:18.944Z · LW(p) · GW(p)

Thanks, will, I'm starting to get it.

comment by [deleted] · 2012-06-02T16:53:49.873Z · LW(p) · GW(p)

Anyone finding themselves in the awkward position of wondering if he is a child among adults who may or may not be using innuendo? And that you think you understand a few of them, but aren't sure you do? To summarize my current state, Will Newsome is hitting some of my "take him seriously" heuristics pretty hard. At their center lies that he is taken far more seriously than most average posters think he should be taken, by some pretty big names who have been with this Eliezer centred rationality project since its start and have accrued quite a reputation for excellent judgement. He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

I have several theories on this which I have constructed over the past few months but don't feel comfortable sharing right here, because I've stumbled on several caches of dangerous thinking. I have to keep squishing some ugh fields and bolstering others when exploring these ideas. Yet I also just can't come up and ask the right people to check my reasoning on any of them, their time is valuable and I'm not in their social circles anyway. I find myself blinking in confusion unsure if I'm being played for a fool or not. There is this strange current of, well insight and reasonableness in his comment history and ideas. Yet there is plain craziness as well interwoven into strange cloth. So I am asking the aspiring rationalist. I am asking the crowd. I am asking the uninitiated in whatever fictional or real troubles he often alludes to. I am asking LessWrong.

What is your position on Will Newsome? I wish to emphasise I am NOT asking about his behaviour in this thread in particular.

Replies from: CarlShulman, John_Maxwell_IV, Jack, TheOtherDave, lukeprog, None, Desrtopa, gwern, Oligopsony, Incorrect, cousin_it, Multiheaded
comment by CarlShulman · 2012-06-02T23:15:10.191Z · LW(p) · GW(p)

He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

To defend the repute of the visiting fellows program, please note that his crackpot score has skyrocketed since that time and he would almost certainly not have been accepted had he applied then as he is today.

Replies from: Alicorn, Will_Newsome
comment by Alicorn · 2012-06-03T06:48:28.094Z · LW(p) · GW(p)

Also, I think his crackpot score skyrocketed mostly after he left - so if it was something we did, it was a delayed effect.

comment by Will_Newsome · 2012-06-03T05:35:31.110Z · LW(p) · GW(p)

Also worth noting is that I was made a Fellow sort of off the cuff without any real input from anyone in the organization. Anna's absence led to much disorganization in the program. And yes, when I first volunteered I was more or less a typical LWer, with one strange thing being my high school drop out status.

comment by John_Maxwell (John_Maxwell_IV) · 2012-06-03T00:38:17.929Z · LW(p) · GW(p)

I get the impression that he's often more concerned with signaling interestingness, intelligence, and contrarianism than figuring out what's true.

Note: I also get that impression from Michael Vassar. But I have lots of respect for the current Singularity Institute director.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2012-06-03T14:26:37.523Z · LW(p) · GW(p)

I don't get that impression from Michael Vassar, possibly because I've talked with him in person. Asking repeatedly for examples makes it fairly possible to find out what he means.

I have no such hope with Will Newsome.

Replies from: khafra
comment by khafra · 2012-06-05T14:29:09.844Z · LW(p) · GW(p)

I've talked with Michael Vassar in person, and also found him much more comprehensible than by brief cryptic textual snippet. Have you talked with Will Newsome in person? I haven't, but every time I engage him personally in comments, etc., his vagueness resolves into something a lot more coherent (even if it's not something I necessarily agree with).

comment by Jack · 2012-06-04T19:19:58.406Z · LW(p) · GW(p)

Will is pretty weird and I don't believe the way he thinks is, ya' know, normative. But I still find his writing to be extremely valuable relative to most Less Wrong commenters because for the most part Less Wrong commenters come in three different flavors: vanilla (what I would say if I weren't as smart or 3-4 years less educated), chocolate (what I would say now) and strawberry (what I would say if I were smarter or 3-4 years more educated). Will is lobster ice cream with rainbow jimmies. I will never think like him and I wouldn't want to. But I'm glad there is someone extending hypotheses as far as they will go and never looking back. I find novel explorations of hypothesis space to be both useful and interesting. He is pursuing a train of thought I don't have a lot of time for and no reason to prioritize. But I'm still looking forward to finding out where it ends up.

Will is like a musician on a hallucinogen. You wouldn't want to have his brain and you probably don't trust his judgment. But before he burns out at 27 he's gonna produce some really interesting ideas, some of which will simply be absurd but a few of which might have real staying power and influence a generation.

comment by TheOtherDave · 2012-06-02T17:21:47.257Z · LW(p) · GW(p)

What is your position on Will Newsome?

I frequently find Will's contributions obscurantist.

In general, I find obscurantism at best tedious, and more often actively upsetting, so I mostly ignore it when I encounter it. Occasionally I engage with it, in a spirit of personal social training.

That said, I accept that one reader's obscurantism is another reader's appropriate level of indirection. If it's valuable to other people, great... the cost to me is low.

At this point the complaining about it by various frustrated people has cost me more than the behavior itself, by about an order of magnitude.

Replies from: Miller
comment by Miller · 2012-06-02T17:59:27.419Z · LW(p) · GW(p)

I frequently find Will's contributions obscurantist.

The same word came to mind, and it's common to his history of interactions, so seeing it here means I ascribe it to him rather than the logic of whatever underlying purpose he may have on this occasion.

comment by lukeprog · 2012-06-03T01:15:11.748Z · LW(p) · GW(p)

I didn't meet Will until April 2011, but most people who have been around longer seem to share Carl's opinion. For myself, I also find many of Will's contributions obscurantist, and I agree with John Maxwell that they seem to want to signal interestingness, intelligence, and and contrarianism. Finally: Will offered good, substantive feedback on two of my papers.

comment by [deleted] · 2012-06-02T18:15:34.793Z · LW(p) · GW(p)

My sensation about Will Newsome is that of a celebrity I haven't heard of. Most of the comments that I notice authored by Will Newsome appear to be about Will Newsome, but I don't understand their content beyond that. They seem to attract a lot of attention.

There is this strange current of, well insight and reasonableness in his comment history and ideas.

I would be interested in reading some of these ideas, if you could point some out.

comment by Desrtopa · 2012-06-04T07:02:52.296Z · LW(p) · GW(p)

In addition to already mentioned obscurantist tendencies, he awards himself intellectual credit for "going meta," even when this does not lead to actually smarter behavior or better results.

comment by gwern · 2012-06-02T17:06:26.549Z · LW(p) · GW(p)

He has also been a visiting fellow at the SI, which means obvious crackpottery should have been filtered.

What did he actually do, though?

Replies from: Will_Newsome, None
comment by Will_Newsome · 2012-06-02T17:39:33.230Z · LW(p) · GW(p)

For half the time, with Anna, I was an intern, not a Fellow. During that time I did a lot of intern stuff like driving people around. Part of my job was to befriend people and make the atmosphere more cohesive. Sometimes I planned dinners and trips but I wasn't very good at that. I was very charismatic and increasingly smart, and most importantly I was cheap. I was less cheap as a Fellow in the Berkeley apartments and accomplished less. I wrote and helped people occassionally. There weren't clear expectations for Fellows. Also people like Eliezer, who had power, never asked for any signs of accomplishment. Eliezer is also very bad at reading. Nonetheless I think I should have accomplished more somehow, e.g. gotten experience writing papers from scratch.

I believe I almost always turned down credit for contributions to papers, but I didn't make too many substantive contributions; I did a fair bit of editing, which I'm good at.

You could get a decent idea by looking at what the average Visiting Fellow did, then remember that I often couldn't remember things I did -- cognitive quirk -- and that I tried to avoid credit when possible at least half the time.

Replies from: None, komponisto, gwern, Alexei
comment by [deleted] · 2012-06-03T04:00:44.550Z · LW(p) · GW(p)

.

Replies from: drethelin
comment by drethelin · 2012-06-03T04:29:29.127Z · LW(p) · GW(p)

Second

comment by komponisto · 2012-06-03T10:11:02.962Z · LW(p) · GW(p)

Part of my job was to befriend people and make the atmosphere more cohesive.

You were good at that, as I recall. As was (especially) Alicorn. Also, at the time I thought it was just super-cool that SI had its mundane tasks done by such brilliant people.

Replies from: Alicorn
comment by Alicorn · 2012-06-03T21:09:46.972Z · LW(p) · GW(p)

You were good at that, as I recall. As was (especially) Alicorn.

:D

comment by gwern · 2012-06-02T19:26:24.249Z · LW(p) · GW(p)

Thanks for the summary.

comment by Alexei · 2012-06-04T19:44:00.022Z · LW(p) · GW(p)

often couldn't remember things I did

That's interesting. I also have something like that. It extends to not being able to remember names, and not being able to easily come up with specific examples. Is it like that for you?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-06T01:37:43.702Z · LW(p) · GW(p)

Yes, also for Eliezer.

Replies from: Alexei
comment by Alexei · 2012-06-06T16:45:31.842Z · LW(p) · GW(p)

Do you know of any helpful strategies for dealing with this or get better?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-06T16:59:54.235Z · LW(p) · GW(p)

For Eliezer & I it seems there's also the matter of not being able to find objects amongst other objects. Eliezer hasn't quite said he's bad at that but I surmised it from one of his most terrible posts, ha. For that issue, I've learned to just use explicit, conscious linear search. Still terrible, but not as terrible.

With episodic memory I suspect there are similar strategies for looking through mental objects, likely in temporal order. Potentially similarly with names. I can't think of anything that would work for specific examples in general though, which as you know is really quite a big problem during arguments and so on.

I mildly suspect the problem has somewhat to do with damage to or atrophy of the dorsolateral prefrontal cortex. But that's speculation, and there are a lot of selection effects on who shows up on LessWrong, so it might be a somewhat rare combination of stuff. Eliezer would know a lot more about the neurology and so on but he's probably not available for questioning and speculation on the matter.

For what it's worth I'm somewhat schizotypal/schizoaffective, and Eliezer also seems to lean that way.

Replies from: TheOtherDave
comment by TheOtherDave · 2012-06-06T17:20:53.302Z · LW(p) · GW(p)

It may or may not be relevant, but finding objects amongst other objects was one of the functions that was severely degraded by my stroke. As with most other damaged functions, I found that actually forcing myself to do it anyway (which usually required first learning a new way to frame the doing of it) led to very rapid improvement back to more-or-less baseline. The improvement plateaued out thereafter. (Unsurprisingly, but disappointingly. The experience of such rapid improvement is very heady stuff.)

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-06T17:26:26.038Z · LW(p) · GW(p)

If you don't mind sharing, what parts of the brain or other cognitive functions were most damaged by the stroke? I've pieced together some of the story but not much.

Replies from: TheOtherDave
comment by TheOtherDave · 2012-06-06T17:55:19.958Z · LW(p) · GW(p)

The aneurysm itself was at the edge of my thalamus. The resulting swelling caused damage kind of all over the place.

The functional damage at first was pretty severe, but I don't remember specifics; I mostly don't remember that week at all and much of what I do remember I'm fairly certain didn't actually happen. I spent it in an ICU. I come back to a coherent narrative about a week later; at that point, the bulk of the functional damage was general pain and fatigue, right-side hemiplegia (my right arm and leg were not quite paralyzed, but I lost control over them), mild aphasia which most often manifested as anomia (difficulty retrieving words) and occasionally in other ways (I remember losing the ability to conjugate sentences for a few hours; that was freaky), and (most significantly) a loss of short-term memory with all the associated knock-on effects to various kinds of cognitive processing.

There was also a lot of concern at various points that there may have been damage to my emotional centers. I never noticed any such effect, but, well, I wasn't necessarily the most reliable witness. Most memorably, this led to one doctor asking me if I my emotional state was at all unusual. I didn't reply "What the fuck kind of a stupid question is that, I just had a fucking stroke, of course my emotional state is fucking unusual you inbred moron!!!" although I really wanted to. I instead replied "I'm pretty sure my unusual emotional states are situational, not organic." Ultimately they started believing me.

comment by [deleted] · 2012-06-02T17:13:17.648Z · LW(p) · GW(p)

You want answers?

comment by Oligopsony · 2012-06-04T18:19:17.655Z · LW(p) · GW(p)

Writes the most consistently fun posts out of anybody here.

comment by Incorrect · 2012-06-02T21:25:43.325Z · LW(p) · GW(p)

Maybe it's a deliberate puzzle set up as an intelligence test for recruiting purposes.

comment by cousin_it · 2012-06-04T22:50:17.333Z · LW(p) · GW(p)

I'm sad that Will doesn't seem to care about being correct, because I can imagine how much he could contribute if he cared.

comment by Multiheaded · 2012-06-24T12:00:35.984Z · LW(p) · GW(p)

I think that Will (his Will-like stuff, not the "respectable" comments) is 60% worth taking seriously. But hell, I take Philip K. Dick 85% seriously, so what do I know. (That is, I'm not a sane person myself, never claimed to be, so you'd be wise to discount the crazy shit I might say on these topics even if you find it interesting.)

comment by nshepperd · 2012-06-02T16:00:22.694Z · LW(p) · GW(p)

For what it's worth, it's already my opinion that you're completely insane and ought to have no credibility whatsoever. In fact I'm confused that anyone takes you seriously at all.

comment by Mitchell_Porter · 2012-06-02T04:32:34.654Z · LW(p) · GW(p)

What's the big scary secret?

Replies from: LKtheGreat, knb, XiXiDu
comment by LKtheGreat · 2012-06-02T17:24:22.799Z · LW(p) · GW(p)

This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can't tell people about - a basilisk, some conspiracy-theory-type information, something. Being a relative newcomer unwilling to go through large numbers of his previous posts, I'd like to know if anyone who's seen him longer has any more information.

Also, this whole thing is absolutely hilarious to read.

Replies from: Mitchell_Porter, XiXiDu
comment by Mitchell_Porter · 2012-06-02T22:33:25.559Z · LW(p) · GW(p)

I have a few ideas:

1) It's a "basilisk", i.e. an imaginary lovecraftian threat that doesn't even make sense outside of some highly particular and probably wrong belief system. (That's not my definition of basilisk, but it is what I think of such claims.)

2) Some mundane fact about the difficulty or danger of actually trying to save the world (in the specific sense of shaping a singularity) has made his blood run cold. It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.

3) I remember when I was 22 and realized (according to the plans I had at the time) that it might take me eight years to save the world. That was very daunting, because at the time it looked like it would be a joyless, stressful, solitary existence, for an unimaginably long period of time. And as it turned out, I didn't even get it done... Will could be fleeing the responsibilities of his "position" - I mean his existential position, which surely includes the perception that he has the potential to make a difference in a huge way.

ETA 4) He wants to create a barrier (a "credibility barrier") between himself and his former associates in SI, so as to develop his own thinking, because there's a systematic deficiency in their outlook and he must avoid the temptation of working within that paradigm.

Replies from: XiXiDu
comment by XiXiDu · 2012-06-03T10:10:54.376Z · LW(p) · GW(p)

It could be the existence in the real world of powerful evil cliques; it could be the psychological potential of joining them, or just of becoming selfish in an ordinary sense.

Right...that would be bad. But I doubt it. They are too crazy for that, just like all the other extremists. And besides, they are not even able to protect themselves from theft, even though they are a relatively small group. Still, damage can be done even by crazies. I just hope one of them will whistle-blow any plans before damage can be done.

comment by XiXiDu · 2012-06-03T10:25:05.426Z · LW(p) · GW(p)

This is mainly what I want to know. From the comments on this post, it looks like W_N claims to have (read: geniunely has, geniunely thinks he has, or trolls as though he has) come across something he can't tell people about - a basilisk, some conspiracy-theory-type information, something.

Would you have written the same comment if the header of this site wouldn't read "a community blog devoted to refining the art of human rationality" but instead "computational theology"?

Replies from: LKtheGreat
comment by LKtheGreat · 2012-06-03T13:31:54.463Z · LW(p) · GW(p)

No, but that's a fallacious comparison. The header does in fact read "a community blog devoted to refining the art of human rationality," and I'm here because I want to read that kind of site.

Also, I've read some of Will's "computational theology" blog. His posts there seem to consist of actual reasoning and logic and such, whereas over here his posts on the same general topic tend toward "I've got a big secret I'm not going to tell you, so there, nyaah." (My apologies if this is an unfair representation, but that's the impression I've formed.)

comment by knb · 2012-06-02T22:59:12.039Z · LW(p) · GW(p)

What's the big scary secret?

I mean, I get why Newsome would want to obscure this: a lot of people get off on being seen as "mysterious" or whatever. But there does seem to be a number of people here who understand what is going on, but are refusing to offer their explanations, in spite of the fact that a lot of people are confused here.

Maybe they take the basilisk threat seriously? That would be crazy/sad if true.

Edit: Also, there are now a number of people openly asking for explanations, but all we are getting is speculation from people who also don't know what is going on. I'm starting to get annoyed with this.

Replies from: XiXiDu
comment by XiXiDu · 2012-06-03T10:16:06.479Z · LW(p) · GW(p)

But there does seem to be a number of people here who understand what is going on, but are refusing to offer their explanations, in spite of the fact that a lot of people are confused here.

Maybe they take the basilisk threat seriously?

Just don't be fooled by intelligence too much. Just because those people can disgorge some math that doesn't lent their extraordinary claims much credence. Most of the credence they assign is based on mutual reassurance anyway. Just like a bunch of ufologists updating on each others evidence of alien abductions.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T14:00:47.378Z · LW(p) · GW(p)

Just like a bunch of ufologists updating on each others evidence of alien abductions.

Given normal assumptions, additional claims of abductions should provide additional evidence. I don't think you've quite pinned downed the error with your example.

comment by XiXiDu · 2012-06-03T10:04:17.105Z · LW(p) · GW(p)

What's the big scary secret?

There is no big scary secret. The only danger to worry about is that this community of schizo scifi nerds is going to have some perceptible and negative influence by spreading and popularizing their bullshit. Which will mainly be a problem for the computer science community, especially AI research, since those people are naturally susceptible to such infections.

But I am not too worried about that either. If all people who buy all this bullshit stop working on AI then maybe that will renovate the field and actually allow some real progress to take place by giving new ideas a chance and by introducing new perspectives which are less deluded by science fictional ideas. In a sense lesswrong/SIAI might function as a crackpot attractor, stripping out all negative elements so that actual progress can take place.

Replies from: wedrifid, TheOtherDave
comment by wedrifid · 2012-06-03T11:06:50.230Z · LW(p) · GW(p)

The only danger to worry about is that this community of schizo scifi nerds is going to

Alicorn, if a "should this be moderated" poll is required anywhere in this thread this is the kind of trolling that needs to be targeted. Way across the line.

Replies from: Vladimir_Nesov, XiXiDu
comment by Vladimir_Nesov · 2012-06-03T11:13:31.350Z · LW(p) · GW(p)

It's rare enough that handing people with those sentiments the weapon of "any dissent is immediately silenced" is worse than the disease.

Replies from: wedrifid
comment by wedrifid · 2012-06-03T11:24:52.978Z · LW(p) · GW(p)

"any dissent is immediately silenced"

Not remotely suggested. Since when does "immediately" mean "after a spiraling trend over a couple of years"?

I suggest that if this is applicable to any behavior in this thread it is to the actual trolling, not Will just being a crackpot.

Replies from: Vladimir_Nesov, XiXiDu
comment by Vladimir_Nesov · 2012-06-03T12:14:08.569Z · LW(p) · GW(p)

Not remotely suggested.

The quoted sentence refers to the weapon, not the event from which it's shaped (through misrepresentation or motivated misinterpretation). Even community voting that hides comments that happen to be critical is being used as fuel for accusations of censorship.

comment by XiXiDu · 2012-06-03T11:51:05.230Z · LW(p) · GW(p)

I suggest that if this is applicable to any behavior in this thread it is to the actual trolling, not Will just being a crackpot.

A fraction of my comments are outright critical and I am only posting a few comments per week. There have been dozens of highly critical comments lately not made by me. Some of them containing direct personal attacks.

If you really perceive the few harsh comments that I make, that reflect a widely held opinion, to be too much then you lost all touch with reality and require much more criticism than I could deliver.

Wait a few more years and the shitstorm is going to increase by orders of magnitude and I won't even be part of it.

Do you really believe that you can get away with your attitude? Be prepared to be surprised.

And stop calling everything "trolling". It's really getting boring.

comment by XiXiDu · 2012-06-03T11:37:50.582Z · LW(p) · GW(p)

Alicorn, if a "should this be moderated" poll is required anywhere in this thread this is the kind of trolling that needs to be targeted. Way across the line.

What is way across the line is when people start asking about "secretes" and basilisks and there is any chance of such possibilities being taken seriously. What is way across the line is when an organisations tries to actively impede research.

Some harsh words are completely appropriate then.

I have no problem with Will Newsome and find a lot of his output enjoyable. But if he starts to lend credibility to crazy shit like basilisks in the minds of people then that has to be said.

comment by TheOtherDave · 2012-06-03T15:21:13.534Z · LW(p) · GW(p)

I endorse you not worrying about SI impeding AI progress on any significant scale.
I would also endorse, if you're genuinely interested in encouraging AI research, devoting more of your attention to the problems that are actually impeding AI progress.

comment by [deleted] · 2012-06-02T16:39:33.416Z · LW(p) · GW(p)

Can you explain clearly why you have gone all crazy? Why do you have to drop these esoteric hints and do this stupid troll business?

My understanding is that you delved too deeply into simulation arguments and met Cthulu or something, had a religious experience and determined that there is a god or something and that the people who are in the know are all in the catholic church somewhere.

And then for some reason you can't just explain this clearly and lay out your reasons. Or maybe you've tried explaining it clearly, but that was before my time and now you assume that everyone either already knows what you are on about, or is interested enough to scour the internet for your posting.

???

If Will won't cooperate, can someone else explain the best model we have of his weirdness?

Replies from: TheOtherDave, athingtoconsider
comment by TheOtherDave · 2012-06-02T17:00:58.230Z · LW(p) · GW(p)

It may be relevant that Will has talked elsewhere about certain important physical phenomena being evasive, in the sense that their likelihood of occurring is significantly inversely proportional to whether someone is trying to prove or demonstrate them.

When I value my interactions with an evasive phenomenon (the beliefs of shy people, the social rules of Guess cultures, etc.), one consequence is often that I can't actually talk about my real reasons for things; everything has to be indirect and roundabout and sometimes actively deceptive.

I am generally happier when I don't value my interactions with evasive phenomena, but that's not always an option.

Replies from: roystgnr
comment by roystgnr · 2012-06-03T01:46:59.541Z · LW(p) · GW(p)

Upvoted for giving two examples of real evasive phenomena. I'd previously only encountered that idea in anti-epistemological contexts, wherein "the universe evades attempts to seek the truth about X" was always clearly a desperate after-the-fact attempt to justify "so despite attempts to seek the truth about X which keep appearing to contradict my claims, you should still believe my claims instead".

But I suppose it's just common sense that you can't properly investigate much psychology or sociology unless you avoid letting the subjects understand that they're being investigated. That's a huge difference from e.g. evasive cosmologies, in which investigating a subject without alerting Him is often presumed impossible.

Replies from: TheOtherDave
comment by TheOtherDave · 2012-06-03T02:16:23.867Z · LW(p) · GW(p)

Well, evasive physical law follows from certain theologies just as readily as evasive cultural norms or relationship rules follow from certain sociologies and psychologies; it needn't be post-hoc reasoning. Of course, whether those theologies, or any theologies, have a referent in the first place is a different question.

Replies from: roystgnr
comment by roystgnr · 2012-06-03T14:05:50.102Z · LW(p) · GW(p)

Evasive physical law follows naturally from some theologies, it's merely been a post-hoc rationalization for the theologies that I've seen people trying to spread. For instance, either of "We have an ethical theory under which God needs to hide" and "We claim to have records of many instances in which God avoided hiding" could be a weak but positive argument by itself, but the (common) combination is actually negative evidence.

comment by athingtoconsider · 2012-06-05T10:57:50.292Z · LW(p) · GW(p)Replies from: Winsome_Troll
comment by Winsome_Troll · 2012-06-05T12:11:22.584Z · LW(p) · GW(p)

"If your catarrh of the nose is treated by a doctor it lasts 42 days, if it is not treated it lasts -- 6 weeks." -- Sigmoid Friend, The Psychopathology of Everyday Trolling

comment by [deleted] · 2012-06-02T02:22:21.233Z · LW(p) · GW(p)

Do you need a hug?

Replies from: None, Will_Newsome
comment by [deleted] · 2012-06-02T14:21:08.641Z · LW(p) · GW(p)

I need a hug.

Edit: Thanks for all the hugs!

Replies from: None, None, Armok_GoB, Normal_Anomaly
comment by [deleted] · 2012-06-02T15:13:36.651Z · LW(p) · GW(p)

Internet Hug Protocol v0.1

INTERNET HUG!!!

(in 0.2, the message might be a vivid near-mode description of a hug)

comment by [deleted] · 2012-06-02T14:30:52.040Z · LW(p) · GW(p)

Can I get a hug?

I've just been sick for a week or so and it's making me all fuzzy-headed, and I hate being fuzzy-headed.

Replies from: Armok_GoB, None, Normal_Anomaly
comment by Armok_GoB · 2012-06-02T19:48:00.549Z · LW(p) · GW(p)

You do.

hugs! ^_^

comment by [deleted] · 2012-06-02T14:37:23.490Z · LW(p) · GW(p)

hugs paper-machine

comment by Normal_Anomaly · 2012-06-02T16:46:42.301Z · LW(p) · GW(p)

hugs paper-machine

comment by Armok_GoB · 2012-06-02T19:47:15.037Z · LW(p) · GW(p)

hugs! ^_^

comment by Normal_Anomaly · 2012-06-02T16:46:06.550Z · LW(p) · GW(p)

hugs konkvistador

comment by Will_Newsome · 2012-06-02T03:11:01.331Z · LW(p) · GW(p)

Not really. If one was offered I'd accept it.

Replies from: None, Armok_GoB
comment by [deleted] · 2012-06-02T03:55:38.090Z · LW(p) · GW(p)

I just realized that I can't take your response as evidence about whether you actually need a hug.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T03:58:37.972Z · LW(p) · GW(p)

Why not? I don't see why. My girlfriend is five feet away, she can give me a hug. I'll ask her to if that would make you feel better.

Replies from: None
comment by [deleted] · 2012-06-02T14:20:12.254Z · LW(p) · GW(p)

I need a hug.

comment by Armok_GoB · 2012-06-02T19:48:29.370Z · LW(p) · GW(p)

/me hugs will as well.

comment by TheOtherDave · 2012-06-02T06:19:16.235Z · LW(p) · GW(p)

I endorse maximizing the degree to which people consider my saying X is true to be evidence that I believe X is true.
I don't worry too much about the degree to which people consider my belief that X is true to be evidence that X is true. I expect that depends a whole lot on specifics of X.
I resent questions that wrap themselves in framings like "don't just consider the obvious points."
I endorse you having private conversations with the folks you consider worthy, rather than having obscure public ones like this. The rest of us might not have whatever attributes would allow us to penetrate the cloud of obfuscation and thereby receive your insights, but that doesn't mean we deserve to have our time wasted.

comment by Normal_Anomaly · 2012-06-02T16:42:54.063Z · LW(p) · GW(p)

I have found Will_Newsome to be annoying for a long time now, because he does things like this and because he strikes me as irational. But he used to get upvoted, so I figured he just rubbed me the wrong way and didn't talk to/about him to avoid conflict. Now other people are downvoting him too. What changed?

Retracted because I have come to understand things that made the question moot, and because I no longer find Will as annoying as I did. I no longer think he's acting out of malice, though I still have serious doubts about his rationality.

comment by Miller · 2012-06-02T17:54:53.885Z · LW(p) · GW(p)

If your goal is to lower your credibility, why do that in the context of talking about credibility?

comment by RobertLumley · 2012-06-02T01:57:37.013Z · LW(p) · GW(p)

Don't feed the trolls. It's sad that needs to be said on LessWrong, but it does.

Replies from: Will_Newsome, Will_Newsome
comment by Will_Newsome · 2012-06-02T02:02:18.197Z · LW(p) · GW(p)

But seriously. What is their other option? Not downvote me? And many of the comments on this post are constructive.

Also, clearly I'm beyond "feeding" at this point—trolling, and getting fed, clearly aren't the aim of this post. But maybe you want to discourage future trolls who might do something similar to what I've done.

comment by Will_Newsome · 2012-06-02T01:59:51.964Z · LW(p) · GW(p)

Stop trolling.

comment by JoshuaZ · 2012-06-02T00:26:18.690Z · LW(p) · GW(p)

Separate comment: Some of your remarks like this look almost like you are engaging in intellectual exhibitionism. This one fits into that and is a potential source of irritation.

Now to more substantially answer the question: people should pay attention to my idea and thoughts exactly how much they are credible or not. Trying to deliberately modify how credible I am in a general context will interfere with people make their most informed decisions about whether or not to listen to anything I have to say.

Replies from: Will_Newsome, Will_Newsome
comment by Will_Newsome · 2012-06-02T00:39:55.973Z · LW(p) · GW(p)

Some of your remarks like this look almost like you are engaging in intellectual exhibitionism. This one fits into that and is a potential source of irritation.

Good!

people should pay attention to my idea and thoughts exactly how much they are credible or not.

People can't do this.

people make their most informed decisions

People can't do this. (That is, not in the sense you seem to be implying.)

(And with the people that can do this, it doesn't even matter what you try to do with your credibility. They'll find you.)

Replies from: JoshuaZ
comment by JoshuaZ · 2012-06-02T01:15:56.736Z · LW(p) · GW(p)

Good!

No. Not good. It damages the signal to noise ratio. LW normally has a very good ratio. Having every single stray thought show up like this is not increasing that ratio. While you do sometimes have interesting ideas, you are not bright enough, informed enough, or a careful enough thinker that we gain much from a not highly censored stream of your thoughts.

For the rest of your reply, the fact that people can't do something perfectly doesn't mean they can't do a useful approximation, and it doesn't mean I should interfere with attempts to get the best estimates they can. If my ideas are generally good, then they will pay attention and that's a good thing. If my ideas are not worthwhile then people will stop paying attention and that's a good thing then also.

Replies from: wedrifid, CarlShulman, Will_Newsome
comment by wedrifid · 2012-06-02T01:34:26.190Z · LW(p) · GW(p)

you are not bright enough, informed enough, or a careful enough thinker that we gain much from a not highly censored stream of your thoughts.

He is bright enough and informed enough.

Replies from: CarlShulman, Will_Newsome
comment by CarlShulman · 2012-06-02T01:50:22.656Z · LW(p) · GW(p)

Presumably, "good enough" depends on at least all three factors, and strength in one can offset deficits in others.

comment by Will_Newsome · 2012-06-02T01:48:37.459Z · LW(p) · GW(p)

Thanks, wedrifid, that means a lot to me. :) (Not that I should ignore the part about not being nearly careful enough in your eyes, of course.)

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:03:33.480Z · LW(p) · GW(p)

Not that I should ignore the part about not being nearly careful enough in your eyes, of course

You could actually take that as a third validation. After all I am declaring that you are successfully achieving what you set out to achieve as an instrumental goal - portray a lack of credibility. It would be totally implausible for me to maintain (or for you to cause me to maintain) a significantly lowered estimation of your credibility while simultaneously believing that you excelled in the 'careful thinking' department as well as the previously mentioned categories.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T16:22:48.695Z · LW(p) · GW(p)

I disagree entirely, and think there is some sort of "lets pretend we are talking about what we say we are talking about" bias at work here.

Will SAYS he is talking about reducing his credibility. He then does not use a host of tools which would do that very effectively ( I think there are many choices, but making errors of fact and logic would be a good start). Speaking cryptically is NOT a very good way to reduce your credibility, except possibly among some subset of people.

What Will is more successfully doing is 1) intriguing a subset of people 2) tweaking the crap out of a large subset of people (in a way that seems orthogonal to credibility seems to me)

Just because he SAYS he is trying to reduce his credibility does not mean that is what he is actually trying to do. I am not sure what he IS trying to do.

comment by Will_Newsome · 2012-06-02T02:06:59.430Z · LW(p) · GW(p)

Yeah, but come on, losing credibility in the eyes of the masses is like the easiest thing in the world. Find a taboo, then break it. Losing credibility in the eyes of the wise, though, is impossible. Some people will know I'm a good rationalist no matter how many shenanigans I pull—I'd have to start breaking laws or something to make them think I'd finally gone full schizo. I guess I could just claim to be God, but it's so hard not to be meta, the relevant people would see through my act quickly. The only choice is to avoid them, and move into the forest for good.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:39:44.830Z · LW(p) · GW(p)

Losing credibility in the eyes of the wise, though, is impossible.

Not so. Be wrong on stuff that matters when you clearly had enough evidence available to reach the correct decision (and they previously would have expected you to be correct). If that doesn't cause you to lose credibility in their eyes then I reject either your definition of "credibility" or "wise".

I guess I could just claim to be God

It would be sufficient to claim that there is a god (and it is this particular God) despite the information you had available. See above.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T03:07:39.922Z · LW(p) · GW(p)

It would be sufficient to claim that there is a god

You'd be surprised. Many people I know think this fact is true, some think it should be obvious. And these are wise people.

comment by CarlShulman · 2012-06-02T01:52:29.598Z · LW(p) · GW(p)

That's the Cooperate-Cooperate equilibrium. In the broader intellectual world one can make an argument against unilateral disarmament in self-promotion (particularly if others engage in it for quite different reasons). OTOH, the C-C equilibrium is better, and LW is closer to it, thanks in significant part to LWers' negative reaction to self-promotion.

comment by Will_Newsome · 2012-06-02T01:25:45.929Z · LW(p) · GW(p)

No. Not good. It damages the signal to noise ratio.

Good!

While you do sometimes have interesting ideas, you are not bright enough, informed enough, or a careful enough thinker that we gain much from a not highly censored stream of your thoughts.

Yes I am. I'm fucking Will_Newsome, brah.

For the rest of your reply, the fact that people can't do something perfectly doesn't mean they can't do a useful approximation

It's not that simple. There's a threshold. They don't meet the threshold.

it doesn't mean I should interfere with attempts to get the best estimates they can

It means you should ignore them, and optimize for the people that matter. For the people that matter: increase, or decrease credibility? Note that for the people that matter, what you do mostly doesn't matter. You have to focus on edge cases.

If my ideas are generally good, then they will pay attention and that's a good thing.

I disagree.

If my ideas are not worthwhile then people will stop paying attention and that's a good thing then also.

I agree.

comment by Will_Newsome · 2012-06-02T02:03:37.957Z · LW(p) · GW(p)

Does LW automatically hide posts below a certain threshold? It really should. I'd feel a lot less guilty that way. And trust me, I do feel guilty. Sacrifices must be made.

Replies from: Randaly
comment by Randaly · 2012-06-02T02:45:23.656Z · LW(p) · GW(p)

It does; this post is currently hidden.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T02:50:37.201Z · LW(p) · GW(p)

Thanks for the information! I feel better now.

comment by Daniel_Burfoot · 2012-06-02T16:39:19.793Z · LW(p) · GW(p)

Great post: I like your style. The first observation to make is that individuals who make extraordinary contributions are often extremely eccentric, and also the quality of their pronouncements usually has high variance. So you've succeeded in increasing my probability estimate that you will say something very worthwhile, though maybe at the price of decreasing the (my) expected value of your average statement.

Replies from: Will_Newsome, khafra
comment by Will_Newsome · 2012-06-02T17:44:25.559Z · LW(p) · GW(p)

Presumably you're the Burfoot who wrote or is writing a book on compression as fundamental to epistemology?

Replies from: Daniel_Burfoot
comment by Daniel_Burfoot · 2012-06-03T22:19:00.314Z · LW(p) · GW(p)

Yes, a draft version is done already, you can find it on arXiv if you are interested. I'm not sure I would say the argument of the book is that "compression is fundamental to epistemology", it's more along the lines of "the problem of building specialized lossless data compressors is a deep and interesting one; if we attack it we will probably find out a lot of interesting stuff along the way".

comment by khafra · 2012-06-05T14:59:01.503Z · LW(p) · GW(p)

Insightful. Will has endorsed "up the variance!" in as many words, but I hadn't made the connection that explicitly maximizing variance like could be a strategy.

comment by knb · 2012-06-02T05:22:49.328Z · LW(p) · GW(p)

I don't get it. I'm guessing that Will edited the post? And it had something to do with the simulation argument?

Edit: I forgot to include, if someone who knows him better could explain will_newsome's motivations here, that would be appreciated. (I enjoy internet drama).

comment by CommanderShepard · 2012-06-02T14:41:28.867Z · LW(p) · GW(p)

Why is Will Newsome doing this? My model of him just broke.

Replies from: None, Miller
comment by [deleted] · 2012-06-02T14:42:20.453Z · LW(p) · GW(p)

Because we have to down vote him.

Replies from: CommanderShepard
comment by CommanderShepard · 2012-06-02T14:43:22.099Z · LW(p) · GW(p)

But he didn't do anything wrong before this.

Replies from: None
comment by [deleted] · 2012-06-02T14:44:16.922Z · LW(p) · GW(p)

Because he's the hero LessWrong deserves, but not the one it needs right now. So we'll hunt him. Because he can take it. Because he's not our hero. He's a silent guardian, a watchful protector. A dark knight.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T17:26:22.091Z · LW(p) · GW(p)

Oh my God. This post was worth it just for the hilarity.

comment by Miller · 2012-06-04T03:39:29.826Z · LW(p) · GW(p)

I'm going with this commenter being Will. What do I win?

Replies from: CommanderShepard
comment by CommanderShepard · 2012-06-04T12:44:16.855Z · LW(p) · GW(p)

I'm going with this commenter being Will.

I've had enough of your snide insinuations.

Gains Renegade Points

comment by moridinamael · 2012-06-02T01:00:20.495Z · LW(p) · GW(p)

Credibility. Should you maximize it, or minimize it? Have I made an error?

Depends entirely on your goals.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T01:04:27.733Z · LW(p) · GW(p)

Entirely? I wouldn't think that was the case, I guess. Do you mean in the sense that the goodness of jumping off a cliff depends entirely on your goals? (The analogy is carefully chosen—many people regard suicide and social suicide similarly.)

Replies from: moridinamael
comment by moridinamael · 2012-06-02T01:18:50.533Z · LW(p) · GW(p)

If you're trying to discredit an idea, pretend to espouse it, while undermining your own credibility.

If you're trying to support an idea, attack it while undermining your own credibility.

If you're trying to "keep people on their toes," occasionally say wrong things, but don't lead people to expect that nothing you say is trustworthy or folks will just ignore you.

If you're trying to become a respected academic, engineer, or businessperson, protect your credibility.

If you are trying to keep people from finding something out, lose your credibility as badly as possible, and then publicly say the thing you're trying to hide.

Etc. So what are your goals?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T01:28:25.102Z · LW(p) · GW(p)

Indeed. But most people share a lot of goals. People are rather homogenous, and credibility is often seen as a fairly universal instrumental goal. Many of the scenarios you listed are rather uncommon goals, much as suicide is pretty uncommon. Why is why I asked, entirely? There's a trivial sense in which it is, but there's a trivial sense in which it's not.

So what are your goals?

To never, ever lie.

Replies from: mwengler, moridinamael
comment by mwengler · 2012-06-02T16:01:45.235Z · LW(p) · GW(p)

So what are your goals?

To never, ever lie.

I did in fact LOL at that one!

Truth may be stranger than fiction, but I still can't figure out if fiction presented as fact is a lie or not. I suppose when I have that sorted out I will be ready to answer your original credibility question. Its just a feeling, but I am unusually intuitive.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T16:08:53.373Z · LW(p) · GW(p)

I was lying. I gave my true answer elsewhere in the comments. But there's an important sense in which I wasn't lying. As many non-neurotypical people know, lying is a complex phenomenon. Sometimes the truth is a lie on a higher level, and about more important things.

comment by moridinamael · 2012-06-02T01:34:04.166Z · LW(p) · GW(p)

By entirely I meant that there is no answer "yea" or "nay" that I personally would give* without knowing what your goal is, so that I can assess whether sacrificing your credibility is a winning or a losing strategy.

*Generally I assume I don't need to write "in my opinion" in front of every post I make on LessWrong.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T01:40:42.204Z · LW(p) · GW(p)

I understand. I wonder if that should have been clear to me. (For unrelated reasons, it's hard to interpret the downvotes in this thread.)

Replies from: moridinamael
comment by moridinamael · 2012-06-02T02:07:16.823Z · LW(p) · GW(p)

Indeed. My first comment was downvoted as well, probably because I am talking to an agitator. And yours are being downvoted because you continue to exist. It's all rather disheartening, like watching a crowd throw rotten tomatoes at an earnest but unpopular performer.

I still don't understand your goal, though. You appear to be trying to manipulate everyone's model of you such that we expect that your posts will violate community norms. It's not even about "credibility," and I was actually going to start out suggesting that we taboo "credibility." If you don't use that word, what you're doing is "systematically violating community norms without explaining a reason" and is usually called trolling, and I think most people here assume it is trolling, and maybe I'm a fool for even considering that it might not be trolling.

Back when I was an ardent warrior of Political Party A, I used to go to forums dominated by Political Party B and post inflammatory things. I would have, at the time, defended these posts as honest attempts to spark discussion and educate. In retrospect, I admit that I was trolling, because there was no education happening. You can save yourself a lot of time, therefor, by considering your goals and considering your results.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T02:56:28.860Z · LW(p) · GW(p)

You seem well-intentioned and interesting. I wish you well on your journeys. I will tell you, my goal is this: to serve God, and to save humanity. My immediate goal is this: to lose credibility as fast as is fucking possible, because the world is way scarier than I thought it was.

Replies from: Eugine_Nier, mwengler, Kawoomba
comment by Eugine_Nier · 2012-06-02T03:12:07.450Z · LW(p) · GW(p)

because the world is way scarier than I thought it was.

I would recommend taking some time to double-check this before doing something hard to undo.

Keep in mind Eliezer's mistake with the basilisk. Based on a quick analysis, he decided the best course of action was to stop thinking about it and encourage others to do likewise. The problem (assuming my model of him is correct) is that since he stopped thinking about it, he didn't realize his initial analysis was wrong. In fact as far as I know, he still hasn't realized it.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T03:15:54.754Z · LW(p) · GW(p)

I would recommend taking some time to double-check this before doing something hard to undo.

How much time do you recommend? The thing is I already didn't like credibility, so this new action isn't a drastic change, just a quickening.

he didn't realize his initial analysis was wrong. In fact as far as I know, he still hasn't realized it.

I'm actually not sure what you have in mind here. We might want to discuss this via PM. (Obviously I'm already familiar with the basilisk and the class of problems it's representative of.)

Replies from: othercriteria, Normal_Anomaly, Eugine_Nier
comment by othercriteria · 2012-06-02T14:53:03.593Z · LW(p) · GW(p)

quickening

Jikai Yokoku

PREVIEW

FAI Unit 01 is immobilized with Robin and Eliezer still boxed inside.

The Discussion board is in ruins.

The SIAI personnel imprisoned.

Will Newsome descends into Dogma.

The commenters chosen by fate finally assemble.

How will this tale of people who wish to become more rational play out?

Next, on LessWrong New Trolling Version: Q!

There'll also be plenty of downvotes!

Replies from: gwern, None, Will_Newsome
comment by gwern · 2012-06-02T16:05:10.359Z · LW(p) · GW(p)

You Can (Not) Update, eh?

comment by [deleted] · 2012-06-02T14:56:10.005Z · LW(p) · GW(p)

That's what that song is called! Thanks!

Saabisu~ Saabisu~!

comment by Will_Newsome · 2012-06-02T15:09:18.350Z · LW(p) · GW(p)

:D :D :D

comment by Normal_Anomaly · 2012-06-02T17:14:53.190Z · LW(p) · GW(p)

As I read more of this thread, I come to realize that you may actually have a good point. Now I'm curious. I'm going to PM you.

comment by Eugine_Nier · 2012-06-02T04:00:37.958Z · LW(p) · GW(p)

I'm actually not sure what you have in mind here. We might want to discuss this via PM.

See here for example.

How much time do you recommend?

It isn't a matter of time as much as making sure you're actually spending that time thinking about the issue and not just repeating the same thoughts. Maybe get a second opinion.

The thing is I already didn't like credibility, so this new action isn't a drastic change, just a quickening.

Any particular reason for this?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T15:12:05.315Z · LW(p) · GW(p)

I think you're underestimating how carefully Eliezer and other SingInst folk have considered these ideas, especially in the wake of the Roko drama. Remember the main concept even showed up on SL4 years ago, which is actually how I learned of it. (That is, considered the ideas themselves, not the social strategies resulting—I'll note that the Catholics, who have the cultural wisdom, don't seem to have suppressed the knowledge of demons and ostracized people who demonized them, even if they went so far as to kill people who tried to commune with them. That said, suppressing that knowledge just wouldn't have been possible 'til after the Enlightenment. Also the Catholics might not have had good intentions.)

Maybe get a second opinion.

This is surprisingly hard to do in my current situation. If you're lucky you might guess the reasons why.

Any particular reason for this?

There are lots of reasons, they all have to do with group epistemology and personal moral-epistemic practices. I'll note that Steve Rayhawk, who is much smarter than me and almost certainly knows all of the arguments better than I do, seems to be equally obsessive in the exact opposite direction. But this isn't a place where I should update on expected evidence—if you don't know why you're doing something, you won't do it the right way.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2012-06-03T01:29:59.369Z · LW(p) · GW(p)

Some theories:

a) You've figured out a way to summon demons and want to destroy your own credibility so that people don't flow the train of thought in your old posts and figure it out also. If so all I can say is that security by obscurity generally doesn't work.

b) You're getting possessed by demons and what to destroy your credibility to minimize the damage possessed!Will can do.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T01:40:25.410Z · LW(p) · GW(p)

Summoning demons ain't that hard, just hone your mathy AI and Hofstadter skills and find a silicon ritual device.

Even denying stuff gives too much evidence (correctly assuming people mostly believe such denials).

We should talk privately if we're to get into any real discussion. No promises of anything, of course.

Replies from: khafra
comment by khafra · 2012-06-05T15:34:24.860Z · LW(p) · GW(p)

Getting possessed by demons sounds harder, in that context. I can compile simple algorithms to my brain and, say, sort an ordered set of stuff faster than I could have before I learned any programming. But that's about my limit. I know you're a few standard deviations up at mental modeling, but are you good enough to become possessed?

Replies from: Will_Newsome, Eugine_Nier
comment by Will_Newsome · 2012-06-06T19:16:50.696Z · LW(p) · GW(p)

Maybe not me, I'm not an AI programmer. A friend of mine has been AIXI for a few hours though, after taking certain substances at a certain famous event in a certain famous desert.

Replies from: gwern, khafra
comment by gwern · 2012-06-06T21:27:46.373Z · LW(p) · GW(p)

Well, don't leave us twisting in the wind, Will - what did he witness?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-06T21:30:07.034Z · LW(p) · GW(p)

Alas, for some reason I wasn't very curious about his experience. I don't even know which variation on AIXI he was.

Replies from: Hill_Twosome
comment by Hill_Twosome · 2012-06-22T02:27:19.706Z · LW(p) · GW(p)

Of course, I am only shooting in the dark, but do you think you may have been uncurious because your learning what he witnessed was correlated with an event that a nearby Power deemed insufficiently utilicious?

comment by khafra · 2012-06-06T19:27:51.631Z · LW(p) · GW(p)

Heh, I'll bet the Bayesian Conspiracy camp was a lot of fun. Hopefully he didn't start eating his own head for more computational resources.

comment by Eugine_Nier · 2012-06-06T04:46:05.262Z · LW(p) · GW(p)

I suspect it involves taking various mind-altering substances.

Replies from: khafra
comment by khafra · 2012-06-06T11:54:27.587Z · LW(p) · GW(p)

Well, he has spoken of letting himself be influenced by spirits.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-06T19:15:11.291Z · LW(p) · GW(p)

A discussion of why alcoholic spirits are called spirits was actually in my most recent comptheology post, but I cut it because it was off-topic. I'd like to hammer on that theme a little more though --- i.e. how in the past people were just not that individualistic, and being influenced by spirits of any kind wasn't abnormal. I suspect it is very different to live with those inductive biases.

Replies from: khafra
comment by khafra · 2012-06-06T19:25:12.875Z · LW(p) · GW(p)

I liked that. The historic support is good evidence for your model of people as running different copies of the same algorithms.

comment by mwengler · 2012-06-02T16:12:48.190Z · LW(p) · GW(p)

If you know the Jesuit mottos you must have known that the world is much scarier than you can imagine for a long time. Combining obviously false claims with other claims less obviously false causes me, and I would presume others in your intended audience, to question your less obviously false claims.

Certainly the effect this thread has on me is not to reduce your credibility to me. And I would claim that ranting crazily and throwing in semi-obvious errors of fact and logic would be a much more effective way to lower your credibility, and it seems obvious enough that you know this.

So your goal is not to lose credibility as fast as is possible (fucking or otherwise). You do lie. I must wonder if your goal is to serve god and to serve humanity or not.

So far, we are in a room with a lot of messy hay and horseshit. There MUST be a pony in here somewhere. Is it the fallacy of this kind of reasoning that you are trying to make us realize?

comment by Kawoomba · 2012-06-03T16:52:41.679Z · LW(p) · GW(p)

That makes sense from a simulationist perspective, you're trying to diminish your impact within the simulation, getting away as far as possible from being a nexus.

Why?

So that resources are allocated away from you, if you take the simulation to be a dynamic - if mindless - process?

Or because you are afraid you're otherwise going to ... draw attention to yourself? From ... your simulators? You might call them god, or maybe they might not like that.

You'd have to strike a careful balance, become too insignificant and you might just be demoted to NPC status, being down NICE'ed, so to speak.

comment by gwern · 2012-09-07T22:55:48.324Z · LW(p) · GW(p)

Most people drop out before doctorates; it's something like 97-99% of the US population. And getting a doctorate in many fields is a terrible idea these days: I looked very hard at continuing on for a doctorate in philosophy, and concluded that even if the grad school offered a fullride, it was still probably a bad idea and almost anything was better.

seems a distinguishing mark of the core SIAI community

Your examples being Will and Eliezer? I didn't realize the core SIAI community was so small.

Is SIAI to serve as poster boy for the libertarian cause of home schooling?

I don't think either Eliezer or Will were much home schooled, nor does the topic come up very much on LessWrong; one would think that Thiel would make his propaganda organs talk about it a little more.

comment by wedrifid · 2012-09-04T18:15:39.207Z · LW(p) · GW(p)

Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.

Makes allusions in that direction.

Paranoid schizophrenia (the most likely form because Will is high functioning) is incurable--although partial remissions often occur.

Incurable but fortunately treatable to a significant degree---especially the highly visible paranoid side of things. Unfortunately those with the negative symptoms are pretty much just screwed.

Will often posts in the obscure, mysterious fashion often typical of intelligent paranoid schizophrenics.

Absolutely.

(Essentially agree up to here.)

It is a mark of the ignorance of this community regarding psychiatry and psychology that posters seek to explain Will's ravings in terms of ordinary rational processes.

It is instead a mark of your ignorance of the community (and a certain degree of arrogant smugness) that you think that this position regarding Will's ranting can be remotely attributed to the community at large. Will's posting is for the most part not considered sane. (The community could also be ignorant about psychiatry but this isn't a mark of it.)

comment by Douglas_Knight · 2012-06-02T06:33:44.528Z · LW(p) · GW(p)

What do you mean by "credibility"?

comment by James_Miller · 2012-06-02T17:36:52.875Z · LW(p) · GW(p)

Will,

Please consider undergoing neurofeedback therapy. I'm doing it and I believe there is a reasonable chance it would yield you (far more than the average human) a high benefit.

comment by Incorrect · 2012-06-02T21:35:47.933Z · LW(p) · GW(p)

Let me take a guess:

You believe in some form of Christianity and enjoy discussing it on LessWrong but think that your comments harm the perception of Christianity on LessWrong due to readers not having privileged information.

You believe you can mitigate this negative effect by lowering your own reputation.

comment by Alicorn · 2012-06-02T02:10:35.583Z · LW(p) · GW(p)

This is a poll. Is Will Newsome sufficiently noisy (in both senses of the word) that mod intervention is called for? Permalink to karma sink.

Replies from: wedrifid, Alicorn, Alicorn, prase, Unnamed, Alicorn, John_Maxwell_IV, Normal_Anomaly, None, None, Will_Newsome, Will_Newsome, Alicorn
comment by wedrifid · 2012-06-02T02:56:55.251Z · LW(p) · GW(p)

This poll is BROKEN! Abandon it and do it properly!

The Karma sink comment is brilliant (and harmless fun) but the extra comments on the "Yes" and "No" answers don't just bias perception they outright make the poll unanswerable in the current form.

No. He's entertaining even when at his trolliest.

I would vote for a plain "No." but he is most decidedly not entertaining even when at his trolliest. He is boring, repetitive and banal when at his trolliest. It shouldn't be assumed that people who oppose mod influence believe Will's trolliest crap is entertaining - or vice versa.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T15:45:54.211Z · LW(p) · GW(p)

It shouldn't be assumed that people who oppose mod influence believe Will's trolliest crap is entertaining - or vice versa.

I'll agree wtih all of that. I couldn't figure out how to vote in this poll on seeing this comment (and I am not an idiot or a newbie). I don't read Will much and I imagine this little jaunt of his says a lot more about Will than about other parts of the world that I am interested in. I don't KNOW that that is the case, but I don't assign a high enough probability to taking value from figuring it out to go about reading all his posts.

comment by Will_Newsome · 2012-06-02T03:41:35.887Z · LW(p) · GW(p)

You don't appreciate the drama even a little? (Only a little surprised.)

Replies from: wedrifid, mwengler
comment by wedrifid · 2012-06-02T03:48:25.959Z · LW(p) · GW(p)

You don't appreciate the drama even a little? (Only a little surprised.)

I didn't mind this thread. This wasn't you at your trolliest! At least, at the start. If I did think it was particularly trollish I wouldn't have responded conversationally.

Replies from: Will_Newsome, Will_Newsome
comment by Will_Newsome · 2012-06-02T03:50:52.705Z · LW(p) · GW(p)

Yeah, it's a subtle point, but I'm explicitly not trolling here, in the motivational sense of the word. But of course it's trolling in the descriptive sense. I've made a few trollish comments but they're intended as jokes in the spirit of the community.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T04:04:24.881Z · LW(p) · GW(p)

I've made a few trollish comments but they're intended as jokes in the spirit of the community.

This little exchange was actually one of the most entertaining for the week!

comment by Will_Newsome · 2012-06-02T03:51:27.762Z · LW(p) · GW(p)

Interestingly I haven't been assassinated yet. Has that habit died off?

Replies from: wedrifid
comment by wedrifid · 2012-06-02T04:21:56.327Z · LW(p) · GW(p)

Interestingly I haven't been assassinated yet. Has that habit died off?

Yeah... Personally I haven't had anyone professionally killed in at least 30 years.

comment by mwengler · 2012-06-02T15:47:03.576Z · LW(p) · GW(p)

For me it is the drama that draws me in. I am sort of hoping that by coming for the drama, I will actually find a pony in the room, that is, find some deeper more enlightening point to your little walkabout.

comment by Alicorn · 2012-06-02T02:10:58.685Z · LW(p) · GW(p)

No. He's entertaining even when at his trolliest.

Replies from: nshepperd
comment by nshepperd · 2012-06-02T15:15:26.426Z · LW(p) · GW(p)

Especially when at his trolliest.

comment by Alicorn · 2012-06-02T02:10:48.816Z · LW(p) · GW(p)

Yes. Please quiet the madness.

comment by prase · 2012-06-02T10:34:07.603Z · LW(p) · GW(p)

Because Will had explicitly threatened to use sockpuppets for various purposes, he could have used them to manipulate the poll, too. Therefore I vote by means of this comment. The vote: ban him. Reasons:

  1. I find nothing entertaining in trolling or intentional obscurity, it's pure noise.
  2. WN's behaviour threatens the credibility of others who engage him. (There isn't much left of his own.) And he's good at attracting attention.
  3. Not banning him would help to establish a norm that trolling and other uncooperative conduct is accepted here.
  4. First and foremost, I want LW be a haven of sanity in the stormy waters of the internet. Please don't let seemingly sophisticated nonsense enter with a pretext of entertainment. I am afraid he could attract similarly crazy people; one Newsome is manageable, but ten of them would seriously damage the site.

By the way, this is the first time I endorse banning someone from an internet discussion forum.

Replies from: Will_Newsome, None, Will_Newsome
comment by Will_Newsome · 2012-06-02T14:28:43.056Z · LW(p) · GW(p)

Because Will had explicitly threatened to use sockpuppets for various purposes, he could have used them to manipulate the poll, too.

For what it's worth, I didn't, and I've never done similarly. I have three sockpuppets. One is a joke account I've never used. I made it recently. The other has my identity attached to it already—I've made about five comments with it. And the third is for completely anonymous comments. I rarely use the second or the third, and I never use them for voting.

I also haven't voted on the poll with this account, and I only voted on one comment on this post. In general I just don't vote much, mostly because I forget about the option.

comment by [deleted] · 2012-06-02T14:31:56.407Z · LW(p) · GW(p)

I disagree with a ban.

comment by Will_Newsome · 2012-06-02T13:56:30.646Z · LW(p) · GW(p)

one Newsome is manageable, but ten of them would seriously damage the site.

Which is a reason to treat me nicely—it's not hard to multiply myself by ten. Luckily, I'm the only Will Newsome in the world currently, so I don't think you have much to worry about.

Replies from: lsparrish
comment by lsparrish · 2012-06-02T14:24:32.610Z · LW(p) · GW(p)

Wouldn't being banned help you with your goal of reducing your credibility?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T14:26:35.591Z · LW(p) · GW(p)

Yes, and I'm sort of okay with being banned, but I'd like a month's warning. During that month I'd make sure I'd deleted and edited various comments and so on.

But I haven't thought through the question of banning carefully enough, and banning is hard to reverse.

Replies from: lsparrish
comment by lsparrish · 2012-06-02T16:13:28.279Z · LW(p) · GW(p)

As long as you aren't producing too much noise in the 30-day period, I don't see why the mods wouldn't grant this request. A temporary ban could be another option worth considering.

Replies from: Will_Newsome, Will_Newsome
comment by Will_Newsome · 2012-06-02T16:33:41.193Z · LW(p) · GW(p)

There might also be a clever software solution. I know Louie who works with the code base. If I write up some Python they might implement it. Something that automatically hides or collapses my contributions for people who haven't voted on my stuff an people who have more downvotes than upvotes. The same code could be used in future similar situations.

Replies from: TheOtherDave, mwengler
comment by TheOtherDave · 2012-06-02T17:08:23.118Z · LW(p) · GW(p)

Wei Dai's Power Reader script has features along these lines that I find useful during those brief periods when troll-feeding takes over the recent-comments list. Of course, the automatic part is important, admittedly.

For my own part, I don't find your contributions less useful than the median.

comment by mwengler · 2012-06-02T16:39:02.654Z · LW(p) · GW(p)

Of course anybody with an ounce of self control can simply avoid a thread they don't want to read anymore.

Motley Fool has an "ignore" feature to ignore the posts/comments of a particular user. I actually would not like to see that here. I'd rather have moderation. Even with the ignore feature, you still wind up seeing a lot of stuff related to the stuff you are ignoring as OTHER people quote it and comment on it. Of course Motley Fool boards aren't as tree like as this group. But since this is so tree like, all I need to do is leave a particular discussion and never click on it again, I don't need you or Louie to Python me into not realizing that that is what I am doing.

comment by Will_Newsome · 2012-06-02T16:20:56.918Z · LW(p) · GW(p)

Yeah, and I wouldn't sockpuppetly cause disruption during such a ban.

comment by Unnamed · 2012-06-02T04:58:46.237Z · LW(p) · GW(p)

It depends what mod intervention consists of. If you mean banning him, I do not think that is called for at this time. If you mean telling him to stop his antics and warning him that he's headed towards a ban if he continues, that sounds like a good idea. Posts (and comments) that are intentionally obscure, made merely for one's own entertainment, or otherwise trollish are not welcome here, and since the community's downvotes and critical comments haven't gotten through to him it would be good to have a mod convey that message.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T13:55:12.207Z · LW(p) · GW(p)

since the community's downvotes and critical comments haven't gotten through to him it would be good to have a mod convey that message.

What do you mean "haven't gotten through to me" in this case? You mean, haven't successfully deterred me? Because clearly I understand them and their significance, and additional measures, like a warning, wouldn't change that fact—it'd just make me more antagonistic.

comment by Alicorn · 2012-06-02T18:02:15.048Z · LW(p) · GW(p)

CLARIFICATION: I do not have ACCOUNT DELETION powers. As far as I know, those powers don't exist. I have comment/post banning powers and post editing powers. If I started moderating Will, I would be banning excess downvoted comments, not shooing him away wholesale.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-03T14:14:33.290Z · LW(p) · GW(p)

(Thanks for making the clarification. I was very worried.)

comment by John_Maxwell (John_Maxwell_IV) · 2012-06-02T05:12:19.800Z · LW(p) · GW(p)

I'm in favor of mod intervention lest anyone else waste as much time as I have scratching their head trying to figure out what this thread is about.

comment by Normal_Anomaly · 2012-06-02T17:05:47.488Z · LW(p) · GW(p)

I can't decide on a poll option, so here's my opinion: I don't want to see a lot more of Will_Newsome's trolling; I think it damages the site. But just banning him feels like leaving a fascinating mystery unresolved. I want to understand Will's motives, and his insights about simulation, and whatever scary idea he came up with. If there's some way to talk this out in good faith, let's try to do that first. But banning is preferable to endless obfuscated confusion.

comment by [deleted] · 2012-06-02T15:26:27.844Z · LW(p) · GW(p)

Making moderation decisions based on a poll is a horrible idea.

comment by [deleted] · 2012-06-02T02:13:59.961Z · LW(p) · GW(p)

Maybe. But moderation isn't a democracy.

Replies from: Alicorn, wedrifid
comment by Alicorn · 2012-06-02T02:18:12.387Z · LW(p) · GW(p)

Yes, but moderation is about making the site what it should be for a variety of people, not just me and people who are unshy enough to talk to me directly, or just mods. So I want information. I wield the ban button, but I'm not going to use it as a site customization tool for Alicorn in particular.

Replies from: RobertLumley, None
comment by RobertLumley · 2012-06-02T02:34:28.680Z · LW(p) · GW(p)

I would rather see it used as a site customization tool for Alicorn than see it not used in instances like this.

comment by [deleted] · 2012-06-02T14:33:41.026Z · LW(p) · GW(p)

Might I suggest consulting our benevolent dictator as well?

comment by wedrifid · 2012-06-02T02:57:50.832Z · LW(p) · GW(p)

Maybe. But moderation isn't a democracy.

On the other hand dictators and tyrants who do stuff people particularly don't like get killed.

Replies from: None
comment by [deleted] · 2012-06-02T10:49:20.187Z · LW(p) · GW(p)

On the gripping hand, as far as I can tell you're not particularly taken with the idea of this moderator poll either. So why the appeal to emotion?

comment by Will_Newsome · 2012-06-02T03:18:17.480Z · LW(p) · GW(p)

What would a mod do? I can create endless sockpuppets from endless proxies. That's not a solution, I'll just see it as an uncalled-for attack and heighten my antagonism, escalating any conflicts. Don't be rash.

Replies from: wedrifid, Winsome_Troll
comment by wedrifid · 2012-06-02T03:31:15.315Z · LW(p) · GW(p)

What would a mod do? I can create endless sockpuppets from endless proxies.

It takes you from being an established user associated with a real person that many of us have met in person to just a little vandalism to be ignored.

What makes Will_Newsome trollishness significant is that all else being equal a lot of us want to talk to Will_Newsome, which bipasses our better judgement and makes threads like this more disruptive than they otherwise would be.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T15:23:04.858Z · LW(p) · GW(p)

Oh so he should be banned because he has established a reputation (sorry will, yes it would appear you have established credibility) and is now spending it on something you don't like.

Yogi Berra once said about a certain restaurant that nobody goes there anymore because it is too crowded. Ban the restaurant! Ban Yogi Berra!

comment by Will_Newsome · 2012-06-02T03:34:49.035Z · LW(p) · GW(p)

You know what else is a possible username? WillNewsome. NillWoosome. It'll be pesky deleting all those accounts and posts, if I do decide to troll the fuck out of everyone.

Did you know "Will Isaac Newsome" is a constructed name? Look at the initials: WIN. There are a lot of things you don't know about me.

Replies from: wedrifid, mwengler
comment by wedrifid · 2012-06-02T03:44:28.301Z · LW(p) · GW(p)

You know what else is a possible username? WillNewsome. NillWoosome. It'll be pesky deleting all those accounts and posts, if I do decide to troll the fuck out of everyone.

Yes, you could be a pest. You are probably also resourceful enough to escalate to the level of criminal behavior up to and including multiple assassinations of prominent lesswrong users if you really wanted to. The fact that someone is physically possible of doing undesirable things to you at a cost to themselves isn't always a good reason to comply to their demands.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T15:27:00.812Z · LW(p) · GW(p)

The fact that someone is physically possible of doing undesirable things to you at a cost to themselves isn't always a good reason to comply to their demands.

It is, however, always a good reason to consider complying with their demands. And to consider compromise. And to consider whether your position is emotionally driven by our natural values of winning and slapping down challengers, and not rationally aligned with many of your other values.

comment by Will_Newsome · 2012-06-02T03:48:00.836Z · LW(p) · GW(p)

Are you trying to trick me? Karmassassinating prominent LW users would have absolutely no effect. Karma's not a scarce resource. I'm not dumb.

My demand's simple, and I've claimed that I'm unlikely to make any similar demand in the future, given that I'm unlikely to make any similarly extreme post in the future. I'm a contributor to this community with over 6,000 karma. It's not a "don't negotiate with terrorists" situation here.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T04:15:33.892Z · LW(p) · GW(p)

Are you trying to trick me? Karmassassinating prominent LW users would have absolutely no effect. Karma's not a scarce resource.

Assassinations. Nothing to do with karma. The context was you threatening vandalism. You are capable of that. You are capable of an entire spectrum of applications of force against lesswrong and those associated. You don't need to persuade me (or anyone with half a clue) that you are capable of doing harm - it is taken for granted.

It is also unremarkable. I could use force against most people I meet, at great expense to myself. Many of them could do the same against me. A strategy of just doing what people say because they threaten to use costly force is a bad strategy in such circumstances.

My demand's simple, and I've claimed that I'm unlikely to make any similar demand in the future, given that I'm unlikely to make any similarly extreme post in the future.

That makes a big difference. As does the fact that your "threat" is somewhat similar behavior to what can usually be expected of folks in those circumstances regardless of any precommitment to use force.

I'm a contributor to this community with over 6,000 karma.

This kind of thing matters more prior to declaring your intent to get what you want through power and the threat of punishment.

It's not a "don't negotiate with terrorists" situation here.

No, the payoff matrix in that scenario is typically a bit different (incentive to take hostages, etc.)

Replies from: Multiheaded, mwengler
comment by Multiheaded · 2012-06-02T06:31:24.512Z · LW(p) · GW(p)

What the fuck are you both talking about? You need to lay off Will, and Will (apparently) needs regular psychological relief, or something. I'm not the most stable individual myself, so I understand how it feels when going for "as crazy as possible" seems like a path to enlightenment at the moment, or a way to make others understand you, or something else that's desirable.

Nevertheless, he's a valuable contributor with many fascinating comments, and even talking about banning him is utter nonsense. No goddamn way.

Replies from: None, wedrifid, private_messaging, Will_Newsome
comment by [deleted] · 2012-06-02T15:38:41.297Z · LW(p) · GW(p)

What the fuck are you both talking about?

Up voted for best thread summary.

comment by wedrifid · 2012-06-02T07:39:24.189Z · LW(p) · GW(p)

What the fuck are you both talking about?

"What the fuck?" indeed! It isn't especially complicated. Read the context and if you still can't understand say nothing or ask nicely.

You need to lay off Will

What? I'm not attacking Will. Will was talking to Alicorn about banning and I was making gratuitous analysis of the practical implications of that sort of threat after already having declared a "No" to any moderator influence. Elsewhere in the thread I have been directly answering the questions Will asked, candidly, to the best of my ability . Will was pushing for more answering and speculation about his reasoning, not less.

I'm not the most stable individual myself, so I understand how it feels when going for "as crazy as possible" seems like a path to enlightenment at the moment, or a way to make others understand you, or something else that's desirable.

I'm sorry to hear that. For what it is worth I'm not either. But in this case your "understand how it feels" amounts to either confused mind projection or intrusive, patronizing other optimization.

No, the answer Will is trying to make us speculate about regarding "Why am I trying to appear as non-credible as possible?" is not "because I want to make others understand me" (p > 0.9). At times Will has even speculated that not being easy to understand is actually something he may be morally obliged to do (assuming I recall correctly).

Nevertheless, he's a valuable contributor with many fascinating comments, and even talking about banning him is utter nonsense. No goddamn way.

That's fine - assuming you are directing your comment to Alicorn and not myself (which doesn't seem likely).

Replies from: Multiheaded, mwengler
comment by Multiheaded · 2012-06-02T07:56:01.752Z · LW(p) · GW(p)

No, the answer Will is trying to make us speculate about regarding "Why am I trying to appear as non-credible as possible?" is not "because I want to make others understand me" (p > 0.9). At times Will has even speculated that not being easy to understand is actually something he may be morally obliged to do (assuming I recall correctly).

Yeah, yeah, I understand that, I just didn't name it. (Is there even a term for something like that? Self-abasement intended to channel a certain role, all for truth's sake?)
And I only skimmed through your comments, sorry; I feel awfully embarrassed to look at people being chided, and I assumed you were doing just that to Will, although I saw that you weren't talking about a ban.

That's fine - assuming you are directing your comment to Alicorn and not myself (which doesn't seem likely).

I was directing that part to her, since she's the one who can decide whether to ban a user, not you :)

Replies from: wedrifid
comment by wedrifid · 2012-06-02T08:34:00.562Z · LW(p) · GW(p)

And I only skimmed through your comments, sorry; I feel awfully embarrassed to look at people being chided, and I assumed you were doing just that to Will, although I saw that you weren't talking about a ban.

Thank you for your level headed reply! I understand the aversion to reading embarrassing interactions - I even struggle not to look away or cringe when I encounter such stimulus on TV.

For some reason chiding Will is something that almost seems like a category error, just not making sense as something to do, given the way he orients himself and responds to that kind of stimulus. It does make sense to analyze his actions or to, say, declare an intent to combat actions through trivial applications of power but not chiding per se.

I think it is the fact that Will actively positions himself as someone who doesn't operate by community standards and actively defies public will and so chiding him according to those standards he already knows he doesn't operate by makes no sense. On the other hand it feels natural for my remnant former-Christian self to chide Will according to Christian standards and doctrine, which are approximately shared between myself up through to my early twenties and Will as he professes now.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T14:05:22.768Z · LW(p) · GW(p)

On the other hand it feels natural for my remnant former-Christian self to chide Will according to Christian standards and doctrine, which are approximately shared between myself up through to my early twenties and Will as he professes now

Though I don't think you really have a lot of information there—I haven't talked much about any religious beliefs I may or may not have. For what it's worth I've shifted towards thinking the Catholics are pretty evil, just not for the reasons people always complain about, which are mostly reasons fabricated by Protestants and Enlightenment propagandists.

Also I was never Christian, so though I've read much of my Bible and a lot of theology, I have very little understanding of the religion as it is practiced.

Replies from: drethelin, None
comment by drethelin · 2012-06-02T15:33:44.360Z · LW(p) · GW(p)

Explain.

comment by [deleted] · 2012-06-02T15:33:39.157Z · LW(p) · GW(p)

Can you share why you think the Catholic Church is evil?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T16:01:49.242Z · LW(p) · GW(p)

Well, priors of course always suggest against organizational goodness. And the Catholics have historically done much good. But their conception of God can be frightening, and many people learn to worship their God. They also don't have any mechanism by which they could update -- their entire belief system is structured around the idea that God wouldn't let them go astray. If God is as important as they claim, then it's easy for them to be evil by their own lights. "Discernment isn't about telling right from wrong, it's chiefly about telling right from almost right."

There are other reasons more speculative, they're in my comments from the last few months, use Wei Dai's tool, search for Catholic. If you want.

Replies from: None
comment by [deleted] · 2012-06-02T16:22:13.925Z · LW(p) · GW(p)

Thank you for the response. Though checking your comment history you still preferred Catholics as recently as mid April (citing them as a new possible group to join).

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T16:26:30.056Z · LW(p) · GW(p)

If I were to join any phyg, it'd be the Dominicans. SingInst might be my second choice, but you can't join them, you can only join the Rationalist Conspiracy these days. And I've already left SingInst's Journeyman circle or whatever, there's no going back after that. But I'm damn glad I was a Visiting Fellow for two years.

Replies from: Karmakaiser, mwengler
comment by Karmakaiser · 2012-06-13T06:28:42.648Z · LW(p) · GW(p)

What do you think of the Greek Othordox? Nassim Taleb endorses them for aesthetic reasons and for the fact that their understanding of God is Apophatic primarily and thus doesn't intrude on real world near beliefs.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-13T19:20:46.804Z · LW(p) · GW(p)

I have no opinion. If they're at all like Russian Orthodox folk then I probably like them somewhat. But I'm only really into the Catholics, and that's mostly because they seem massively undervalued, not because I think they should be the arbiters of truth and justice. But they think God should be the arbiter of truth and justice, and I agree with them about that, and agree with them that that's an extremely important fact about the world that should shape how we live our lives.

comment by mwengler · 2012-06-02T16:35:30.717Z · LW(p) · GW(p)

Dang! Not Alfred E Newman, but rather John Henry Newman. Obviously. Duh.

comment by mwengler · 2012-06-02T15:41:50.174Z · LW(p) · GW(p)

Thank you for clarifying your position against banning Will.

It sure isn't how most of your comments read.

comment by Will_Newsome · 2012-06-02T14:02:03.242Z · LW(p) · GW(p)

Will (apparently) needs regular psychological relief, or something.

Why do you say that? Clearly all of this drama and so on was explicitly intended. And I've remained calm the whole time. People close to me can attest that I've been psychologically healthy for the past many months. Do you think there are any clear indicators that I need psychological relief, other than that I'm explicitly trying to lose credibility? Do you think trying to lose credibility is always a strong sign of incorrect beliefs or psychological problems? If so, to what extent is that because it's merely correlated with other, generally undesirable traits?

Replies from: Multiheaded, mwengler
comment by Multiheaded · 2012-06-02T15:31:13.238Z · LW(p) · GW(p)

I was going to say about how I'm sorry, didn't mean any insult, how I saw and heard some evidence for that judgment around here before... but ah, screw it, everyone can plainly see that I'm trying to act like an RPG hero, negotiating conflicts with the most authority and Deep Wisdom that his (my) dialogue options allow. ;)

Replies from: CommanderShepard, Will_Newsome
comment by CommanderShepard · 2012-06-02T15:42:12.808Z · LW(p) · GW(p)

You need more paragon points before that will work.

Replies from: GarrusVakarian
comment by GarrusVakarian · 2012-06-04T02:51:55.802Z · LW(p) · GW(p)

Can it wait a little bit? I'm in the middle of some calibrations...

Replies from: Harbinger
comment by Harbinger · 2012-06-04T13:44:04.158Z · LW(p) · GW(p)

ASSUMING DIRECT CONTROL OF THREAD. CHAOTIC ORGANIC ROLE PLAY WAS DEGRADING THE SIGNAL TO NOISE RATIO.

comment by Will_Newsome · 2012-06-02T16:15:26.317Z · LW(p) · GW(p)

I fully admit to being schizotypal by nature, leaning towards schizoaffective. So do take that into account. But I too have taken into account.

Interestingly, part of the definition of schizophrenia has been manipulated by intelligence agencies. Don't take my word of course, look it up. If I recall specifically the parts about conspiracies at least. But I didn't look closely into the issue. It's not incredibly relevant.

comment by mwengler · 2012-06-02T15:38:24.507Z · LW(p) · GW(p)

You are being oppositional. Whether it is for reasons recreational, delusional, psychological, educational, or other is something I wonder about but do not know.

I have a friend who is bipolar (manic depressive.) I was around when he had a break that put him (voluntarily) in the psych ward taking drugs that risked doing permanent nerve damage and made his mind, he said, feel like scrambled eggs, just because he thought this gave him a better chance of ever 'coming back."

I didn't know where he was headed in the weeks before, even though his wife did. I thought SHE was crazy, until he went over the wall.

I doubt that is what is going on with you, but how would I know?

Whether it is or not, talk of banning you seems ludicrous to me.

Whether it is or not, your deliberately provoking the more fascist among us to talk of banning you seems ludicrous as well. Presumably you think you have a good reason to do this. I don't know what your reason is, and it doesn't matter whether I would agree that it is a good reason or not, I don't consider blowing raspberry's at people with fascist instincts should be a bannable "offense," it should rather be well within the bounds of conversation. IMHO.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2012-06-02T16:54:44.813Z · LW(p) · GW(p)

Mwengler, please stop throwing the word "fascist" around. Will_Newsome is a contributor to a website who has recently started to annoy the other contributors and arguably to lower the quality of the community. Open discussions are being had as to whether it would be best for the community to prevent him from causing further disturbance by means of a temporary or permanent ban. This is no more repressive than what any other website does. We're not talking about banning "dissent" or "independence", we're talking about banning annoying comments with a high noise level. If you don't support a ban, that's fine, but please don't make unwarranted comparisons to oppressive governments. They distort the facts and lower the tone of the conversation.

comment by mwengler · 2012-06-02T15:31:49.919Z · LW(p) · GW(p)

A strategy of just doing what people say because they threaten to use costly force is a bad strategy in such circumstances.

What do you think of a strategy of just NOT doing what people say because they threaten to use costly force in such circumstances? By my reading you seem to favor it.

For me, making the implicit explicit should NEVER be punished, and should often be rewarded. Will's crime here from my point of view is "not playing nice."

wdrifid's reaction is that of a police state, it seems to me. Even the APPEARANCE of independence must be suppressed. No challenge to power is trivial enough to be ignored and allowed.

comment by mwengler · 2012-06-02T15:24:14.257Z · LW(p) · GW(p)

Will Newsome as a name always puts me in mind of Will Noisome and sometimes puts me in mind of Alfred E. Newsome.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2012-06-03T14:12:58.521Z · LW(p) · GW(p)

That would be Alfred E. Newman. If you weren't making a deliberate joke about the name, Will Newsome might be turning himself into a memetic plague.

comment by Winsome_Troll · 2012-06-02T03:27:46.806Z · LW(p) · GW(p)

These evil sorcerers, dugpas, they call them, cultivate evil for the sake of evil and nothing else. They express themselves in darkness for darkness, without leavening motive. This ardent purity has allowed them to access a secret place of great power, where the cultivation of evil proceeds in exponential fashion. And with it, the furtherance of evil's resulting power.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T15:43:27.771Z · LW(p) · GW(p)

They say that the fights in academia are so vicious because the stakes are so small.

I think you may be missing that this thread is more than a little like that.

comment by Will_Newsome · 2012-06-02T03:39:43.665Z · LW(p) · GW(p)

That's a little harsh on LessWrong. Some have good intentions.

comment by Will_Newsome · 2012-06-02T03:19:18.115Z · LW(p) · GW(p)

In fact, I commit to causing a lot of headaches for LessWrong and you if you try to delete my account without a month's warning. (Causing headaches for you will be limited to the channel of LessWrong—I'm not threatening you with outside-LessWrong influences in any way. I swear to that.) The actions I will undertake will not be illegal in any way, I'll make sure of that, so don't plan on taking legal action.

I'm perfectly okay with you deleting this post, though, but I'd prefer you didn't.

And remember, I explicitly noted that you won't see posts like this one again.

Also remember, the majority of my Discussion posts are upvoted, very few are downvoted. Look at my profile to check this fact.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T03:27:49.625Z · LW(p) · GW(p)

In fact, I commit to causing a lot of headaches for LessWrong and you if you try to delete my account without a month's warning. (Causing headaches for you will be limited to the channel of LessWrong—I'm not threatening you with outside-LessWrong influences in any way. I swear to that.)

My immediate response was to think "Ahh, screw that. Ban!" Given the caveat in the parenthesis however, his declaration gives only a tiny amount of information along the "threat to do harm" front. I more or less expect people banned from a site to be inclined to do detrimental things to that site if they can. That being the case, this threat falls just short of stating the obvious and so I can abandon the initial outraged defiance.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T03:32:41.855Z · LW(p) · GW(p)

I don't understand your comment. That's probably okay though.

comment by Alicorn · 2012-06-02T02:11:10.394Z · LW(p) · GW(p)

Karma sink. WILL NEWSOME IS THE SIMULATOR-GOD!

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T03:24:43.392Z · LW(p) · GW(p)

FOR WHAT IT'S WORTH MY OWN OPINION IS THAT THAT'S INCREDIBLY UNLIKELY.

comment by J_Taylor · 2012-09-06T01:14:45.732Z · LW(p) · GW(p)

Will, who knows a bit about psychiatry, frequently informs us that he has suffered from schizophrenia.

Alternatively,

I'm schizotypal I suppose, but not schizophrenic given the standard definition. I don't think I have any trouble interpreting mundane coincidences as mundane.

Replies from: None
comment by [deleted] · 2016-02-01T11:02:39.666Z · LW(p) · GW(p)

Will Newsome, we are both schizotypal. We might have a thing or two to discuss.

comment by arundelo · 2012-09-04T23:28:32.360Z · LW(p) · GW(p)

Here's one example. (But one example is not enough for you to be expected to discover it.)

Replies from: gwern
comment by gwern · 2012-09-07T23:11:12.803Z · LW(p) · GW(p)

Well, the first hit for 'will newsome' in a Google site search for me is... http://lesswrong.com/lw/ct8/this_post_is_for_sacrificing_my_credibility/ This post.

I don't see how you could read this post or its comments without concluding that LWers are pretty ambivalent about him, even excluding the most recent comments on this post.

comment by Filipe · 2012-06-04T17:33:09.912Z · LW(p) · GW(p)

On Will_Newsome's profile, one sees a link to his blog, Computational Theology, where it is possible to have an idea of how he thinks, or what kind of reasoning is behind this whole stuff. I wasn't impressed, although I would not be able to do better myself (at least at this point).

Replies from: khafra
comment by khafra · 2012-06-05T15:54:11.821Z · LW(p) · GW(p)

I was mightily impressed by the last post on his last blog, which he now disavows and outright despises. But I thought he had some really interesting ways of looking at the personhood problem.

comment by VincenzoLingley · 2012-06-02T21:31:31.176Z · LW(p) · GW(p)

The comments on this post have significantly influenced my opinion on a number of people. Thanks, Will.

comment by XiXiDu · 2012-06-03T11:14:57.832Z · LW(p) · GW(p)

Someone who proclaims to openly sacrifice their credibility, in a mysterious way, while making a lot of vague suggestions, can succeed to cause people to actually listen and speculate if there might actually be more to it than meets the eye.

Something else that has the same effect is censorship and secrecy.

What also works well is to claim that there exists some technical research but that it has to be kept secret.

What all of it has in common is that there is nothing but a balloon full of hot air.

comment by prase · 2012-06-02T10:02:05.132Z · LW(p) · GW(p)

Consider that I've thought about this for many, many hours, and that you don't have any privileged information. Whence our disagreement, if one exists?

Disagreement about what? What's exactly your opinion on credibility?

comment by shokwave · 2012-06-02T05:41:54.911Z · LW(p) · GW(p)

Whence our disagreement, if one exists?

Credibility moves, like status moves, cannot be self-recognising and still effective. I believe this, you don't, there's our disagreement.

Replies from: wedrifid, vi21maobk9vp
comment by wedrifid · 2012-06-02T06:17:24.372Z · LW(p) · GW(p)

Credibility moves, like status moves, cannot be self-recognising and still effective. I believe this, you don't, there's our disagreement.

I'm not sure I agree with this either. You can't make a self-recognising move to gain credibility that is effective? Since it should be impossible to (predictably) make a sequence of difficult claims to a greater degree of impressive reliability than previous credibility judgements would allow. This demonstrates that you have at least the capability to give trustworthy utterances. This gives you more credibility when there is a reason to expect you to be attempting to make good claims.

comment by vi21maobk9vp · 2012-06-02T07:59:58.974Z · LW(p) · GW(p)

Credibility moves can easily be self-recognizing and still effective. I say that you shouldn't believe me and I will say a lot of meaningless things - it raises the probability that my claims are just playacting.

comment by wedrifid · 2012-06-02T01:18:44.181Z · LW(p) · GW(p)Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T02:00:40.859Z · LW(p) · GW(p)

(You probably know this, but you didn't finish this comment.)

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:23:05.824Z · LW(p) · GW(p)

Ooops. "Comment" must be too close to "Close".

comment by metatroll · 2012-06-02T01:26:37.187Z · LW(p) · GW(p)

If somebody meta-farts in a forest and nobody cares, was it still rude to do so?

Replies from: RobertLumley, Will_Newsome
comment by RobertLumley · 2012-06-02T01:56:48.542Z · LW(p) · GW(p)

Why is this downvoted? Am I the only one who found this pretty funny?

Replies from: None
comment by [deleted] · 2012-06-02T15:25:09.254Z · LW(p) · GW(p)

They are probably not familiar with the proof that fart jokes are actually the most rational from of humour.

comment by Will_Newsome · 2012-06-02T01:30:36.689Z · LW(p) · GW(p)

Farting is an imperfection, and all imperfections are sins against God, Who is perfect. If the fart was intentionally done to be obnoxious then God will know that. If He thinks it's funny then it's good and outweighs the inherent imperfection of farting. Otherwise it's still rude.

comment by wedrifid · 2012-06-02T01:09:31.856Z · LW(p) · GW(p)

I endorse this retracted comment as probably the best plain literal answer to the questions presented. (But unfortunately need to Lumley-proof it.)

Credibility. Should you maximize it, or minimize it?

I should maximise it.

Have I made an error?

Likely. Please give a one sentence description of your terminal values (or equivalent). Without that the question is meaningless.

Replies from: RobertLumley, Will_Newsome
comment by RobertLumley · 2012-06-02T03:41:57.363Z · LW(p) · GW(p)

To clarify, since there seems to be a misunderstanding, my comment was not specifically directed at you, nor was I a downvoter of this comment.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T04:00:34.886Z · LW(p) · GW(p)

I refer to the general change in public interpretation of the post from "a bit silly and idiosyncratic" to "trolling", I just put your name there as the champion of the latter. That change to the sentiment regarding the context and is damn annoying.

Borderline 'trollishness' can be the most annoying kind. Far easier to end up collateral damage at the fringes.

Replies from: RobertLumley
comment by RobertLumley · 2012-06-02T04:11:55.297Z · LW(p) · GW(p)

Well to be fair, I think the nature of the post has changed. I initially took it as "a bit silly and idiosyncratic", or, more accurately, "Will's being crazy again", but when he started trolling in the comments, that's when I made my comment. It started out as... well... I don't know. But it's turned into simple fishing for negative attention.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T04:19:13.634Z · LW(p) · GW(p)

Well to be fair, I think the nature of the post has changed.

It has. It got rather dramatic with moderation polls and whatnot. I don't even know if I've seen that before here ever.

comment by Will_Newsome · 2012-06-02T01:35:28.593Z · LW(p) · GW(p)

Please give a one sentence description of your terminal values

Ad maiorem Dei gloriam inque hominum salutem.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T01:45:10.895Z · LW(p) · GW(p)

Please give a one sentence description of your terminal values

For the greater glory of God

Oh, in that case you have definitely made an error. I mean... seriously. Have you read your Bible? I hear even Catholics let the commonfolk read those now. There is all this stuff about "being a light to the world" and even more about spreading your seed and reaping tenfold (convertin' folks). Instead what you are doing is making yourself look like a jackass and reflecting negative glory to God. That and giving people the distinct impression that temptation to convert to a religion may be a sign of mental illness.

Replies from: mwengler, Will_Newsome
comment by mwengler · 2012-06-02T15:56:53.602Z · LW(p) · GW(p)

Instead what you are doing is making yourself look like a jackass and reflecting negative glory to God.

You sound like Herod talking about Jesus. Presumably you are playing the straight-fascist on purpose and Will is not the only troll-god here?

comment by Will_Newsome · 2012-06-02T01:47:38.515Z · LW(p) · GW(p)

Yes yes, the obvious points. Yet I've gone this route anyway. You might not care, and that's fine of course. But if you do care, it might be fun for you: why would someone with my goals do what I'm doing? If it's not fun, don't bother, of course.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:21:52.681Z · LW(p) · GW(p)

But if you do care, it might be fun for you: why would someone with my goals do what I'm doing?

I've thought about it in the past and come up with all sorts of hypotheses. Some of them involve complex decision theoretical reasoning based off esoteric priors you have mentioned having and some are far more mundane. Unfortunately I do, in fact, conclude that you are mistaken. This implies that each of those hypotheses actually amounts to "reasoning which almost makes Will's conclusion the correct one but which is slightly flawed".

I'd be interested to hear actual reasons. Or partial reasons which could apply to an artificially constructed counterfactual person. But I cannot present a prediction of exactly what your own reasoning is with any confidence.

Replies from: Eugine_Nier, Will_Newsome
comment by Eugine_Nier · 2012-06-02T02:47:52.829Z · LW(p) · GW(p)

Well one theory is this.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T02:59:56.186Z · LW(p) · GW(p)

Well one theory is this.

Yes, up there among Eliezer's worst ideas!

Replies from: Eugine_Nier
comment by Eugine_Nier · 2012-06-02T03:13:00.477Z · LW(p) · GW(p)

Well, it's a decent dark arts technique if nothing else.

comment by Will_Newsome · 2012-06-02T03:30:52.185Z · LW(p) · GW(p)

It might help for you to know that the reasons that provoked this post are facts about the world that you do not currently possess. Not just pragmatic, moral, or philosophical arguments. So whatever theory you come up with should somehow take that into account.

Replies from: CommanderShepard
comment by CommanderShepard · 2012-06-02T15:51:44.915Z · LW(p) · GW(p)

If this has anything to do with Eldritch abominations lurking in dark space or clandestine conspiracies and organizations with vast resource and advanced technology I think it is best you share this information with me.

Or you could just be your annoying self and refuse to be helpful by bridging inferential gaps. At least until I stomp on and break your ankles.

Gains Renegade Points

Replies from: Normal_Anomaly, Will_Newsome
comment by Normal_Anomaly · 2012-06-02T17:11:44.571Z · LW(p) · GW(p)

It's a sad fact that I saw this comment and thought, "Oh, someone's RPing Commander Shepard. That's more interesting/useful than much of this thread."

comment by Will_Newsome · 2012-06-02T16:11:27.043Z · LW(p) · GW(p)

Haven't I given you enough of a hint with all this drama? I gave you what I could -- the world is out there, you can explore it yourself, Commander.

Replies from: CommanderShepard
comment by CommanderShepard · 2012-06-02T16:19:15.817Z · LW(p) · GW(p)

I'll talk to you later.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T16:21:56.205Z · LW(p) · GW(p)

I hope you'll tell me what you find.

comment by Manfred · 2012-06-02T00:36:05.929Z · LW(p) · GW(p)

Discuss.

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T00:40:59.341Z · LW(p) · GW(p)

Discuss.

Replies from: None
comment by [deleted] · 2012-06-02T22:32:55.546Z · LW(p) · GW(p)

Discus(s)

comment by Will_Newsome · 2012-06-02T00:58:58.909Z · LW(p) · GW(p)

that you don't have any privileged information.

Oh, and that I do have privileged information. Lots of fuckin' privileged information.

Replies from: albeola
comment by albeola · 2012-06-02T01:14:59.480Z · LW(p) · GW(p)

Is any of it transmissible? If not, is the reason why it isn't transmissible transmissible? Do your reasons carry over to other people's situations?

Replies from: Will_Newsome
comment by Will_Newsome · 2012-06-02T01:32:17.774Z · LW(p) · GW(p)

None is transmissible. My reasons carry over to people in situations who are similar to mine—that's very vague, but it implies that my reasons aren't truly unique to me. I'm truly sorry I can't share more.

Also, I vaguely remember "albeola"... have we met?