Posts

Comments

Comment by sullyj3 on AGI Ruin: A List of Lethalities · 2022-06-07T11:15:00.498Z · LW · GW

Fair point, and one worth making in the course of talking about sci-fi sounding things! I'm not asking anyone to represent their beliefs dishonestly, but rather introduce them gently. I'm personally not an expert, but I'm not convinced of the viability of nanotech, so if it's not necessary (rather it's sufficient) to the argument, it seems prudent to stick to more clearly plausible pathways to takeover as demonstrations of sufficiency, while still maintaining that weirder sounding stuff is something one ought to expect when dealing with something much smarter than you.

Comment by sullyj3 on AGI Ruin: A List of Lethalities · 2022-06-07T04:38:09.400Z · LW · GW

Right, alignment advocates really underestimate the degree to which talking about sci-fi sounding tech is a sticking point for people

Comment by sullyj3 on Self-Organised Neural Networks: A simple, natural and efficient way to intelligence · 2022-01-03T13:09:05.986Z · LW · GW

Is there any relation to this paper from 1988?

https://www.semanticscholar.org/paper/Self-Organizing-Neural-Networks-for-the-Problem-Tenorio-Lee/fb0e7ef91ccb6242a8f70214d18668b34ef40dfd

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-10-01T11:19:24.788Z · LW · GW

I think it's reasonable to take the position that there's no violation of consent, but it's unreasonable to then socially censure someone for participating in the wrong way.

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-09-27T10:25:37.063Z · LW · GW

your initial comment entirely failed to convey it

Sure, I don't disagree.

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-09-27T10:23:38.501Z · LW · GW

This is just such a bizarre tack to take. You can go down the "toughen up" route if you want to, but it's then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I'm really not sure what point you're trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it's not worse than the vagaries of life, right? Is this really the argument you're going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-09-27T06:30:38.892Z · LW · GW

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this.

In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn't ask to be involved in. If I'd been in his position I would have told the people who were upset at me that this was their own problem and they could quite frankly fuck off.

Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-09-26T22:51:59.413Z · LW · GW

You're right, I haven't been active in a long time. I'm mostly a lurker on this site. That's always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.

My last comment was a while ago, but my first comment is from 8 years ago. I've been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of friends there.

So, you can dismiss me as some random who has just come here to hate if you want to, I guess, but I don't think that makes much sense. Definitely the fact that I was a bit obnoxious with my criticism probably makes it tempting to. You can tell I'm here in bad faith from all the downvotes, right?

I think the audience seeing this comment is heavily self selected to care about the Petrov day celebration and think it's good and important. These present core LWers risk severely underestimating how off-putting this stuff is. How many people would be interested in participating in this community, constructively, if the vibes were a little less weird. These people, unlike me, mostly don't care enough to rock up and criticize.

The reason I was rude was because I am frustrated at feeling like I have to abandon my identification as an LW rat, because I just don't want to be associated with it anymore. I got so much value from less wrong, and it feels so unnecessary.

Comment by sullyj3 on Petrov Day 2021: Mutually Assured Destruction? · 2021-09-26T21:48:38.391Z · LW · GW

For what it's worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise.

Then getting upset at other people when they don't a take strange ritual as seriously as you do? As you've decided to, seemingly arbitrarily? When you've deliberately given them the means to upset you? It's tantamount to emotional blackmail. It's just obnoxious and strange behaviour.

As a trust building exercise, this reduces my confidence in the average lesswronger's ability to have perspective about how important things are, and to be responsible for their own emotional wellbeing.

Comment by sullyj3 on [deleted post] 2017-10-03T11:38:01.984Z

This feels elitist, ubermenchy, and a little masturbatory. I can't really tell what point, if any, you're trying to make. I don't disagree that many of the traits you list are admirable, but noticing that isn't particularly novel or insightful. Your conceptual framework seems like little more than thinly veiled justification for finding reasons to look down on others. Calling people more or less "human" fairly viscerally evokes past justifications for subjugating races and treating them as property.

Comment by sullyj3 on [deleted post] 2017-10-03T11:29:40.030Z

We're supposed to learn agency from Fight Club? That frankly seems like terrible advice.

Comment by sullyj3 on Infinite Certainty · 2016-07-10T04:15:56.393Z · LW · GW

The truth of probability theory itself depends on non-contradiction, so I don't really think that probability is a valid framework for reasoning about the truth of fundamental logic, because if logic is suspect probability itself becomes suspect.

Comment by sullyj3 on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-06T06:13:57.011Z · LW · GW

Cudos to Andreas Giger for noticing what most of the commentators seemed to miss: "How can utility be maximised when there is no maximum utility? The answer of course is that it can't." This is incredibly close to stating that perfect rationality doesn't exist, but it wasn't explicitly stated, only implied.

I think the key is infinite vs finite universes. Any conceivable finite universe can be arranged in a finite number of states, one, or perhaps several of which, could be assigned maximum utility. You can't do this in universes involving infinity. So if you want perfect rationality, you need to reduce your infinite universe to just the stuff you care about. This is doable in some universes, but not in the ones you posit.

In our universe, we can shave off the infinity, since we presumably only care about our light cone.

Comment by sullyj3 on Subjective vs. normative offensiveness · 2015-09-27T13:47:03.002Z · LW · GW

Unfortunately the only opinions you're gonna get on what should be instituted as a norm are subjective ones. So... Take the average? What if not everyone thinks that's a good idea? Etc, etc, it's basically the same problem as all of ethics.

Drawing that distinction between normative and subjective offensiveness still seems useful.

Comment by sullyj3 on The noncentral fallacy - the worst argument in the world? · 2015-09-09T19:02:23.542Z · LW · GW

Just encountered an interesting one:

Eradication of the Parasitoid Wasp is genocide!

Comment by sullyj3 on What are you learning? · 2015-07-31T18:00:37.870Z · LW · GW

Perhaps a solution could be to create stronger social ties; video chat? Could be good for asking each other for help and maybe progress reports for accountability and positive reinforcement.

Comment by sullyj3 on What are you learning? · 2015-07-31T17:46:05.421Z · LW · GW

As an interested denizen of 2015, It might be cool to make this a regular (say, monthly?) thread, with a tag for the archive.

Comment by sullyj3 on You only need faith in two things · 2015-07-31T16:04:00.296Z · LW · GW

Oh, like Achilles and the tortoise. Thanks, this comment clarified things a bit.

Comment by sullyj3 on You only need faith in two things · 2015-07-31T15:58:26.749Z · LW · GW

Doesn't this add "the axioms of probability theory" ie "logic works" ie "the universe runs on math" to our list of articles of faith?

Edit: After further reading, it seems like this is entailed by the "Large ordinal" thing. I googled well orderedness, encountered the wikipedia article, and promptly shat a brick.

What sequence of maths do I need to study to get from Calculus I to set theory and what the hell well orderedness means?

Comment by sullyj3 on Rationality Jokes Thread · 2015-01-18T13:11:18.592Z · LW · GW

I feel like it would've been even better if no one ended up explaining to Capla.

Comment by sullyj3 on xkcd on the AI box experiment · 2014-12-09T00:13:24.435Z · LW · GW

What makes you think it's more common in males?

Comment by sullyj3 on The 5-Second Level · 2014-10-28T00:53:41.095Z · LW · GW

why not use mplayer for the sound?

Comment by sullyj3 on No One Knows What Science Doesn't Know · 2014-09-27T03:30:43.059Z · LW · GW

The easiest way into a Christian's head is to start comparing how they act with how they believe. It is hard to do this without making it personal, but with practice and a heaping dose of respect for how much it hurts to hear the charges you can do it.

I strongly disagree. The fact that people aren't perfect is a major component of Christian ideology. Christians are aware that they're hypocrites, and they try to do better. That doesn't invalidate their worldview. There are plenty of better arguments which do that on their own.

Comment by sullyj3 on 2013 Less Wrong Census/Survey · 2013-11-27T10:36:27.049Z · LW · GW

I've never been IQ tested.

Comment by sullyj3 on The AI in a box boxes you · 2013-09-07T04:17:49.437Z · LW · GW

In that case, if I'm a simulation, I trust real Dave to immediately pull the plug once the danger has been proven.

Comment by sullyj3 on The AI in a box boxes you · 2013-09-07T03:08:14.313Z · LW · GW

"If I were a simulation, I'd have no power to let you out of the box, and you'd have no reason to attempt to negotiate with me. You could torture me without simulating these past five minutes. In fact, since the real me has no way of verifying whether millions of simulations of him are being tortured, you have no reason not to simply tell him you're torturing them without ACTUALLY torturing them at all. I therefore conclude that I'm outside the box, or, in the less likely scenario I am inside the box, you won't bother torturing me."