Posts

Why I Think All The Species Of Significantly Debated Consciousness Are Conscious And Suffer Intensely 2024-11-20T16:48:44.859Z
The Case For Giving To The Shrimp Welfare Project 2024-11-15T16:03:57.712Z
How I Learned That You Should Push Children Into Ponds 2024-11-11T14:20:23.080Z
Our Intuitions About The Criminal Justice System Are Screwed Up 2024-06-17T06:22:29.713Z
Losing Faith In Contrarianism 2024-04-25T20:53:34.842Z
On Leif Wenar's Absurdly Unconvincing Critique Of Effective Altruism 2024-04-04T19:01:00.332Z
The Closed Eyes Argument For Thirding 2024-03-31T22:11:44.036Z
Why The Insects Scream 2024-03-22T19:47:26.302Z
Conspiracy Theorists Aren't Ignorant. They're Bad At Epistemology. 2024-02-28T23:39:39.192Z
Theism Isn't So Crazy 2024-02-20T03:20:07.814Z
SIA Is Just Being a Bayesian About the Fact That One Exists 2023-11-14T22:55:43.362Z
Eugenics Performed By A Blind, Idiot God 2023-09-17T20:37:13.650Z
Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory 2023-09-17T02:15:41.430Z
The commenting restrictions on LessWrong seem bad 2023-09-16T16:38:26.318Z
Ethics Needs A Marginal Revolution 2023-09-15T19:08:19.378Z
Contra Heighn Contra Me Contra Functional Decision Theory 2023-09-11T19:49:43.216Z
Anyone want to debate publicly about FDT? 2023-08-29T03:45:54.239Z
Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong 2023-08-27T01:06:37.355Z
The Joan of Arc Challenge For Objective List Theory 2023-08-22T08:01:53.802Z
The Lopsided Lives Argument For Hedonism About Well-being 2023-08-22T07:59:46.186Z
Infinite Ethics: Infinite Problems 2023-08-16T02:44:06.253Z
Underwater Torture Chambers: The Horror Of Fish Farming 2023-07-26T00:27:15.490Z
Nature Releases A Stupid Editorial On AI Risk 2023-06-29T19:00:58.170Z
Two Pieces of Advice About How to Remember Things 2023-05-22T18:10:45.362Z
Deontological Norms are Unimportant 2023-05-18T09:33:31.628Z
The Orthogonality Thesis is Not Obviously True 2023-04-05T21:06:49.386Z
Two Dogmas of LessWrong 2022-12-15T17:56:06.462Z

Comments

Comment by omnizoid on Why I Think All The Species Of Significantly Debated Consciousness Are Conscious And Suffer Intensely · 2024-11-20T23:21:55.646Z · LW · GW

Yes oops

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-20T11:39:20.197Z · LW · GW

I refer you to my response to Said Achmiz's comment.  Do you have a better way of estimating animal consciousness?  Sure, the report isn't perfect, but it's better than alternatives.  It's irrational to say "well, we don't know exactly how much they suffer, so let's ignore them entirely." https://www.goodthoughts.blog/p/refusing-to-quantify-is-refusing

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-19T20:47:25.484Z · LW · GW

Fischer's not against using it for tradeoffs, he's against using it as a singular indicator of worth. 

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-19T19:36:39.902Z · LW · GW

But then you'd lose out on being the creatures.

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-19T19:36:11.581Z · LW · GW

The dark arts of expected value calculations relying on conservatively downgrading the most detailed report on the subject.  What a joke.

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-19T19:35:28.498Z · LW · GW

But I'm not trolleying them--I'm talking about how bad their suffering is.  

Comment by omnizoid on The Case For Giving To The Shrimp Welfare Project · 2024-11-16T10:41:31.734Z · LW · GW

As they describe in the report, the philosophical assumptions are mostly inconsequential and assumed  for simplicity.  The rest of your critique is just describing what they did, not an objection to it.  It's not precise and they admit quite high uncertainty, but it's definitely better than alternatives (E.g. neuron counts). 

Comment by omnizoid on Losing Faith In Contrarianism · 2024-04-26T05:42:37.634Z · LW · GW

It's not that piece.  It's another one that got eaten by a Substack glitch unfortuantely--hopefully it will be back up soon! 

Comment by omnizoid on Losing Faith In Contrarianism · 2024-04-26T05:40:41.659Z · LW · GW

He thinks it's very near zero if there is a gap. 

Comment by omnizoid on The Closed Eyes Argument For Thirding · 2024-04-10T02:02:43.664Z · LW · GW

If you half and don't think that your credence should be 2/3 in heads after finding out it's Monday you violate the conservation of evidence.  If you're going to be told what time it is, your credence might go up but has no chance of going down--if it's day 2 your credence will spike to 100, if it's day 1 it wont' change. 

Comment by omnizoid on The Closed Eyes Argument For Thirding · 2024-04-07T04:06:10.938Z · LW · GW

Yes--Lewis held this, for instance, in the most famous paper on the topic. 

Comment by omnizoid on The Closed Eyes Argument For Thirding · 2024-04-05T13:01:25.775Z · LW · GW

Lots of people disagree with 2.  

Comment by omnizoid on The Closed Eyes Argument For Thirding · 2024-04-01T13:12:54.424Z · LW · GW

I didn't make a betting argument. 

Comment by omnizoid on Why The Insects Scream · 2024-03-23T02:17:35.522Z · LW · GW

Impervious to reason?  I sent you an 8,000 word essay giving reasons for it! 

Comment by omnizoid on Why The Insects Scream · 2024-03-22T22:35:02.896Z · LW · GW

Just to be clear, I banned you because I find your comments to be annoying consistently.  You are, in fact, the first commenter I've ever banned.  

As for the question, they look at the various neural correlates of suffering on different theories, split their credence across them, and divy up the results based on expected consciousness.  The report is more detailed. 

Comment by omnizoid on Why The Insects Scream · 2024-03-22T22:32:15.793Z · LW · GW

https://benthams.substack.com/p/moral-realism-is-true

Comment by omnizoid on Theism Isn't So Crazy · 2024-02-28T21:23:44.577Z · LW · GW

It may be imaginable, but if it's false, who cares.  Like, suppose I argue, that fundamental reality has to meet constraint X and view Y is the only plausible view that does so.  Listing off a bunch of random ones that meet constraint X but are false doesn't help you .

Comment by omnizoid on Theism Isn't So Crazy · 2024-02-21T23:40:54.766Z · LW · GW

Well, UDASSA is false https://joecarlsmith.com/2021/11/28/anthropics-and-the-universal-distribution.  As I argue elsewhere, any view other than SIA implies the doomsday argument.  The number of possible beings isn't equal to the number of "physically limited beings in our universe," and there are different arrangements for the continuum points.  

Comment by omnizoid on Theism Isn't So Crazy · 2024-02-20T23:47:36.154Z · LW · GW

The argument for Beth 2 possible people is that it's the powerset of continuum points.  SIA gives reason to think you should assign a uniform prior across possible people.  There could be a God-less universe with Beth 2 people, but I don't know how that would work, and even if there's some coherent model one can make work without sacrificing simplicity, P(Beth 2 people)|Theism>>>>>>>>>>>>>>>>>>>>>>P(Beth 2 people)|Atheism.  You need to fill in the details more beyond just saying "there are Beth 2 people," which will cost simplicity.  

Remember, this is just part of a lengthy cumulative case.

Comment by omnizoid on Theism Isn't So Crazy · 2024-02-20T18:04:36.336Z · LW · GW

If theism is true then all possible people exist but they're not all here.  SIA gives you a reason to think many exist but says nothing about where they'd be.  Theism predicts a vast multiverse. 

Comment by omnizoid on Theism Isn't So Crazy · 2024-02-20T18:03:38.429Z · LW · GW

The cases are non-symmetrical because a big universe makes my existence more likely but it doesn't make me more likely to get HTTTTTTTHTTHHTTTHTTTHTHTHTTHHTTTTTTHHHTHTTHTTTHHTTTTHTHTHHHHHTTTTHTHHHHTHHHHHHHTTTTHHTHHHTHTTTTTHTTTHTTHHHTHHHTHHTHTHTHTHTHHTHTHTTHTHHTTHTHTTHHHHHTTTTTTHHTHTTTTTHHTHHTTHTTHHTTTHTTHTHTTHHHTTHHHTHTTHHTTHTTTHTHHHTHHTHHHHTHHTHHHTHHHHTTHTTHTHHTHTTHTHHTTHHTTHHTH.  The most specific version of the evidence is I get those sequence of coin flips, which is unaffected by the number of people, rather than that someone does that.  My view follows trivially from the widely adopted SIA which I argued for in the piece--it doesn't rely on some basic math error.

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-10-18T01:38:51.603Z · LW · GW

I didn't attack his character, I said he was wrong about lots of things. 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-10-06T21:19:38.487Z · LW · GW

//If you add to the physical laws code that says "behave like with Casper", you have re-implemented Casper with one additional layer of indirection. It is then not fair to say this other world does not contain Casper in an equivalent way.//

No, you haven't reimplemented Casper, you've just copied his physical effects.  There is no Casper, and Casper's consciousness doesn't exist.  

Your description of the FDT stuff isn't what I argued.  

//I've just skimmed this part, but it seems to me that you provide arguments and evidence about consciousness as wakefulness or similar, while Yudkowsky is talking about the more restricted and elusive concept of self-awareness. //

Both Yudkowsky and I are talking about having experiences, as he's been explicit about in various places.  

//Your situation is symmetric: if you find yourself repeatedly being very confident about someone not knowing what they are saying, while this person is a highly regarded intellectual, maybe you are overconfident and wrong! I consider this a difficult dilemma to be in. Yudkowsky wrote a book about this problem, Inadequate Equilibria, so it's one step ahead of you on the meta.//

I don't talk about the huge range of topics Yudkowsky does.  I don't have super confident views on any topic that is controvsial among the experts--but Yudkowsky's views aren't, they mostly just rest on basic errors.

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-09-27T19:41:39.621Z · LW · GW

I think this comment is entirely right until the very end.  I don't think I really attack him as a person--I don't say he's evil or malicious or anything in the vicinity, I just say he's often wrong.  Seems hard to argue that without arguing against his points.  

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-09-23T20:23:07.032Z · LW · GW

I never claimed Eliezer says consciousness is nonphysical--I said exactly the opposite.

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-18T03:12:26.571Z · LW · GW

If you look at philosophers with Ph.Ds who study decision theory for a living, and have a huge incentive to produce original work, none of them endorse FDT.  

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:29:00.700Z · LW · GW

Yeah, I was just kidding! 

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:23:11.140Z · LW · GW

About three quarters of academic decision theorists two box on Newcombe's problem.  So this standard seems nuts.  Only 20% one box.  https://survey2020.philpeople.org/survey/results/4886?aos=1399

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:21:46.648Z · LW · GW

My goal was to get people to defer to Eliezer.  I explicitly say he's an interesting thinker who is worth reading. 

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:21:01.558Z · LW · GW

I dispute that . . . 

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:20:38.956Z · LW · GW

I didn't say Eliezer was a liar and a fraud.  I said he was often overconfident and eggregiously wrong, and explicitly described him as an interesting thinker who was worth reading. 

Comment by omnizoid on Contra Yudkowsky on Epistemic Conduct for Author Criticism · 2023-09-18T02:18:14.355Z · LW · GW

//It presents each disagreement as though Eliezer were going against an expert consensus, when in fact each position mentioned is one where he sided with a camp in an extant expert divide.// 

Nope false.  There are no academic decision theorists I know of who endorse FDT, no philosophers of mind who agree with Eliezer's assessment that epiphenomenalism is the term for those who accept zombies, and no relevant experts about consciousness who think that animals aren't conscious with Eliezer's confidence--that I know of.  

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-18T00:54:40.262Z · LW · GW

The examples just show that sometimes you lose by being rational.  

Unrelated, but I really liked your recent post on Eliezer's bizarre claim that character attacks last is an epistemic standard. 

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-18T00:52:40.169Z · LW · GW

What's your explanations of why virtually no published papers defend it and no published decision theorists defend it?  You really think none of them have thought of it or anything in the vicinity? 

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-17T20:43:23.508Z · LW · GW

I mean like, I can give you some names.  My friend Ethan who's getting a Ph.D was one person.  Schwarz knows a lot about decision theory and finds the view crazy--MacAskill doesn't like it either.

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-17T18:59:04.216Z · LW · GW

I wouldn't call a view crazy for just being disbelieved by many people.  But if a view is both rejected by all relevant experts and extremely implausible, then I think it's worth being called crazy!  

I didn't call people crazy, instead I called the view crazy.  I think it's crazy for the reasons I've explained, at length, both in my original article and over the course of the debate.  It's not about my particular decision theory friends--it's that the fact that virtually no relevant experts agree with an idea is relevant to an assessment of it.  

I'm sure Soares is a smart guy!  As are a lot of defenders of FDT.  Lesswrong selects disproportionately for smart, curious, interesting people.  But smart people can believe crazy things--I'm sure I have some crazy beliefs; crazy in the sense of being unreasonable such that pretty much all rational people would give them up upon sufficient ideal reflection and discussion with people who know what they're talking about. 

Comment by omnizoid on The Lopsided Lives Argument For Hedonism About Well-being · 2023-09-17T18:55:40.138Z · LW · GW

Good one!  

Though is there a reason? 

Comment by omnizoid on The Lopsided Lives Argument For Hedonism About Well-being · 2023-09-17T15:29:06.978Z · LW · GW

Yep

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-17T15:26:20.895Z · LW · GW

You can make it with Parfit's hitchiker, but in that case there's an action before hand and so a time when you have the ability to try to be rational.  

There is a path from the decision theory to the predictor, because the predictor looks at your brain--with the decision theory it will make--and bases the decision on the outputs of that cognitive algorithm. 

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-17T15:24:22.139Z · LW · GW

The Demon is omniscient.  

FDTists can't self-modify to be CDTists, by stipulation.  This actually is, I think, pretty plausible--I couldn't choose to start believing FDT. 

Comment by omnizoid on Contra Heighn Contra Me Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-17T15:21:46.117Z · LW · GW

Well here's one indication--I don't know if there's a single published academic philosophy paper defending FDT.  Maybe there's one--certainly not many.  Virtually no decision theorists defend it.  I don't know much about Soares, but I know he's not an academic philosopher, and I think there are pretty unique skills involved in being an academic philosopher.

Comment by omnizoid on The commenting restrictions on LessWrong seem bad · 2023-09-16T18:40:56.298Z · LW · GW

Yeah, I agree I have lots of views that LessWrongers find dumb.  My claim is just that it's bad when those views are hard to communicate on account of the way LW is set up.  

Comment by omnizoid on Contra Heighn Contra Me Contra Functional Decision Theory · 2023-09-15T19:04:26.659Z · LW · GW

The description is exactly as you describe in your article.  I think my original was clear enough, but you describe your interpretation, and your interpretation is right.  You proceed to bite the bullet.  

Comment by omnizoid on Anyone want to debate publicly about FDT? · 2023-08-31T19:37:28.993Z · LW · GW

How'd you feel about a verbal debate? 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-29T04:03:02.751Z · LW · GW

Philosophy is pretty much the only subject that I'm very informed about.  So as a consequence, I can confidently say Eliezer is eggregiously wrong about most of the controversial views I can fact check him on.  That's . . . worrying. 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-28T14:32:31.754Z · LW · GW

I felt like I was following the entire comment, until you asserted that it rules out zombies.

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-28T03:15:39.637Z · LW · GW

If you only got rid of consciousness behavior would change.  

You might be able to explain Chalmers' behavior, but that doesn't capture the subjective experience. 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-27T22:23:01.556Z · LW · GW

It's not epiphenomenalism because the law invokes consciousness.  On the interactionalist account, consciousness causes things rather than just the physical stuff causing things.  If you just got rid of consciousness, you'd get a physically different world.  

I don't think that induction on the basis of "science has explained a lot of things therefore it will explain consciousness" is convincing.  For one, up until this point, science has only explained physical behavior, not subjective experience.  This was the whole point (see Goff's book Galileo's error).  For another, this seems to prove too much--it would seem to suggest that we could discover the corect modal beliefs in a test tube. 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-27T13:33:09.619Z · LW · GW

Did you read the next sentence?  The next sentence is " (note, this is not exactly how I feel about Yudkowsky, I don’t think he’s knowingly dishonest, but I just thought it was a good quote and partially represents my attitude towards Yudkowsky)."  The reason I included the quote was that it expressed how I feel about Yud minus the lying part--every time I examine one of his claims in detail, it almost always turns out false, often egregiously so.  

I don't think that arguments about whether animals are conscious are value questions.  They are factual questions--do animals have experience.  Is there something it's like to be them? 

Comment by omnizoid on Eliezer Yudkowsky Is Frequently, Confidently, Egregiously Wrong · 2023-08-27T12:28:45.888Z · LW · GW

On Cannell, as I said, I'm too ignorant to evaluate his claims in detail. My claim is just there are smart sounding people who claim Eliezer is naive about AI.  

On the zombie argument, the physical facts are not why it happens in the relevant sense.  If god causes a couch to disappear in one world, that is physically identical to another world in which Allah caused the couch to disappear, which is physically identical to a third world in which there is a fundamental law that causes it to disappear.  Physical identity has to do with the way that the physical stuff composing a world behaves.