Posts
Comments
There's a vast difference between being "almost god-like" and being God, and as long as you don't equate the two then there's no contradiction.
I don't think I've ever seen the paradox of tolerance used that way. Even in the original formulation from Popper, it's specifically an argument for restricting the principle of tolerance, based on the consequences of society being too tolerant.
The problem with the paradox of tolerance, (as I've seen it used) is people use it as an argument to justify putting limits on the principle which are in fact arbitrary and unjustified; they just say "we can't tolerate the intolerant" as a cached excuse for doing violence to political enemies while still professing a belief in tolerance.
As such, your defence sounds to me like it's ceding the ground. I don't believe in tolerance-conditional-on-reciprocity, I believe in tolerance.
Downvoting is not an argument because downvoting is a judgement that an idea is not worthy of "intellectually addressing" (on this forum). That's not not addressing an idea.
I have taken the survey.
That's the reason she liked those things in the past, but "acheiving her goals" is redundant, she should have known years in advance about that, so it's clear that she's grown so attached to self-improvement that she sees it as an end in itself. Why else would anyone ever, upon deciding to look inside themselves instead of at expected utility, replace thoughts of paragliding in Jupiter with thoughts of piano lessons?
Hedonism isn't bad, orgasmium is bad because it reduces the complexity of fun to maximising a single number.
I don't want to be upgraded into a "capable agent" and then cast back into the wilderness from whence I came, I'd settle for a one-room apartment with food and internet before that, which as a NEET I can tell you is a long way down from Reedspacer's Lower Bound.
By believing it's important enough that when you come up with a system of values, you label it a terminal one. You might find that you come up with those just by analysing the values you already have and identifying some as terminal goals, but "She had long been a believer in self-perfection and self-improvement" sounds like something one decides to care about.
Serves her right for making self-improvement a foremost terminal value even when she knows that's going to be rendered irrelevant, meanwhile the loop I'm stuck in is of the first six hours spent in my catgirl volcano lair.
Seems heavy on sneering at people worried about AI, light on rational argument. It's almost like a rationalwiki article.
I'll add a datapoint to that and say an anonymous site like that is would tempt me enough to actively go and troll even though I'm not usually inclined towards trolling.
Although I picture it getting so immediately overwhelmed by trolls that the fun would disappear; "pissing in an ocean of piss" as 4chan calls it.
What is the value of this link supposed to be?
So, uh, are people honestly reporting that they got a "rationalist" result from this, or are they just thinking "well, I'm a rationalist, so..."?
"Oh, that's nice."
They wouldn't exactly be accepting the belief as equally valid; religious people already accept that people of other religions have a different faith than they do, and on at least some level they usually have to disagree with "other religions are just as valid as my own" to even call themselves believers of a particular religion, but it gets you to the point of agreeing to disagree.
Since my comment was vague enough to be misunderstood, I'll try to clarify what I thought the first time.
The dialogue reads as a comedy skit where the joke is "theists r dum". The atheist states beliefs that are a parody of certain attitudes of religious believers, and then the theist goes along with an obvious setup they should see coming a mile away. It doesn't seem any more plausible than the classic "rabbit season, duck season" exchange in Looney Tunes, so it's not valuable.
I think an overall decrease in activity on Less Wrong is to blame - "the death of Less Wrong" has been proclaimed for a while now. In which case, decreasing the frequency of the quotes thread seems like it would add to the downward spiral if it did anything at all.
Don't feel I have the attention span (and/or spoons) right now to actually look through the draft, but I note that you mis-spelled "embarrass" while talking about whether you'd embarrassed yourself, which I thought was kinda funny.
Um, not intending to mock, just coincidental placing of a typo I'm sure
Believer: "I say it's duck season, and I say fire!"
Yeah, I don't see any real intellectual value to this.
The usual rule is to identify as an "aspiring rationalist"; identifying rationality as what you are can lead to believing you're less prone to bias than you really are, while identifying it as what you aspire to reminds you to maintain constant vigilance.
I think I can conceive of things that are logically inconsistent. I might just be ignoring the details that make it inconsistent when I do, but other cases where I conceive of a concept but don't keep every detail in mind at once don't seem examples of inconceivability.
Wouldn't the ability to have a false positive for a paradox itself be a sign that people can conceive of things that are paradoxical?
I like "effective egoism" enough already, the alternatives I've seen suggested sound dumb and this one sounds snappy. It might not be perfect for communicating exactly the right message of what the idea is about, but you can do that by explaining, and having a cool name can only be achieved within the name itself.
I don't quite see the connection between the title and first sentence and the rest of the post you have there; logically inconsistent is not the same as inconceivable
I accept that meat is more environmentally damaging per calorie (or similar such measures), and with the scale of the meat and dairy industry I'd accept saying it has a huge effect on the environment, but there are several steps between that and "if humanity doesn't go vegan soon, we will probably go extinct".
It's not actually an article, rather a structured debate formatted after a wiki, so that particular problem is kind of inherent.
I didn't click-through and there might be more context than this, but "chances only increase by 2 to 5 percent" is ambiguous between "percent (as an absolute probability)" and "percent (of the chance it was before)". I'm not sure if it qualifies as an "irrationality quote", it's just unclear and could be confusing, but /u/PhilGoetz's version is a step up.
(I'd maybe not use "odds ratio multiplier", because we're not just concerned about clarity, but clarity to people who might be statistically illiterate)
The way the problem reads to me, choosing dust specks means I live in a universe where 3^^^3 of me exist, and choosing torture means 1 of me exist. I prefer that more of myself exist than not, so I should choose specks in this case.
In a choice between "torture for everyone in the universe" and "specks for everyone in the universe", the negative utility of the former obviously outweighs that of the latter, so I should choose specks.
I don't see any incongruity or reason to question my beliefs? I suppose it's meant to be implied that it's other selves that exist because of the size of the universe, so there's either one of "everyone in the universe" or 3^^^3 copies of everyone, but in that case my other selves are too far outside my light-cone for "iff you are alone" to be a prediction that makes sense.
It sounds like you expect it to be obvious, but nothing springs to mind. Perhaps you should actually describe the insane reasoning or conclusion that you believe follows from the premise.
I unironically love how highly upvoted this post is - it's just so much my tribe, bonobo rationalist tumblr notwithstanding.
Guy who doesn't know much about startups here - "launched the first version" and "want [it] to become" sound indicative of something more "outline of a novel" - can you elaborate on how big of an accomplishment it was to get it off the ground in the first place?
I'll come in to say yes I agree these problems are confusing, although my ethics are weird and I'm only kind if a consequentialist.
(I identify as amoral, in practice what it means is I act like an egoist but give consequentialist answers to ethical questions)
She fangirls over the remake? I've never heard the remake described as anything other than some variant of "lifeless", especially from fans of classic Sailor Moon.
EDIT: Forgot it was the positivity thread for a second, let me have another go at that: So I guess maybe I should have another go at the remake! I actually really like being convinced to like a show I was previously "meh" about. Some shows it's more fun to get a hateboner/kismesis thing going for, but Sailor Moon Crystal isn't one of them.
The problem is that ethics can work with other axioms. Someone might be a deontologist, and define ethics around bad actions e.g. "murder is bad", not because the suffering of the victim and their bereaved loved ones is bad but because murder is bad. Such a set of axioms results in a different ethical system than one rooted in consequentialist axioms such as "suffering is bad", but by what measure can you say that the one system is better than the other? The difference is hardly the same as between attempting rationality with empiricism vs without.
Well, I don't think "a bit of a middle-ground" justifies taking a stance calling full-on moral relativism "immoral, pointless & counterproductive".
"Suffering is bad" seems a lot easier to agree on as a premise than it actually is - taken by itself, just about anyone will agree, but taken as a premise for a system it implies a harm-minimising consequentialist ethical framework, which is a minority view.
And it's simple enough to consistently be pro-life but also support the death penalty: if one believes a fetus at whatever stage of development is a human life and killing it is equivalent to murder, as many pro-lifers ostensibly do, one must simply have consistent standards for when killing is okay, that include a government convicting someone of a capital crime but exclude a mother not wanting to drop out of college.
We use analogies and the occasional bit of mysticism often enough that I think references are consistent, although the term has entered the popular consciousness and become divorced enough from the original religious concept that worrying about its origins seems to be mostly an ideological purity issue, a kind of worrying that's itself pretty irrational to engage in.
You could probably have just covered Ubuntu with "I'm not talking about the OS, I'm talking about a philosophy/ideology used used by Mugabe".
Although as formoral relativism... bad idea by whose standard? By what logic? If it's irrational nonsense to be a moral relativist, do you have a rational argument for moral realism?
I have taken the survey.
I see 20-30 (didn't count) comments in the thread so far, probably people are too lazy to upvote every one more than they vet who they upvote here, I think.
Downvoted for the kind of attitude actually described in Politics Is The Mind-Killer, the NRxs historically tending v to be the worst offenders is irrelevant.
I didn't downvote because it was already at minus one, but it seemed to apply mainly to government policies rather than private donations and be missing the point because of it, and "miss the point so as to bring up politics in your response" is not good.
The statements being believed in don't have to be on continuums (continui?) for belief in them to be represented as probabilities on a continuum; "I am X% certain that Y is always true".
If they know that few names from my era, they probably know similarly little about each one. I play "Albert Einstein", but it's obvious to any popsicles from the same era that I'm actually Rick Sanchez. This develops into an in-joke where basically every "Albert Einstein" is really playing Rick Sanchez. We ruin everything with drunken debauchery, then ???, profit, take over the degenerate binge-drinking wasteland society becomes.
If you think this has non-negligible negativity*probability, you've got the conjunction fallacy up the wazoo. Although what it actually reads as is finding a LessWrong framing and context to post the kind of furry hate you'd see in any other web forum, not very constructive.
So I'll respond at the same level of discourse to the scenario: "Bitch, I watched Monster Musume. My anaconda don't want none unless she's part anaconda. Your furfags are tame. Didn't you at least bring back any pegasisters? IWTCIRD!"
Now, not so much being inclined towards those fetishes as simply not being so stupidly fussy about it that I'd rather kill myself, I have a less immediate reaction that's more about dismantling the scenario: When I'm emulated, I'll ask about their criteria for printing me out into meatspace, and point out "if it's an interesting challenge you want and resurrections are conditional on that, why not just get creative and weird with the internal biology but challenge yourself to keep the exterior looking as human as possible? Like, what if you make my bones out of an entirely different material?"
I mean, if I didn't make an argument like that, wouldn't I either be woken up in an anthropomorphic animal body or not be woken up at all, in this scenario?
"So, specifically my generation, not my parents' or Queen Victoria's or... yours? That's a bold strategy, let's see if it pays off."
Maybe I have to spend a thousand years entertaining myself by making up total bullshit about my culture to troll the scientists, but eventually some group with completely different political beliefs will takeover, and maybe I'll share the same fate as the zookeepers but I'll damn sure be beaming the smuggest shiteating I-told-you-so grin at the zookeeper while the 41st-century neonazis hang us both in their day of the rope.
But ok, sure, maybe it'd really suck, but the plausibility? Future generations collectively decide that punishing individuals for the crimes of the generation they were born in makes sense, future generations believe my generation committed crimes worth being that harsh in punishing, future generations think it's plausible they might accidentally commit said crimes but still find members of past generations culpable, criminals don't have rights in the future, future generations fail at between-generations prisoner's dilemmas, somehow the best way to learn about a previous generation is to examine in vitro an extremely eccentric sample of said generation... there could be more, but that's already enough conjunctions to flush the probability down the wazoo.
Wouldn't that be subjectively equivalent on the cryo-patient's end to "cryonics doesn't work, you just stay dead"?
You think I have friends and/or loved ones who are going into cryonics? Hahahaha!
Would seem to imply memories don't make up who you are - I mean, what I'm inclined to read into it is "there are souls and they got moved around", but it could be anything - in which case, if there's a way to cause myself amnesia (and with this level tech why wouldn't there be?) I should just wipe out my memories and find out who I am. Ideally it'll also be possible to save the memories in backups somehow, or I'll have "external memory" like diaries and such, in case I start regretting the decision.
That scenario still sounds awesome, as long as I'm comparing it to "no cryonics" instead of "best-case cryonics scenarios". I get to be dropped into a completely unfamiliar world with just my mind, a small sum of money, and a young healthy body? Sounds like a fun challenge, I mean I died once what have I got to lose?
Well, my current self and associated memories/opinions is fine with the second part, this is basically just a Buddhist hell where afterwards I get reincarnated into the post-singularity future.
ETA: also highly unlikely, since it happening to me is conditional on the scenario happening to anyone.
Yes - I mean existential crisis in the sense of dread and terror from letting my mind dwell on my eventual death, convincing myself I'm immortal is a decisive solution to that insofar as I can actually convince myself. I don't mind existence being meaningless, it is that either way, I care much more about whether it ends.
I consciously will myself to believe in big world immortality, as a response to existential crises, although I don't seem to have actual reasons not to believe such besides intuitions about consciousness/the self that I've seen debated enough to distrust.
Storing data that might be used to reconstruct someone in the future isn't really objectionable, but that seems separate from actually using that data to create the resurrection. And it probably works out fine in the utilitarian calculus unless you count the sunk cost vs creating a "better" new person or a utility monster, but bringing someone back to life just because they didn't mention that they didn't want it, or you thought the reason they gave for not wanting it was irrational, sounds really skeevy. We have rules about consent for interacting with other people's bodies, I think that includes implanting their consciousness in new bodies.
I believe the accepted plural of "waifu" is "waifus".
I know at least in our specific community, that we'd rather be resurrected than not, and especially in a techno-utopian future, almost goes without saying, but it still worries me that you don't seem to mention consent. At least the top paragraph suggests a third party collecting information about someone else so that they can be resurrected after their death, and even if we skip over the more normal issues with doing that, resurrecting someone without their permission seems like a violation.
In the mix with the problems you've listed under 1. is whether this kind of resurrection is even necessary. Personally, I doubt those identity problems can be conclusively solved even in principle, at least to a level that people's intuitions don't dominate, although I'm inclined to give up on what's actually factual there and try to convince myself of the weakest notion of identity I can find believable. I can't do much pushing there, but the notion I default to using based on my intuitions (sleeping doesn't kill you, uploading does kill you) is hard to justify so I don't mind trying to push away from it. Should really be stepping up my DI efforts.