Posts

Comments

Comment by alexu on Tallinn-Evans $125,000 Singularity Challenge · 2010-12-27T23:31:28.241Z · score: 2 (16 votes) · LW · GW

Why has this comment been downvoted so much? It's well-written and makes some good points. I find it really disheartening every time I come on here to find that a community of "rationalists" is so quick to muffle anyone who disagrees with LW collective opinion.

Comment by alexu on Were atoms real? · 2010-12-14T14:59:11.621Z · score: 8 (13 votes) · LW · GW

Half the more "philosophical" posts on here seem like they're trying to reinvent the wheel. This issue has been discussed a lot by philosophers and there's already an extensive literature on it. Check out http://plato.stanford.edu/entries/scientific-realism/ for starters. Nothing wrong with talking about things that have already talked about, of course, but it would probably be good at least to acknowledge that this is a well-established area of thought, with a known name, with a lot of sophisticated thinking already underway, rather than having the mindset that Less Wrong is single-handedly inventing Western philosophy from scratch.

Comment by alexu on Pain · 2009-08-03T14:11:36.719Z · score: 0 (6 votes) · LW · GW

The difficulty of answering this question suggests one possibility: pain might very well be the only intrinsically bad thing there is. Pain is bad simply because it is bad, in a way that nothing else is. It could be argued that the "goodness" or "badness" of everything else is reducible to how much pain qualia it causes or prevents.

Comment by alexu on It's all in your head-land · 2009-07-24T14:31:16.351Z · score: 1 (1 votes) · LW · GW

Lots of great stuff in this post. Don't have time to comment on anything in particular, but just wanted to say: this is the best-written piece I've ever seen on lesswrong. Keep writing.

Comment by alexu on Atheism = Untheism + Antitheism · 2009-07-01T13:54:37.309Z · score: 3 (3 votes) · LW · GW

Quick: is there an 85 year old former plumber by the name of Saul Morgan eating a breakfast of steak and eggs in a diner in North Side of Chicago right now? Who knows, right? You certainly don't have an affirmative belief that there is, but it's also true that, perhaps up until this moment, you didn't affirmatively believe that there wasn't such a man either. Lacking a belief in something is not the same as believing in its converse. To affirmatively believe in the non-existence of every conceivable entity or the falsity of ever proposition would require an infinite number of beliefs.

Comment by alexu on Essay-Question Poll: Dietary Choices · 2009-05-03T19:59:47.728Z · score: 0 (14 votes) · LW · GW

I eat anything. Make a conscious choice to eat healthy stuff and avoid junk food and simple carbs when convenient. Preferred eating pattern is to basically graze all day long. That, as well as a general indifference toward food (I find eating to be a bit of an irritating necessity, and never have cravings for anything) are enough to keep me trim. Probably worth noting that I wasn't always this way; up through college, I loved eating crap foods, sweets, carbs, soda, etc. Permanent preference changes take time, but can happen.

Most vegetarians/vegans strike me as sanctimonious twits, who are more often than not no healthier than anyone else.

Comment by alexu on Open Thread: May 2009 · 2009-05-02T04:10:54.569Z · score: 3 (3 votes) · LW · GW

I get that you're being sarcastic, but I'm not sure what you're driving at.

Comment by alexu on Open Thread: May 2009 · 2009-05-02T03:37:51.424Z · score: -1 (7 votes) · LW · GW

However, if the holodeck hypothesis is true, then someone outside the simulation might decide to be nice to me, so the probability that it will win is more like 10^-3.

Um, what?

Comment by alexu on Open Thread: May 2009 · 2009-05-02T02:56:06.351Z · score: -9 (9 votes) · LW · GW

But that's pretty much what LW is, no? I've long suspected that "rationality," as discussed here, was a bit of a ruse designed to insinuate a (misleading) necessary connection between being rational and supporting transhumanist ideals.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-05-01T18:39:46.320Z · score: 0 (4 votes) · LW · GW

Your conception of "theism" -- a tremendously broad concept -- is laughably caricatured and narrow, and it pollutes whatever argument you're trying to make: absolutely none of the logic in the above post follows in the way you think it does.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-05-01T18:36:01.436Z · score: 2 (2 votes) · LW · GW

Discounting an argument because of the person making it is pretty much the textbook definition of ad hominem fallacy.

Also, it should go without saying that being a theist doesn't automatically mean one believes in a loving and all-powerful god watching over us. And anyway, I still don't follow the logic that being a theist means one can't make sensible decisions about the Singularity (insofar as one can say there are "sensible decisions" to be made about something that's basically a sci-fi construct at this point.)

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-05-01T18:32:39.508Z · score: 1 (3 votes) · LW · GW

God's non-existence isn't predicated on any positive evidence for the proposition, but on lack of any evidence whatsoever, which was just as lacking in previous centuries as it is today.

Anyway, a list of Nobel Prize winners in the sciences is going have a substantial number of theists on it (probably a majority).

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-05-01T16:25:33.646Z · score: 1 (7 votes) · LW · GW

You've never heard of the ad hominem fallacy, I take it?

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-05-01T16:20:42.246Z · score: 2 (6 votes) · LW · GW

You do realize that on any list of historically significant "geniuses," the majority are going to be theists, right? I'm sure it must be nice to pat yourself on the back for being "smarter" than people like Goethe, Thomas Aquinas, and Kierkegaard, but that would seem to be a reductio ad absurdum against the use of theism as an automatic disqualifier for "smartness," to my mind.

Comment by alexu on Generalizing From One Example · 2009-04-30T13:53:23.579Z · score: 6 (8 votes) · LW · GW

Isn't there an equally well-known bias toward thinking we'll react differently to future events (or behave differently) than most people? That is, we observe that most people don't become happier when they become rich, but we convince ourselves that we're "different" enough that we nonetheless will? I think Dan Gilbert wrote pretty extensively on this in of those recent "happiness studies" books. Anyway, it seems like there's an obvious tension between the two tendencies.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-29T15:43:28.536Z · score: 1 (1 votes) · LW · GW

Of course, different worldviews may be qualitatively very different, but the point I'm making is that our personal reasons for adopting one over the other aren't all that different. My reasons for believing various scientific findings have much more to do with the sociology of my upbringing and current environment than with the actual truth or falsity of those findings. I did some lab experiments in high school and college, but to extrapolate from those personal verifications to the truth of all scientific findings is to make quite an inductive leap.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-29T15:34:22.581Z · score: 1 (1 votes) · LW · GW

I'm not sure those categories are as meaningful as you think. How many scientific findings are you capable of verifying personally, right now? And believing you're capable of verifying them, "in principle," is quite different altogether...

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-29T15:14:55.131Z · score: 5 (5 votes) · LW · GW

Right. The idea that we as individuals arrive at our scientific beliefs via perfect rationality is a fiction. It's good to keep in mind that our scientific beliefs are a product of a particular social network -- we believe things largely because people and institutions we trust believe those things. The difference between being a Mormon and being a scientific materialist is less a qualitative difference (i.e., one person is rational, the other is not) than one of degree, circumstance, and where you place your faith.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-28T21:30:18.347Z · score: 7 (7 votes) · LW · GW

A problem I have with the LW community is this background assumption that infinite life somehow equals infinite utility, that living forever is clearly the rational goal, and that anyone (the vast majority of people, it seems) who doesn't express any particular zeal for this notion is deluded, irrational, or under religion's spell. A long, healthy life is certainly desirable to most people, but I think there are good, irreligious, perfectly sensible reasons for not placing any great value on immortality or living to see the distant future.

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-28T21:24:19.347Z · score: 1 (1 votes) · LW · GW

Pretty doubtful, especially controlling for IQ and education...

Comment by alexu on Theism, Wednesday, and Not Being Adopted · 2009-04-27T21:33:12.226Z · score: 14 (14 votes) · LW · GW

This is a great post because it shows just how hard one has to stretch the meaning of "win" to find a way in which atheism always "wins." In the example, it seems that Wedesday "wins" by remaining a Mormon, unless she just happens to place some kind of high personal value on metaphysical truth that can only be satisfied by holding the epistemically correct belief. There's no reason why that should be for everyone, though -- there's a pretty strong case both for not caring at all about these questions, as well accepting one's "default" view if it's too costly to shed. Say Wednesday never becomes a philosopher, but instead, goes into business, or becomes a journalist, or a doctor. It's difficult to imagine how the "less wrong" position of atheism would help her "win" in any of these endeavors, and, in all likelihood, the practical costs incurred by deconverting would swamp any marginal gains she'd get from changing her metaphysical stance on God.

I think people on LW are very hesitant to admit that their strong attachment to "true" metaphysical beliefs may have nothing to do with "winning," but rather, could just be an idiosyncratic personal preference (which is perfectly OK).

Comment by alexu on Well-Kept Gardens Die By Pacifism · 2009-04-21T15:04:56.971Z · score: 0 (12 votes) · LW · GW

Are you so confident in your perfect, unerring rationality that you'll consider that particular proposition completely settled and beyond questioning? I'm about as certain that there is no God as one can get, but that certainty is still less than 100%, as it is for virtually all things I believe or know. Part of maintaining a rational outlook toward life, I'd think, would be keeping an attitude of lingering doubt about even your most cherished and long-held beliefs.

Comment by alexu on Well-Kept Gardens Die By Pacifism · 2009-04-21T13:21:55.741Z · score: 13 (13 votes) · LW · GW

The site is about rationality, not dogma -- I think. Posts should be judged on the strength and clarity of their ideas, not the beliefs of the individual posters who espouse them. To categorically exclude an entire class of people -- some of whom are very good rationalists and thinkers -- simply because they don't subscribe to some LW party line, is not only short-sighted, but perversely, seems to run entirely counter the spirit of a site devoted to rationality.

The consequences, I imagine, would be less interesting, less broad discussion, with a constricting of perspective and a tendency to attract the same fairly narrow range of people who want to talk about the same fairly narrow range of topics. It will select not for good rationalists per se, but some mix of people who overly fancy themselves good rationalists, as well as the standard transhumanism/Singularity crowd that's here because of EY.

Comment by alexu on Well-Kept Gardens Die By Pacifism · 2009-04-21T13:10:09.425Z · score: -7 (17 votes) · LW · GW

"Still, we can agree that Aumann is not on board with the programme..."

What on earth are you talking about? A legendary rationalist is "not on board with the programme" here at a website ostensibly devoted to the discussion of rationality because he might be a theist? Get a grip. There is no such "programme" that would exclude him.

The site would be helped most not by categorically excluding theists, but by culling out all the blinkered and despicable cult-like elements that seem to worm their way persistently into the manner of speaking around here.

Comment by alexu on Well-Kept Gardens Die By Pacifism · 2009-04-21T13:03:15.357Z · score: -8 (12 votes) · LW · GW

Outreach? For someone who seems so avowedly anti-religious, you seem very eager to appropriate all the trappings of classical, unthinking religion. I'm fine discussing rationality here, but talk of proselytizing makes me nauseous.

Comment by alexu on Well-Kept Gardens Die By Pacifism · 2009-04-21T12:59:50.120Z · score: -5 (17 votes) · LW · GW

This is one of the more asinine things I've seen on here. There are many, many brilliant people who happen to be theists, and to categorically exclude their contributions and viewpoints would be doing the community a grave disservice. I'm an atheist myself, but I've never thought for a second that "God doesn't exist" is any kind of fundamental, unassailable axiom of rationality. It's not.

Comment by alexu on The Sin of Underconfidence · 2009-04-20T16:32:59.214Z · score: 1 (1 votes) · LW · GW

How is it unfair to him in any way? He's free to choose whether to debate or not debate you; I doubt any reasonable person would be offended by the mere contemplation of a future debate. And any sort of advantage or disadvantage that might be gained or lost by "tipping him off" could only be of the most trivial sort, the kind any truth-seeking person should best ignore. All this does is make it a bit difficult to talk about the actual substance and ideas underlying the debate, which seems to me the most important stuff anyway.

Comment by alexu on The Sin of Underconfidence · 2009-04-20T14:40:00.181Z · score: 3 (3 votes) · LW · GW

I agree, but the anthropic principle has always seemed like a bit of cheat -- an explanation that really isn't much of an explanation at all.

Comment by alexu on The Sin of Underconfidence · 2009-04-20T14:28:31.766Z · score: 3 (7 votes) · LW · GW

Can someone explain why we can't name the theist in question, other than sheer silliness?

Comment by alexu on Of Gender and Rationality · 2009-04-16T22:31:00.051Z · score: 8 (12 votes) · LW · GW

1). There is a lot of, for want of a better term, "mental masturbation" around here: arguing for the sake of arguing, debating insignificant points, flashy but ultimately useless displays of intellect etc. Men tend to enjoy this sort of thing much more than women. Perhaps the female equivalent would be "social masturbation" -- endless gossiping about other people's trivia.

2). There's a major bias toward discussing math and science topics on here, and objective rather than subjective experience. Rationality, as a meta-construct, arguably isn't necessarily limited to these domains. I don't see why it can't be applied to equally good effect to literature and the humanities, art, interpersonal relationships, etc. Broaden your conversations to include some more of these topics (but, of course, with the same characteristic rational approach) and you may win over more female participants.

Comment by alexu on Bayesians vs. Barbarians · 2009-04-15T14:32:00.038Z · score: -6 (12 votes) · LW · GW

1). The post isn't directly about the singularity (though he does bring up AI); nonetheless, he writes the same way about it. The point stands.

2). The post may not be directly about the Singularity, but in some sense, isn't everything Eliezer writes about the Singularity? It's disingenuous to say there's no connection between the "Art of rationality" and the Singularity/AI as far as he's concerned. A cynic might say Eliezer's ulterior motive in propounding "rationalism" is simply to garner more Singularity/AI supporters, even if there's no necessary connection between the two.

3). I'm not upset by the topic per se. I mostly just think it's a ludicrous way to be writing. And with one of Eliezer's avowed goals being to recruit more followers (even expressing comfort with the idea of being a cult-leader), I could see this sort of rhetoric laying the foundation for something much scarier and more serious down the line. What happens 30 years from now, if some group thinks it's on the verge of activating the first super-intelligent AI, but the "anti-rationalists" in Congress and around the country want to pass laws restricting it? At such a (seemingly) momentous point in history, what action could that justify taking? You can see where this is heading.

Comment by alexu on Bayesians vs. Barbarians · 2009-04-15T01:37:54.696Z · score: -11 (21 votes) · LW · GW

Fighting wars and having to do nasty things? Please. The martial language is absurd for a site that's basically a gathering spot for sci-fi fans and math enthusiasts. And yes, Eliezer's overheated, apocalyptic diatribe bears a striking resemblance to the kind of rhetoric employed by cults and fringe groups of all stripes, not least among them the fundamentalist Christians so reviled on here. It's exactly this sort of thing that prevents the Singularity as being seen as anything but a quasi-religious "rapture for nerds" by most educated people.

Comment by alexu on Bayesians vs. Barbarians · 2009-04-15T00:31:56.029Z · score: -35 (37 votes) · LW · GW

This reads like a précis of Mein Kampf for the next generation. Well done!

Comment by alexu on Declare your signaling and hidden agendas · 2009-04-14T13:03:23.248Z · score: 3 (3 votes) · LW · GW

"Declare your hidden agendas" is somewhat of an oxymoron -- obviously anyone with a true hidden agenda isn't going to declare it. Seems like this idea of disclaimers in front of LW posts is a non-starter.

Comment by alexu on GroupThink, Theism ... and the Wiki · 2009-04-14T01:29:48.493Z · score: 1 (1 votes) · LW · GW

Your best guesses seem pretty close to how the terms are used on here; I think the community at large should be wary of appropriating terms that already have long histories in certain fields.

Comment by alexu on GroupThink, Theism ... and the Wiki · 2009-04-13T21:04:49.909Z · score: 0 (0 votes) · LW · GW

Be careful about how you define those terms, as they may be idiosyncratic. "Rationalism" and "Empiricism" have long philosophical histories, and are typically seen as parallel, not-quite-rival schools of thought, with the rationalists striving to root all knowledge in a priori rational inquiry (Descartes' Meditations is the paradigm example). I'm not sure it's wise to flip that on its head by redefining such a common, well-denoted term.

Comment by alexu on Persuasiveness vs Soundness · 2009-04-13T14:36:18.670Z · score: 3 (5 votes) · LW · GW

I'm certainly not against using chunked concepts on here per se. But I think associating this community too closely with sci-fi/fantasy tropes could have deleterious consequences in the long run, as far as attracting diverse viewpoints and selling the ideas to people who aren't already pre-disposed to buying them. If Eliezer really wanted to proselytize by poeticizing, he should turn LW into the most hyper-rational, successful PUA community on the Internet, rather than the Star Wars-esque roleplaying game it seems to want to become.

Comment by alexu on Persuasiveness vs Soundness · 2009-04-13T14:08:41.312Z · score: 3 (5 votes) · LW · GW

What the hell are the "dark arts"? Could we quit playing super-secret dress-up society around here for one day and just speak in plain English, using terms with known meanings?

Comment by alexu on Marketing rationalism · 2009-04-13T04:11:35.462Z · score: 0 (6 votes) · LW · GW

People are irrational largely because they're stupid. I have yet to be convinced that "rationality" is something entirely distinct from intelligence itself, such that you can appeal to someone to become significantly more "rational" without simultaneously effecting the seemingly tougher feat of boosting IQ a standard deviation or so.

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T22:01:14.118Z · score: 0 (10 votes) · LW · GW

I won't dispute this. For some people, a calculated decision to remain overweight in today's world in order to focus on other things may be the best course of action.

Alternatively, if losing weight is that important to you, you can alter your environment so "today's world" doesn't make it so tempting to eat crappy foods. Your body can be screaming out "eat more food!" all it wants, but if you're living in a cabin in some remote corner of Alaska, there's only so much damage that can do.

Comment by alexu on Akrasia and Shangri-La · 2009-04-10T21:58:39.057Z · score: -5 (25 votes) · LW · GW

Yes, and the 40% or whatever of America that's obese are all "immune" to exercise. That's surely it.

Funny how we were just discussing on LW people who self-handicap and make excuses in order to justify their failures. Might a bit of that be going on here as well?

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:56:02.496Z · score: 0 (6 votes) · LW · GW

Oh come on. If Eliezer eats fewer calories than he expends, he's not going to die of hunger. I fully buy that will-power is a legitimate issue, but bringing up extreme cases like this to make your point doesn't enhance the conversation.

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:45:20.536Z · score: 0 (6 votes) · LW · GW

I'm sure you've seen the psych research suggesting people have a finite amount of "willpower" they can exercise at a given time. It probably does make sense for some people to worry about hard-thinking (or other endeavors) than staying in top shape.

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:43:09.148Z · score: -1 (7 votes) · LW · GW

Of course not, but you've contrived an odd corner-case that, in fact, doesn't exist in reality. I'm not sure what that goes to show.

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:40:38.908Z · score: 2 (6 votes) · LW · GW

Are you saying it didn't work because it didn't curb your hunger or your desire for other, less healthy foods? Or it didn't work because you stuck to the diet of healthy foods and gained weight nonetheless? The latter seems hard to believe, though I suppose it's technically possible to accumulate an excess of calories via turkey and bananas...

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:37:16.388Z · score: 0 (6 votes) · LW · GW

So, maybe staying thin requires Herculean effort for some. Why turn your back on that particular challenge? Elsewhere you seem to take a lot of pride in your determination to "save the world," which seems like no small feat. Don't try to lose weight -- lose weight!

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:31:35.549Z · score: 0 (6 votes) · LW · GW

Diet (singular) does work in the sense of consistently, indefinitely eating healthier foods.

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:22:12.869Z · score: 1 (7 votes) · LW · GW

Yeah, and I realize that simply recommending "diet and exercise" is a bit too pat. Getting oneself into virtuous cycles, with extremely short-term rewards and consequences, is the most effective meta-tactic I know. There are various ways to do this; the key is just to render willpower moot.

Comment by alexu on How theism works · 2009-04-10T21:11:47.121Z · score: 0 (4 votes) · LW · GW

You raise an interesting point I've considered before in relation to Bostrom's simulation argument: if we're living in a simulation, wouldn't that effectively make God real? I can't see a way to deny this without some linguistic legerdemain. It seems like one's probability assignment to the proposition "God is real" should be lower-bounded by the proposition "we're living in a simulation."

Comment by alexu on The Unfinished Mystery of the Shangri-La Diet · 2009-04-10T21:05:31.100Z · score: -1 (13 votes) · LW · GW

It seems like you're questioning the value of diet and exercise -- almost as if they don't work for all people, or they only work for limited amounts of time. This is, of course, untrue, and I know you know this. The real key is to put yourself into a virtuous cycle, where the rewards (or negative consequences) of diet and exercise make themselves apparent to you every day, rather than months down the line, effectively circumventing akrasia.