Posts

Comments

Comment by non-expert on Politics Discussion Thread January 2013 · 2013-02-10T18:52:54.085Z · LW · GW

interesting, so you are dividing morality into impact on immigrants and the idea that they should be allowed to join us a a moral right, with the former included in your analysis and the latter not.

putting aside positions, from a practical perspective it seems that drawing that line will remain difficult because "impact to immigrants" likely informs the very moral arguments I think you're trying to avoid. Or in other words, putting that issue (effect on immigrants) within the costs/benefits analysis requires some of the same subjective considerations that plague the moral argument (both in terms of difficulty in resolving with certainty and the idea of avoiding morality).

Regardless, seems like the horse has been dead for hours (my fault!). Thanks for engaging with me.

Comment by non-expert on Politics Discussion Thread January 2013 · 2013-02-10T17:01:35.751Z · LW · GW

nope, i'm just asking why you think that the moral argument should be ignored, and why that position is obvious. we're talking about a group of humans and what laws and regulations will apply to their lives, likely radically changing them. these decisions will affect their relatives, who may or may not be in similar positions themselves. when legislating about persons, it seems there is always some relevance as to how the laws will affect those people's lives, even if broader considerations (value to us/cost to us as a country) are also relevant.

to be clear, i'm NOT saying you're wrong. I'm asking you why you think you're right, particularly since its so obvious.

EDIT: i totally appreciate i jumped in mid-conversation and asked a question which is now a chain and that might come off as odd to you, so sorry -- you asked about my point -- fair question, I'm not sure I really have one other than understanding your point of view. perhaps silly, but thought you made an interesting point and wanted to see how you thought through the issue before you made it. a "non-expert" can't tell anyone they're wrong, can only try to learn why others think they are right :).

Comment by non-expert on By Which It May Be Judged · 2013-02-10T03:37:36.202Z · LW · GW

if we confess that 'right' lives in a world of physics and logic - because everything lives in a world of physics and logic - then we have to translate 'right' into those terms somehow.

A different perspective i'd like people's thoughts on: is it more accurate to say that everything WE KNOW lives in a world of physics and logic, and thus translating 'right' into those terms is correct assuming right and wrong (fairness, etc.) are defined within the bounds of what we know.

I'm wondering if you would agree that you're making an implicit philosophical argument in your quoted language -- namely that necessary knowledge (for right/wrong, or anything else) is within human comprehension, or to say it differently, by ignoring philosophical questions (e.g. who am i and what is the world, among others) you are effectively saying those questions and potential answers are irrelevant to the idea of right/wrong.

If you agree, that position, though most definitely reasonable, cannot be proven within the standards set by rational thought. Doesn't the presence of that uncertainty necessitate consideration of it as a possibility, and how do you weigh that uncertainty against the assumption that there is none?

To be clear, this is not a criticism. This is an observation that I think is reasonable, but interested to see how you would respond to it.

Comment by non-expert on Politics Discussion Thread January 2013 · 2013-02-09T20:15:50.968Z · LW · GW

Look, there is no doubt an equivalency in your method in that "they should join us" is put on the backburner along with "we should penalize them." I'm simply highlighting this point.

Or to put it another way, the moral statement I'm trying to make is that the moral value of absolutist moral considerations is less than utilitarian concerns in regards to costs/benefits. I don't actually care about moral arguments for or against immigration that aren't consequentalist.

In limiting the "consequentialist" argument to the "home country's" benefits and costs, you've by default given credence to the idea that "they should be penalized" in that you're willing to avoid penalizing them if they add value to your country -- another way of looking at it is to say those that want the immigrants to "join us" aren't benefited in any way by saying that the opposite moral argument was ignored. You've softened your statement now by using "moral value....is less," but you're actually going further than that -- you're saying that the utilitarian concerns on cost/benefits are SO GREAT relative to the moral issues that the moral issues should be ignored completely (or that's how your solution plays out). This is a bold statement, irrespective of its merits. How else would you interpret your statement?:

Immigration would be much better if we approached the issue of "How much do immigrants cost us vs how much do we benefit from them" and made laws in light of this, instead of approaching it from the moral difference between "This is our home and we shouldn't let strangers in" or "Freedom means allowing anyone to join us".

Your point only works if you completely ignore the moral argument. Once it matters even a little, the luxuries offered by cost/benefit analysis are thrown out the window because you now have a subjective consideration to incorporate that makes choices difficult. Again, just highlighting the consequences of your argument, don't really have an opinion on your particular argument.

Part of the problem with politics is we just say things and don't think about what they mean, since our focus is more on being right and presuming the potential certainty rather than understanding the sources and consequences of various political arguments and appreciating the inherent uncertainty that is unavoidable with any governance regime (or so I would argue).

Comment by non-expert on Politics Discussion Thread January 2013 · 2013-02-09T02:55:54.861Z · LW · GW

I think you're implicitly making an moral statement (putting aside whether its "correct"). Your focus on "costs to us and how much do we benefit" means we downplay or eliminate any consideration of the moral question. However, ignoring the moral question has the same effect as losing the moral argument to "this is our home and we shouldn't let strangers in" -- in both cases the moral argument for "joining us" is treated as irrelevant. I'm not making an argument, just an observation i think is relevant if considering the issue.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-07T17:07:31.775Z · LW · GW

DeFranker -- many thanks for taking the time, very helpful.

I spent last night thinking about this, and now I understand your (LW's) points better and my own. To start, I think the ideas of epistemic rationality and instrumental rationality are unassailable as ideas -- there are few things that make as much sense as the ideas of what rationality is trying to do, in the abstract.

But, when we say "rationality" is a good idea, I want to understand two fundamental things: In what context does rationality apply, and where it applies, what methodologies, if any, apply to actually practice it. I don't presuppose any answers to the above -- at the same time I don't want to "practice rationality" unless or before i understand how those two questions are answered or dealt with (I appreciate its not your responsibility to answer them, I'm just expressing them as things I'm considering).

"Weaknesses" of rationality is not an appropriate question -- I now understand the visceral reaction -- However, by putting rationality in context, one can better understand its usefulness from a practical perspective. Any lack of usefulness, or lack of applicability would be the "weakness/criticism" I was asking about, but upon reflection, I get to the same place by talking about context.

Let me step back a bit to explain why I think these questions are relevant. We all know the phrase "context matters" in the abstract -- I would argue that epistemic rationality, in the abstract, is relevant for instrumental rationality because if our model of the world is incorrect, the manner in which we choose to reach our goals in that world will be affected. All I'm really saying here is that "context matters." Now while most agree that context matters with respect to decision making, there's an open question as to "what context actually matters. So, there is always a potential debate regarding whether the the world is understood well enough and to the extent necessary in order to successfully practice instrumental rationality -- this is clearly a relative/subjective determination.

With that in mind, any attempt to apply instrumental rationality would require some thought about epistemic rationality, and whether my map is sufficient to make a decision. Does rationality, as it is currently practice, offer any guidance on this? Lets pretend the answer is no -- that's fine, but then that's a potential "flaw" in rationality or hole where rationality alone does not help with an open issue/question that is relevant.

I'm not trying to knock rationality, but I'm not willing to coddle it and pretend its all there is to know if it comes at the cost of minimizing knowledge.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-07T16:36:40.880Z · LW · GW

Great, thanks, this is helpful. Is the answer to the above questions, as far as you practice rationality, the same for instrumental rationality? it is an idea -- but no real methodology? in my mind it would seem decision theory could be a methodology by which someone could practice instrumental rationality. To the extent it is, the above questions remain relevant (only in the sense they should be considered,

I now have an appreciation of your point -- I can definitely see how the question "what are the flaws with epistemic rationality" could be viewed as an meaningless question -- I was thinking about epistemic rationality as more than just an idea -- an idea WITH a methodology. Clearly the idea is unassailable (in my mind anyway), but methodologies (whether for rationality or some other purpose) could at least in concept have flaws, or perhaps flaws in that they cannot be applied universally -- it was this that I was asking about.

Interestingly, your response raises a different question. If epistemic rationality is an idea, and not a methodology, rationality (as it is discussed here) leaves open the possibility that there could be a methodology that may apply/help with practicing epistemic rationality (i.e. consistent with the IDEA of rationality, but a methodology by which you can practice it).

As I think most appreciate, ideas ( not necessarily with respect to rationality, but generally) suffer from the fact that they are general, and don't give a user a sense of "what to do" -- obviously, getting your map to match reality is not an easy task, so methodologies for epistemic rationality in the abstract could be helpful so as to put the idea to practice.

This is particularly important if you're practicing instrumental rationality -- This type of rationality is practiced "in the world," so having an accurate (or accurate enough) model is seemingly important to ensure that the manner in which you practice instrumental rationality makes sense.

Thus, a possible shortcoming of instrumental rationality could be that it depends on epistemic rationality, but because there isn't a clear answer to the question of "what is real," instrumental rationality is limited to the extent our beliefs regarding "what is real" are actually correct. You could say that instrumental rationality, depending on the circumstances, does not require a COMPLETE understanding of the world, and so my observation, even if fair, must be applied on a sliding scale.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-07T04:42:48.945Z · LW · GW

no -- im not saying your goals ought to be anything, and i'm not trying to win an argument, but appreciate you will interpret my motives as you see appropriate.

let me try this differently -- there is an idea on LW that rationality is a "good" way to go about thinking [NOTE: correct me if i'm wrong]. By rationality, I mean exactly what is listed here:

Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed "truth" or "accuracy", and we're happy to call it that. Instrumental rationality: achieving your values. Not necessarily "your values" in the sense of being selfish values or unshared values: "your values" means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as "winning".

My question relates to putting these two ideas/points into context, but with more of a focus on epistemic rationality (because it seems you need to know the world (i.e. context) in which you're making decisions before you apply instrumental rationality) -- is epistemic rationality practiced through a methodology? (probability theory/decision theory/something else?) or is the description above just an idea that is to be applied generically, e.g. just taking into account cognitive biases? If its just a description of an idea, then does that mean you cannot really "apply" it, you more just try to keep the general tenets in mind when thinking about things?

if theres a methodology (or multiple) to be used to practice epistemic rationality, does that methodology(ies) apply to help understand all aspects of "reality" (again, keying off EY's definition)? [NOTE: It seems reality, if it could be understood, would mean the broadest understanding of who we are, why we are here, and how our world works day-to-day. Is LW using a different definition of reality?] If more than one methodology could apply depending on the situation, how do you distinguish between those methodologies?

If the "chosen" methodology(ies) for epistemic rationality is NOT appropriate for certain decisions, what alternatives are to be used? Also, how do you describe the distinction between the decisions for which the chosen methodology(ies) works and those decisions for which it does not?

To be clear, I'm asking to get context for how rationality fits within the larger picture of the universe, including all of its uncertainty. I realize you may not have answers to all these questions and that there may not be consensus about any of it -- thats more than fine since all i'm looking for is responses, i don't care what they actually are. for example, you or others may make certain assumptions for certain of the questions to make necessary simplifications/etc. - all of that is fine, I just think the questions need to be considered before you can credibly apply (or seek to apply) rationality, and want to see if you've thought about them and if so, how you've handled them. If I'm being unreasonable or missing something with my questions, so be it, but i'd be interested in your thoughts.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T20:52:12.416Z · LW · GW

DeFranker, thanks for the detailed note -- I take your points, they are reasonable and fair, but want to share a different perspective.

The problem I'm having is that I'm not actually making any arguments as "correct" or saying any of you people are wrong. The observation/statement for the sake of discussion does not mean that there is a conclusory judgment attached to it. Now, to the extent that you say i need to have a better understanding to make dissenting points, fair, but all I want to know is what the weakest arguments against rationality are, and question what relevance those weaknesses, if any, on the determination about the amount of time and energy to be spent on rational choice theory, as opposed to another theory or no theory. This seems particularly appropriate with respect to THIS article -- which asks that believers of a theory question the weakest positions of that theory -- whether in application or whatever. This is an analysis for believers to perform. Again, I'm not saying you don't have any strong arguments to weaker positions or that you even have weak positions -- I'm asking how those that follow rationality have approached this question/issue and how they've disposed of it.

It would seem those that follow a theory have the greatest responsibility to consider the strongest arguments against that very theory (which is exactly why EY posted the article re: Judaism). Why is it so inappropriate to hold rationality to the same standard? I'm not presupposing an answer, I just want to know YOUR answer is so i better understand your point of view. Perhaps your answer is "its obvious this theory is correct," without more. I would be fine with that simply because you've answered the question -- you've given me your perspective. Sure, I may ask additional questions, but the goal is not to be right or win some online war, the goal is to learn (my effing name is "non-expert" -- you dont' have to worry about me telling you that you're wrong, but i may question your logic/reason/etc.) I cannot learn unless I understand the perspectives of those that disagree with me.

And regarding the quoted text -- yes, while i appreciate i did not follow the "culture" or norms of this site, I had looked at this site as a place for substantive answers/discussions. I'm not making a fully general counterargument -- I'm simply pointing out that attacking my jokes/jabs allows you to avoid my question -- again, to be clear, I didn't ask the question to prove you're wrong, I'm asking the question to hear your answer!

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T20:25:04.752Z · LW · GW

How has Rationality, as a universal theory (or near-universal) on decision making, confronted its most painful weaknesses? What are rationality's weak points? The more broad a theory is claimed to be, the more important it seems to really test the theory's weaknesses -- that is why I assume you bring up religion, but the same standard should apply to rationality. This is not a cute question from a religious person, more of an intellectual inquiry from a person hoping to learn. In honor of the grand-daddy of cognitive biases, confirmation bias, doesn't rational choice theory need to be vetted?

HungryTurtle makes an attempt to get to this question, but he gets too far into the weeds -- this allowed LW to simply compare the "cons" of religion with the "cons" of rationality -- this is a silly inquiry -- I don't care how the weaknesses of rationality compares to the weaknesses of Judaism because rational theory, if universally applicable with no weaknesses, should be tested on the basis of that claim alone, and not its weaknesses relative to some other theory.

NOTE: re-posting without offending language in the hopes i dont need to create a new name. looks like i lost on my instrumental rationality point, got downvoted enough to get be restricted. on the bright side I am learning to admit i'm wrong (i was wrong to misread whether i'd offend LW, which prevented me from engaging with others on substantive points i'm trying to learn more about).

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T19:58:11.391Z · LW · GW

This is a response to theOtherDave -- I can't respond anymore to threads! you guys win! crush dissent based on superficial factors that "automatically result in downvotes" and thus ignore criticism! fool proof!

Is that [understanding reality] the goal? I'm not sure it is As above, I neither agree that understanding reality is a singularly important terminal goal, nor that finding the "best theory" for achieving my goals is a particularly high-priority instrumental goal.

ok, sorry to put words in your mouth -- what is your goal then? Is it not fair to say the goal is "understand reality" and "achieve your goals"? I'm ignoring the second because its personal -- the first goes to a normative understanding of reality, which presumably equally apply to each of us.

perhaps your definition is different, but my understanding is that epistemic rationality is focused on understanding reality, and it uses rational choice theory as a means to understand that reality.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T19:44:46.031Z · LW · GW

Thanks, EY. I am asking a real question in that i want to know what people think of the question.

As a person that does not think rationality is as useful or as universal as people do on this site, I am at a disadvantage in that i'm in the minority here, however, I'm still posting/reading to question myself through engaging with those I disagree with. I seek the community's perspective, not to necessarily believe it or label it correct/wrong, but simply to understand it. My personal experience (with this name and old ones) has been that people generally do not respond to viewpoints that are contrary to the conventional thought -- this is problematic because this community is best-positioned to defend weaknesses (claimed or real) regarding rationality. Looking at it another way, if I believe that rationality has serious flaws, I need to be able to defend myself against YOUR best arguments, but can only do that if someone engages with me so I understand those arguments first.

The point of my post was to ask a serious question and poke you guys with a stick, hoping the poke elicits a response to the question -- frankly it worked, and now i will swallow, learn from and hopefully respond to the various comments. So long as negative points don't prevent me from reading and posting, i could care less about what points i have -- I also note that I was clear about my intentions about wanting to goad an answer.

Perhaps you disagree with my methods, but since my goal was to hear multiple perspectives, and got more than I usually do, i see its a win for instrumental rationality. And, if my follow-ups suggest I'm not an troll/general a**, perhaps I wont have to use dirty tricks going forward!

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T19:08:21.686Z · LW · GW

Thanks. I don't mean any weaknesses in particular, the idea laid out by EY was to confront your greatest weaknesses, so that is something for those that follow the theory to look into -- I'm just exploring :).

I guess what I'm not following is this idea of "choosing" an approach. Implicit in your answer I think is the idea that there is a "best" approach that must be discovered among the various theories on living life -- why does the existence of theory that is the "best" indicative that it is universally applicable? The goal is to "understand reality," not choose a methodology that is the "best" under the assumption that the "best" theory can be then be followed universally.

Put differently, to choose rationality as a universal theory notwithstanding its flaws, you're saying more than "its the "best" of all the available theories -- I think you must also believe that the idea of having a set theory to guide life, notwithstanding its flaws, is the best way to go about understanding reality. What is the basis for the belief in the second prong?

Saying "well i have to make a decision," so i need to find the best theory doesn't cut it. It is clear there are times we must make a decision, but you are left with a similar question -- why are humans entitled to know what to do simply because they need to make a decision? Perhaps in "reality" is there is no answer (or no answer within the limits of human comprehension) -- it is true you're stuck not knowing what to do but you surely have a better view of reality (if that is the reality).

The implications of this are important. If you agree that rational choice theory is the "best" of all theories, but also agree that there is (or may be) a distinction between "choosing/applying a set theory" and "understanding reality" to the greatest extent humanly possible, it suggests one would need more than rationality to truly understand reality.

Comment by non-expert on Avoiding Your Belief's Real Weak Points · 2013-02-06T18:02:49.392Z · LW · GW

How has Rationality, as a universal theory (or near-universal) on decision making, confronted its most painful weaknesses? What are rationality's weak points? The more broad a theory is claimed to be, the more important it seems to really test the theory's weaknesses -- that is why I assume you bring up religion, but the same standard should apply to rationality. This is not a cute question from a religious person, more of an intellectual inquiry from a person hoping to learn. In honor of the grand-daddy of cognitive biases, confirmation bias, doesn't rational choice theory need to be vetted?

HungryTurtle makes an attempt to get to this question, but he gets too far into the weeds -- this allowed LW to simply compare the "cons" of religion with the "cons" of rationality -- this is a silly inquiry -- I don't care how the weaknesses of rationality compares to the weaknesses of Judaism because rational theory, if universally applicable with no weaknesses, should be tested on the basis of that claim alone, and not its weaknesses relative to some other theory.

Please note that negative points to this post, or failure to respond will only provide further evidence that LW is guilty of confirmation bias. Its sweet when you get to use cognitive biases against those that try to weed them out. (Yes, I'm trying to goad someone into answering, but only because I really want to know your answer, not because I'm trying to troll).

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T23:30:17.718Z · LW · GW

ok, wasn't trying to play "gotcha," just answering your question. good chat, thanks for engaging with me.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T23:19:45.321Z · LW · GW

you suggested that emotion hacking is more of an issue for instrumental rationality and not so much for epistemic rationality. to the extent that is wrong, you're ignoring emotion hacking (subjective factor) from your application of epistemic rationality.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T22:46:08.101Z · LW · GW

sure. note that i don't offer this as conclusive or correct, but just something i'm thinking about. also, lets assume rational choice theory is universally applicable for decision making.

rational choice theory gives you an equation to use and all we have to do is fill that equation with the proper inputs, value them correctly, and you get an answer. Obviously this is more difficult in practice, particularly where inputs (as to be expected) are not easily convertible to probabilities/numbers -- I'm worried this is actually more problematic than we think. Once we have an objective equation as a tool, we may be biased to assume objectivity and truth regarding our answers, even though that belief often is based on the strength of the starting equation and not on our ability to accurately value and include the appropriate subjective factors. To the extent answering a question becomes difficult, we manufacture "certainty" by ignoring subjectivity or assuming it is not as relevant as it is.

Simply put, the belief we have a good and objective starting point biases us to believe we also can/will/actually derive an objectively correct answer, affecting the accuracy with which we fill in the equation.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T21:15:50.561Z · LW · GW

Thanks for the clarification, now i understand.

Going back to the original comment i commented on:

emotion-hacking is mostly an instrumental technique (although it is also epistemically valuable to notice and then stop your brain from flinching away from certain thoughts).

Particularly with your third type of emotion hacking ("hacking your emotional responses to external stimuli"), it seems emotion hacking is vital for for epistemic rationality -- i guess that relates to my original point, that hacking emotions are at least as important for epistemic rationality as hacking emotions for instrumental rationality.

I raised the issue originally because I worry that rationality, to the extent it must value subjective considerations, tends to minimize the importance of those considerations to yield a more clear inquiry.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T20:38:31.442Z · LW · GW

I suppose there is a third kind of emotion-hacking, namely hacking your emotional responses to external stimuli.

isn't this the ONLY kind of emotion-hacking out there? what emotions are expressed irrespective of external stimuli? seems like a small or insignificant subset.

But it's not as if I can respond to other people's thoughts, even in principle: all I have access to are sounds or images which purport to be correlated to those thoughts in some mysterious way.

the second two paragraphs above are responding to this. sorry to throw it back at you, but perhaps i'm misunderstanding the point you were trying to make here? I thought you were questioning the value of considering/responding to others' thoughts, because you are arguing that even if you could, you would need to rely on their words and expressions, which may not be correlated with their "true" state of mind.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T19:26:34.676Z · LW · GW

All emotions are responses to external stimuli, unless your emotions relate only to what is going on in your head, without reference to the outside (i.e. outside your body) world.

I agree you can't respond to others' thoughts, unless they express them such that they are "behaviors." Interestingly, the "problem" you have with the sounds or images (or words?) which purport to be correlated to others' thoughts is the same exact issue everyone is having with you (or me).

if we're confident in our own ability to express our thoughts (i.e. the correlation problem is not an issue for you), then how much can we dismiss others' expressions because of that very same issue?

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-05T16:45:18.420Z · LW · GW

whose thoughts and whose behaviors? not disagreeing, just asking.

Comment by non-expert on Dissolving the Question · 2013-02-04T17:54:42.096Z · LW · GW

the best evidence that confirmation bias is real and ever-present is a website of similarly thinking people that values comments based on those very users' reactions. perhaps unsurprisingly, those that conform to the conventional thought are rewarded with points. so i guess while the point system doesn't actually work as a substantive matter, at least we are afforded a constant reminder that confirmation bias is a problem even among those that purport to take it into account.

of course, my poking fun will only work so long as i don't get so many negative points that i can no longer question the conventional thought (gasp!). what is my limit? I'll make sure to conform just enough to stay on here. :) The worst part is I'm not even trying to troll, I'm trying to listen and question at the same time, which is how i thought I'm supposed to learn!

Comment by non-expert on Dissolving the Question · 2013-02-04T17:45:28.865Z · LW · GW

what you're saying is obviously true, but it goes beyond the information available. the question, limited the facts given, is representative of a larger point, which is the one I'm trying to explain as a general observation and is not limited to whether in fact That tree fell and made a noise.

btw, I never thanked you for our previous back and forth -- it was actually quite helpful, and your last comment in our discussion has kept me thinking for a couple weeks now, and perhaps in a couple more i will respond!

Comment by non-expert on Against Modal Logics · 2013-02-04T17:36:20.983Z · LW · GW

what is the basis for the position that knowledge of the world must come from analytical/probabilistic models? I'm not questioning the "correctness" of your view, only wondering your basis for it. It seems awfully convenient that a type of model that yields conclusions is in fact the correct one -- put another way, why is the availability of a clear methodology that gives you answers indicative of its universal applicability in attaining knowledge?

traditional philosophy, as you correctly point out, has failed to bridge its theory to practice -- but perhaps that is the flaw of the users and not the theory. rationalists generally believe the use of probabilities is sound methodology, but the problems regarding decision-making are a flaw of the practitioners. though I appreciate you likely disagree, perhaps we have the same problem with philosophy. Though there are no clear answers, the models of thought they provide could effectively apply in practical situations, its just that no philosopher has been able to get there.

Comment by non-expert on Dissolving the Question · 2013-02-04T17:22:36.604Z · LW · GW

EDIT: made small edits.

in my opinion, the question is brilliant and its importance is misunderstood, though EY somewhat dances around it.

Whether or not the tree makes a noise is irrelevant once no one can hear it, and thus whether or not the tree is heard is a pre-condition to knowledge that it has fell/made noise. the point then is that (i) the lack of truth to a statement and (ii) truth of a statement that cannot be understood are effectively the same thing.

In other words, what is pointless is trying to pin down truths that cannot be conclusively proven within the bounds of human comprehension (e.g., is there free-will, what is the meaning of life), because practically speaking you're in the same place you would be if there was no answer -- just arguing amongst those who choose to consider the question in the first place.

Comment by non-expert on Rationality Quotes February 2013 · 2013-02-04T16:52:27.932Z · LW · GW

emotion-hacking seems far more important in epistemic rationality, as your understanding of the world is the setting in which you use instrumental rationality, and your "lens" (which presumably encompasses your emotions) is the key hurdle (assuming you are otherwise rational) preventing you from achieving the objectivity necessary to form true beliefs about the world.

Comment by non-expert on The Fallacy of Gray · 2013-01-14T18:42:50.438Z · LW · GW

With respect to your example, I can only play with those facts that you have given me. In your example, I assumed that knowledge of which vial has poison could not be known, and the best information we had was our collective beliefs (which are based on certain factors you listed). I agree with the task at hand as you put it, but the devil is of course in the details.

Which vial contains poison is a fact about the world, and there are a million other contingent facts about the world that go one way or another depending on it. Maybe the air around the vial smells a little different. Maybe it's a different temperature. Maybe the poisoned vial weighs more, or less. All of those contingent facts means that there are different ways I can approach the vials, and if I approach the vials one way I am more likely to live than if I approach the vials a different way.

But as noted above, if we cannot derive the truth, it is just as good as not existing. If the "vial picker" knows the truth beforehand, or is able to derive it, so be it, but immediately before he picks the vial, the Truth, as the vial picker knows it, is of limited value -- he is unsure and everyone around him thinks hes an idiot. After the fact, everyone's opinion will change accordingly with the results. By creating your own example, you're presupposing (i) an answer exists to your question AND (ii) that we can derive it -- we don't have that luxury in the real life, and even if we have that knowledge to know an "answer" exists, we don't know whether the vial picker can accurately pick the appropriate vial based on the information available.

The idea of subjective truth (or subjective reality) doesn't rely solely on the fact that reality doesn't exist, most generally it is based on the idea that there may be cases a human cannot derive what is real even where there is some answer. If we cannot derive that reality, the existence of that reality must also be questioned. We of course don't have to worry about these subtleties if the examples we use assume an answer to the issue exists.

The meaning of this is that rationality in my mind is helpful only to the extent (i) an answer exists and (ii) it can be derived. If the answer to (i) and (ii) are yes, rationality sounds great. If the answer to (i) is no, or the answer to (i) is yes but (ii) is no, rationality (or any other system) has no purpose other than to give us a false belief that we're going about things in the best way. In such a world, there will be great uncertainty as to the appropriate human course of action.

This is why I'm asking why you are confident the answer to (i) is yes for all issues. You're describing a world that provides a level of certainty such that the rationality model works in all cases -- I'm asking why you know that amount of certainty exists in the world -- its convenience is precisely what makes its universal application suspect. As noted in my answer to MugaSofer, perhaps your position is based on assumption/faith without substantiation, which I'm comfortable with as a plausible answer, but not sure that is the basis you are using for the conclusion (for the record, my personal belief is that any sort of theory or basis for going about our lives requires some type of faith/assumptions because we cannot have 100% certainty)

Comment by non-expert on The Fallacy of Gray · 2013-01-14T18:04:03.349Z · LW · GW

We can know what is Right, as long as we define it as "right according to human morals." Those are an objective (if hard to observe) part of reality. If we built an AI that tries to figure those out, then we get an ethical AI - so I would have a hard time calling them "subjective"

I don't dispute the possibility that your conclusion may be correct, I'm wondering the basis under which you believe your position to be correct. Put another way, why are moral truths NOT relative? How do you know this? Thinking something can be done is fine (AI, etc.), but without substantiation it introduces a level of faith to the conversation -- I'm comfortable with that as the reason, but wondering if you are or if you have a different basis for the position.

From my view, moral truths may NOT be relative, but I have no basis for which to know that, so I've chosen to operate as if they are relative because (i) if moral truths exist but I don't know what they are, I'm in the same position as them not existing/being relative, and (ii) moral truths may not exist. This doesn't mean you don't use morality in your life, its just that you need to have a belief, without substantiation, that those you subscribe to conform with universal morals, if they exist.

OK, i'll try to search for those EY writings, thanks.

Comment by non-expert on The Fallacy of Gray · 2013-01-14T08:01:43.161Z · LW · GW

What do you disagree with? That "truth is relative" applies to only moral questions? or that it applies to more than moral questions?

If instead your position is that moral truths are NOT relative, what is the basis for that position? No need to dive deep if you know of something i can read...even EY :)

Comment by non-expert on The Fallacy of Gray · 2013-01-14T07:55:45.180Z · LW · GW

I actually don't think we're using the word differently -- the issue was premised solely for issues where the answer cannot be known after the fact. In that case, our use of "confidence" is the same -- it simply helps you make decisions. Once the value of the decision is limited to the belief in its soundness, and not ultimate "correctness" of the decision (because it cannot be known), rationality is important only if you believe it to be correct way to make decisions.

Comment by non-expert on The Fallacy of Gray · 2013-01-14T07:48:20.918Z · LW · GW

Roughly speaking, I understood Mugasofer to be referring to a calculated value with respect to a proposition that ought to control my willingness to expose myself to penalties contingent on the proposition being false.

How is this different than being "comfortable" on a personal level? If it isn't, the only value of rationality where the answer cannot be known is simply the confidence it gives you. Such a belief only requires rationality if you believe rationality provides the best answer -- the "truth" is irrelevant. For example, as previously noted in the thread, if I'm super religious, I could use scripture to guide a decision and have the same confidence (on a subjective, personal way). Once the correctness of the belief cannot be determined as right or wrong, the manner in which the belief is created becomes irrelevant, EXCEPT to the extent laws/norms change because other people agree. I've taken the idea of absolute truth and simply converted it social truth because I think its a more appropriate term (more below).

You are suggesting that rationality provides the "best way" to get answers short of perfect knowledge. Reflecting on your request for a comparatively better system, I realized you are framing the issue differently than I am. You are presupposing the world has certainty, and only are concerned with our ability to derive that certainty (or answers). In that model, looking for the "best system" to find answers makes sense. In other words, you assume answers exist, and only the manner in which to derive them is unknown. I am proposing that there are issues for which answers do not necessarily exist, or at least do not exist within world of human comprehension. In those cases, any model by which someone derives an answer is equally ridiculous. That is why I cannot give you a comparison. Again, this is not to throw up my hands, its a different way of looking at things. Rationality is important, but a smaller part of the bigger picture in my mind. Is my characterization of your position fair? If so, what is your basis for your position that all issues have answers?

So, I have two vials in front of me, one red and one green, and a thousand people are watching. All thousand-and-one of us believe that the red vial contains poison and the green vial contains yummy fruit juice. You are arguing that this is all I need to know to make a decision, because the relevance of the truth about which vial actually contains poison is limited to the extent to which other people agree that it does.

I am only talking about the relevance of truth, not the absolute truth, because the absolute truth cannot be necessarily be known beforehand (as in your example!). Immediately before the vial is chosen, the only relevance of the Truth (referring to actual truth) is the extent to which the people and I believe something consistent. Related to the point I made above, if you presuppose Truth exists, it is easy to question or point out how people could be wrong about what it is. I don't think we have the luxury to know the Truth in most cases. Until future events prove otherwise, truth is just what we humans make of it, whether or not it conforms with the Truth -- thus I am arguing that the only relevance of Truth is the extent to which humans agree with it.

In your example, immediately after the vial is taken -- we find out we're right or wrong -- and our subjective truths may change. They remain subjective truths so long as future facts could further change our conclusions.

Comment by non-expert on The Fallacy of Gray · 2013-01-10T07:49:24.979Z · LW · GW

I suspect that the word "confidence" is not being used consistently in this exchange, and you might do well to replace it with a more explicit description of what you intend for it to refer to.

i referenced confidence only because Mugasofer did. What was your understanding of how Mugasofer used "confident as we should be"? Regardless, I am still wondering what the value of being "right" is if we can't determine what is in fact right? If it gives confidence/ego/comfort that you've derived the right answer, being "right" in actuality is not necessary to have those feelings.

To say that "rationality falls short" in these cases suggests that it's being compared to something.

Fair. The use of rationality and the belief in its merits generally biases the decision maker to form a belief that rationality will yield a correct answer, even if it does not -- it seems rationality always errs on applying probabilities (and forming a judgment), even if they are flawed (or you don't know they are accurate). To say it differently, to the extent a question has no clear answer (for example, because we don't have enough information or it isn't worth the cost), I think we'd be better off withholding judgment altogether than forming a judgment for the sake of having an opinion. Rumsfeld had this great quote -- "we dont know what we don't know" -- we also don't know the importance of what we don't know relative to what we do know when forming judgments. From this perspective, having an awareness of how little we know seems far more important than creating judgments based on what we know. Rationality cannot take into account information that is not known to be relevant -- what is the value of forming a judgment in this case? To be clear, I'm not "throwing my hands up" for all of life's questions and saying we don't know anything -- I'm trying to see how far LW is willing to push rationality as a universal theory (or the best theory in all cases short of perfect knowledge, whatever that means).

Truth is relative because its relevance is limited to the extent other people agree with that truth, or so I would argue. This is because our notions of truth are man-made, even if we account for the possibility that there are certain universal truths (what relevance do those truths have if only you know them?). Despite the logic underlying probability theory/science in general, truths derived therefrom are accepted as such only because people value and trust probability theory and science. All other matters of truth are even more subjective -- this does not mean that contradicting beliefs are equally true or equally valid, instead, truth is subjective precisely because we cannot even attempt prove anything as true outside of human comprehension. We're stuck debating and determining truth only amongst ourselves. Its the human paradox of freedom of expression/reasoning trapped within an animal form that is fallible and will die. From my perspective, determining universal truth, if it exists, requires transcending the limitations of man -- which of course i cannot do.

Comment by non-expert on The Fallacy of Gray · 2013-01-10T06:45:44.585Z · LW · GW

Perspectivism provides that all truth is subjective, but in practice, this characterization has no relevance to the extent there is agreement on any particular truth. For example, "Murder is wrong," even if a subjective truth, is not so in practice because there is collective agreement that murder is wrong. That is all I meant, but agree that it was not clear.

Comment by non-expert on The Fallacy of Gray · 2013-01-09T14:22:36.518Z · LW · GW

Throwing your hands in the air and saying "well we can never know for sure" is not as accurate as giving probabilities of various results. We can never know for sure which answer is right, but we can assign our probabilities so that, on average, we are always as confident as we should be. Of course, humans are ill-suited to this task, having a variety of suboptimal heuristics and downright biases, but they're all we have. And we can, in fact, assign the correct probabilities / choose the correct choice when we have the problem reduced to a mathematical model and apply the math without making mistakes.

If all you're looking for is confidence, why must you assign probabilities? I'm pushing you in hopes of understanding, not necessarily disagreeing. If I'm very religious and use that as my life-guide, I could be extremely confident in a given answer. In other words, the value of using probabilities must extend beyond confidence in my own answer -- confidence is just a personal feeling. Being "right" in a normative sense is also relevant, but as you point out, we often don't actually know what answer is correct. If your point instead is that probabilities will result in the right answer more often then not, fine, then accurately identifying the proper inputs and valuing them correctly is of utmost importance -- this is simply not practical in many situations precisely because the world is so complex. I guess it boils down to this -- what is the value of being "right" if what is "right" cannot be determined? I think there are decisions where what is right can be determined -- and rationality and the bayesian model works quite well. I think far more decisions (social relationships, politics, economics -- particularly decisions that do not directly affect the decision maker) are too subjective to know what is "right" or accurately model inputs. In those cases, I think rationality falls short, and the attempt to assign probabilities can give false confidence that the derived answer has a greater value than simply providing confidence that it is the best one.

I think I'm the only one on LessWrong that finds EY's writing maddening -- mostly the style -- I keep screaming to myself, "get to the point!" -- as noted, perhaps its just me. His examples from the cited article miss the point of perspectivism I think. Perspectivism (or at least how I am using it) simply means that truth can be relative, not that it is relative in all cases. Rationality does not seem to account for the possibility that it could be relative in any case.

Comment by non-expert on The Fallacy of Gray · 2013-01-09T04:15:20.904Z · LW · GW

Ok, yes, the idea of using probabilities raises two issues -- knowing you have the right inputs, and having the right perspective. Knowing and valuing the proper inputs to most questions seems impossible because of the subjectivity of most issues -- while Bayesian judgements may still hold in the abstract, they are often not practical to use (or so I would argue). Second, what do you think about the idea of "perspectivism" -- that there is only subjective truth in the world? You don't have to sign on completely to Nietzsche's theory to see its potential application, even if limited in scope. For example, a number of communication techniques employ a type of perspectivism because different people view issues through an "individual lens". In either case, seeing the world as constructed of shades of grey seems more practical and accurate relative to using probabilities. This seems at odds with Bayesian judgments that assume that probabilities yield one correct answer AND that a person can and should be able to derive that correct answer.

The point i raise about communication techniques relates to your "offtopic" point. I assume you are a rationalist, and thus believe yourself to have superior decision making skills (at least relative to those that are not students (or masters) of rationality). If so, what is the value of your "off topic" point -- you clearly were able to answer my question despite its shortcomings -- why belittle someone that is trying to understand an article that is well-received by LW? Is the petty victory of pointing out my mistakes, from your perspective, the most rational way to answer my comment? I'm not insulted personally (this type of pettiness always makes me smile), but I'm interested in understanding the logic of your comments. From my perspective, rationality failed you in communicating in an effective way. It seems your arrogance could keep many from following and learning from LW -- unless of course the goal is to limit the ranks of those that employ rationality. What am I missing? (and the answer is no, i haven't considered using a spell or grammar checker other than the one provided by this site).

Comment by non-expert on The Fallacy of Gray · 2013-01-08T08:52:20.508Z · LW · GW

i don't follow the relevance of article, as it seems quite obvious. the real problem with the black and white in the world of rationality is the assumption there is a universal answer to all questions. the idea of "grey" helps highlight that many answers have no one correct universal answer. what i dont understand about rationalists (LW rationalists) is that the live in a world in which everything is either right or wrong. this simplifies a world that is not so simple. what am i missing?