Self-conscious ideology
post by casebash · 2017-06-28T05:32:23.146Z · LW · GW · Legacy · 13 commentsContents
13 comments
Operating outside of ideology is extremely hard, if not impossible. Even groups that see themselves as non-ideological, still seem to end up operating within an ideology of some sort.
Take for example Less Wrong. It seems to operate within a few assumptions:
- That studying rationality will provide use with a greater understanding of the world.
- That studying rationality will improve you as a person.
- That science is one of our most important tools for understanding the world.
...
These assumptions are also subject to some criticisms. Here's one criticism for each of the previous points:
- But will it or are we dealing with problems that are simply beyond our ability to understand (see epistemic learned helplessness)? Do we really understand how minds work well enough to know whether a mind uploaded would still be "you"?
- But religious people are happier.
- Hume's critique of induction
I could continue discussing assumptions and possible criticisms, but that would be a distraction from the core point, which is that there are advantages to having a concrete ideology that is aware of it's own limitations, as opposed to an implicit ideology that is beyond all criticism.
Self-conscious ideologies also have other advantages:
- Quick and easy to write since you don't have to deal with all of the special cases.
- Easy to share and explain. Imagine trying to explain to someone, "Rationality gives us a better understanding of the world, except when it does not". Okay, I'm exaggerating, epistemic humility typically isn't explained that badly, but it certainly complicates sharing.
- Easier for people to adopt the ideology as a lens through which to examine the world, without needing to assume that it is literally true.
13 comments
Comments sorted by top scores.
comment by cousin_it · 2017-06-28T19:03:13.820Z · LW(p) · GW(p)
My favorite self-conscious ideology is Continental Rationalism. The main idea is that the world is built on logic and harmony which can be understood by an individual human mind. It was born from religious mysticism (Descartes, Leibniz) but somehow became very fruitful in science. Competing ideologies like "the world is built on chance" or "all understanding is social" don't bear nearly as much fruit, though they sound more sophisticated. Maybe it's because they discourage you from trying to understand things by reason, while CR encourages it. Heck, I think even LW ideology loses out to CR, because self-improvement feels like a grind, while understanding the world feels like a quest. Maybe if LW focused for a while on understanding things instead of defeating akrasia and such, it'd be a happier place.
Replies from: entirelyuseless, casebash, bogus↑ comment by entirelyuseless · 2017-06-29T13:28:30.907Z · LW(p) · GW(p)
Maybe if LW focused for a while on understanding things instead of defeating akrasia and such, it'd be a happier place.
Completely agree. (Username is not by chance.)
↑ comment by bogus · 2017-06-28T19:22:37.423Z · LW(p) · GW(p)
The main idea is that the world is built on logic and harmony which can be understood by an individual human mind. It was born from religious mysticism (Descartes, Leibniz)
Erm, Pythagoras was around a lot earlier than the likes of Descartes or Leibniz. Even the competing ideas that "the world is built on chance" or else that "all understanding is social" (or, to put it another way, "man is the measure of all things") are of comparable antiquity and not really more 'sophisticated' in any relevant way - except perhaps in an overly literal sense, being more conducive to "sophistry"!
comment by ChristianKl · 2017-06-29T10:10:50.938Z · LW(p) · GW(p)
I think it would be very worthwhile to study which assumptions are actually shared. We could have a poll where we list 50 assumptions and everyone states on a Likert state to what extent they agree.
It would also be interesting to see whether there are other clusters besides a basic "rational ideology" cluster.
comment by tadasdatys · 2017-06-29T16:52:07.981Z · LW(p) · GW(p)
If you take a random set of people, they will have various beliefs, and some of those will be more common than others. Calling that an ideology sems unfair. By the way, all beliefs have criticisms and yet some beliefs are more correct than others.
Also, "it's likely that some of the beliefs I hold are wrong" is already one rationalist assumption, or at least it should be. What are you adding to that?
Replies from: ChristianKl↑ comment by ChristianKl · 2017-06-29T18:00:21.620Z · LW(p) · GW(p)
It's not about fairness.
Being self-conscious of the peoples that one has and that one uses to operate is useful.
comment by satt · 2017-06-28T20:23:34.174Z · LW(p) · GW(p)
You reminded me of a tangentially related post idea I want someone to steal: "Ideologies as Lakatosian Research Programmes".
Just as people doing science can see themselves as working within a scientific research programme, people doing politics can see themselves as working within a political research programme. Political research programmes are scientific/Lakatosian research programmes generalized to include normative claims as well as empirical ones.
I expect this to have some (mildly) interesting implications, but I haven't got round to extracting them.
Replies from: Jayson_Virissimo↑ comment by Jayson_Virissimo · 2017-06-29T04:44:25.085Z · LW(p) · GW(p)
You've already been scooped. The "research programme" that Lakatos talks about was designed to synthesize the views of Kuhn and Popper, but Kuhn himself modeled his revolutionary science after constitutional crises, and his paradigm shifts after political revolutions (and, perhaps more annoyingly to scientists, religious conversions). Also, part of what was so controversial (at the time) about Kuhn, was the prominence he gave to non-epistemic (normative, aesthetic, and even nationalistic) factors in the history of science.
Replies from: sattcomment by Viliam · 2017-06-28T10:17:14.425Z · LW(p) · GW(p)
I think Eliezer once wrote something about things becoming clearer when you think about how you would program a computer to do it, as opposed to e.g. just throwing some applause lights to a human. So, how specifically would you implement this kind of belief in a computer?
Also, should we go meta and say: "'Rationality gives us a better understanding of the world, except when it does not' is a good ideology, except when it is worse" et cetera?
What exactly would that actually mean? (Other than verbally shielding yourself from criticism by endless "but I said 'except when not'".) Suppose a person A believes "there is a 80% probability it will rain tomorrow", but a person B believes "there is a 80% probability it will rain tomorrow, except if it is some different probability". I have an idea about how A would bet about tomorrow's weather, but how would B?
Replies from: TheAncientGeek, casebash↑ comment by TheAncientGeek · 2017-06-29T11:21:55.002Z · LW(p) · GW(p)
I think Eliezer once wrote something about things becoming clearer when you think about how you would program a computer to do it, as opposed to e.g. just throwing some applause lights to a human. So, how specifically would you implement this kind of belief in a computer?
First solve natural language...
No one has used eliezer's technique much and there may be a reason for that.
↑ comment by casebash · 2017-06-29T06:59:25.522Z · LW(p) · GW(p)
"'Rationality gives us a better understanding of the world, except when it does not"
I provided this as an exaggerated example of how aiming for absolute truth can mean that you produce an ideology that is hard to explain. More realistically, someone would write something along the lines of, rationality gives us a better understanding of the world, except in cases a), b), c)... but if there are enough of these cases and these cases are complex enough, then in practise people round it off to "X is true, except when it is not", ie. they don't really understand what is going on as you've pointed out.
The point was that there are advantages of creating a self-conscious ideology that isn't literally true, but has known flaws, such as it becoming much easier to actually explain so that people don't end up being confused as above.
In other words, as far as I can tell, it doesn't seem that your comment isn't really responding to what I wrote.