What is your personal totalizing and self-consistent worldview/philosophy?
post by lsusr · 2024-12-27T23:59:30.641Z · LW · GW · 6 commentsThis is a question post.
Contents
The Problem None Answers 5 Gordon Seidoh Worley None 6 comments
Every major author who has influenced me has "his own totalising and self-consistent worldview/philosophy" [LW · GW]. This list includes Paul Graham, Isaac Asimov, Joel Spolsky, Brett McKay, Shakyamuni, Chuck Palahniuk, Bryan Caplan, qntm, and, of course, Eliezer Yudkowsky, among many others.
In this post, I'll attempt to lay out my own worldview and philosophy.
If you have one too, then it's a good exercise to put it into words. What are your most important Litanies? What are your noble truths? Here's mine.
The Problem
Evolution is not random. Evolution is an optimizer, which means that it can, from a certain point of view, be seen to have intentions. Evolution's optimization target is "inclusive genetic fitness". "Inclusive genetic fitness" and "good" are different things. In other words, you and I were created by an epiphenomenon with values fundamentally orthogonal to our own.
There are some ways our values are aligned with evolution. For example, both you and evolution want you to have good eyesight. But in other ways, your values are unaligned with evolution. For example, you want to suffer less. Evolution doesn't care how much you experience the conscious perception of suffering. It would happily torture you and a million other of its children for a million years if that bought it a 0.001% increase to inclusive genetic fitness.
The result is horrors like genocide and factory farms. Genocide and factory farms are good, from the point of evolution, because they are instrumentally useful. But this creates a bioengineering problem for evolution, because compassion is instrumentally useful too. Putting both of them into the same brain produces a mess of contradictions incompatible with sanity. This isn't just misinformation in your environment. It's part of the genetically-programmed biophysical architecture of your brain. In this way, evolution has intentionally distorted your perception of good and evil.
Making things worse is a fundamental conflict between world modeling and world optimization. Your brain creates a real-time simulation of your local environment. This simulation feels like external physical reality, but it is merely conscious reality. Your brain attempts to maintain a 1-to-1 correspondence between the simulation and the real world. If correspondence is maintained, then the simulation not only feels like reality―it can be treated as such too. That's why it usually isn't necessary to make a distinction between reality and the simulation.
Evolution doesn't want you to notice that these are different things. Evolution wants you to believe that the simulation reflects external physical reality. Everything you consciously perceive is part of the simulation, and exists only within in the tiny slice of physical reality that is your brain.
These are the two most important ways evolution has intentionally designed our brains to create a simulation that deviates from reality. There are many others, but most are downstream of these two problems.
Nations, religions, products, and other competitive memetic forces hijack evolution's distortions and make them 1,000× worse.
And yet…there is a way out of this mess. Because every one of these memeplexes has to have some accurate sense of the truth to not be outcompeted. It may be warped and distorted, but it's there. And if you seize on that kernel of truth, protect it, nurture it, and be forever vigilant that you, yourself, may have been corrupted, then it will grow. This is the ultimate battle inside of every conscious being and every unconscious thinking machine.
And we're going to win.
Answers
Mine:
The world is perfect, meaning it is exactly as it is and always was going to be. However, the world as we know it is an illusion in that it only exists in our minds. We only know our experience, and all (metaphysical) claims to know reality, no matter how useful and predictive they are, are contingent and not fundamental. But we get confused about this because those beliefs are really useful and really predictive, and we separate ourselves from reality by first thinking the world is real, and then thinking our beliefs are about the world rather than of the world itself.
Thus the first goal of all self-aware beings is to get straight in their mind that everything is an illusion. This changes nothing about daily life because everything adds up to normality, but we are no longer confused. Knowing that all is illusion eliminates our fundamental source of suffering that's created by seeing ourselves as separate from the world, and thus we allow ourselves to return to the original joy of experience.
Having gotten our minds straight, now we can approach the task of shaping the world (which is, again, an illusion we construct in our minds, and is only very probably a projection of some external reality into our minds) to better fit our preferences. We can take our preferences far. They weren't designed to be maximized, but nonetheless we can do better than we do today. We can build machines and social technologies and networks other people (or at least create the illusion of these things in the very ordinary way we create all our illusions) make possible the world we more want to live in. And everyone can do this, for they are not separate from us. Their preferences are our own; ours theirs. Together we can create a beautiful illusion free of pain and strife and full of flourishing.
6 comments
Comments sorted by top scores.
comment by L Rudolf L (LRudL) · 2024-12-28T13:35:51.800Z · LW(p) · GW(p)
If you have [a totalising worldview] too, then it's a good exercise to put it into words. What are your most important Litanies? What are your noble truths?
The Straussian reading of Yudkowsky is that this does not work. Even if your whole schtick is being the arch-rationalist, you don't get people on board by writing out 500 words explicitly summarising your worldview. Even when you have an explicit set of principles [LW · GW], it needs to have examples and quotes to make it concrete (note how many people Yudkowsky quotes and how many examples he gives in the 12 virtues [LW · GW] piece), and be surrounded by other stuff that (1) brings down the raw cognitive inferential distance [LW · GW], and (2) gives it life through its symbols / Harry-defeating-the-dementor stories / examples of success / cathedrals / thumos.
It is possible that writing down the explicit summary can be actively bad for developing it, especially if it's vague / fuzzy / early-stages / not-fully-formed. Ideas need time to gestate, and an explicit verbal form is not always the most supportive container.
comment by CstineSublime · 2024-12-28T06:48:28.628Z · LW(p) · GW(p)
I'm not sure if I'm persuaded that we can say that "evolution has intentions" - isn't evolution just a convenient word to describe a pattern? Evolution isn't an entity, it isn't a system, it doesn't have an identity. It is a quality or pattern that we notice in certain systems and at this point we risk getting into some kind of Hegelian matroishka doll about weltgeist - something which I'm afraid to even bring up.
I also feel like I'm missing something that "genocide and factory farms are instrumentally useful" especially if you're anthropamorphising evolution.
That being said, if the use of the Pathetic Fallacy allows you to make your beliefs pay rent - then I retract the above!
At any rate, thank you for posting this - because I realized something!
I think what this has made me realize is that I DO NOT have a conscious totalizing worldview or philosophy. But obviously I must have a worldview, I assume everyone does, all humans beings have a myriad of beliefs that lies on a continuum between a eclectic and byzantine (and very inconsistent) hodge-podge and the kind of totalizing and self-consistent system you describe.
Now, when I say I don't have a "conscious totalizing worldview or philosophy" - I am saying I don't have the self-awareness to know how self-consistent my beliefs are, and where on the continuum mine lies and this is partly because I couldn't summarize it. As such I'm guessing I'm somewhere on the inconsistent, eclectic, hodge-podge side - but perhaps unconsciously my beliefs are actually really self-consistent and I'm more on the totalizing side, but I doubt it.
This realization is surprising to me, because one thing I am is an aesthete with a curatorial bent - or in laymans terms "I know what I like, and I know what I don't like and avoid it like the plague". An aesthete is someone who is especially sensitive to (or excessively concerned with) the beautiful, especially in art. And by curatorial, I mean, someone who wants those beautiful art things to be coordinated in a certain way, to exclude anything which isn't beautiful rah rah rah. In this regard I have a very bright but narrow spotlight of self-awareness.
I don't mean to say that I am some kind of superior tastemaker or have a better sense of what is beautiful than others, but I do believe I am especially attuned to my own, idiosyncratic and totally subjective sense of beauty or what I enjoy experiencing. In fact if you ask me "what do most people like?" I would throw up my hands. If pushed, I would mumble something about Taylor Swift, Jeff Koons, and Michael Kors - the kind of answer you give when you have no idea what you're talking about.
To put it another way, I couldn't possibly be an aesthetic elitist, because I don't even know what most people like, so I couldn't even have something to compare my own aesthetics against.
I don't know if that counts as a totalizing worldview, since it is only a partial worldview - it is a hyperacuity about art, fashion, music, narrative, the written word, performance etc. that I "like". Me, myself, and only I.
↑ comment by lsusr · 2024-12-28T09:19:47.105Z · LW(p) · GW(p)
Regarding genocide and factory farms, my point was just that abusing others for your self-benefit is an adaptive behavior. That's all. Nothing deeper than that.
By the way, I appreciate you trying to answer the crux of my question to the extent that makes sense. This is exactly the kind of thinking I was hoping to provoke.
As for being attuned with your own taste, it is an especially necessary component of a totalizing worldview for artists e.g. Leonardo, Miyazaki, Eiichiro Oda.
Replies from: CstineSublime↑ comment by CstineSublime · 2024-12-28T11:13:30.259Z · LW(p) · GW(p)
my point was just that abusing others for your self-benefit is an adaptive behavior.
Thank you for clarifying that, I got confused about to whom it benefited.
This is exactly the kind of thinking I was hoping to provoke.
That is excellent to know. And thank you for providing that provocation. It could become valuable self-knowledge for me.
And yes agreed it can be a very necessary component for artists, while I have no doubt there are a lot of artists who spend their artistic lives exploring and discovering that worldview which is unbeknownst to themselves (Picasso perhaps? Fernando Pessoa too? The Moonage Daydream documentary begins with Bowie saying that all artists are attempting to define their relationship with the world) one of the most repeated things that is said about writers, filmmakers, and other creatives is that the adored ones have a distinct "voice" or "point of view".
This even works in the inverse, fashion designer Miuccia Prada one opined that "to hate something is the origin of my work"
comment by L Rudolf L (LRudL) · 2024-12-28T13:20:25.055Z · LW(p) · GW(p)
Every major author who has influenced me has "his own totalising and self-consistent worldview/philosophy" [LW · GW]. This list includes Paul Graham, Isaac Asimov, Joel Spolsky, Brett McKay, Shakyamuni, Chuck Palahniuk, Bryan Caplan, qntm, and, of course, Eliezer Yudkowsky, among many others.
Maybe this is not the distinction you're focused on, but to me there's a difference between thinkers who have a worldview/philosophy, and ones that have a totalising one that's an entire system of the world.
Of your list, I only know of Graham, Asimov, Caplan, and, of course [LW · GW], Yudkowsky. All of them have a worldview, yes, and Caplan's maybe a bit of the way towards a "system of the world" because he does seem to have a overall coherent perspective on economics, politics, education, and culture (though perhaps not very differentiated from other libertarian economists?).
Paul Graham definitely gets a lot of points for being right about many startup things before others and contrarian in the early days of Y Combinator, but he seems to me mainly an essayist with domain-specific correct takes about startups, talent, aesthetics, and Lisp [LW · GW] rather than someone out to build a totalising philosophy of the world.
My impression of Asimov is that he was mainly a distiller and extrapolator of mid-century modernist visions of progress and science. To me, authors like Vernor Vinge are far more prophetic, Greg Egan is far more technically deep, Heinlein was more culturally and politically rich, Clarke was more diverse, and Neal Stephenson just feels smarter while being almost equally trend-setting as Asimov.
I'd be curious to hear if you see something deeper or more totalising in these people?
comment by Jonas Hallgren · 2024-12-28T11:21:11.086Z · LW(p) · GW(p)
I resonate with this framing of evolution as an optimizer and I think we can extend this perspective even further.
Evolution optimizes for genetic fitness, yes. But simultaneously, cultural systems optimize for memetic fitness, markets optimize for economic fitness, and technological systems increasingly optimize for their own forms of fitness. Each layer creates selection pressures that ripple through the others in complex feedback loops. It isn't necessarily that evolution is the only thing happening, it may be the outermost value function that exists but there's so much nesting here as well.
There's only modelling and what is being modelled and these things are happening everywhere all at once. I feel like I fully agree with what you said but I guess for me an interesting point is about what basis to look at it from.