Posts

StartAtTheEnd's Shortform 2024-01-11T17:52:24.084Z

Comments

Comment by StartAtTheEnd on What are Emotions? · 2024-11-15T10:51:52.623Z · LW · GW

I think there's a problem with the entire idea of terminal goals, and that AI alignment is difficult because of it.

"What terminal state does you want?" is off-putting because I specifically don't want a terminal state. Any goal I come up with has to be unachievable, or at least cover my entire life, otherwise I would just be answering "What needs to happen before you'd be okay with dying?"

An AI does not have a goal, but an utility function. Goals have terminal states, once you achieve them you're done, the program can shut down. An utility function goes on forever. But generally, wanting just one thing so badly that you'd sacrifice everything else for it.. Seems like a bad idea. Such a bad idea that no person has ever been able to define an utility function which wouldn't destroy the universe when fed to a sufficiently strong AI.

I don't wish to achieve a state, I want to remain in a state. There's actually a large space of states that I would be happy with, so it's a region that I try to stay within. The space of good states form a finite region, meaning that you'd have to stay within this region indefinitely, sustaining it. But something which optimizes seeks to head towards a "better state", it does not want to stagnate, but this is precisely what makes it unsustainable, and something unsustainable is finite, and something finite must eventually end, and something which optimizes towards an end is just racing to die. A human would likely realize this if they had enough power, but because life offers enough resistance, none of us ever win all our battles. The problem with AGIs is that they don't have this resistance.

The after-lives we have created so far are either sustainable or the wish to die. Escaping samsara means disappearing, heaven is eternal life (stagnation) and Valhalla is an infinite battlefield (a process which never ends). We wish for continuance. It's the journey which has value, not the goal. But I don't wish to journey faster.

Comment by StartAtTheEnd on Anvil Problems · 2024-11-15T08:54:14.442Z · LW · GW

I meant that they were functionally booleans, as a single condition is fulfilled "is rich", "has anvil", "AGI achieved". In the anvil example, any number past 1 corresponds to true. In programming, casting positive integers to booleans results in "true" for all positive numbers, and "false" in the case of zero, just like in the anvil example. The intuition carries over too well for me to ignore.

The first example which came to mind for me when reading the post was confidence, which is often treated as a boolean "Does he have confidence? yes/no". So you don't need any countable objects, only a condition/threshold which is either reached or not, with anything past "yes" still being "yes".

A function where everything past a threshold maps to true, and anything before it maps to false, is similar to the anvil example, and to a function like "is positive" (since a more positive number is still positive). But for the threshold to be exactly 1 unit, you need to choose a unit which is large enough. 1$ is not rich, and having one water droplet on you is not "wet", but with the appropriate unit (exactly the size of the threshold/condition) these should be functionally similar.

I'm hoping there is simple and intuitive mathematics for generalizing this class of problems. And now that I think about it, most of these things (the ones which can be used for making more of themselves) are catalysts (something used but not consumed in the process of making something). Using money to make more money, anvils to make more anvils, breeding more of a species before it goes extinct.

Comment by StartAtTheEnd on Anvil Problems · 2024-11-14T17:37:05.166Z · LW · GW

This probably makes more sense if you view it as a boolean type, you either "have an anvil" or you don't, and you either have access to fire or you don't. We view a lot of things as booleans (if your clothes get wet, then wet is a boolean). This might be helpful? It connects what might seem like a sort of edge case into something familiar.

But "something that relies on itself" and "something which is usually hard to get, but easy to get more of once you have a bit of it" are a bit more special I suppose. "Catalyst" is a sort of similar yet different idea. You could graph these concepts as dependency relations and try out all permutations to see if more types of problems exists

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-14T14:47:42.460Z · LW · GW

The short version is that I'm not sold on rationality, and while I haven't read 100% of the sequences it's also not like my understanding is 0%. I'd have read more if they weren't so long. And while an intelligent person can come up with intelligent ways of thinking, I'm not sure this is reversible. I'm also mostly interested in tail-end knowledge. For some posts, I can guess the content by the title, which is boring. Finally, teaching people what not to do is really inefficient, since the space of possible mistakes is really big.

Your last link needs an s before the dot.

Anyway, I respect your decision, and I understand the purpose of this site a lot better now (though there's still a small, misleading difference between the explanation of rationality and in how users are behaving. Even the name of the website gave the wrong impression).

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-13T20:06:39.037Z · LW · GW

Yes intuitions can be wrong welcome to reality

But these ways of looking at the world are not factually wrong, they're just perverted in a sense.
I agree that schools are quite terrible in general.

how could I have come up with this myself?

That helps for learning facts, but one can teach the same things in many different ways. A math book from 80 years ago may be confusing now, even if the knowledge it covers is something that you know already, because the terms, notation and ideas are slightly different.

we need wisdom because people cannot think

In a way. But some people who have never learned psychology have great social skills, and some people who are excellent with psychology are poor socializers. Some people also dislike "nerdy" subjects, and it's much more likely that they'd listen to a ted talk on budy language than read a book on evolutionary psychology and non-verbal communication. Having an "easy version" of knowledge available which requires 20 IQ points less than the hard version seems like a good idea.
Some of the wisest and psychologically healthy people I have met have been non-intellectual and non-ideological, and even teenagers or young adults. Remember your "Things to unlearn from school" post? Some people may have less knowledge than the average person, and thus have less errors, making them clear-sighted in a way that makes them seem well-read. Teaching these people philosophy could very well ruin their beautiful worldviews rather than improve on them.

if you know enough rationality you can easily get past all that.

I don't think "rationality" is required. Somebody who has never heard about the concept of rationality, but who is highly intelligent and thinks things through for himself, will be alright (outside of existential issues and infohazards, which have killed or ruined a fair share of actual geniuses).
But we're both describing conditions which apply to less than 2% of the population, so at best we have to suffer from the errors of the 98%.

I'm not sure what you mean by "when you dissent when you have an overwhelming reason". The article you linked to worded it "only when", as if one should dissent more often, but it also warns against dissenting since it's dangerous.
By the way, I don't like most rational communities very much, and one of the reasons is that is that they have a lot of snobs who will treat you badly if you disagree with them. The social mockery I've experienced is also quite strong, which is strange since you'd suspect intelligence to correlate with openness, and for the high rate of autistic people to combat some of the conformity.

I also don't like activism, and the only reason I care about the stupid ideas of the world is that all the errors are making life harder for me and the people that I care about. Like I said, not being an egoist is impossible, and there's no strong evidence that all egoism is bad, only that egoism can be bad. The same goes for money and power, I think they're neutral and both potentially good/bad. But being egoistic can make other people afraid of me if I don't act like I don't realize what I'm doing.

It's more optimal to be passionate about a field

I think this is mostly correct. But optimization can kill passion (since you're just following the meta and not your own desires). And common wisdom says "Follow your dreams" which is sort of naive and sort of valid at the same time.

Believing false things purposefully is impossible

I think believing something you think is false, intentionally, may be impossible. But false beliefs exist, so believing in false things is possible. For something where you're between 10% and 90% sure, you can choose if you want to believe in it or not, and then using the following algorithm:
Say "X is true because" and then allow your brain to search through your memoy for evidence. It will find them.

The articles you posted on beliefs is about the rules of linguistics (belief in belief is a valid string) and logic, but how belief works psychologically may be different. I agree that real beliefs are internalized (exist in system 1) to the point that they're just part of how you anticipate reality. But some beliefs are situational and easy to consciously manipulate (example: self-esteem. You can improve or harm your own self esteem in about 5 minutes if you try, since you just pick a perspective and set of standards in which you appear to be doing well or badly). Self-esteem is subjective, but I don't think the brain differentiates subjective and objective things, it doesn't even know the difference.

And it doesn't seem like you value truth itself, but that you value the utility of some truths, and only because they help you towards something you value more?

Ethically yes, epistemically no

You may believe this because a worldview will have to be formed through interactions with the territory, which means that a worldview cannot be totally unrelated to reality? You may also mean this: That if somebody has both knowledge and value judgements about life, then the knowledge is either true or false, while the value judgements are a function of the person. A happy person might say "Life is good" and a depression person might say "Life is cruel", and they might even know the same facts.

Online "black pills" are dangerous, because the truth value of the knowledge doesn't imply that the negative worldview of the person sharing it is justified. Somebody reading the vasistha yoga might become depressed because he cannot refute it, but this is quite an advanced error in thinking, as you don't need to refute it for its negative tone to be false.

Rationality is about having cognitive algorithms which have higher returns

But then it's not about maximizing truth, virtue, or logic.
If reality operates by different axioms than logic, then one should not be logical.
The word "virtue" is overloaded, so people write like the word is related to morality, but it's really just about thinking in ways which makes one more clear-sighted. So people who tell me to have "humility" are "correct" in that being open to changing my beliefs makes it easier for me to learn, which is rational, but they often act as if they're better people than me (as if I've made an ethical/moral mistake in being stubborn or certain of myself).
By truth, one means "reality" and not the concept "truth" as the result of a logic expression. This concept is overloaded too, so that it's easy for people to manipulate a map with logical rules and then tell another person "You're clearly not seeing the territory right".

physics is more accurate than intuitive world models

Physics is our own constructed reality, which seems to a act a lot like the actual reality. But I think an infinite amount of physics could exist which predicts reality with a high accuracy. In other words, "There's no one true map". We reverse engineer experiences into models, but experience can create multiple models, and multiple models can predict experiences.
One of the limitation is "there's no universal truth", but this is not even a problem as the universe is finite. But "universal" in mathematics is assumed to be truly universal, covering all things, and it's precisely this which is not possible. But we don't notice, and thus come up with the illusion of uniqueness. And it's this illusion which creates conflict between people, because they disagree with eachother about what the truth is, claiming that that conflicting things cannot both be true. I dislike the consensus because it's the consensus and not a consensus.

A good portion of hardcore rationalists tend to have something to protect, a humanistic cause 

My bad for misrepresenting your position. Though I don't agree that many hardcore rationalists care for humanistic causes. I see them as placing rationality above humanity, and thus prefering robots, cyborgs, and AIs above humanity. They think they prefer an "improvement" of humanity, but this functionally means the destruction of humanity. If you remove negative emotions (or all emotions entirely. After all, these are the source of mistakes, right?), subjectivity, and flaws from humans, and align them with eachother by giving them the same personality, or get rid of the ego (it's also a source of errors and unhappiness) what you're left with is not human. It's at best a sentient robot. And this robot can achieve goals, but it cannot enjoy them.
I just remembered seeing the quote "Rationality is winning", and I'll admit this idea sounds appealing. But a book I really like (EST: Playing the game the new way, by Carl Frederick) is precisely about winning, and its main point is this: You need to give up on being correct. The human brain wants to have its beliefs validated, that's all. So you let other people be correct, and then you ask them for what you want, even if it's completely unreasonable.

Rationality doesn't necessarily have nature as a terminal value

I meant nature as its source (of evidence/truth/wisdom/knowledge). "Nature" meaning reality/the dao/the laws of physics/the universe/GNON. I think most schools of thought draw their conclusions from reality itself. The only kind of worldviews which seems disconnected from reality is religions which create ideals out of what's lacking in life and making those out to be virtue and the will of god.

None of that is incompatible with rationality

What I dislike might not be rationality, but how people apply it, and psychological tendencies in people who apply it. But upvotes and downvotes seem very biased in favor of a consensus and verifiability, rather than simply being about getting what you want out of life. People also don't seem to like being told accurate heuristics which seem immoral or irrational (the colloquial definition that regular people use) even if they predict reality well. There's also an implicit bias towards alturism which cannot be derived from objective truth.

About my values, they already exist even if I'm not aware of them, they're just unconscious until I make them conscious. But if system 1 functions well, then you don't really need to train system 2 to function well, and it's a pain to force system 2 rationality onto system 1 (your brain resists most attempts at self-modification). I like the topic of self-modification, but that line of studies doesn't come up on LW very often, which is strange to me. I still believe that the LW community downplays the importance of human nature and psychology. It may even underevaluate system 1 knowledge (street smarts and personal experiences) and overevaluate system 2 knowledge (authority, book-smarts, and reasoning)

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-13T13:16:15.095Z · LW · GW

There's a lot to unfold for this first point:

Another issue with teaching it academically is that academic thought, like I already said, frames things in a mathematical and thus non-human way. And treating people like objects to be manipulated for certain goals (a common consequence of this way of thinking) is not only bad taste, it makes the game of life less enjoyable.
Learning how to program has harmed my immersion in games, and I have a tendency to powergame, which makes me learn new videogames way faster than other people, also with the result that I'm having less fun than them. I think rationality can result in the same thing. Why do people dislike "sellouts" and "cars salesmen" if not for the fact they they simply optimize for gains in a way which conflicts with taste? But if we all just treat taste like it's important, or refuse to collect so much information that we can see the optimal routes, then Moloch won't be able to hurt us.

If you want something to be part of you, then you simply need to come up with it yourself, it will be your own knowledge. Learning other peoples knowledge however, feels to me like consuming something foreign.

Of course, my defense of ancient wisdom so far has simply been to translate it into an academic language in which it makes sense. "Be like water" is street-smarts, and "adaptability is a core component of growth/improvement/fitness" is the book-smarts. But the "street-smarts" version is easier to teach, and now that I think about it, that's what the bible was for.

Most things that society waste its time discussing are wrong. And they're wrong in the sense than even an 8-year-old should be able to see that all controversies going on right now are frankly nonsense. But even academics cannot seem to frame things in a way that isn't riddled with contradictions and hypocrisy. Does "We are good, but some people are evil, and we need to fight evil with evil otherwise the evil people will win by being evil while we're being good" not sound silly? A single thought will get you karl poppers "paradox of tolerance" and a single thought more will make you realize that it's not a paradox but a kind of neutrality/reflexivity which make both sides equal, and that "We need to fight evil" means "We want our brand of evil to win" as long as people don't dislike evil itself but rather how it's used. Again, this is not more complicated than "I punched my little brother because I was afraid he'd punch me first, and punching is bad" which I expect most children to see the problem with.

astrology

The thought experiment I had in mind was limited to a single isolated situation, you took it much further, haha. My point was simply "If you use astrology for yourself, the outcomes are usually alright". Same with tarot cards, as far as I'm concerned, it's a way to talk with your subconsciousness without your ego getting in the way, which requires acting as if something else is present. Even crystal balls are probably a kind of Rorschach test, and should not be used to "read other people" for this reason. Finally, I don't disagree with the low utility of astrology, but false hope gives people the same reassurance as real hope. People don't suffer from the non-existence of god, but from the doubt of his existence. The actual truth value of beliefs have no psychological effects (proof: Otherwise we could use beliefs to measure the state of reality).

are more rational w.r.t. to that goal

I disagree as I know of counter-examples. It's more likely for somebody to become rich making music if their goal is simply to make music and enjoy themselves, than if their goal is to become rich making music. You see similar effects for people who try to get girlfriends, or happiness for that matter. If X resuls in Y, then you should optimize for X and not for Y. Many companies are dying because they don't realize such a simple thing (they try to exploit something pre-existing rather than making more of what they're exploiting, for instance the trust in previous IPs). Ancient wisdom tackles this. Wu Wei is about doing the right by not trying to do it. I don't know how often this works, but it sometimes does.

I have to disagree that anyones goal is truth. I've seen strong evidence that knowledge of an environment is optimal for survival, and that knowledge-optimizing beats self-delusion every time, but even in this case, the real goal is "survival" and not "truth". And my proof is the following: If you optimize for truth because it feels correct or because you believe it's what's best, then your core motivation is feelings or beliefs respectively. For similar reasons, non-egoism is trivially impossible. But the "Something to protect" link you sent seems to argue for this as well?
And truth is not always optimal for goals. The belief that you're justified and the belief that you can do something are both helpful. The average person is 5/10 but tend to rate themself as 7/10, which may the around the optimal bias.
By the way, most of my disagreements so far seem to be "Well, that makes sense logically, but if you throw human nature into the equation then it's wrong"

Some people may find fulfillment from that

I find myself a little doubtful here. People usually chase fame not because they value it, but because other people seem to value it. They might even agree cognitively on what's valuable, but it's no use if they don't feel it.

I think you would need to provide evidence for such claims

How many great peoples autobiographies and life stories have you read? The nearer you get to them, the more human they seem, and if you get too close you may even find yourself crushed by pity. About Isaac Newton, it was even said "As a man he was a failure; as a monster he was superb". Boltzmann committed suicide, John Nash suffered from skizophrenia. Philosophy is even worse off, titles like "suicide or coffee?" do not come from healthy states of mind. And have you read the Vasistha Yoga? It's basically poison. But it's ultimately a projection, a worldview does not reveal the world, but rather than person with the worldview.

Then you weren't thinking rationally

But what saved me was not changing my knowledge, but my interpretation of it. I was right that people lie a lot, but I thought it was for their own sake, when it's mostly out of consideration for others. I was right that people were irrational, but I didn't realize that this could be a good thing.

No one can exempt you from laws of rationality

That seems like it's saying "I define rationality as what's correct, so rationality can never be wrong, because that would mean you weren't being rational". By treating rationality as something which is discovered rather than created (by creating a map and calling it the territory), any flaw can be justified as "that wasn't real rationality, we just didn't act completely rationally because we're flawed human beings! (our map was simply wrong!)".
There can be no universal knowledge, maps of the territory are inherently limited (and I can prove this). As far as rationality uses math and verbal or written communication, it can only approximate something which cannot be put into words "The dao of which can be spoken is not the dao" simply means "the map is not the territory".

By the way, I think I've found a big difference between our views. You're (as far as I can tell) optimizing for "Optimization power over reality / a more reliable map", while I'm optimizing for "Biological health, psychological well-being and enjoyment of existence".
And they do not seem to have as much in common as rationalists believe.

But if rationality in the end worships reality and nature, that's quite interesting, because that puts it in the same boat as Taoism and myself. Some people even put Nature=God.  

Finally, if my goal is being a good programmer, then a million factors will matter, including my mood, how much I sleep, how much I enjoy programming, and so on. But somebody who naively optimizes for progamming skills might practice at the cost of mood, sleep, and enjoyment, and thus ultimately end up with a mediocre result. So in this case, a heuristic like "Take care of your health and try to enjoy your life" might not lose out to a rat-race like mentality in performance. Meta-level knowledge might help here, but I still don't think it's enough. and the tendency to dismiss things which seem unlikely, illogical or silly is not as great as a heuristic as one would think, perhaps because any beliefs which manage to stay alive despite being silly have something special about them.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-12T10:51:32.407Z · LW · GW

I think majority of people aren't aware of psychology and various fields under it

I don't think there's a reason for most people to learn psychology or game theory, as you can teach basic human behaviour and such without the academic perspective. I even think it's a danger to be more "book smart" than "street smart" about social things. So rather than teaching game theory in college, schools could make children read and write a book report on "How to Win Friends & Influence People" in 4th grade or whatever. Academic knowledge which doesn't make it to 99% of the population doesn't help ordinary people much. But a lot of this knowledge is simple and easier than the math homework children tend to struggle with.

I don't particularly believe in morality myself, and I also came to the conclusion that having shared beliefs and values is really useful, even if it means that a large group of people are stuck in a local maximum. As a result of this, I'm against people forcing their "moral" beliefs on foreign groups, especially when these groups are content and functional already. So I reject any global consensus of what's "good". No language is more correct than another language, and the same applies for cultures and such. 

Well it depends on your definition of inhuman

It's funny that you should link that post, since it introduces an idea that I already came up with myself. What I meant was that people tend to value what's objective over what's subjective, so that their rational thinking becomes self-destructive or self-denying in a sense. Rationality helps us to overcome our biases, but thinking of rationality as perfect and of ourselves as defect is not exactly healthy. A lot of people who think they're "super-humans" are closer to being "half-humans", since what they're doing is closer to destroying their humanity than overcoming or going beyond it. And I'm saying this despite the fact that some of these people are better at climbing social hierarchies or getting rich than me. In short, the objective should serve the subjective, not the other way around. "The lenses which sees its own flaws" merely conditions itself to seeing flaws in everything. Some of my friends are artists, and they hate their own work because they're good at spotting imperfections in it, I don't consider this level of optimization to be any good for me. When I'm rational, it's because it's useful for me, so I'm not going to harm myself in order to become more rational. That's like wanting money thinking it will make me happy, and then sacrificing my happiness in order to make money.

But the fields like cognitive biases etc are not

I'll agree as long as these fields haven't been subverted by ideologies or psychological copes against reality yet (as that's what tend to make soft sciences pathetic). The "Tall poppy syndrome" has warped the publics perception of the "Dunning kruger effect", so that it becomes an insult you can use against anyone you disagree with who are certain of themselves, especially in a social sitaution in which a majority disagree.

Astrology

Astrology is wrong and unscientific, but I can see why it would originate. It's a kind of pattern reocgnition gone awry. Since everything is related, and the brain is sometimes lazy and thinks that correlation=causation and that X implies Y is the same as Y implies X, they use patterns to predict things, and assume that recreating the patterns will recreate the things. This is mostly wrong, of course, but not always. People who are happy are likely to smile, but smiling actually tends to make you happier as well. Do you know the tragic story behind the person who invented handwashing? He found the right pattern, and the results were verifiable, but because his idea sounded silly, he ended up suffering.

If you had used astrology yourself, it might have ended better, as you'd be likely to intrepret what you wanted to to be true, and since your belief that your goal in life was fated to come true would help against the periodic doubt that people face in life.

I would strongly disagree on the front of intelligence

Intelligent is not something you are, it's something you have. Identifying with your intelligence is how you disown 90% of yourself. Seeing intelligence as something available to you rather than as something you are helps eliminate internal conflict. All "gifted kid burnout" and "depressed intelligent person" situation I have seen was partly caused by this dangerous identification. Even if you dismiss everything else I've said so far, I want to stress the importance of this one thing. Lastly, "systematic optimality" seems to suffer from something like Goodhart's law. When you optimize for one variable, you may harm 100 other variables slightly without realizing it (paperclip optimizers seem like the mathematical limit of this idea). Holistic perspectives tend to go wrong less often.

I like the Internal Family Systems view. I think the brain has competing impulses whose strength depends on your physical and psychological needs. but while I think your brain is rational according to what it wants, I don't think it's rational according to what you want. In fact, I think peoples brains tend to toy them them completely. It creates suffering to motivate you, it creates anxiety to get you to defend yourself, it creates displeasure and tells you that you will be happy if you achieve your goals. Being happy all the time is easy, but our brain makes this hard to realize so that we don't hack our own reward systems and die. If you only care about a few goals, your worldview is extremely simple. You have a complex life with millions of factors, but you only care about a few objective metrics? I'm personally glad that people who chase money or fame above all end up feeling empty, for you might as well just replace humanity with robots if you care so little for experiencing what life has to offer.

there is a good amount of coorelations with IQ

Oh, I know, I have a few bans from various websites myself (and I once got rate limited on here). And intelligence correlates with nihilism, meta-thinking, systemization, and anxiety (I know a study found the correlaton to mental illness to be false. But I think the correlation is negative until about 120 IQ and then positive after). But why did Nikola Tesla's intelligence not prevent him from dying poor and lonely? Why was Einstein so awkward? Why do some many intelligent people not enjoy life very much? My answer is that these are consequences of lacking humanity / healthy ways of thinking. It's not just that stupid people are delusional. I personally like the idea that intelligence comes at the cost of instinct. For reference, I used to think rationally, I hated the world, I hated people, I couldn't make friends, I couldn't understand myself. Now I'm completely fine, I even overcame depression. I don't suffer and I don't even dislike suffering, I love life, I like socializing. I don't worry about injustice, immorality or death.

I just found a highlight of the sequences, and it turns out that I have read most of the posts already, or just discovered the principles myself previously. And I disagree with a few of the moral rules because they decrease my performance in life by making me help society. Finally, my value system is what I like, not what is mathematically optimal for some metric which people think could help society experience less negative emotions (I don't even think this is true or desirable)

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-12T06:36:46.760Z · LW · GW

There's an entire field of psychology, yes, but most men are still confused by women saying "it's fine" when they are clearly annoyed. Another thing is women dressing up because they want attention from specific men. Dressing up in a sexy manner is not a free ticket for any man to harass them, but socially inept men will say "they were asking for it" because the whole concept of selection and stardards doesn't occur to them in that context. And have you read Niccolò Machiavelli's "The Prince"? It predates psychology, but it is psychology, and it's no worse than modern books on office politics and such, as far as I can tell. Some things just aren't improving over time.

wisdom goes wrong a lot of time

You gave the example of the ayurvedic textbook, but I'm not sure I'd call that "wisdom". If we compare ancient medicine to modern medicine, then modern medicine wins in like 95% of cases. But for things relating to humanity itself, I think that ancient literature comes out ahead. Modern hard sciences like mathematics are too inhuman (autistic people are worse at socializing because they're more logical and objective). And modern soft sciences are frankly pathetic quite often (Gardner's Theory of Multiple Intelligences is nothing but a psychological defense against the idea that some people aren't very bright. Whoever doesn't realize this should not be in charge of helping other people with psychological issues)

I don't understand where it may apply other than being a nice way to say "be more adaptive"

It's a core concept which applies to all areas of life. Humans won against other species because we were better at adapting. Nietzsche wrote "The snake which cannot cast its skin has to die. As well the minds which are prevented from changing their opinions; they cease to be mind". This community speaks a lot of "updating beliefs" and "intellectual humility" because thinking that one has all the answers, and not updating ones beliefs over time, leads to cognitive inflexibility/stagnation, which prevents learning. Principles are incredibly powerful, and most human knowledge probably boils down to about 200 or 300 core principles. 

I have found that I can bypass a lot of wisdom by using these axioms

Would I be right to guess that ancient wisdom fails you the most in objective areas of life, and that it hasn't failed you much in the social parts? I don't disagree that modern axioms can be useful, but I think there's many areas where "intelligent" approaches leads to worse outcomes. For the most part, attempting to control things lead to failure. I've had more unpleasant experiences on heavily moderated platforms than I have had in completely unmoderated spaces. I think it's because self-organization can take place once disturbance from the outside ceases. But we will likely never know.

I think the failure to general purpose overcome akrasia is a failure of rationality

You could put it like that. I'd say something like "The rules of the brain are different than those of math, if you treat the brain like it's supposed to be rational, you will always find it to be malfunctioning for reasons that you don't understand". Too many geniuses have failed at living good lives for me to believe that intelligence is enough. I have friends with IQs above 145 who are depressed because they think too rationally to understand their own nature. They reject the things which could help them, because they look down on them as subjective/silly/irrational.
David Goggings story is pretty interesting. I can't say I went through as much as him, but we do have things in common. This might be why I have the courage to criticize science on LW in the first place.

Comment by StartAtTheEnd on Spade's Shortform · 2024-11-11T15:38:05.784Z · LW · GW

No problem! Little note though, your psychiatrist might doubt you if it seems like you're trying to self-diagnose because of something you read online. It may be better not to name it directly unless they bring it up first

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-11T15:19:24.409Z · LW · GW

Some false beliefs can lead to bad actions, but I don't think it's all of them. After all, human nature is biased, because having a bias aided in survival. The psyche also seems like it deceives itself as a defense mechanism fairly often. And I think that "believe in yourself" is good advice even for the mediocre.

I'm not sure which part of my message each part of your message is in response to exactly, but some realizations are harmful because they're too disillusioning. It's often useful to act like certain things are true - that's what axioms and definitions are, after all. But these things are not inherently true or real, they become so when we decide that they are, but in a way it's just that we created them. But I usually have to not think about that for a while before these things go back to looking like they're solid pieces of reality rather than just agreements.

Ancient wisdom can fail, but it's quite trivial for me to find examples in which common sense can go terribly wrong. It's hard to fool-proof anything, be it technology or wisdom.

Some things progress. Math definitely does. But like you said, a lot of wisdom is rediscovered periodically. Science hasn't increased our social skills nor our understanding of ourselves, modern wisdom and life advice is not better than it was 2000 years ago. And it's not even because science cannot deal with these. The whole "Be like water" thing is just flexibility/adaptability. Glass is easier to break than plastic. What's useful is that somebody who has never taken a physics class or heard about darwinism can learn and apply this principle anyway. And this may still apply to some wisdom which accidently reveals something which is beyond the current standard of science.

As for that which is not connected to reality much (wisdom which doesn't seem to apply to reality), it's mostly just the axioms of human cognition/nature. It applies to us more than to the world. "As within, so without", in short, internal changes seem to cause external changes. If you're in a good mood then the external world will seem better too. A related quote is "As you think, so you shall become" which is oddly simiar to the idea of cognitive behavioural therapy.

Comment by StartAtTheEnd on Spade's Shortform · 2024-11-11T06:04:43.134Z · LW · GW

I see this problem quite often in communities for people with ADHD. People describe being unable to relax or start any task if they have any plans later, seemingly going into a sort of "waiting mode" until that event happens. This may be a common problem which is simply stronger in people with ADHD, I'm not sure. 

If you Google "ADHD Waiting mode", you should be able to find posts on this. I don't know how many of these are scientific or otherwise high-quality, and how many of them are unhealthy self-victimzation and other such things. I'm not judging, as I'm diagnosed with ADHD and a few other things myself, I just don't recommend identifying as ones medical diagnoses nor considering them as inherently impossible to overcome. 

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-11T05:48:55.997Z · LW · GW

Then, I'd argue, they're being wrong or pedantic. Since I don't believe my evidence is wrong, it's at most incomplete, and one could argue that an incomplete answer is incorrect in a sense, not because it says anything wrong, but because it doesn't convey the whole truth. If either reason applied to anyone reading that comment, I'd have loved to discuss it with them, which is why I wrote that initial comment in a slightly provocative or cocky way (which I belive is not inappropriate as it reflects my level of confidence quite accurately). This may conflict with some peoples intellectual virtues, but I think a bit of conflict is healthy/necessary for learning

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-11T02:44:42.402Z · LW · GW

Maybe people care way less about the difference between the two kinds of downvotes than I do. Even if the comment was bad or poorly communicated, I don't think the disagree downvote is appropriate as long as the answer is correct. I see the votes as being "subjective" and "objective" respectively. I agree about the noise thing

Comment by StartAtTheEnd on Poll: what’s your impression of altruism? · 2024-11-09T23:22:56.804Z · LW · GW

I don't think any one option is precise enough that it's correct on its own, so I will have to say "5" as well.

Here's my take:

  • Altruism can be a result of both good and bad mental states.
  • Helping others tends to be good for them, at least temporarily.
  • Helping people can prevent them from helping themselves, and from growing.
  • Helping something exist which wouldn't exist without your help is to get in the way of natural selection, which over time can result in many groups who are a net negative for society in that they require more than they provide. They might also remain dependent on others.
  • Finally, (and I expect some people to disagree with this) I think that moral good is a luxury. Luxuries are pleasant, but expensive, so when you engage in more luxury than you can afford, it stops being sustainable. And putting luxuries above necessities seem to me a good definition of decadence. 

Everything has dose-dependent and context-dependent pros and cons.

I think you're expecting too much of the word "good". I don't think any "good" exists such that more of it is always better, so I think "good" is a region of space rather than a direction. If optimization is gradient descent, then the "good" direction might change with every step you take. But if optimization means "what metric should we optimize for?" then we don't know (we have yet to find a single metric which an AGI could maximize without destroying humanity. Heading too far in any direction seems dangerous). So I think many peoples intuition of the word "good" can prevent them from ever hitting a satisfactory answer (as they're actually searching for something which can be taken to infinity without anything bad happening as a result, and not even considering the context in question)

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-09T01:19:16.839Z · LW · GW

That sounds about right. And "people sometimes feel that way" is a good explanation for the downvote in my opinion. I was arguing the object-level premises of the post because the "disagree" downvote was factually wrong, and this factual wrongness, I argue, is caused by a faulty understanding of how truth works, and this faulty understanding is most common in the western world and in educated people, and in the ideologies which correlate with western thought and academia.

If you disagree with something which is true, I think the only likely explanations are "Does not understand" and "Has a dislike of", and the bias I pointed out covers both of these possibilities (the former is a "map vs territory" issue and the latter is a "morality vs reality" issue).

I think you figured out what went wrong nicely, but in the end the disagreement remains. I still consider my point likely. If somebody comes along and tells me that they disagreed with it for other reasons, I might even argue that they're lying to themselves, as I'm way to disillusioned to think that a "will to truth" exists. I think social status, moral values and other such things are stronger motivators than people will admit even to themselves.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-08T05:28:30.635Z · LW · GW

I refered to that too (specifically, the assumption). By true I meant that the bias which I think is to blame certainly exists, not that it was certain to be the main reason (but I'd like to push against this bias in general, so even if this bias only applies to some of the people to see my comment, I think it's an important topic to bring up, and that it likely has enough indirect influence to matter)

To address your points:

1: Of course it's mixed. But the mixed advice averages out to be "wise", something generally useful.
2: I think it's necessarily trial and error, but a good question is "does the wisdom generalize to now?". 
3: This of course depends on the examples that you choose. A passage on the ideal age of marriage might generalize to our time less gracefully than a passage on meditation. I think this goes without saying, but if we assume these things aren't intuitive, then a proper answer would be maybe 5 pages long.
4: Would interpreting it as "negative" not mean that it has been misunderstood? That one can learn without understanding is precisely why they could prosper with a level of education which pales to that of modern times. We learned that bad smells were associated with sickness way before we discovered germs. If our tech requires intelligence to use, then the lower quartile of society might struggle. And with the blind approach you can use genius strategies even if you're mediocre.

5: along with 4, I think this is an example of the bias that I talked about above. What we think of as "real" tends to be sufficiently disconnected from humanity. Religion and traditional ways of living seem to correlate with mental health, so the types of people who think that wealth inequality is the only source of suffering in the world are too materialistic and disconnected. Not to commit the naturalistic fallacy, but nature does optimize in its own way, and imitating nature tends to go much better than "correcting" it.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-08T00:57:43.724Z · LW · GW

You don't think the entire western world is biased in favor of science to a degree which is a little naive? In addition to this, I think that people idolize intelligence and famous scientists, that they largely consider people born before the 1950s to have repulsive moral values, that they dislike tradition, that they consider it very important to be "educated", that they overestimate book smarts and underestimate the common sense of people living simple lives, and that they believe that things generally improve over time (such that older books are rarely worth bothering with), and I believe that social status in general make people associate with newer ideas over older ones. There's also a lot of people who have grown up around old, strict and religious people and who now dislike these. It doesn't help it that more intelligent people are higher in openness in general, and that rationalism correlates with a materialistic and mechanical worldview.

Many topics receive a lot more hostility than they deserve because of these biases, and usually because they're explained in a crazy way (for instance, Carl Jungs ideas are often called pseudoscience, and if you take the bible literally then it's clearly wrong) or because people associate them with immorality (say, the idea that casual sex is disliked by traditional because they were mean and narrow-minded, and not because casual sex caused problems for them, or because it might cause problems for us)

A lot of things are disliked or discarded despite being useful, and a lot of wisdom is in this category. All of this was packed in the message that "people dislike old things because it sounds irrational or immoral" (people tend to dislike long comments)

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-08T00:07:35.417Z · LW · GW

I don't see it as unkind, and I don't think "trial and error" is a wrong explanation either. It seems very unlikely that ideas which are strictly harmful stick around for a very long time. So much that it must necessarily tend in the other direction (I won't attempt to prove this though)

I'm good at navigating hypothesis space, so any difficulties are likely related to theory of mind of people who are very different from myself (being intelligent but out of sync in a way). Still, I don't buy the idea that people can't or shouldn't do this. You're even guessing at my intentions right now, and if somebody is going to downvote me for acting in bad faith, they'll also need to guess at my intentions. So this seems like a common and sensible thing to do in moderation, rather than an intellectual sin of sorts

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-07T23:32:12.834Z · LW · GW

They did answer the question, there's just a little bit of deduction required? I understood it at a glance and didn't even notice any typos. Situations in which agents can learn something without understanding the reasons behind what they learn are quite common, it's not a novel idea, it just raises a red flag in people who are used to scientific thinking. The general bias in society against tradition/spirituality/religion is too strong compared to the utility (even if not correctness) of these three.

That useless extra text in my previous comment saves a future comment or to by taking things into account in advance. I even wrote the "I didn't understand the explanation" reaction above (as something one might have thought before downvoting the comment), so it's not that I didn't think of it, I just considered it an unlikely reaction as I disagree with it

Comment by StartAtTheEnd on Alexander Gietelink Oldenziel's Shortform · 2024-11-07T23:09:57.356Z · LW · GW

This seems like an argument in favor of:

Stability over potential improvement, tradition over change, mutation over identical offspring, settling in a local maximum over shaking things up, and specialization vs generalization.

It seems like a hyperparameter. A bit like the learning rate in AI perhaps? Echo chambers are a common consequence, so I think the optimal ratio of preaching to the choir is something like 0.8-0.9 rather than 1. In fact, I personally prefer the /allPosts suburl over the LW frontpage because the first few votes result in a feedback loop of engagement and upvotes (forming a temporary consensus on which new posts are better, in a way which seems unfairly weighted towards the first few votes). If the posts chosen for the frontpage use the ratio of upvotes and downvotes rather than the absolute amount, then I don't thing this bias will occur (conformity might still create a weak feedback loop though).

I'm simplifying some of these dynamics though.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-07T19:04:56.310Z · LW · GW

I worded that a bit badly, I meant I had a hard time thinking of better (meaning kinder) explanations, not better (meaning more likely) explanations. Across all websites I've been on in my life, I have posted more than 100000 comments (resulting in many interactions), so while things like psychoanalyzing people, assuming intentions, and making stereotypes is "bad", I simply have too much training data, and too few incorrect guesses not to do this. I do, however, intentionally overestimate people (since I want to talk to intelligent people, I give people the benefit of doubt for as long as possible) but this means that mistakes are attributed to their intentions, personality or values, rather than careless mistakes or superficial heuristics. In this situation, I've assumed that they're offended by the idea that traditional socities rival the science method in some situations. But it may be something more superficial like "I find short comments to be effortless", "somebody else already said that" or "I didn't understand your explanation and I consider it your fault". But like I said in another comment, I remember the first downvotes being disagreements (red X) rather than regular downvotes, so I took it as meaning "this is wrong" rather than "I don't like this comment". Not that any of this matters very much, admittedly

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-07T18:42:07.916Z · LW · GW

That makes sense, I just evaluated the comment in isolation. But I believe that the first few downvotes were as "incorrect" (the red X) rather than regular downvotes (down arrow), which is why the feedback occured to me as simply mistaken (as the comment is not false).

I've noticed, by the way, that most comments posted tend to get downvoted initially and then return to 0 over time. There may be a few regular, highly active users with high standards or something, and less casual users with lower standards which balance them out over time. I've gone to -10 and back before. 

Comment by StartAtTheEnd on The Case Against Moral Realism · 2024-11-07T18:19:18.688Z · LW · GW

I don't think good and evil are objectively real as moral terms, but if something makes us select against certain behaviour, it may be because said behaviour results in organisms deleting themselves from existence. So that "evil" actually means "unsustainable". But this makes it situational (your sustainable expenditure depends on your income, for instance, so spending 100$ cannot be objectively good or evil).

Moral judgments vary between individuals, cultures and societies


Yes, and which actions result in you not existing will also vary. There's no universal morality for the same reason that there's no universal "best food" or "most fitting zoo enclosure", for "best" cannot exist on its own. Calling something "best" is a kind of shortcut, there's implicit things being referred to.
What's the best move in Tetris? The correct answer depends on the game state. When you're looking for "objectively correct universal moral rules" you might also be throwing away the game state on which the answer depends.

I'd go as far as to say that all situations where people are looking for universal solutions are mistaken, as there may (necessarily? I'm not sure) exist many local solutions which are objectively better in the smaller scope. For instance, you cannot design a tool which is the best tool for fixing any machine, instead you will have to create 100s of tools which are the best for each part of each machine. So hammers, saws, wrenches, etc. exist and you cannot unify all of them them to get something which is objectively better than any of them in any situation. But does this imply that tools are not objective? Does it not rather imply that good is a function taking at least two inputs (tool, object) and outputting a value based on the relation between the two? (a third input could be context, i.e. water is good for me in the context that I'm thirsty). 

If my take is right, then like 80% of all philosophical problems turn out to be nonsense. In other words, most unsolved problems might be due to flawed questions. I'm fairly certain in this take, but I don't know if it's obvious or profound.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-07T17:46:36.185Z · LW · GW

Yeah I'm asking because downvotes are far too ambigious. I think they're ambigious to the point that they don't make for useful feedback (You can't update a worldview for the better if you don't know what's wrong with it). I don't think downvotes are necessarily bad as a concept though. And about humanity - sure, and on any other website I'd largely have agreed with your view, but when I talk about intellectual things I largely push my own humanity to the side. And even if somebody downvotes because of irrational feelings, I'm interested in what those feelings are.

But I know that people on here frequently value truth, and I'm quite brutal to those values as I think truth is about as valid of a concept as a semicolon (the language is just math/logic rather than English). And if we are to talk about Truth with a capital T, then we're speaking about reality, which is more fundamental than language (the territory, reality, is important. But I rarely see any good maps, even on this website. So when taoists seem to suggest throwing the map away entirely, I do think that's a good idea for every day life. It's only for science, research and tech that I value maps). That makes me an outlier though, haha.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-05T20:28:50.928Z · LW · GW

I'm curious why you were downvoted, for you hit the nail on the head. For a short an concise answer, yours is the best.

Does anyone know? Otherwise I will just assume that they're rationalists who dislike (and look down on) traditional/old things for moral reasons. This is not very flattering of me but I can't think of better explanations.

Comment by StartAtTheEnd on Does the "ancient wisdom" argument have any validity? If a particular teaching or tradition is old, to what extent does this make it more trustworthy? · 2024-11-05T20:24:24.942Z · LW · GW

Ancient wisdom is not scientific, and it might even be false, but the benefits are very real, and these benefits sort of works to make the wisdom true.

The best example I can give is placebo, the belief that something is true helps make it true, so even if it's not true, you get the benefits of it being true. The special trait ancient wisdom has is this: The outcome is influenced by your belief in the outcome. This tends to be true for psychological things, and advice like "Belief can move mountains" is entirely true in the psychological realm. But scientific people, who deal with reality, tend to reject all of this and consider it as nonsense, as the problems they're used to aren't influenced by belief.

Another case in which belief matters includes treating things with weight/respect/sacredness/divinity. These things are just human constructs, but they have very real benefits. Of course, you can be an obnoxious atheist and break these illusions all you want, but the consequences of doing this will be nihilism. Why? Because treating things as if they have weight is what gives them weight, and nihilism is basically the lack of perceived weight. There's nothing objectively valid about filial piety, but it does have benefits, and acting as if it's something special makes it so.

Ancient wisdom often gets the conclusions right, but get the explanations wrong, and this is likely in order to make people take the conclusions seriously. Meditation has been shown to be good for you. Are you feeling "Ki" or does your body just feel warm when you concentrate on it? Do you become "one with everything" or does your perception just discard duality for a moment? Do you "meet god" or do you merely experience a peace of mind as you let go of resistance? The true answer is the boring one, but the fantastical explanation helps make these ideas more contagious, and it's likely that the false explanations have stuck around because they're stronger memetically.

Ancient wisdom has one advantage that modern science does not: It can deal with things which are beyond our understanding. The opposite is dangerous: If you reject something just because you don't understand why it might be good (or because the people who like it aren't intellectual enough to defend it), then you're being rational in the map rather than in the territory. Maybe the thing you're dismissing is actually good for reasons that we won't understand for another 20 years.

You can compare this with money, money is "real but not real" in a similar way. And this all generalizes far beyond my examples, but the main benefits are found, like I said, in everything human (psychological and spiritual) and in areas in which the consensus has an incomplete map. I belive that nature has its own intelligence in a way, and that we tend to underestimate it.

Edit: Downvotes came fast. Surely I wrote enough that I've made it very easy to attack my position? This topic is interesting and holds a lot of utility, so feel free to reply. 

Comment by StartAtTheEnd on What are some good ways to form opinions on controversial subjects in the current and upcoming era? · 2024-10-29T02:40:20.751Z · LW · GW

While you could format questions in such a way that you can divide them into A and B in a sensible manner, my usual reaction to thought experiments which seem to make naive assumptions about reality is that the topic isn't understood very deeply. The problem about looking at the surface (and this is mainly why average people don't hold valuable opinions) is that people conclude that solar panels, windmills and electric cars are 100% "green", without taking into account the production and recycling of these things. Many people think that charging stations for electric cars are green, but they don't see the coal powerplant which supplies power to the charging station. In other words "Does X solution actually work?" is never asked. Society often acts like me when I'm being neurotic. When I say "I will clean my house next week" I allow my house to stay messy while also helping myself to forget the manner for now. But this is exactly like saying "We plan to be carbon neutral by 2040" and then doing nothing for another 5 years.

And yes, that does clarify things!

  • Valid, but knowing what's important might require understanding the problem in the first place. A lot of people want you to think that the thing they're yelling about is really important.
  • Then the axioms do not account for a lot of controversial subjects. I think the abortion debate also depends on definitions "at how many weeks can the child said to be alive?" "When is it your own body and when is it another body living inside you?"
  • I'm afraid it doesn't. I believe that morality has little to no correlation with intelligence, and that truth has little to do with morality. I'd go as far as to say that morality is one of the biases that people have, but you could call these "values" instead of biases.

To actually answer your question, I think understanding human nature and the people you're speaking to is helpful. Also the advantages of pushing certain beliefs, and the advantages of holding certain beliefs.

If somebody grew up with really strict parents, they might value freedom, whereas somebody who lacked guidance might recognize the danger of doing whatever one feels like doing. And whether somebody leans left or right economically seems influenced by their own perceived ability to support themselves. Ones level of pity for others seems to be influenced by ones own confidence, since there's a tendency to project ones own level of perceived fragility.

If you could measure a groups biases perfectly, then you could subtract it from the position they hold. If there's strong reasons to lean towards X, but X is only winning by a little bit, then X might not be true. You can often also use reason to find inconsistencies. I'd go as far as saying that inconsistencies are obvious everywhere unless you unconsciously try to avoid seeing them. Discrimination based on inherent traits is wrong, but it's socially acceptable to make fun of stupid people, ugly people, short people and weirdos? The real rule is obviously closer to something like "Discrimination is only acceptable towards those who are either perceived to be strong enough to handle it, or those who are deemed to be immoral in general". If you think about it enough, you will likely find that most things people say are lies. There's also some who have settled on "It's all social status games and signaling" which is probably just another way of looking at the same thing. Speaking of thinking, if you start to deconstruct morality and questioning it, you might put yourself out of sync with other people permanently, so you've been warned.

But the best advice I can give is likely just to read the 10 or so strongest arguments you can find on both sides of the issue and then judging for yourself. If you can't trust your own judgement, then you likely also can't trust your own judgement about who you can trust to judge for you. And if you can judge this comment of mine, then you can likely judge peoples takes on things in general, and if you can't judge this comment of mine, then you won't be able to judge the advice you get about judging advice, and you're stuck in a sort of loop.

I'm sometimes busy for a day or two, I don't think I will have longer delays in replying than that

Comment by StartAtTheEnd on What are some good ways to form opinions on controversial subjects in the current and upcoming era? · 2024-10-27T18:57:32.342Z · LW · GW

I misread a small bit, but I still stand by my answer. It is however still unclear to me if you value truth or not. You mention moral frameworks and opinions, but also sound like you want to get rid of biases? I think these conflict.

I guess I should give examples to show how I think:

  • Suppose that climate change is real, but that the proposed causes and solutions are wrong. Or that for some problem X, people call for solution Y, but you expect that Y will actually only make X worse (or be a pretend-solution which gives people a false sense of security and which is only adopted because it signals virtue)
  • Suppose that X is slightly bad, but not really worth bothering about, however, team A thinks that X is terrible and team B thinks that A is the best thing ever.
  • Suppose that something is entirely up to definition, such that truth doesn't matter (for instance, if X is a mental illness or not). Also, suppose that whatever definition you choose will be perceived as either hatred or support.
  • I don't think it's good to get any opinions from the general population. If actual intelligent people are discussing an issue, they will likely have more nuanced takes than both the general population and the media.
  • Lets say that X personality trait is positively correlated with both intelligence and sexual deviancy. One side argues that this makes them good, another side argues that this makes them bad. Not only is this subjective, people would be confusing the utilitarian "good/bad" with the moral "good/bad" (easy example: Breaking a leg is bad, but having a broken leg does not make you a bad person). 

I think being rational/unbiased results in breaking away from socities opinions about almost everything. I also think that being biased is to be human. The least biased of all is reality itself, and a lot of people seem really keen on fixing/correcting reality to be more moral. In my worldview, a lot of things stop making sense, so I don't bother with them, and I wonder why other people are so bothered by so many things.

I might be unable to respond for a little while myself, sorry about that

Comment by StartAtTheEnd on What are some good ways to form opinions on controversial subjects in the current and upcoming era? · 2024-10-27T17:55:08.734Z · LW · GW

I think it's often the case that neither A nor B are true. Common opinions are shallow, often simplified and exaggerated or even entirely besides the point.
Now, you're asking what a good way to form opinions is, well, it depends on what you want.

Do you want to know which side you should vote for to bring the future towards the state that you want?
Do you want to figure out which side is the most correct?
Do you want to figure out the actual truth behind the political issue?
Do you want to hold an opinion which won't disrupt your social life too much or make you unpopular?

I expect that these four will bring you to different answers.

(While I think I understand the problem well, I can't promise that I have a good solution. Besides, it's subjective. Since the topic is controversial, any answer I give will be influenced by the very biases that we're potentially interested in avoiding)

By the way, personally, I don't care much what foreign actors (or team A and B) have to say about anything, so it's not a factor which makes a difference to me.

Edit: I should probably have submitted this as a comment and not an answer. Oh well, I will think up an answer if you respond.

Comment by StartAtTheEnd on how to rapidly assimilate new information · 2024-10-25T00:01:44.892Z · LW · GW

Most of my learning took place in my head, causing it to be isolated from other senses, so that's likely one of the reasons. In some of the examples I know of people forgetting other things, they did things like learning 2000 digits of pi in 3 days, which is exactly something which doesn't really connect to anything else. So you're likely correct (at least, I don't know enough instances of forgetting to make any counter-arguments) 

most of it isn't really useful at helping you address the problems that you're facing

This is a rather commonly known technique, but you can work backwards from the problems, learning everything related to them. Rather than learning a lot and hoping that you can solve whatever problems might appear.

What I personally did, which might have been unhealthy, was wanting to fully understand what I was working with in general. So I'd always throw myself at material 5 years of studies above what I currently understood. When introduced to the Bayes chain rule, I started looking into the nature of chain rules, wanting to know how many existed across mathematics and if they were connected with one another. Doing things like this isn't always a waste of time, though, sometimes you really can skip ahead. If you Google summaries of about 100 different books written by people who are experts in their fields or highly intelligent in general, you will gain a lot of insights into things.

Comment by StartAtTheEnd on Word Spaghetti · 2024-10-24T23:11:08.907Z · LW · GW

I have the same problem. I think my non-verbal IQ might be about 45 points above my verbal IQ, so that could be a factor. I also think mostly in concepts, since I'm afraid that thinking in words would blind me to insights which do not yet have words to describe them.

But translation from "idea in my mind" to "words that others can understand" is hard. I hear that information in the mind has a relational (mindmap) structure, while writing is linear and left-to-right. So the data structures are quite different.

I'm autistic, which harms my ability to communicate. I also tend to create my own vocabulary, and to use grammar in a mathematical sense. I might add "un-" or "-izable" affixes to words which shouldn't have them, or use set-builder notation in my personal notes, even if they contain no mathematics at all. This causes me to have my own efficient symbolic language which is incompatible with other peoples models/associations/tokens.

There's two other things I try to avoid:

1: Subvocalization (it slows me down)
2: Explaining things to myself. I know what I mean, always. If I catch myself thinking to myself as if other people were listening, I stop. Is this a natural habit meant to improving communication, or caused by trauma and fear of being misunderstood (like imagining social scenarios while in the shower)? For I imagine that it causes a dramatic reduction in thinking speed, even if you get the benefits of rubberducking. 

In short, I'm guessing that people with high verbal intelligence, and those who tend to think purely in words don't have much difficulty writing. I don't have any contrary evidence in any of my memories, so I will believe this for now

Comment by StartAtTheEnd on how to rapidly assimilate new information · 2024-10-24T22:50:02.136Z · LW · GW

I will not argue that any of what you said is wrong, because I don't believe it is, but I've personally found that learning too much too fast makes me sick. "Rapid" learning may be fine, but anything faster could have serious trade-offs. Consuming and digesting food os similar enough to consuming and digesting knowledge that many intuitions carry over (like the neusea for overeating, or getting tired of eating the same thing for too long, etc). 

When cramming for exams, I would sometimes go through 4000 pages in about two weeks, and it would result in a sort of confusion and nausea, and I'd have lots of loose ends and scattered thoughts floating around. Now, I didn't always do the exercises like I should, and my learning was more theoretical than practical, so it may just be that I didn't finish anything before moving on to the next part, leaving my knowledge unsolidified. So "sort of understanding" is definitely not a good stage to stop at, that's my mistake, and most people here probably know better than to do that.

However, I've heard of people trying really hard to study or remember something for hours a day, and then forgetting other important events going back like two weeks. Like memories of last weekend just disappearing and such. I'm not sure if older knowledge is at risk (if you can accidentally erase important things if you're too aggressive in your learning).

Maybe some people on here have stories to share? Not that it's likely. You need to be really low in conscientiousness to be as unstructured as I am, and to have a messy desk, a messy house, messy notes, and be obsessed enough with something that you forget to eat, or forget if it's currently morning or evening. And people like 'us' don't fit in here since we avoid "tedius" things, leading to messy and informal writing, and leading us to avoid knowledge which doesn't interests us but which is relevant if one wants to write an article on a subject. Perhaps med students have enough of a workload to understand the consequences of excessive learning, but I don't know how many of them use this site.

Apologies if your areas of interest doesn't extend to what I'm discussing here.

Comment by StartAtTheEnd on What is autonomy? Why boundaries are necessary. · 2024-10-22T16:58:08.232Z · LW · GW

I thought this was quite obvious. It's why skin exists, and why internal bleeding is bad, and it's also why borders are a good idea. That last statement will probably make a lot of people angry with me, but that's because a moral ideal clashes with reality. I'm personally on the side of reality, since I know the consequences of opposing it.

There's a social hierarchy which looks a bit like the following:

Self > Family > Friends > Community > Country > Nation > World

And these layers of separation protect the inside against negative change or entropy or whatever. This website also has its own border which I consider a good thing (as much as society wants to convince me that gatekeeping is sin).

What I hope society will realize soon is that dissolving these boundaries can have terrible consequences. I do include top-down moderation in that (as inherenting rules from upper layers interferes with the agency of lower layers). For instance, if this website has to censor certain information because an authority 2 or 3 levels further up in the hierarchy demands it, then this website will lose some of its agency and thus its individuality (uniqueness/difference). I consider it a danger when structures impose moral criteria on entities further down the hierarchy. "Thought crimes" are one example, but another is when Mojang says "You cannot allow swearing on your Minecraft realm server" or when Google says "Your website has offensive content, we need to delist it", or when the world tells Japan "Strong borders are immoral".

Now, you could make a case for the upper layers rejecting something inside them for a good reason (like the body expelling harmful things). For instance, society might tell parents "You can generally parent how you want, but violence against children is not allowed". But in this instance, we could simply say "Because they would be compromising the agency of children by harming them".

Interference seems to be harmful in general though. By which I mean this seems like an axiomatic truth. The best teacher tries to aid the growth of students, but they do not try to control the direction of growth. Being controlling in relationships is also harmful. I've heard this described as "Don't mess with other peoples destiny". There's also daoist ideals which says not to interfere. And what's the best advice for people with social anxiety? "Just be yourself". In short, people screw up when they try to control themselves too much rather than letting go and letting things flow naturally. it's like everything in life evolves in a good direction by itself as long as you don't mess with it. Even some meditation techniques can be described as "Shut up and just listen/observe". Tyrannies impose excess control. Communism might have failed because of similar issues with control vs letting people (or complex systems) organize themselves.


TL;DR: Interference might be, in a mathematical sense, harmful, because it prevents... Something akin to self-organization/self-assembly. (Or perhaps because "inheriting" restictions from every upper layer cuts off too many possibilities for the lower layers. Like when the educational system fails to foster each students individuality because it only fosters what students have in common. (∪ vs ∩)). And somehow, borders (agency) seems to help against this problem.

Comment by StartAtTheEnd on Nathan Young's Shortform · 2024-10-15T11:01:36.696Z · LW · GW

I think it's necessarily truth given the statistical distribution of things. If I say "There's necessarily less people with PhDs than with masters, and necessarily less masters than college graduates" you'd probably agree.

The theory that "If you understand something, you can explain it simply" is mostly true, but this does not make it easy to understand, as simplicity is not ease (Just try to explain enlightenment / the map-territory distinction to a stupid person). What you understand will seem trivial to you, and what you don't understand will seem difficult. This is just the mental representation of things getting more efficienct and us building mental shortcuts for things and getting used to patterns.

Proof: There's people who understand high level mathematics, so they must be able to explain these concepts simply. In theory, they should be able to write a book of these simple concepts, which even 4th graders can read. Thus, we should already have plenty of 4th graders who understand high level mathematics. But this is not the case, most 4th graders are still 10 years of education away from understanding things on a high level. Ergo, either the initial claim (that what you understand can be explained simply) is false, or else "explained simply" does not imply "understood easily"

The excessive humility is a kind og signaling or defense mechanism against criticism and excessive expectations from  other people, and it's rewarded because of its moralistic nature. It's not true, it's mainly pleasant-sounding nonsense originating in herd morality.

Comment by StartAtTheEnd on Overview of strong human intelligence amplification methods · 2024-10-15T10:40:44.254Z · LW · GW

Right, I agree with that.

A right shift by 2SDs would make people like Hawkings, Einstein, Tesla, etc. about 100 times more common, and make it so that a few people who are 1-2SDs above these people are likely to appear soon. I think this is sufficient, but I don't know enough about human intelligence to guarantee it.

I think it depends on how the SD is increased. If you "merely" create a 150-IQ person with a 20-item working memory, or with a 8SD processing speed, this may not be enough to understand the problem and to solve it. Of course, you can substitute with verbal intelligence, which I think a lot of mathematicians do. I can't rotate 5D objects in my head, but I can write equations on paper which can rotate 5D objects and get the right answer. I think this is how mathematics is progressing past what we can intuitively understand. Of course, if your non-verbal intelligence can keep up, you're much better off, since you can combine any insights from any area of life and get something new out of it.

Comment by StartAtTheEnd on Overview of strong human intelligence amplification methods · 2024-10-15T02:05:58.130Z · LW · GW

You're correct that the average IQ could be increased in various ways, and that increasing the minimum IQ of the population wouldn't help us here. I was imagining shifting the entire normal distribution two SDs to the right, so that those who are already +4-5SDs would become +5-7SDs.

As far as I'm concerned, the progress of humanity stands on the shoulders of giants, and the bottom 99.999% aren't doing much of a difference.

The threshold for recursive self-improvement in humans, if one exists, is quite high. Perhaps if somebody like Neumann lived today it would be possible. By the way, most of the people who look into nootropics, meditations and other such things do so because they're not functional, so in a way it's a bit like asking "Why are there so many sick people in hospitals if it's a place for recovery?" thought you could make the argument that geniuses would be doing these things if they worked.

My score on IQ tests has increased about 15 points since I was 18, but it's hard to say if I succeeded in increasing my intelligence or if it's just a result of improving my mental health and actually putting a bit of effort into my life. I still think that very high levels of concentration and effort can force the brain to reconstruct itself, but that this process is so unpleasant that people stop doing it once they're good enough (for instance, most people can't read all that fast, despite reading texts for 1000s of hours. But if they spend just a few weeks practicing, they can improve their reading speed by a lot, so this kind of shows how improvement stops once you stop applying pressure)

By the way, I don't know much about neurons. It could be that 4-5SD people are much harder to improve since the ratio of better states to worse states is much lower

Comment by StartAtTheEnd on Nathan Young's Shortform · 2024-10-15T01:47:43.476Z · LW · GW

Gwern and Scott are great writers, which is different from writing great things. It's like high-purity silver rather than rough gold, if that makes sense.

I do think they write a lot of great things, but not excellent things. Posts like "Maybe Your Zoloft Stopped Working Because A Liver Fluke Tried To Turn Your Nth-Great-Grandmother Into A Zombie" are probably around the limit of how difficult of an idea somebody can communicate while retaining some level of popularity. Somebody wanting to communicate ideas one or two standard deviations about this would find themselves in obscurity. I think there's more intelligent people out there sharing ideas which don't really reach anyone. Of course, it's hard for me to provide examples, as obscure things are hard to find, and I won't be able to prove that said ideas are good, for if it was easy to recognize as such, then they'd already be popular. And once you get abstract enough, the things you say will basically be indistinguishable from nonsense to anyone below a certain threshold of intelligence.

Of course, it may just be that high levels of abstraction aren't useful, leading intelligent people towards width and expertise with the mundane, rather than rabbit holes. Or it may be that people give up attempting to communicate certain concepts in language, and just make the attempt at showing them instead.

I saw a biologist on here comparing people to fire (as chemical processes) and immediately found the idea familiar as I had made the same connection myself before. To most people, it probably seems like a weird idea?

Comment by StartAtTheEnd on Overview of strong human intelligence amplification methods · 2024-10-14T22:09:03.770Z · LW · GW

Short note: We don't need 7SDs to get 7SDs.

If we could increase the average IQ by 2SDs, then we'd have lots of intelligent people looking into intelligence enhancement. In short, intelligence feeds into itself, it might be possible to start the AGI explosion in humans.

Comment by StartAtTheEnd on Advice for journalists · 2024-10-13T05:55:04.288Z · LW · GW

Generally, some of the ideas here are still potentially useful, they just don't get you any guarantees.

When I say "There's nothing you can do about journalists screwing you over" I mean it like "There's nothing you can do about the police screwing you over". In 90% of cases, you probably won't be screwed over, but the distribution of power makes it easy for them to make things difficult for you if they hate you enough. Another example is "Unprotected WIFI isn't secure", you can use McDonalds internet for your online banking for years without being hacked, so in practice you're only a little insecure, but the statement "It's insecure" just means "Whether or not you're safe no longer depends on yourself, but on other peoples intentions".

From this perspective, I'm warning against something which may not even happen. But it's merely because a bad actor could exploit these attack vectors. I'm also speaking very generally, in a larger scope than just Lesswrong users talking to journalists. This probably adds to the feeling of our conversations being disconnected.

But I will have to disagree about nobody being naive. When two entities interact, and one of the entities is barely making an effort in pleasing the other party, it's because of a difference in power. A small company may go out of its way to help you if you call its customer support line, whereas even getting in touch with a website like Facebook (unless its through the police) is genuinely hard.

The content says "Journalists exist to help us understand the world. But if you are a journalist, you have to be good enough to deserve the name" Which seems to mean "If you're going to trade, you need to provide something of value yourself, like offering a service". I think this is true for journalists as individuals, but not for companies which employ journalists. If these people won't treat you with respect, it's because they don't have to, and arguing with them is entirely pointless, even if you're right. Nothing but power will guarantee a difference, and if a journalist treats you kindly it's probably because they have integrity (which is one of the forces capable of resisting Moloch).

Repeating myself a bit here, but hopefully made my position clearer in the process.

Comment by StartAtTheEnd on sarahconstantin's Shortform · 2024-10-12T11:57:53.543Z · LW · GW

You seem to dislike reality. Could it not be that the worldview which clashes with reality is wrong (or rather, in the wrong), rather than reality being wrong/in the wrong? For instance that "nothing is forever" isn't a design flaw, but one of the required properties that a universe must have in order to support life?

Comment by StartAtTheEnd on Advice for journalists · 2024-10-12T07:08:11.868Z · LW · GW

Right, a constraint is power. This constraint is actually  the most important. In case of a power imbalance though, there's nothing the weaker party can really do but to rely on the good-will of the other party. It's their choice how things work out, to the extent that the game board favors them.

If the journalist isn't too powerful, and if they benefit from listening to you, and they're not entirely obsessed about pushing a narrative which goes against your interests or knowledge, then things are favorable and more likely to turn out well.

My argument is that we can consider these things (power difference, alignment of views, the good/bad faith of the journalist in question, etc) as parameters, and that the outcome depends entirely on these parameters, and not on the things that we pretend to be important.

Is it for instance good advice to say "Word yourself carefully so that you cannot be misinterpreted?" For how much effort Jordan Peterson put into this, it didn't do much to help his reputation.

Reputation, power and interests matter, they are the real factors. Things like honesty, truthfulness, competence and morality are the things that we pretend matter, and it's even a rule that we must pretend they matter, as breaking the forth wall (as I'm doing here) is considered bad taste. But the pretend-game gets in the way of thinking clearly. And I think this "advice for journalists" post was submitted in the first place because somebody noticed that the game being played didn't align with what it was "supposed" to be. The reason they noticed is because journalists aren't putting much effort into their deception anymore, which is because the balance of power has been skewed so much

Comment by StartAtTheEnd on Advice for journalists · 2024-10-11T01:55:26.436Z · LW · GW

If it's not about truth value, then it's not about misinformation. It's more about manipulation and the harmfulness of certain information, no?

My point is about the imperfections//limitations of language. If I say "the vaccine is safe", how safe does it have to be for my statement to be true? Is an one-in-a-million risk a proof by contradiction, or is it evidence of safety? Where's the cut-off for 'safety'?

I do think fighting "bad-faith manipulation" is doable at times, but I don't think you can label anything as being true/false for certain.

Another point, which I should have mentioned earlier, is that removing false information can be harmful. Better to let it stay along with the counter-arguments which are posted, so that observers can read both sides and judge for themselves. Believing in something false is a human right. Imagine, for isntance, if believing (or not believing) in god was actually illegal

Comment by StartAtTheEnd on Nathan Young's Shortform · 2024-10-11T00:23:14.348Z · LW · GW

Why does it confuse you? The attention something gets doesn't depend strongly on its quality, but on how accessible it is.

If I get a lot of karma/upvotes/thumbs/hearts/whatever online, then I feel bad, because I would have written something poor.
My best comments are usually ignored, with the occasional reply from somebody who misunderstands me entirely, and the even more rare even that somebody understands me (this type is usually so aligned with what I wrote that they have nothing to add).

The nature of the normal distribution makes it so that popularity and quality never correlate very strongly. This is discouraging to people who do their best in some field with the hope that they will be recognized for it. I've seen many artists troubled by this as well, everything they consider a "masterpiece" is somewhat obscure, while most popular things go against their taste. An example that many can agree with is probably pop music, but I don't think any examples exists which more than 50% of people agree with, because then said example wouldn't exist in the first place.

Comment by StartAtTheEnd on Advice for journalists · 2024-10-11T00:10:09.483Z · LW · GW

But things like that happen all the time, and most things that people know about most topics are superficial, meaning that they've only heard the accusations, and that they're only going to encounter the correction if they care to have a conversation about the topic. If the topic is politically biased, and these people spend time in politically biased communities, then it's unlikely that anyone is going to show them the evidence that they're wrong. You're not incorrect, but think about the ratio of rationalists to non-rationalists. The reach of the media vs the amount of people who will bother to correct people who don't know the full story.

It would also be easy for the website in question to say "You're been accused of doing X, which is bad. We don't tolerate bad behaviour on your platform" and ban you before you get to defend yourself. If the misunderstanding is bad enough, online websites can simply decide that even talking about you, or "defending you" is a sign of bad behaviour (I think this sort of happened to Kanye West because we had a manic episode in which he communicated things which are hard to understand and easy to misunderstand)

we could collectively keep one wiki page containing all of this

There's a Wikipedia page on "Gamergate", written largely by people who don't know what happened. And there's a "Gamergate Wiki" with tons of information (44 pages) with every detail documented in chronological order. I want to ask you two questions about this Wiki with the "other side of the story":

1: Have you ever heard of it?
2: Can you even find it? (the only link I have myself is an archived page)

Comment by StartAtTheEnd on Advice for journalists · 2024-10-10T23:57:15.045Z · LW · GW

"Fighting misinformation" often means "Rejecting views which goes against ones own political narrative". Scientists care the most about truth, and they aren't afraid of challenging and questioning what is known. People heavily invested in politics don't actually care all the much about truth, they just pretend to do so because it sounds noble. The "truthseeking" kind of person is a bit of a weirdo, not a lot of them exists.

It's sad that the problem has gotten bad enough that even people on here have recognized it, but It's nice not seeing comments which essentially say "Official sources are untrustworthy? That's a bold claim. Give me evidence, from a source that I consider official and trustworthy, of course."

But I really want to point out that it's mathematically impossible to combat falsehood, and that it doesn't matter if you call it "misinformation" or "disinformation". The very approach fundamentally misunderstands how knowledge works.

1: Science is about refining our understanding, which means challenging it rather than attacking anyone who disagrees. It must be "open to modification" rather than "closed".
2: It's impossible to know if there's any unknown unknowns that one is missing. Absolute certaincy does not seem to exist in knowledge.
3: Many things depend on definitions, which are arbitrary. "Is X a mental illness?" is decided not by X, but by a persons relationship to X.
4: Any conversation which has some intellectual weight is going to be difficult, and unless you can understand what the other person is saying, you cannot know if they're wrong.
5: Language seems to have a lot of relativity and unspoken assumptions. If you say "Death is bad" you may mean "For a human, the idea of death is uncomfortable if it prevents them from something that they're capable of doing". Arguing against "Death is bad" is trivial, "If death didn't exist, neither would the modern man, for we evolved through darwinism".
6: People who know better always outnumber those who only have superficial understandings. Consensuses favor quantity over quality, but those who are ahead of the rest must necessarily be a minority who possess obscure knowledge which is difficult to communicate. I'd go as far as saying that getting average people involved in science was a mistake. 99% of people aren't knowledable enough to understand the vaccine, so their position on the matter depends on the political bias of the authority they trust, which makes everything they have to say about the topic worthless. Except of couse statements like "The government stated X, now they're staying Y, so they either lied in the past or they're lying now" which is just basic logic

Comment by StartAtTheEnd on Advice for journalists · 2024-10-09T20:52:48.717Z · LW · GW

If the journalist acts in good faith, I think you will be alright. If not, there's nothing you can do, whatsoever.

Coming up with reasons is almost too easy:
1: The journalist can write an article about you even if you've never talked to them
2: A journalist can start out trustworthy and then change for the worse (most untrustworthy authorities today grew powerful by being trustworthy. Now that they've created their public image of impartiality and fairness, they can burn it for years. Examples include Google and Wikipedia)
3: If you record me saying "I wouldn't say I'm very interested in cars", you just cut out the first part of the video, and now you have me saying "I'm very interested in cars". If I quote another person, "X said that Y people are bad", you could cut out the part of me saying "Y people are bad". The deeper and more complex a subject you can get me to talk about, the easier it would be to take me out of context. Making Jordan Peterson look bad is trivial for instance.
4: Even if you have evidence that your words were twisted, you'll lose if your evidence can't reach other people. So if your values don't align with the average journalist, or if your reputation is bad, you might find yourself relying on getting the word out by having a social media post go viral or something.

Personally, if I see a journalist or website treating anyone unfairly, I make a mental check that they're inherently untrustworthy. I'd contact such people only if they had a stake in releasing my story (so that our goals align). As you may imagine, my standards result in me not bothering with about 90% of society. I rarely attempt to solve problems like this, because I have solved them in the past and realized that the solution is actually unwanted (that many things are flawed on purpose, and not because they lack intelligent people to help them fix them)

Things would be better if society as a whole valued truthfulness, and if winning directly (rather than with underhanded tricks) was associated with higher social status. These are the upstream chances I'd like to see in the world

Comment by StartAtTheEnd on Advice for journalists · 2024-10-08T21:16:57.428Z · LW · GW

You can still lie by omission, allowing evidence that shows person A's wrongdoings, while refuting evidence that shows either person A's examples of trustworthiness, or person B's wrongdoings.

If I do 10 things, 8 of which are virtuous and 2 of which are bad, and you only communicate the two to the world, then you will have deceived your listeners. Meanwhile, if another person does 8 things which are bad and 2 which are virtuous, you could share those two things. One-sidedness can be harmful and biased without ever lying (negative people tend to be in this group I think, especially if they're intelligent)

A lot of online review sites are biased, despite essentially being designed to represent regular people rather than some authority which might lie to you. They silently delete reviews, selectively accuse reviews of breaking rules (holding a subset of them to a much higher standard, or claiming that reviews are targeted harassment by some socially unappealing group), adding fake votes themselves, etc.

Comment by StartAtTheEnd on Advice for journalists · 2024-10-08T18:10:51.648Z · LW · GW

to make a public list of journalists who are or aren't trustworthy

This doesn't work, as you don't know if the list (or its creators) are trustworthy. This is a smaller version of something which is an unsolvable problem (because you need an absolute reference point but only have relative reference points) An authority can keep an eye on everything under its control, but it cannot keep an eye on itself. "Who watches the watchers?". This is why a ministry of truth is a bad idea and why misinformation is impossible to combat.
It's tempting to say that openness of information is a solution (that if everyone can voice their opinions, observers can come to a sound conclusion themselves), and while this does end better, you don't know if, for instance, a review site is deleting user reviews or not. (I just realized this is why people value transparency. But you don't know if a seemingly transparent entity is actually transparent or just pretending to be. You can use technology which is fair or secure by design, but authorities (like the government) always make sure that this technology can't exist.

Comment by StartAtTheEnd on Information dark matter · 2024-10-03T06:43:55.979Z · LW · GW

I like the concept, but I don't want to make "information dark matter" anymore accessible, and I think it's a bad idea to think about how it may be done. It seems like the kind of optimization which increases the legibility of society, and which makes it easy to automate and exploit things, and perhaps most importantly, allows an authority which doesn't want these things to exist, to find them and remove them effectively.

I will not be able to defend this point very well, as I don't think words exist for most of the ideas that I have in mind. I can only vaguely point at it, with patterns which are possibly so abstract that they don't make sense to most people:

  • What's optimal is context-dependent, and thus local. If you read 100 books, and you write your own book, the resulting book will be smaller (and not only because redundancies have been removed, or because you're thrown away objectively wrong information). Because of a fractal-like structure, mutually exlusive information can exist as long as it's isolated. When you force interaction between different things, you basically destroy their differences, arriving at a more "general" result which is their average or their intersection. Enforcing any kind of consensus is therefore destructive.
  • I don't belive that sharing any positive things really increase them. If anything, it probably makes for a more even distribution, or a faster exhaustion of what's good (everything good is a resource, every resource is limited). In fact, the "goodness" of a resource is probably a byproduct of burning that resource, so that what we consider good is actually the destruction of good. If you watch a good movie, then you're having fun, but if you watch it again you will find that it's less fun than the first time. I think your concept of "value" might have similar issues, and that "efficiency" might just mean "exploitability" or "step-size taken towards equilibrium". Nothing is entirely good or entirely bad, and I'd like to warn against optimizing towards any metrics whatsoever. I think the alignment problem shows that we fundamentally misunderstand what optimization is. We cannot come up with a single metric for what to optimize for doesn't result in the destruction of society, and I doubt this is merely because we're having trouble defining "ethical", "good", moral", etc. 
  • The idea mixing things is good is antibiological, and people who come to such conclusions tend to be, from my own observation, transhumanist in some sense (weaker sense of shame, less of a need for personal space, dislike of discrimination, dislike of borders, little need for privacy, smaller amygdalas, etc)
    We treat family differently than strangers, and we tell information to our friends which we wouldn't tell others. This all creates a kind of "closedness" and I very much doubt that we evolved this tendency just because we're irrational beings who don't want a prosperous society. Even this community is a bit of a walled garden, and it's precisely because walls, or at least discriminatory access, is essential for the survivability of a system (and I do see the humor in the possibility of getting banned or rate-limited for this comment)
  • It's often more frustrating to interact with bigger companies, since you'll encounter bureaucracy and automated systems which never have the answer you're looking for. It's only when you can get in contact with another human being that the process tends to be pleasing. But mindsets in which optimization, globalization, automation and other such things are regarded as positive, cause the world to tend towards a less biological/human state. Even though society is getting more connected, loneliness is on the raise, and it's because the efficiency of the system is optimizing away human interactions and freedom. This is another reason I'd like to reduce the systemization of society.
Comment by StartAtTheEnd on Honest science is spirituality · 2024-09-25T16:12:13.708Z · LW · GW

That's alright! And I'm happy some of it resonated with you.

While science seems to solve a lot of problems, I think it creates new problems as it solves the old ones, and that most of the problems were trying to solve now wouldn't be as bad if they weren't amplified by technology. I think science will always both solve and create problems, and that this is unavoidable because science itself is unbiased and free-for-all, and that they problems will get worse since technology is a power-amplifier (by the way, as technology allows for stronger tools over time, society needs more laws and restrictions in order to keep people from being able to harm one another, and if you try to design a video game in which players progress like this you will notice that no player is going to enjoy the end-game)

I also don't think we should give science the entire credit. The knowledge of humanity is mainly  improved by few, extremely intelligent people (Einstein, Newton, Tesla, Hawking, and so on). Even before the scientific method, a few highly influential people accounted for most changes in the world. Most peoples education consist of studying other peoples discoveries and theories, with the hope that they can reach a level where they provide more benefit than harm. This almost makes common people sound superficial, but that's also because science is getting harder to use. 200 years ago you'd likely be alright if you could operate a shovel and a wheelbarrow, whereas you're at a disadvantage today if you can't make it through college.

It is impossible to understand the world. At best you can make a mental model which can predict it because, in some sense, your internal model is a bisimulation (I'm not sure if that's the right term, but the idea should be close). We refine theories so that they predict the world with less and less error. But the human brain already does this by its own, and intelligence isn't actually required, nor is understanding. It's just trial and error, you try things, and if their outcome is good, you keep them, and otherwise you get rid of them. But isn't this how darwinism works? And how cultures and traditions originate? But they don't need to know why something works, only that it does. We don't give tradition much credit because traditional explanations are irrational, but as far as pure results go I think traditions do quite well. Most arguments against tradition aren't rational but rather moralistic, and society doesn't seem aware of this, but morality and truth are at great conflict.

Besides the psychological consequences of understanding something (a kind of disillusionment), I think Moloch might be the greater danger. Moloch seems to me a result of legibility, and glorifying science makes people think that legibility (and order) are good-in-themselves, i.e. something where more is always better. This is not the case. I didn't yet know this when I wrote my previous comment, but the issue is known as "high modernity". Nassim Taleb has written about how forcing orderliness on society is dangerous, and it's my personal belief that a lack of legibility is what holds back Moloch. The game theoretical collapse of society is only happening now, in the modern age, since the modern age is what has made it possible, by creating enough order and simplicity (or given the illusion of these) that we now have enough information available that the dilemmas are visible. And now that they're visible you're either required to join them or to put yourself at a disadvantage by not joining them. I believe that online social media is unhealthy, whereas real-life socialization is often healthy, because the latter is more chaotic (literally) and has less visible metrics that one could start optimizing for.

To generalize, most undesirable mechanics in life are caused by excess legibility, and by the belief that something singular is good and ought to be optimized for. Rationality, legibility, morality, equality, happiness... Whatever metric we choose, the outcome will be terrible. The alignment problem in AI should perhaps teach us that focusing on anything singular is bad, i.e. that balance is key. But Taoists knew this more than 2500 years ago. I've often been told that religious people are just stupid and that the category of people who speak of Heaven/Nature/GNON and warn against "playing god" don't include hyper intelligent people, but from my limited experience with rationalism and science, this is wrong.

Of course, this community will largely disagree with me, since it believes that society is clearly improving while I believe it's clearly getting worse.