My default frame: some fundamentals and background beliefs

post by Pontor · 2020-11-10T00:04:46.519Z · LW · GW · 4 comments

Following John Nerst’s noble example: https://everythingstudies.com/2018/07/16/30-fundamentals

Only items after #18 are due to me, the rest are copied from the link above.

  1. I subscribe to scientific materialism as a tentative metaphysics. Physical reality is what there is, and everything in the universe is a manifestation, however indirect, of the laws of physics. Ideas about human-centric phenomena (thought, emotion, choice, morality) being ontologically fundamental aren’t, in my view, worth taking seriously other than as interesting social or psychological phenomena.” It ought to go without saying that I still appreciate art and the humanities on their own terms as well as in naturalistic terms.
  2. Many of the things we believe are not facts, they’re narratives — stories that are interpretations and generalizations of many facts. Narratives translate a strange, bewildering world into mentalese by using the building blocks of thought: causation, intention, agents, relationships, importance, good and evil, promises, rights and obligations.
  3. The supposed facts that make up narratives can themselves be seen as sub-narratives, all in a fractal structure. The difference between a fact and a small narrative isn’t well defined. That’s because of how words work — they inherently contain different levels of generalization, interpretation and valuation. Thus, the structure and adoption of narratives shapes our collective worldviews and our societies, all the way up the chain from single words to full-blown paradigms. The struggle for the rights to shape them resembles war.
  4. The study of rhetoric as usually defined is far too limited. We focus on rhetoric as trying to convince people to make a specific decision, but the constant reshaping of our shared conceptual world by the process of discourse is a more important arena for rhetoric.
  5. The truth and accuracy of claims must take precedence over their social and political implications (in the public sphere, that is, because in personal situations tact is often more important). This is all for Kantian reasons: if we stop evaluating claims by their accuracy we can no longer trust them. Academic research that rejects this and sees itself as engaging in a political project (no matter how laudable) rather than a scientific one have severe legitimacy problems for this reason. Having a fundamentally political, strongly values-oriented way of seeing the world runs counter to the dispassionate rationality required for generating reliable knowledge.
  6. My personal disposition is often towards decoupling and I maintain that requests for it ought to be heeded.
  7. The human brain is a biologically evolved organ and nothing about human behavior makes sense except in the light of evolution. Human nature matters. Furthermore, sex differences exist and are not conjured out of nothing. Intelligence tests measure something real, and it is significantly heritable. As are many, many other traits. Denying all this fundamentally corrupts your capability to think about anything relating to human beings.
  8. whether it’s personalities, behavior, history, technology, social arrangements or culture, there’s a certain level of freedom and contingency in how it develops. There are also restrictions of varying hardness that make some configurations and structures more stable and thus likelier than others. “It’s fixed” and “it’s arbitrary” are complementary partial narratives over a complex reality where the social world semi-contingently grows inside a cost-space defined by biology and material constraints. Easy peasy.
  9. In practice, there are, when assigning responsibility for life outcomes, two poles: one being a high-agency stance that treats individual freedom and responsbility as absolute, and the other a low-agency stance that treats choices as constrained so much by social structures that responsibility is best placed on the system as a whole. I don’t have much of a dog in that fight and I think people who favor one of these approaches across the board are doing it wrong; they’re appropriate in different measures depending on the specific context.
  10. Humans lived in small hunter-gatherer groups for almost all of their evolutionary history. Agriculture and industry are late inventions that put humans in unnatural conditions, leading to many, many good things but also to a lot of problems, including large scale war, oppression, regimentation, boredom, alienation etc. Lots (not all) of social problems are both “innate” and “environmental” at once by being a product of our nature having to adapt to conditions it didn’t evolve to fit into.
  11. Way more social and political problems, dissatisfactions, internal dissonances etc. than we think stem from us living in very large (as opposed to small sub-dunbarian) societies. When everyone no longer knows everyone else, we can’t rely on intuition to run society any more. Complicated issues will have to be managed, and we can’t just go by what feels right (or dismiss something because it doesn’t), because on this scale there are dynamics we don’t inuitively comprehend. The result has been the emergence of impersonal, formal institutions like laws, bureaucracy, currency, property and contracts — everything hippies hate. I sympathize. Really, I do. But it’s sheer size that demands those things, not artificial imposition. We can’t have large societies that “feel right”.
  12. War, violence, genocide, conquest, exploitation and poverty are not the products of modernity, capitalism or the industrial revolution. They are as old as civilization (some are older). Western civilization as an escape from a violent, oppressive, dirt-poor past is an extraordinary accomplishment, and it has every right to feel good about itself for that. Similarly, modernity was and is a great thing that’s led to prosperity beyond anyone’s wildest dreams, and we should bow down and thank it for what it has given us and continue to give us, before we start legitmately complaining about downsides that certainly do exist.
  13. Consensus morality is downstream from practicality. It does not grow into a predetermined shape dictated by some elusive “correct” ethical theory, instead it thrives and withers selectively based on the shape of the space available. We will not have morality that demands too much of us and we will not have morality that prevents us from picking low-hanging fruit. We find ways to justify the rules we need.
  14. Consequentialism, deontology and virtue ethics are not substitutes. They are complements, dealing with different facets of a morality full of internal tensions. The notion of a “true” ethical theory and strong moral realism makes no sense to me.
  15. Much of our thinking is half-conscious at best and our powers of introspection are way weaker than we think. Intuitions about our own motives and behaviors are not to be trusted, and powerful feelings are to be acknowledged but viewed with suspicion. Only by studying psychology (for understanding minds annd emotions), economics (for understanding behavior), math/machine learning/cognition (for understanding properties, categories and concepts) and philosophy (for when the rest isn’t enough) can we even begin to understand what we’re thinking and why.
  16. I identify with nerds (especially of the STEM variety) and instinctively tend to take their side in any disagreement or conflict. However, I’ve noted that the closer I get to such nerds the less I feel like one of them.
  17. Ideologies describe the world but also work to reshape it in their own image. For that to yield good results an ideology must expect a little bit more of the world than it currently delivers, but not too much more, lest the cable pulling us forward breaks from the strain. Good ideology is practical but not cynical, optimistic but not utopian.
  18. Everyone arguing has a point, even if often not a very strong one. Charity and civility in disagreement is an essential virtue, and make an effort to understand under what conditions something seemingly wrong would make sense. People are more different in how they see the world than we think, and we should practice telling ourselves stories that make the other sympathetic.
  19. Every modern intellectual citizen ought to become familiar with at least some of the major ideas in the rationalist canon. This includes R:AZ, The Codex, Superforecasting, How to Measure Anything, Inadequate Equilibria, and Good and Real.
  20. News and social media have developed serious downsides for both individuals and groups. Until healthier social media and cleaner news sources are created and adopted, it is worth taking great personal pains to mitigate these downsides. The main reason people do not do this is status quo bias. Most people are very irresponsible about their mental exposure to media. Attention and awareness are among the very most valuable resources a person has.
  21. Our current civilization is a dystopia stuck in some inadequate equilibria. We know what better incentives and saner institutions [LW · GW] might look like, but we don't know how to put them in place.
  22. There is no guarantee that the arc of history will bend toward things we like. If it does, it will involve struggle and tribulation.
  23. Technological change in the 21st century is massively underappreciated in general discourse. For examples, take the current shift to remote work, and the widespread mental health effects of social media. It is nearly impossible to predict these things, but having high uncertainty does not excuse status quo bias. There is an impending cyberpunk weirdtopia of unknown specifics, and this ought to loom large in the collective consciousness.
  24. Transhumanism is a reasonable generalization of standard humanism in light of technology. Transhumanism also raises genuine moral and aesthetic dilemmas that should not be ignored. Any discussion of ethics or philosophy in the modern age ought to be informed by transhumanist thought. Those that don’t are out-of-date. I consider transhumanist interventions to include brain emulation, AI, nanotechnology, genetic engineering, personalized medicine, embryo screening, cryonics, psychedelics, nootropics, orthodontics, organ transplants, cosmetic surgery, CPAP machines, contact lenses, antidepressants, steroids, vaccines, soap, the scientific method itself, advanced meditation, cognitive-behavioral conditioning, and fitness training.
  25. Contrary to what Wikipedia says, there is nothing pseudoscientific about cryonics. Mainstream science provides no strong reasons to think that cryonically vitrifying a person’s brain will not also vitrify their memory and personality with high fidelity--even if there are strong reasons why resurrection will require very powerful technology. Resurrection, if possible, is much more likely to be done via scanning and uploading rather than by thawing and resuscitation, you unimaginitive dullards.
  26. Nature is not kind nor loving. Not only did she decide to give us a taste for meat, reflexive hostility toward outsiders, and an egalitarian instinct in a world of deep inequalities, but we also got weird stuff like the Repugnant Conclusion of population ethics, Arrow’s impossibility theorem, the prisoner’s dilemma, and that goddamn Goodhart thing rearing its head everywhere. The human will comprises a thousand shards of desire [LW · GW], adapted for reproduction in the ancestral environment, hence the deep trickiness of the question, “what do I really want?”
  27. Morality is in the mind and moral realism is mistaken. “What should I want” collapses into “what do I want?” “What is good?” collapses into “what do I like?”
  28. There are no sacred values. Supposedly sacred values may be traded off against mundane values in what is called a “taboo tradeoff”. For example, the life of one versus the convenience of very many. Taboo tradeoffs are common but people and societies manage to avoid talking about them in public, hence the name.
  29. People’s intuitions around happiness and wealth are reliably wrong. This fact is well-established in the scientific literature and readily apparent to anyone who introspects, yet it is underappreciated in public discourse, policy questions, and philosophical discussions. It is worth acknowledging the Hedonic Treadmill every day of your life.
  30. Absolute poverty is more morally compelling than wealth inequality. People often conflate the two, and this muddies the waters, to the detriment of the absolutely poor.
  31. Private property rights, free(ish) markets, and a predictable legal environment are some of the most important factors in general prosperity. On the margin, pretty much every country would benefit from more of those things.
  32. Being raised with responsible spending habits is one of the most underrated privileges. Weak personal financial skills ought to be seen as a glaring deficiency. Excusing bad financial decisions of poor people is both condescending and actively harmful.
  33. Most of the ostensibly factual claims people make are largely attempts to--broadly speaking--manipulate others or game the social fabric. The attempt is being made by a subconscious (often semi-intentional) process which is outside of your control and is mostly adapted to the ancestral environment. I basically buy the claims of Robert Cialdini and Robin Hanson.
  34. People do not shop for education and healthcare using the commonly professed rubrics. This mismatch can be explained in some part by signaling, as in The Elephant in the Brain.

4 comments

Comments sorted by top scores.

comment by Pontor · 2020-11-23T02:14:48.440Z · LW(p) · GW(p)

Alright, I think it'll make me a more responsible intellectual citizen if I try to distinguish these items a bit based on how I expect to view them in a decade or two. Let's do it.

Well overall, I expect that my current attentional foci are substantially influenced by current news, political narratives, and intellectual fads. I look back at what things I was saying and paying attention to in 2010, and I see few major differences and hard reversals, but I do see a lot of noteworthy omissions, changes of emphasis, and different compressions.


I think (34) will be fairly obsolete in 15 years. I dunno how remote learning and telemedicine have impacted things in the wake of covid, but it's plausible to me that the signaling equilibria will change enough that (34) will at least sound like an outdated opinion.

(29-32) are fairly timeless, but I wouldn't be surprised if fads in news and politics change enough in 15 years that they seem like a questionable focus.

Gods, I hope (23-25) become less necessary to say in 15 years. How much of this incipient cyberpunk weirdtopia do folks need to experience before they expand their horizons a couple centimeters?

I anticipate (21) being painfully more relevant in only 10 years. Unless we somehow get a lot of lucky breaks in a row.

The toxic status quo around news and (social) media just seems entirely unsustainable to me. I expect (21) to be fully out-of-date in 10 years, for better and/or for worse.

It's hard to imagine changing my mind about (19) any time soon, but it's possible. Perhaps I'll want to change the list to include/exclude different works. Or maybe I'll update hard against the value of mainstream mindshare. I doubt it though. See my response to niplav's comment for the generator behind (19).

(To reiterate the disclaimer: items (1-18) were adopted unmodified from John Nerst's blog post)

I get the feeling that (5-9), (18), and maybe (12) and (16) will feel less relevant in 15 years than they do right now. I think their loading with certain culture-war-related valence makes them feel more relevant right now, which is probably partly why they are on Nerst's mind (and mine).

Okay, so that's the pre-hindsight about what I originally wrote. But what about things I omitted?

I could see a world 15 years from now where it looks utterly ignorant to not include a whole paragraph about privacy.

Developments around self-driving cars triggered a gout of Trolley Problem memes. This hasn't actually been such a big deal, but I could imagine some other technology requiring a deep examination and refactoring of our moral intuitions. I tried to keep it pretty broad, but it's possible this refactoring will make my current list look a little weird.

Maybe China will be culturally ascendant in the next 20 years and I will feel the need to explicitly say something about individualism vs collectivism or something.

I might eventually be compelled to put more focus on lifestyle stuff. For example, I might dedicate several bullet points to the importance of diet, exercise, contemplative practice, work-life balance, and writing.

Some number of my family and friends will perma-die in the next 20 years, after which I may be compelled to push the cryonics stuff harder.

In the age of automation, I may feel the need to express niche opinions about economics and political philosophy. I do not yet know what these niche opinions might be.

Echo-chamber awareness, bad-faith detection, the principle of charity, asymmetrical weapons, and so on may become even more important as tools in my everyday epistemic toolkit. In contrast to the more eternal, abstract epistemic principles.

I hope not, but the need to resist Dark Side Epistemology [LW · GW] may become urgent and take up a few bullet points.

comment by niplav · 2020-11-10T11:02:21.523Z · LW(p) · GW(p)

Nice post.

I mostly agree, but this bit stood out to me:

  1. Every modern intellectual citizen ought to become familiar with at least some of the major ideas in the rationalist canon. This includes R:AZ, The Codex, Superforecasting, How to Measure Anything, Inadequate Equilibria, and Good and Real.

I am not sure what exactly you mean with "modern intellectual citizen". At the broadest, it could encompass all adults, at the narrowest, it would be limited to college professors & public intellectuals.

I also doubt that this is a productive method of raising the Sanity Waterline. We're here in a place where many people have had their minds pretty strongly changed by these texts, but reading e.g. the reviews of R:AZ on Amazon & Goodreads, I observe that many people read it, say "meh" and go on with their lives – a pretty disappointing result for reading ~2000 pages!

Furthermore, aren't sufficiently intellectual people already familiar with some of the ideas in the "rationalist canon", just by exposure to the scientific method? I think yes, and I also think that the most valuable aspect of these texts is not the ideas in and of themselves, but rather the type/structure of thinking they demand? (E.g. scout vs. soldier mindset).

Replies from: Pontor
comment by Pontor · 2020-11-10T15:43:23.677Z · LW(p) · GW(p)

Thanks, good questions. I had originally written "every responsible intellectual citizen" but that didn't feel quite right. I didn't want so much to morally condemn people who haven't read what I find important, but to highlight the fact that news of general intellectual progress does not seem to move as fast as news of progress in science. So I could forgive someone for not knowing about Fun Theory calculations nowadays, even if they were a circumspect philosopher in the 1980s--they've let themselves fall out of touch, but news travels slowly and communication has changed so it's not totally their fault.

I also doubt that this is a productive method of raising the Sanity Waterline. We're here in a place where many people have had their minds pretty strongly changed by these texts, but reading e.g. the reviews of R:AZ on Amazon & Goodreads, I observe that many people read it, say "meh" and go on with their lives – a pretty disappointing result for reading ~2000 pages!

Yeah, you're probably right about the Sanity Waterline. I didn't know about those amazon reviews though :[

 

Furthermore, aren't sufficiently intellectual people already familiar with some of the ideas in the "rationalist canon", just by exposure to the scientific method?

Well to illustrate my motivation here, I've occasionally made bets with my most infovorous coworkers, but they would always insist on doing even odds. I tried to explain odds ratios, loss aversion, and the linear utility of small amounts of money, but of course that never worked. But when I'm hanging out with rationalists, this problem doesn't happen.

EDIT: Here's the same frustration from a different angle: suppose I have these three intellectual friends. Alice is a normal-ish physics student who likes to feed her extra-curricular curiosity mostly by reading and listening to Sean Carroll. Bob is a reddit junkie who watches Science YouTube and supplements with Sam Harris and Eric Weinstein's podcasts. Charlie is a tech worker who likes to read Vox for infotainment, and he's been exposed to a handful of EA ideas and a few SSC posts, all of which made him think "whoa, cool", but none of which made him slide down the rabbit hole. 
Maybe I can get Alice to make bets with me and to agree that anthropics is an important part of the frontier of philosophy, but for some reason she is still just so weak with futurism--she seems to still be leaning on Jetsons-style archetypes without realizing it. I can have serious, productive conversations with Bob about our coming cyberpunk weirdtopia, but when I bring up prediction markets, he--lacking the background knowledge about EV, odds ratios, and betting--doesn't really seem to get it. I'm arguing politics with Charlie, and I make reference to "the naive view of free will" and he asks me to stop and explain. Oh right, I think and start looking for an alternate approach to what I'm trying to say.
Alice, Bob, and Charlie are all getting some relatively high-quality exposure to the scientific method in action, but whenever I talk to one of them, I end up thinking, gods, when will this concept become more widespread?

comment by TAG · 2020-11-11T23:29:04.952Z · LW(p) · GW(p)
  1. Morality is in the mind and moral realism is mistaken. “What should I want” collapses into “what do I want?” “What is good?” collapses into “what do I like?”

That's not implied by 13. 13 indicates that morality, while not fully realistic, is defined at group level.