Thanks very much for your work in this area, and for being so willing to engage in this discussion. I'm personally disappointed that the original post got so much engagement and yet this excellent reply and follow up has not.
Thanks Nora. Your first example especially resonated for the kind of work I do where we try and understand what the client wants and needs - often with limited background info and when the client often struggles to articulate their wants and needs.
Great post. A naive thought is that this could be a useful analogy for understanding how complex systems are resistant to outside intervention, and why reform can seem so much harder than wholesale reinvention/disruption/destruction.
This cost of rules and restrictions seems highly underestimated. Rules and regulations crowd out a lot of private action. When we ask whether rules or private choices are most responsible for keeping us apart, don’t neglect the full extent to which rules crowd out that private action. Even without that, this new study finds private action is mostly responsible.
I think Zvi's skepticism about the proper role of government and the moral right of coercion has hardened into cynicism about leadership and state capacity being fundamentally insufficient to the task.
I would caveat that the primary data reported has almost no evidentiary value because of the smalls ample size (n = 41).
I feel that you have a separate issue beyond the existence of scope insensitivity as a phenomenon, and that is that Yudkowsky committed a value judgement when he labelled the phenomenon a production of systematic error. The article linked above describes how scope insensitivity differs from an unbiased utilitarian perspective on aid and concern (it is this latter approach that Yudkowsky would presumably consider correct):
In the specific case of valuations underlying public policy decisions, one would expect that each individual life at risk should be given the same consideration and value, which is a moral principle to which most individuals in western countries would probably agree to. Nonetheless, intuitive tradeoffs and the limits of moral intuitions underlying scope insensitivity in lifesaving contexts can often lead to non-normative and irrational valuations (Reyna & Casillas, 2009).
I appreciate this post, and I think this is an insightful take on these much-discussed books and widely used phrase.
One ongoing (endless) tension in sustainability transitions - ie the study and acceleration of societal changes towards better social, environmental, and economic systems - is the idea of improving our knowledge of "what works" through rigorous and centralised testing and evaluation vs. approaches that emphasise local knowledge, practices and structures.
If youre interested, some of the terms that have been devised to try and grapple with this are "transdisciplinary"; "place-based approaches"; and "community based participatory research".
Comment by areiamus on [deleted post]
Is there any way to have posts like these hidden from the lesswrong RSS feed?
Thanks Vipul. I agree that the time horizons people who are at low personal risk are working on are very short, eg 2-4 weeks.
I would say that if you are in a highly secure position then also schedule some time to explicitly reflect on your work and life thus far. Are you trying to solve the most important problems in your work? Are you lonely because the people who you would otherwise spend time with aren't reaching out to you, or you don't derive social support or enjoyment sufficient for you to spend the effort reaching out to you? Do you know how to rest if you don't have events and obligations to fill all your time?
I appreciate you raising this issue, Evan. And especially the clarity of the trade off between instrumental and epistemic rationality brings into focus a sense of discomfort I have felt in a lot of the recent activity on LW critical of the CDC.
I think it's especially important to keep our egos small and remember that expertise does not generalise.
Yes. It was meant to imply a comparison set against which your post should be considered - e.g., if I read about 1-10 articles like yours every day, then your post was among the best of about 100-1000 (possibly an exaggeration for effect).
Please do not post links without any description or context. As an RSS subscriber I am unable to even see (or open) linkposts without opening the LW post in my browser. A description of the link (and ideally a repetition of the link in the post text) is very useful for helping readers understand why you linked it and who may wish to read it.
Meta: are you republishing this piece from somewhere else? I subscribe to LW (and EAF) with RSS and over the past few days I've had all of your previous posts inserted into my feed three times. Is this likely to be some issue with LW, or an integration with your personal blog?
It seems to me that there's a third key message, or possibly a reframing of#1, which is that people without power should be considered less morally culpable for their actions -eg the Wells Fargo employees should be judged less harshly.
The concept of "human error" is often invoked to explain system breakdown as resulting from individual deficiencies (eg, early public discussion of the Boeing 737 MAX crashes had an underlying theme of "Ethiopian and Indonesian pilots are just not as skilled as American pilots") - but a human factors / resilient engineering perspective recognises that humans' roles in technical systems can be empowered or constrained by the system design. And of course it was other humans who designed (approved, built, ...) the system in the first place.