Posts
Comments
Huh, you might want to get your virus-checker checked - it's just a link to a substack page
https://www.secretorum.life/p/life-on-the-grid-part-2
Very interesting, and yes I think I'm getting at something like that here as well.
No I think it does - almost like free-range foraging vs. being spoon-fed information (wild animal vs. domesticated) - in the former you learn how to quickly discriminate between good/useful food and bad and develop a kind of intuition for how to efficiently find the good stuff whereas in the latter you do not.
then read it again but non-ironically
ughh you are right, missed opportunity
Fair enough, but as I said not all writing has to be aimed at maximum concision and clarity (and insisting that it should be is bad for our collective creativity). One may choose to write in a less direct manner in order to briefly present numerous tangentially related ideas (which readers may follow up on if they choose) or simply to provide a more varied and entertaining reading experience. Believe it or not, there are other goals that one can have in writing and reading besides maximally efficient communication/intake of information.
What about this is hard to read? I'm confused.
So there is really no purpose to every read something non-fiction besides efficient intake of information? Is that what we really believe?
I could not care less whether or not any reads this.
Not going to be everyone's vibe that's fine, but if you've forgotten that there are other reasons to read something besides maximally efficient intake of information then that's a problem...
Efficient communication/intake of information is not the only reason that people write or read...
there is no main idea and I'm not trying to convince anyone of anything. It is nothing.
"probably wrong" thank you so much for this
Thank you for your review!
They do spend considerable time discussing that in the article
Yes
To be clear, I am not the author - this is an article that was submitted to the journal. If you want to read the article just reach out the email above (if you want to take a look without registering to be a gardener that is okay).
I'll just post a twitter thread here that I wrote in response to criticism. Maybe this will clarify my goals/intentions
I want “nerds” to realize that we are not above performative attention-seeking behavior, that we can really easily slip into a failure mode of “write a blog post that embraces some high-minded ideal that no one disagrees with and then propose some law/policy/program that will supposedly increase this thing and then pat yourself on the back and move on”. I wanted to expose my own emotions and insecurities around really caring about science/progress/altruism/etc. while also really wanting people to read and praise my writing because I think other people struggle with this too. I want people reflect on the fact that writing =/= thinking and if you aren’t careful it’s easy to forget that. I wanted people to consider that explicitly trying to be innovative, creative, or smart might not work as well being as earnest as possible in your pursuit of curiosity, love, beauty, etc and that internet/social media can make it really hard to be earnest in your intellectual pursuits. I also just wanted to entertain and make people laugh; of course I was making arguments but I also view this essay as art (pretentious, I know)– I want people to know it’s okay to do both, not everything has to be a Very Serious Essay That Convinces You of Something. In fact, I would argue that there is a dearth of aesthetic sensibilities in the science/progress/EA space and that we all might benefit from a little more style, emotion, beauty, and humor.
Also, check out the substack :) - https://rogersbacon.substack.com/
Being a little tongue-in-cheek with this one, but I think recent US history shows racial preferences are more malleable than we might think. Will there be a tipping point when everyone is either mixed or has a close relative that's mixed where it will seem a little more silly to argue about race? I don't know about Brazil and would be curious to hear more like Ben Pace.
Thanks! Yup, just finished and enjoyed DoE :)
A good reminder, I'll start getting worried when discussion of these heresies moves beyond niche internet message boards.
I don't think anything - this is a heresy not something I believe in (I would argue your question is evidence that this view is a modern heresy). "politicians were generally older".... the average age of senators is 57 for example.
I hope it goes without saying that this is a heresy and not something I actually believe. A recent article in the Journal of Controversial Ideas makes the case for animal-rights terrorism.
"There is widespread agreement that coercive force may be used to prevent people from seriously and wrongfully harming others. But what about when those others are non-human animals? Some militant animal rights activists endorse the use of violent coercion against those who would otherwise harm animals. In the philosophical literature on animal ethics, however, theirs is a stance that enjoys little direct support. I contend that such coercion is nevertheless prima facie morally permissible."
Interesting. I guess in some ways yes because it's giving people access to another form of identity but it's also kind of orthogonal in that the identity is only used in virtual environment and it's pseudonymous. The argument in this heresy is that we being less attached to our names IRL would cause a shift of some kind in cognition/consciousness.
Yea that's fair, I didn't write this with LW in mind but I should have considered dropping/trimming introduction as it's not as necessary for this audience.... Interesting, I've heard similar thoughts to yours regarding music from quite a few people. This makes me think that the ubiquity of art in the modern world is affecting us more than we may realize. Curious what research exists on long-term effects of music/art consumption, although this would be hard to study I guess (which is why I'm suspicious that there is something we haven't yet appreciated).
I was in Denver recently and there is street art (giant colorful murals) everywhere. It seems like this is almost universally regarded as a good thing vs. looking at concrete walls, but now I'm skeptical. There is at least some evidence that students learn less and get more distracted in busy classrooms for what it's worth.
Sure, I don't deny that there are some ideas which should be kept secret for at least some time so that you can better capitalize on them. But I think for most people this category of ideas is much smaller than they think and that it would serve them better in the long run to be less stingy with their ideas. This kind of gets to the crux of my thesis - if you have a scarcity mindset with ideas than they probably will be scarce for you. Maybe you will end up losing out on an opportunity or some concrete short-term benefit, but there are more intangible, long-term benefits to be had by being open with your ideas - the difficulty is that these benefits are inherently more nebulous/illegible and therefore easier to discount.
You are right about the use of impact as a metric, definitely not perfect, and I think both of those sources probably oversell how poor scientific evaluation is in general. Some of the problem is that people are not incentivized to really care that much and they don't specialize in grant/paper evaluation, the idea of having "professional reviewers" is interesting, but not sure how practically achievable it is.
I hadn't heard about the idea of depth first search but it is exactly what I am talking about and you explained it very well, thank you for sharing.
"We often observe that the solutions found by genetic algorithms, or NNs, or cats, are strange, perverse, unexpected, and trigger a reaction of 'how did it come up with that?'; one reason is just that they are very thorough about exploring the possibility space"
Do you have any specific examples in mind here that you are willing to share? None are coming to mind off the top of my head and I'd love to have some examples for future reference.
I'm a little confused by what you are referring to here so if you are willing to spell it out I would appreciate it but no worries either way. Many very fascinating ideas in your other comment, I'll try to respond in a day or two.
Not intentional - thanks!
And he disclosed his name because the New York Times published it - https://www.nytimes.com/2021/02/13/technology/slate-star-codex-rationalists.html
I've also discussed the paper with him and he didn't seem to have an issue with it.
Ha I like the Einstein example! I think about the "bold leaps" thing a lot - we may be in kind of "epistemic hell" with respect to certain ideas/theories i.e. all small steps in that direction will seem completely false/irrational (the valley between us and the next peak is deep and wide). Maybe not perfect but I think the problem of inheritance as you describe in the Bakewell article fits as an example here. Heredity was much more complex than we thought and the problem was complicated by the fact that we had lots of wrong but vaguely reasonable ideas that came from essentially mythical figures like Aristotle. The idea that we should study a very simple system and collect huge amounts of data until a pattern emerges and then go from there instead of armchair theorizing was kind of a crazy idea, which is why a monk was the one to do it and no one realized how important it was until 40 years later.
The question is how do we create individuals that are capable of making huge jumps in knowledge space and environments that encourage them to do so. Anything that sounds super reasonable is probably not radical enough (which is why this is so difficult). Like you say, it can't be too crazy, but we need people who will go incredibly far in one direction while starting with a premise that is highly speculative but not outright wrong. One example might be panpsychism - we need an Einstein who takes panpsychism as brute fact and then attempts to reconstruct physics from there. My own wild offering is that ideas are alive, not in the trivial sense of a meme, but as complex spatiotemporal organisms, or maybe they are endosymbionts that are made of consciousness in the same way we are made of matter (see Ideas are Alive and You are Dead). Before the microscope we couldn't really conceive how a life form could be that small, maybe there is something like that going on here as well and new tools/theories will lead to the discovery of an entirely new domain of life. Obviously this is crazy but maybe this is an example of the general flavor of crazy we need to explore.
…one reason is just that they are very thorough about exploring the possibility space, where a human would have long since gotten bored, said "this is stupid", and moved on - it was stupid, but it wasn't stupid enough, and if they had persisted long enough, it would've wrapped around from idiocy to genius. Our satisficing nature undermines our search for truly novel solutions; we aren't inhumanly patient enough to find them.
One reason that people might persist in something way past boredom or reasonable justification is religious faith or some kind of irrational conviction arising from a spiritual experience. From a different angle, Tyler Cowen also offers some thoughts on why the important thinkers of the future will be religious:
Third, religious thinkers arguably have more degrees of freedom. I don’t mean to hurt anybody’s feelings here, but…how shall I put it? The claims of the religions are not so closely tied to the experimental method and the randomized control trial. (Narrator: “Neither are the secular claims!”) It would be too harsh to say “they can just make stuff up,” but…arguably there are fewer constraints. That might lead to more gross errors and fabrications in the distribution as a whole, but also more creativity in the positive direction. And right now we seem pretty hungry for some breaks in the previous debates, even if not all of those breaks will be for the better.
I don’t think Mendel was particularly inspired by his religious faith to study heredity (I might be wrong) but it certainly didn’t stop him and in the broad sense it enabled him to be an outsider who could dedicate extended study to something seemingly trivial. As you pointed towards, being an outsider is crucial if someone is to take these kinds of bold leaps. Among other things, being an insider makes it harder to get past what you described at the end of the Origins of Innovation article:
Perhaps there is some sort of psychological barrier, where the mind flinches at any suggestion bubbling up from the subconscious that conflicts with age-old tradition or with higher-status figures. Should any new ideas still manage to come up, they are suppressed; “don’t rock the boat”, don’t stand out (“the innovator has for enemies all those who have done well under the old conditions”)
This is the fundamental reasoning behind an article I wrote that was recently published in New Ideas in Psychology – "Amateur hour: Improving knowledge diversity in psychological and behavioral science by harnessing contributions from amateurs" (author access link). Amateurs can think and do research in ways that professionals can’t by virtue of not facing the incentives and constraints that come with having a career in academia. We identify six “blind spots” in academia that amateurs might focus on – long-term research, interdisciplinary, speculative, uncommon or taboo topics, basic observational research, and aimless projects). This led us to write:
Taken together, our discussion of blind spots highlights one overarching direction in “research-space” that may be especially promising: long, aimless, speculative, and interdisciplinary research on uncommon or taboo subjects. Out of all amateur contributions to sciences so far, Darwin's achievements may be the primary exemplar of this type of endeavor. As aforementioned, at the time of his departure on the HMS Beagle in 1831 he was an independent scientist—a 22-year-old Cambridge graduate with no advanced publications who had to pay his own way on the voyage (Bowlby, 1990; Keynes & Darwin, 2001). Darwin's work on evolution certainly took a long time to develop (the Beagle's voyage took 5 years and he did not publish On the Origin of Species until 23 years after he returned). It was aimless in the sense that he did not set out from the beginning to develop a theory of evolution. His work was highly interdisciplinary (Darwin drew on numerous fields within the biological sciences in addition to geology and economics), was the culmination of a huge amount of basic observational work, and was not necessarily an experimental contribution (though he did make those as well), but primarily theoretical (and sometimes more speculative) in nature. Darwin's theories were taboo in the sense that they went against the prevailing theological ideas of the time and caused significant controversy (and still do). We speculate that there may one day be a “Charles Darwin of the Mind” who follows a similar path. Indeed, it seems that the state of theorizing in psychology today is at an early stage comparable to evolutionary theorizing at the time of Darwin (Muthukrishna & Henrich, 2019), and the time may be ripe for an equally transformative amateur contribution in psychology. We hope that this paper provides the smallest nudge in this direction.
I actually just posted about the article here because we mention LessWrong as an example of a community where amateurs make novel research contributions in psychology – “LessWrong discussed in New Ideas in Psychology article”.
So if I had to guess – the next Darwin/Einstein/Newton will be an amateur/outsider, religious or for some reason have some weird idea that they pursue to the extreme, and have some kind of life circumstance that allows them to do this (maybe like Darwin they come from money).
I also touch on this theme in my article “The Myth of Myth of the Lone Genius”. Briefly, we have put too much cultural emphasis in science on incrementalism, on standing on the shoulders of giants. Sure, most discoveries come from armies of scientists making small contributions, but we need to also cultivate the belief that you can make a radical discovery by yourself if you try really really hard. I also quote you at the beginning of the article.
“The Great Man theory of history may not be truly believable and Great Men not real but invented, but it may be true we need to believe the Great Man theory of history and would have to invent them if they were not real.”
Thanks for catching the grammar mistake - fixed! These are interesting extensions of the basic idea of using more randomness in science, thanks for sharing. Your last point makes me think about the use of prediction markets to guess which studies will replicate, something that people have successfully done.
Your point is well taken, and we should definitely keep in mind that randomness can also create perverse incentives and can easily be overdone. However, I would argue that there is virtually no randomness in science now and ample evidence that we are bad at evaluating grants, papers, applicants and are generally overly conservative when we do evaluate (see Conservatism in Science for a review). In rare cases, I might advocate for pure randomness but, like you suggest, I think some kind of mixed strategy is probably the way to go in most cases. For example, with grants we can imagine a strategy where there is a quick review to rule out obvious nonsense and then maybe place grants in high quality and low quality with the number of slots allocated to those categories accordingly (you could also just limit people to one submission to get rid of spamming problem).
A few examples of us being bad at evaluating things:
"I just did a retrospective analysis of 2014 NeurIPS ... There was no correlation between reviewer quality scores and paper's eventual impact."
“Analysing data from 4,000 social science grant proposals and 15,000 reviews, this paper illustrates how the peer-review scores assigned by different reviewers have only low levels of consistency (a correlation between reviewer scores of only 0.2). From: Are peer-reviews of grant proposals reliable? An analysis of Economic and Social Research Council (ESRC) funding applications
For hiring decisions, it might be even worse - is this person truly a better scientist or did they just happen to land in a more productive research lab for their PhD? Will this person make a better graduate student or did they just go to a better undergraduate college? I would advocate for a threshold (we are fine with hiring any of these people) and then randomness in some hiring situations.
Great post - similar to Adam Shai's comment, this reminds of a discussion in psychology about the over-application of the scientific paradigm. In a push to seem more legitimate/prestigious (physics envy) , psychology has pushed 3rd-person experimental science at the expense of more observational or philosophical approaches.
(When) should psychology be a science? - https://onlinelibrary.wiley.com/doi/abs/10.1111/jtsb.12316
You wouldn't have got this at all from what I wrote but, we are definitely not saying that it will be easy to integrate "blind spot" research into academia or that it will happen overnight. A significant portion of the paper is spent providing examples of amateur psychology work (from the past and the present, we reference some of the work on LW), discussing why it is difficult to integrate this knowledge into modern academia, how academia might benefit from doing so, and how we might actually accomplish this over the long run. Certainly we are under no illusions that academics will wake up to all of the valuable intellectual work that happens outside of the confines of academia, but maybe at the very least they will become a little more aware of the limitations of their own work and the value that can be added by engaging with these outsiders.
I just don't really see it as that problematic if a small percentage of scientists spend their time thinking about and working on the paranormal/supernatural because (1) scientists throughout history did this and we still made progress. Maybe it wasn't necessary that Newton believed in alchemy/theology but he did and belief in these things is certainly compatible with making huge leaps in knowledge like he did, (2) I'm not sure if believing in the possibility of ghosts is more ridiculous than the idea that space and time are the same thing and they can be warped (I'm not a physicist :). UFOs would probably have been lumped into these categories as well and now we know that there are credible reports of anomalous phenomenon. Whether they are aliens or not who knows, but it is possible that studying them could lead to an understanding of new phenomenon (I think it already has led us to understand new rare forms of lightning but I'm forgetting the specifics).
Look, I don't really believe in these things and I don't behave as if I did, but I am open to the possibility. The main argument here is that being open to the possibility, having a sense of mystery and epistemic humility, does make a difference in how we think and do science. This kind of goes back to the discussion of paradigm-shifting science/normal science. If absolutely no believes that a paradigm shift is possible then it will never happen. I'm of the opinion that it's important for us to maintain a kernel of doubt in the hard-headed materialist atheist perspective. In truth, I think we are pretty closely aligned and I am just playing devil's advocate :)
certainly the authoritarian link is highly speculative, but I think in general we underestimate how politics/culture/psychology influence what we care about and how we think in science. A more extreme version of the question is: how similar would we expect alien science to be to ours? Obviously it would be different if they were much more advanced, but assuming equal levels of progress, how would their very different minds (who knows how different) and culture lead them to think about science differently? In an extreme version, maybe they don't even see and use something like echolocation - how would this influence their scientific investigation?
"Certainly, we see many example of both theoretical and applied work in many sciences, showing that in this regard the diversity is enough.
About the unifying theory of physics, I'm not that sure about the link with authoritarian culture. But once again, in actual science, there are so many viewpoints and theories and approaches that it would take days to list them for only the smallest subfield of physics. So I'm not convinced that we are lacking diversity in this regard."
I don't see how you can make this conclusion, we don't know what the counterfactual is. Obviously there is a lot of diversity of theories/approaches but that doesn't mean that we wouldn't have different theories/approaches if science was born in a different cultural background.
Again, I think these are all open questions, but I think it is reasonable to conclude that it might make a difference on the margins. Really we are asking - how contingent is scientific progress? The answer might be "not very much" but over the long-run of history it may add up.
So little actual knowledge that almost everyone was a "Renaissance man" (and so they literally all shared the same sources)”
Interesting thought - now everyone has to specialize, there are less people who have different combinations of know in a given discipline. Like i talked about with education, i think its worth thinking more about how our education systems homogenize our mental portfolio of people.
Re: tenure - its a good point and certainly we do have some diversity of scientific niches. Its an open question whether we have enough or not, i think my point more anything is just pointing out that this form of diversity also matters.
Radical proposal: we need scientific monasteries, isolated from the world, with celibate science monks dedicating to growing knowledge above all else :)
One point of confusion that I think is running through your comments (and this is my fault for not being clear enough) is how I am conceiving of "mind". In my conception, a mind is the genetics and all of the environment/past experiences but also the current context of the mind. So for example, yes you would still have the same mind in one sense whether you were doing science in a university or were just an independent scientist, but in another sense no because the thoughts you are willing and able to think would be different because you are facing very different constraints/incentives. Hope this helps.
I actually would disagree with your last point. Certainly cultural/political diversity will matter more for psych/social sciences but I think it will have an effect on what kinds of topics people care about in the first place when it comes to harder sciences and math. I can imagine a culture which has a more philosophical bent to it leading to more people doing theoretical work and a culture which has a greater emphasis on engineering and practicality doing more applied work. I could also imagine a more authoritarian culture leading to people doing physics in a certain style - perhaps more of a search for unifying "theory of everything" type ideas vs. a more democratic and diverse culture leading to a more pluralistic view of the universe. Not saying these would be huge effects necessarily but on the margins it could make a difference.
Hmm yea I see your point. I guess what I was saying is that there are certain thought patterns and styles of cognition which may be more likely to stumble on the kind of ideas or do the kind of work that can potentially lead to paradigm shifts. Whether or not we are less able to think in this way now is definitely an open question but I think one we should worry about.
Glad you liked it! I certainly think there is a lot of room for disagreement, I'll respond to a few of your comments