Posts

Comments

Comment by alkexr on Why do you (not) use a pseudonym on LessWrong? · 2020-05-07T22:03:29.018Z · score: 6 (4 votes) · LW · GW

I live in a social environment where expressing opinions or otherwise giving information about myself could have negative consequences, ranging from mild inconvenience to serious discrimination. I have no intention to hide my real identity from those who know the account, but I do want to hide my account from those who know my real identity (and aren't close friends). I use this name for most online activity.

Comment by alkexr on My experience with the "rationalist uncanny valley" · 2020-04-23T22:00:52.635Z · score: 3 (2 votes) · LW · GW

I've been aware for a while now that having enough awareness to notice being trapped is not enough to step outside the pattern, but I can't step outside this pattern. I also believe that admitting that there is no substitute for practice isn't going to be causally linked to me actually practicing (due to a special case of the same trap), so I'll just go on staying trapped for now I guess.

Comment by alkexr on What will happen to supply chains in the era of COVID-19? · 2020-04-08T01:29:40.471Z · score: 2 (2 votes) · LW · GW

Being self-sufficient and robust as a national economy is accepting a competitive disadvantage relative to a global just-in-time supply chain in times of prosperity in exchange for a competitive advantage during a crisis. Selection pressures will push economies accepting this tradeoff towards being actively interested in a world with more crises.

Comment by alkexr on Outline of Metarationality, or much less than you wanted to know about postrationality · 2018-10-15T10:02:33.784Z · score: 3 (2 votes) · LW · GW

Question: how does postrationality and instrumental rationality relate to each other? To me it appears that you are simply arguing for instrumental rationality over epistemic rationality, or am I missing something?

Comment by alkexr on Outline of Metarationality, or much less than you wanted to know about postrationality · 2018-10-15T10:00:31.797Z · score: 3 (2 votes) · LW · GW
However, if this is really what 'postrationality' is about, then I think it remains safe to say that it is a poisonous and harmful philosophy that has no place on LW or in the rationality project.

It feels like calling someone's philosophy poisonous and harmful doesn't advance the conversation, regardless of its truth value, and this proves the point of the main post well.

Comment by alkexr on Tradition is Smarter Than You Are · 2018-09-19T23:07:07.413Z · score: 3 (5 votes) · LW · GW

Being able to speak is probably more important than being as smart as a human. Cultural / memetic evolution is orders of magnitude faster than biological, but its ability to function is dependent on having a memory better than mortal minds. Speech gives some limited non-mortal memory, as does writing the printing press, or the internet. These inventions enable more efficient evolution. AI will ramp up evolution to even higher speeds, since external memory will be replaced with internal 1) lossless and 2) intelligent memory. As such I am unconvinced that this would mean slower takeoff speeds. (You just explained that the most important factor in doing well as humans is something humans are not overly good at, instead of the special magic that only humans posess.)

Comment by alkexr on Physics has laws, the Universe might not · 2018-06-10T17:02:27.792Z · score: 2 (1 votes) · LW · GW

People not being able to come up with any idea but that diseases are a curse of the gods is strong evidence not for diseases being a curse of gods but for the ignorance of those people. The most likely answer to that question is either something no one will think of for centuries to come or simply that the model of separating objects into "sorts of things" is not useful for deciphering the misteries of the universe despite being an evolutionary advantage on the ancestral savanna.

Comment by alkexr on Of Two Minds · 2018-05-26T00:45:46.857Z · score: 13 (3 votes) · LW · GW

You might have gone too far with speculation - your theory can be tested. If your model was true, I would expect a correlation between, say, the ability to learn ball sports and the ability to solve mathematical problems. It is not immediately obvious how to run such an experiment, though.