Posts

Comments

Comment by Mervinkel (mikhail-larionov) on Thou Art Godshatter · 2024-07-10T17:56:34.479Z · LW · GW

I was heavily thinking about this topic in the past few weeks before stumbling across this post and your comment, and I appreciate both.

Ultimately, I agree with your conclusion. What’s more, I think this (becoming a pure reproductive consequentialist) is also inevitable from the evolutionary standpoint.

It’s already clear that pure hedonistic societies (“shards of desire” et al) are on a massive decline. The collective West, with an average US fertility rate of something like 1.6 per woman, is going to die off quickly.

But the gap will be filled, and it will be filled with the programming that re-enables higher reproductive fitness.

My take, though, is that you don’t have to be radical about either of those strategies. You don’t have to maximize your fertility to the absolute best by sacrificing all joy. I think you just have to maximize it to some reasonable subjective degree. Arguably, having fun should have a positive impact on your gene propagation — as long as you efficiently propagate!

So my personal choice is to follow all the strategies from your comment and some more — except the ones that are not fun. And treat the rest of the activities (fun but pointless) as inevitable cost of slow evolution, but not blame myself for this since this is not really my fault.

This excludes sperm banks but includes maximizing offspring by various other joyous ways.

This poses some interesting challenges though. Brute-forcing the problem of limited resources to pass to your offspring, you still have the challenge of limited bonding opportunities with the mothers, which may be detrimental to the children and hurt their own reproduction (which is critical, as also mentioned in the comments).

I wonder what is the optimal number of human offspring for one male, given that at some higher numbers, further increase seems to be detrimental to the sum of group fitness.

Comment by Mervinkel (mikhail-larionov) on LLM Generality is a Timeline Crux · 2024-06-25T03:34:58.301Z · LW · GW

Thanks for a thoughtful article. Intuitively, LLMs are similar to our own internal verbalization. We often turn to verbalizing to handle various problems when we can't keep our train of thought by other means. However, it's clear they only cover a subset of problems; many others can't be tackled this way. Instead, we lean on intuition, a much more abstract and less understood process that generates outcomes based on even more compressed knowledge. It feels that the same is true for LLMs. Without fully understanding intuition and the kind of data transformations and compressions it involves, reaching true AGI could be impossible.