Posts
Comments
Here is my much shorter guide, which I wrote a year or so ago. I guess I would call it the shortest incomplete guide. It's geared towards an audience that wants to do much less thinking about them.
Newbie here.
In the AI Timeline post, one person says it's likely that we will consume 1000x more energy in 8 years than we do today. (And another person says it's plausible.)
How would that happen? I guess the idea is: we discover over the next 3-5 years that plowing compute into AI is hugely beneficial, and so we then race to build hundreds or thousands of nuclear reactors?
Intuitively, I assume that LLMs trained on human data are unlikely to become much smarter than humans, right? Without some additional huge breakthrough, other than just being a language model?