Use computers as powerful as in 1985 or AI controls humans or ?
post by jrincayc (nerd_gatherer) · 2025-02-03T00:51:05.706Z · LW · GW · 0 commentsContents
No comments
A way to prevent AGI from taking over or destroying humanity is to strictly limit the computing power used on unknown AI algorithms. My back of the envelope calculations[1] show that restricting the hardware to 64 KiB of total storage is definitely sufficient to prevent an independence gaining AGI, and restricting to 2 MiB of storage is very likely to be sufficient to prevent an indepence gaining AGI. State of the art AI on the other hand tends to be using at least 1 GiB of RAM or much more and processing power in the teraflops range or more. As for an upper limit before we get AGI, whole brain emulation provides one, but that is on the order of 1 exoflops and 1 petabyte, so we do not have a precise idea of where the limits for AGI are. Also, we don't have a way to make sure that AI software is aligned with human goals and ethics.[2]
So here are options:
- Only use really weak computers (midrange 1985 computers like a Mac 512K or an Atari 520ST would almost certainly be safe)
- Just let AGI take control.
- Hope that AGI really requires very powerful computers and ban them but allow less powerful computers that are well above what we are sure cannot be an AGI.
- Hope there is outside intervention that prevents dangerous AGI (space aliens, divine intervention, dark lords of the matrix, etc.)
- ???
So what should humanity do? I talked to a non-computer scientist about this, and his answer was that restricting us to circa 1985 power of computers was the best choice, which actually surprised me a little. Letting AGI take control can result in extinction, or the AGI imposing rules that we don't like.[3]
The problem is that we are metaphorically experimenting with 15 kiloVolt AC when we really should be experimenting with 5 volt DC because we have a very weak understanding of AI safety.
I don't know what humanity should do. As for me personally, if I had the choice between my current 1.6 GHz 4 core CPU with 24 GB of RAM computer that I am typing on, versus living in a world where we had eliminated existential risk from things like uncontrolled AGI and nuclear bombs, I would gladly trade my computer in for a 512 KB, 8 MHz computer with a floppy drive and a CD-R and an modem level network connection if that is what we all need to do. I am curious what others think.
*These are my own opinions and not those of my employer. This document may be distributed verbatim in any media. Preview image is Creative Commons Attribution-Share Alike 4.0 International license and originally from https://commons.wikimedia.org/wiki/File:Atari_520_ST_%2B_with_monochrome_monitor_SM_124.jpg *
https://www.researchgate.net/publication/388398902_Memory_and_FLOPS_Hardware_Limits_to_Prevent_AGI and for an earlier draft https://www.lesswrong.com/posts/9kvpdK9BLSMxGnxjk/thoughts-on-hardware-limits-to-prevent-agi [LW · GW] and if you see any mistakes I made or have questions please tell me. ↩︎
https://intelligence.org/2023/04/21/the-basic-reasons-i-expect-agi-ruin/ and https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities [LW · GW] ↩︎
For example, two rules that I can imagine humans disliking would be 1. No eating vertebrates or cephalopods and 2. No going farther than 1 million km from Earth. I am not even sure that trying to change the AGI's mind on this would be a good idea (since we are vertebrates that do not like not to be eaten and we might want the AGI to impose restrictions like 2 on dangerous aliens). ↩︎
0 comments
Comments sorted by top scores.