John von Neumann on how to safely progress with technology

post by Dalton Mabery (dalton-mabery) · 2022-07-13T11:07:00.904Z · LW · GW · 0 comments

Contents

No comments

From a 1955 article written by von Neumann titled, "Can We Survive Technology?"

What safeguard remains? Apparently only day-to-day — or perhaps year-to-year — opportunistic measures, along sequence of small, correct decisions. And this is not surprising. After all, the crisis is due to the rapidity of progress, to the probable further acceleration thereof, and to the reaching of certain critical relationships. Specifically, the effects that we are now beginning to produce are of the same order of magnitude as that of "the great globe itself." Indeed, they affect the earth as an entity. Hence further acceleration can no longer be absorbed as in the past by an extension of the area of operations. Under present conditions it is unreasonable to expect a novel cure-all. For progress there is no cure. Any attempt to find automatically safe channels for the present explosive variety of progress must lead to frustration. The only safety possible is relative, and it lies in an intelligent exercise of day-to-day judgment […]

The one solid fact is that the difficulties are due to an evolution that, while useful and constructive, is also dangerous. Can we produce the required adjustments with the necessary speed? The most hopeful answer is that the human species has been subjected to similar tests before and seems to have a congenital ability to come through, after varying amounts of trouble. To ask in advance for a complete recipe would be unreasonable. We can specify only the human qualities required:
patience, flexibility, intelligence.

This sounds an awful lot like what AI Alignment has been trying to do. Granted, as Scott Alexander notes in a book review of von Neumann's biography:

This sounds suspiciously like the smartest man in the world admitting he’s not sure what to do.

0 comments

Comments sorted by top scores.