The Power of Intelligence - The Animation

post by Writer · 2023-03-11T16:15:16.705Z · LW · GW · 3 comments

This is a link post for https://youtu.be/q9Figerh89g

Contents

3 comments

 

This video is an animation of The Power of Intelligence [LW · GW], by @Eliezer Yudkowsky [LW · GW

The Sorting Pebbles Into Correct Heaps [LW · GW] video, coupled with this video, make a very basic case for the importance of AGI alignment.

Here's most of the pinned comment under the video, also in the description:

The script used for this video is an essay published by Eliezer Yudkowsky in 2007. 

Now, a few points:

Sorting Pebbles Into Correct Heaps was about the orthogonality thesis. A consequence of the orthogonality thesis is that powerful artificial intelligence will not necessarily share human values. 

This new video is about just how powerful and dangerous intelligence is. These two insights put together are cause for concern.

If humanity doesn't solve the problem of aligning AIs to human values, there's a high chance we'll not survive the creation of artificial general intelligence. This issue is known as "The Alignment Problem". Some of you may be familiar with the paperclips scenario: an AGI created to maximize the number of paperclips uses up all the resources on Earth, and eventually outer space, to produce paperclips. Humanity dies early in this process. But, given the current state of research, even a simple goal such as “maximize paperclips” is already too difficult for us to program reliably into an AI. We simply don't know how to aim AIs reliably at goals. If tomorrow a paperclip company manages to program a superintelligence, that superintelligence likely won't maximize paperclips. We have no idea what it would do. It would be an alien mind pursuing alien goals. Knowing this, solving the alignment problem for human values in general, with all their complexity, appears like truly a daunting task. But we must rise to the challenge, or things could go very wrong for us. 

You can read The Power of Intelligence and many other essays by Eliezer Yudkowsky on this website: https://www.readthesequences.com/

3 comments

Comments sorted by top scores.

comment by Writer · 2023-03-11T17:18:33.858Z · LW(p) · GW(p)

Of all the videos we've done, this one elicits, by far, the strongest emotional reaction in me. Part of it is due to the essay. I found it invigorating when I first read it, but then the thought of an alien intelligence taking apart our planet, almost as inevitable as a law of physics, haunted me for a good while. Part of it is also the animation and visuals. The colors are intense, and the scenes are sometimes unsettling and violent, but still beautiful. Nature, but softened.

Replies from: niplav
comment by niplav · 2023-03-11T18:44:03.133Z · LW(p) · GW(p)

I found it good to be reminded of this essay, and will probably link to the video a bunch of times in discussions.

comment by Kevin Imes (kevin-imes) · 2023-03-13T15:40:08.767Z · LW(p) · GW(p)

Glad to see this channel here. I highly recommend their videos on grabby aliens for those unfamiliar with the concept.