Posts

Comments

Comment by marek_rosa on HLAI 2018 Field Report · 2018-09-03T19:13:05.500Z · LW · GW

Great way how to visualize the risks of unaligned AGI is the alien life form in the film Life https://www.youtube.com/watch?v=cuA-xqBw4jE

It starts as a seed entity, quickly adapts, learns new tricks, gets bigger and stronger, ruthlessly oblivious to human value system. Watch it, and instead of the alien imagine child AGI.

Comment by marek_rosa on HLAI 2018 Field Report · 2018-09-03T18:52:44.748Z · LW · GW

Hi Gordon!

thanks for writing this. I am glad you enjoyed HLAI 2018.

I agree, many AI/AGI researchers partially or completely ignore AI/AGI safety. But I have noticed a trend in the past years: it's possible to "turn" these people and make them take safety more seriously.

Usually the reason to their "safety ignorance" is just insufficient insight, not spending enough time on this topic. Once they learn more, they quickly see how thing can go wrong. Of course, not everyone.

Hope this helped.

Best,

Marek