Posts

Comments

Comment by FR_Max on 7 traps that (we think) new alignment researchers often fall into · 2022-09-29T17:11:06.817Z · LW · GW

Thank you for your post!  It really is. New alignment researchers often fall into is assuming that AI systems can be aligned with human values by solely optimizing for a single metric. Thanks again for the deep insight into the topic and the recommendations.

Comment by FR_Max on [deleted post] 2022-09-26T19:00:24.618Z

It's true change is the only constant in the universe, and yet we often act as if things will always stay the same. We get comfortable with the status quo and resist change, even when it's clearly for the better. The ability to see beyond our current moment is what allows us to progress as a species. So even though futurists may not always be correct about the details, we should all aspire to be wrong about the future in just the same way.  Therefore, postmodern in art accepts exploring various styles, genres, forms and designs without bounds of limits.   

Comment by FR_Max on [deleted post] 2022-09-26T18:58:43.453Z