You can now listen to the “AI Safety Fundamentals” courses

post by PeterH · 2023-06-09T16:45:35.666Z · LW · GW · 0 comments

This is a link post for https://forum.effectivealtruism.org/posts/vxpqFFtrRsG9RLkqa/announcement-you-can-now-listen-to-the-ai-safety

Contents

    You can now listen to most of the core readings from both courses:
  Apply to join the “AI Safety Fundamentals Governance Course” July cohort!
    Apply before 26th June 2023!
    https://apply.aisafetyfundamentals.com/governance
  Thoughts, feedback, suggestions?
None
No comments

The AI Safety Fundamentals courses are one of the best ways to learn about AI safety and prepare to work in the field.

BlueDot Impact facilitates the courses several times per year, and the curricula are available online for anyone to read. 

The “Alignment” curriculum is created and maintained by Richard Ngo (OpenAI), and the “Governance” curriculum was developed in collaboration with a wide range of stakeholders. 

You can now listen to most of the core readings from both courses:

AI Safety Fundamentals: Alignment
Gain a high-level understanding of the AI alignment problem and some of the key research directions which aim to solve it.


Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

AI Safety Fundamentals: Governance
Gain foundational knowledge for doing research or policy work on the governance of transformative AI.

Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

We've also made narrations for some readings from the advanced “Alignment 201” course, and we may record more later this year:

AI Safety Fundamentals: Alignment 201
Gain enough knowledge about alignment to understand the frontier of current research discussions. 

Listen online or subscribe:
Apple Podcasts | Google Podcasts | Spotify | RSS

Apply to join the “AI Safety Fundamentals Governance Course” July cohort!

Gain foundational knowledge for doing research or policy work on the governance of transformative AI.

Successful applicants will participate in the AI Governance course with weekly virtual classes, and join the AI Safety Fundamentals community.

Apply before 26th June 2023!

https://apply.aisafetyfundamentals.com/governance


Thoughts, feedback, suggestions?

These narrations were created by Perrin Walker (TYPE III AUDIO) on behalf of BlueDot Impact.

We would love to hear your feedback. Do you find the narrations helpful? How could they be improved? What other AI safety material would you like to listen to? Please comment below, complete our feedback form, or write to team@type3.audio.

0 comments

Comments sorted by top scores.