Creating 'Making God': a Feature Documentary on risks from AGI

post by Connor Axiotes (connor-axiotes-1) · 2025-04-15T02:56:09.206Z · LW · GW · 0 comments

Contents

    Project summary:
    Rough narrative outline:
    Our basic model for why this is needed:
    Update [14.04.25]
    1) Prof. Rose Chan Loui is the Founding Executive Director, Lowell Milken Center on Philanthropy and Nonprofits at UCLA.
    2) Prof. Ellen Aprill is Senior Scholar in Residence and taught Political Activities of Nonprofit Organizations at UCLA in 2024.
    3) Holly Elmore is the Executive Director of Pause AI US.
    4) Eli Lifland is the Founding Researcher at the AI Futures Project, and a top forecaster.
    5) Heather-Rose is Government Affairs Lead in LA for Labor Union SAG-AFTRA.
    Civil Society
  Upcoming Interviews
  Potential Interviews
  Interviews We’d Love
  Points to Note:
    Project Goals:
      Some rough numbers:
    How will this funding be used?
      Travel [Total: £13,500]
      Equipment [Total: £41,000]
      Production Crew (30 Days of Day Rate) [Total: £87,000]
      Director (3 Months): [Total: £15,000]
      Executive Producer (3 months): [Total: £15,000]
      TOTAL: £226,500 ($293,046)
    Who is on your team? What's your track record on similar projects?
      Mike Narouei [Director]:
      Connor Axiotes [Executive Producer]:
None
No comments

Please donate to our Manifund (as of 14.04.25 we have two more days of donation matching up to $10,000). Email me at connor.axiotes or DM me on Twitter for feedback and questions.

Project summary:

  1. To create a cinematic, accessible, feature-length documentary. 'Making God' is an investigation into the controversial race toward artificial general intelligence (AGI).
  2. Our audience is a largely non-technical one, and so we will give them a thorough grounding in recent advancements in AI, to then explore the race to the most consequential piece of technology ever created.
  3. Following in the footsteps of influential social documentaries like Blackfish/Seaspiracy/The Social Dilemma/Inconvenient truth/and others - our film will shine a light on the risks associated with the development of AGI.
  4. We are aiming for film festival acceptance/nomination/wins and to be streamed on the world’s biggest streaming platforms.
  5. This will give the non-technical public a strong grounding in the risks from a race to AGI. If successful, hundreds of millions of streaming service(s) subscribers will be more informed about the risks and more likely to take action when a moment may present itself.

Rough narrative outline:

Our basic model for why this is needed:

***

Update [14.04.25]

1) Prof. Rose Chan Loui is the Founding Executive Director, Lowell Milken Center on Philanthropy and Nonprofits at UCLA.

2) Prof. Ellen Aprill is Senior Scholar in Residence and taught Political Activities of Nonprofit Organizations at UCLA in 2024.

3) Holly Elmore is the Executive Director of Pause AI US.

4) Eli Lifland is the Founding Researcher at the AI Futures Project, and a top forecaster.

5) Heather-Rose is Government Affairs Lead in LA for Labor Union SAG-AFTRA.

Civil Society

Upcoming Interviews

  1. Cristina Criddle, Financial Times Tech Correspondent covering AI - recently broke the Financial Times story about OpenAI giving days long safety-testing rather than months for new models).
  2. David Duvenaud, Former Anthropic Team Lead.
  3. John Sherman, Dads Against AI and podcasting.

Potential Interviews

  1. Jack Clark (we are in touch with Anthropic Press Team).
  2. Gary Marcus (said to get back to him in a couple weeks).

Interviews We’d Love

  1. Kelsey Piper, Vox.
  2. Daniel Kokotajlo, formerly OpenAI.
  3. AI Lab employees.
  4. Lab whistleblowers.
  5. Civil society leaders.

Points to Note:

Project Goals:

  1. We are aiming for film festival acceptance/nomination/wins and to be streamed on the world’s biggest streaming platforms, like Netflix, Amazon Prime, and Apple TV+.
  2. To give the non-technical public a strong grounding in the risks from a race to AGI.
  3. If successful, hundreds of millions of streaming service(s) subscribers will be more informed about the risks and more likely to take action when a moment may present itself.
  4. As timelines are shortening, technical alignment bets are looking less likely to pay off in time for AI, international governance mechanisms seem to be breaking down - and so our goal is to influence public opinion on the risks so that they might take political or social action before the arrival of AGI. If we do this right, we could have a high chance of moving the needle.

Some rough numbers:

How will this funding be used?

In order to seriously have a chance at being on streaming services, the production quality and entertainment value has to be high. As such, we would need the following funding over the next 3 months to create a product like this.

Accommodation [Total: £30,000]

Travel [Total: £13,500]

Equipment [Total: £41,000]

Production Crew (30 Days of Day Rate) [Total: £87,000]

Director (3 Months): [Total: £15,000]

Executive Producer (3 months): [Total: £15,000]

MISC: £25,000 (to cover any unforeseen costs, get legal advice, insurance and other practical necessities).

TOTAL: £226,500 ($293,046)

Who is on your team? What's your track record on similar projects?

Mike Narouei [Director]:

Watch Your Identity Isn’t Yours - which Mike filmed, produced, and edited when he was at Control AI. The still above is from that.

Connor Axiotes [Executive Producer]:

Donate to our Manifund (as of 14.04.25 we have two more days of donation matching up to $10,000). Email me at connor.axiotes or DM me on Twitter for feedback and questions.

0 comments

Comments sorted by top scores.