Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate

Funding requirements

Sign grant agreement
Reach min funding
Get Manifund approval
5

Development of a Cautionary Tale Feature Film about Gradual Disempowerment

AI governanceGlobal catastrophic risks
Petr_Salaba avatar

Petr Salaba

ProposalGrant
Closes August 25th, 2025
$87,500raised
$100,000minimum funding
$200,000funding goal

Offer to donate

17 daysleft to contribute

You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.

Sign in to donate

work title: Seductive Machines and Human Agency

Funding for film producer Petr Salaba (fiscally serviced by Epistea, a registred charity) for Development of a Cautionary Tale Feature Film about Gradual Disempowerment

Why

Beyond the risk of a dramatic robot takeover, humanity faces the systemic risk of extinction through gradual disempowerment as we offload more and more agency to AIs across the realms of culture, economy and politics.

This warrants a well researched cautionary tale with the actionable pathways of hope.

What
We want to make a feature length film about this topic for global theatrical distribution. The desired outcome is that the public has a clearer understanding of the systemic risks while also being aware of potential positive roadmaps and promising pathways humanity could take in order to keep its meaningful agency.

The final film may be a hybrid between an AI generated dramatization and real life documentary.

Phase 1 - Film Development (USD 100,000-200,000)
To make a film development package (script prototype, production plan, test video, public communication strategy, articulated theory of change, pitch deck for funding Phase 2)

Intended by the end of Q4 2025


Phase 2 - Estimated budget USD 2M for film production + USD 3M to 5M for campaign and distribution

To produce and distribute the final feature film in mainstream platforms (theatres and then online).

Intended in Q3 2026

Who and how
The producer and film director Petr Salaba has a track record of producing high quality dramatic educational video content, and by leveraging generative AI, the costs can be cca 20x lower compared to traditional production methods. So for 2M with the right ideas, we could produce a spectacle comparable to a traditional 40 mil$ production.


In close collaboration with the team of researchers and writers behind the Gradual Disempowerment paper, we plan to write the script that clearly and plausibly explains the potential perils of near future AI systems and inspires meaningful positive pathways to take (the team is already on board).

Other collaborators are audiovisual media artists and producers for prototyping and consultations.

Intended actions for Phase 1

  • script drafting and iterations (collaboration with writers)

  • consultations with “third wave AI safety” experts

  • audience research (it is likely that in Q3 2026 many AI disempowerment cases will be publicly much more salient than they are today in July 2025)

  • producing video tests with generative AI technology

  • drafting public campaign strategy surrounding the film release (marketing and campaigning consultations)

  • Production strategy for Phase 2 (collaboration with European and US film producers)

  • Funding strategy for Phase 2

Risks and pitfalls to avoid:

  1. Scope creep - USD 2 mil for production seems like a sweet spot to make something impactful yet creatively agile, let’s not overblow this into development hell while the form and content is becoming outdated. Q3 2026 might be around the time where hybrid human-AI cinematography peaks before AI culture starts taking over. If theatrical feature film turns out too difficult, we might reroute towards an online shorter format or series which could be meaningfully produced even with smaller budget.

  2. Let’s not oversaturate the AI doom memeplex, it is hard to control. The positive roadmaps should be drafted from the onset, not as an afterthought.

Comments3Offers2Similar4
MikeSmith avatar

Michele Rocco Smeets

"Cult Of The Effigy", AI Critical TV Series

A mainstream fictional TV series exposing the dangers of AI

AI governanceGlobal catastrophic risks
1
0
$0 raised
MichelJusten avatar

Michel Justen

Video essay on risks from AI accelerating AI R&D

Help turn the video from an amateur side-project to into an exceptional, animated distillation

AI governanceGlobal catastrophic risks
1
5
$0 raised
wiserhuman avatar

Francesca Gomez

Develop technical framework for human control mechanisms for agentic AI systems

Building a technical mechanism to assess risks, evaluate safeguards, and identify control gaps in agentic AI systems, enabling verifiable human oversight.

Technical AI safetyAI governance
3
6
$10K raised
Connoraxiotes avatar

Connor Axiotes

'Making God': a Documentary on AI Risks for the Public

Geoffrey Hinton & Yoshua Bengio Interviews Secured, Funding Still Needed

Science & technologyTechnical AI safetyAI governanceGlobal catastrophic risks
16
35
$205K raised