You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.
Hi I'm Gaetan Selle, I am seeking $60,000 to spend 12 months working full time on The Flares, a Francophone media project that I co-founded, focused on the future of humanity, with a recent editorial pivot towards AI safety. The project addresses a major gap in the French-speaking information ecosystem: there are very few established creators producing serious, accessible, high-quality content on advanced AI risk for a broad audience.
The Flares has already demonstrated substantial reach and traction, with nearly 80k subscribers on YouTube, 8 million views, multiple videos reaching over 100k views, and strong engagement on recent AI safety content. I have also interviewed leading figures in the field and was selected for the Frame Fellowship in San Francisco, the first fellowship dedicated to creators communicating on AI risk.
I’m unusually positioned to do this. Not only because I’m passionate, but because I have the combination of: professional filmmaking ability, proven audience traction, access to AI safety guests, French-language advantage (not a lot of actors in this space), trusted by organisations such as The Future of Life Institute and Coefficient Giving, and an existing platform.
The goal of this project is to strengthen awareness and understanding of AI safety, the risks of superintelligent AI, and AI governance in the French-speaking world. More broadly, it aims to help build a more informed public conversation around advanced AI and its implications for humanity’s long-term future.
Funding would allow me to produce at least two in-depth video essays per month and podcast episodes, significantly increasing the amount of high-quality Francophone AI risk communication and helping move viewers from awareness to action. This would also give me the freedom to think outside the box with special projects like a documentary, short films etc.
A further goal is to encourage civic engagement, not only awareness. One of my main calls to action is to direct viewers who are concerned about AI risk to the Pause IA website and its tools, which allow them to contact their elected officials in France. In this way, the project aims not only to inform audiences, but also to help convert concern into concrete democratic action and shift the Overton window.
I see this as a field-building and public-understanding project. The long-term goal is to help create a more informed French-speaking public sphere around advanced AI risk, so that concern about these issues can become mainstream.
This funding would primarily be used to allow me to work full time on The Flares for 12 months. The largest share would support my compensation, with the remainder covering modest operating and production costs such as software, hosting, transcription, subtitles, graphics, and occasional production-related expenses.
The funding would therefore convert an already proven but constrained part-time project into a stable full-time operation, allowing greater output, depth, and consistency.
The project is currently founder-led. I am the person responsible for the core work on the YouTube channel: editorial direction, research, scripting, guest outreach, interviewing, hosting, filming, editing, publishing, and overall project management. The newsletter, website and other system/operation is managed by the other co-founder.
Occasional outside help may be used for limited support tasks, but there are currently no other salaried staff.
My background in filmmaking allows me to be very versatile, creatively and technically. My track record includes award winning short films, building The Flares to nearly 80k subscribers and around 8 million total views over roughly eight years. Several videos have reached well over 100k views, including a recent video on the METR graph that has reached about 104k views and continues to grow. I have also interviewed prominent AI safety figures including Malo Bourgon, CEO of MIRI, David Krueger (Evitable), Max Winga (ControlAI), and others.
The most likely cause of failure is insufficient funding to make the project sustainable at a full-time level. Throughout last year, the channel was maintained part time with very limited time dedicated to it, and the result was that I was only able to produce one video essay and around ten podcast episodes over the year.
By contrast, since the beginning of this year, with part-time support from the Future of Life Institute and the two-month full-time incubator-style Frame Fellowship, I was able to produce five video essays and five podcast episodes. This is a significant increase compared with last year and suggests that with full-time capacity I can produce substantially more content, reach more people, and grow the channel faster. Since increasing output, the channel has resumed growth and gained nearly 2,000 new subscribers in roughly two to three months.
If this project fails to secure funding, I will most likely have to return to a much more limited part-time model. That would slow output, slow growth, reduce reach, and ultimately reduce impact at a time when public understanding of increasingly powerful AI systems may be especially urgent. The main outcome would therefore be a missed opportunity to expand serious Francophone communication on AI safety when the need for it may be growing quickly. This is particularly important because Europe plays a major role in AI regulation, so strengthening informed public understanding in the French-speaking world could have broader relevance for how these technologies are governed.
Over the last 12 months, the project has raised very limited funding. In January 2026, it received a six-month grant (USD 6000) from the Future of Life Institute, covering approximately one day per week of full-time work on The Flares. In addition, the project has generated roughly USD 500 from small crowdfunding support and YouTube revenue over the same period.
Overall, external funding has so far been modest, and the project has largely been sustained through founders labour rather than significant financial support.