Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
Support our mission. Read more

The market for grants

Manifund helps great charities get the funding they need. Discover amazing projects, buy impact certs, and weigh in on what gets funded.

Get started
Manifox
ProjectsCommentsDonations
'Making God': a Documentary on AI Risks for the Public
JLK avatar

Jisoo Kim

about 2 hours ago

Very important and timely work. Education is empowerment and people need to know about what AI - including AGI and superintelligence - may do to their lives if we're not careful now. Good luck, Connor!

Split Personality Training
Austin avatar

Austin Chen

about 8 hours ago

Approving and making a small ($1k) donation as well, as the priors on this kind of project seem reasonable (MATS mentee asking to continue their project), and Marius and Evan's support gives me some confidence. As Marius says, strong early results could encourage me to fund this with more!

Visions from the Archive
TAHS avatar

Tobias Stevenson

about 10 hours ago

because I was told I need to show some self love ... no one else did

AI forecasting and policy research by the AI 2027 team
🐙

Mudit Arun Tulsianey

about 16 hours ago

I'm excited for what the team does next!

Personally loved AI 2027, and am exploring ways to contribute positively to the field :)

One thing on my mind:

As a physicist, I've observed that our most innovative solutions often emerge when diverse perspectives converge. I'd love to see more initiatives actively recruiting non-STEM majors into AI safety. Humanities scholars bring unique analytical frameworks and ethical considerations that complement technical approaches, as brilliantly illustrated in Gillian Tett's 'Anthrovision'. Perhaps dedicated fellowships or collaborative projects pairing technical and humanities researchers could catalyse these valuable cross-disciplinary insights?

AI Safety Reading Group at metauni [Retrospective]
mfar avatar

Matthew Cameron Farrugia-Roberts

about 18 hours ago

Final report

Thanks to everyone for the retrospective support!

As a final note, though unrelated to this funding, I recently moved to Oxford and recently started a new AI safety reading group with some students here.

Finishing The SB-1047 Documentary
michaeltrazzi avatar

Michaël Rubens Trazzi

about 20 hours ago

Final report

closing the project

Making 52 AI Alignment Video Explainers and Podcasts
michaeltrazzi avatar

Michaël Rubens Trazzi

about 20 hours ago

Final report

closing project

Create open source predictors for various genetically influenced traits such as intelligence and disease risk
🌻

Gene Smith

about 21 hours ago

Progress update

What progress have you made since your last update?

This project has slowed down a bit because all the volunteers working on it became busy with other work. We have made some additional progress since then, most notably some work by Kman to make a predictor that we think should be able to explain roughly 10% of the within-family variance in intelligence.

What are your next steps?

Kman needs to finish the predictor and Franz needs to get a website up where people can upload genomes.

Is there anything others could help you with?

We could use some help finishing the front end for the website. There is a partly finished front-end, and a dev who is willling to work with anyone that wants to help get this project across the finish line. We still have about half the funds left, so we can provide some money to devs that want to help out with this (though probably not a market rate salary)

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

2 days ago

@jeff3454 thanks for your donations! Can you please (if you'd like) email me on connor.axiotes@gmail.com - we'd love to say thanks.

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

2 days ago

@Austin thank you! We appreciate it.

'Making God': a Documentary on AI Risks for the Public
Austin avatar

Austin Chen

3 days ago

@Connoraxiotes Just wanted to say that I very much appreciate this level of transparency and openness about your thinking & finances; I think it's super helpful for others embarking on similar projects!

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

3 days ago

@michaeltrazzi hey!

All in all just above $17,000. 9/10 people on set also included myself, Mike, Gary and our film photographer.

A Netflix shoot is usually minimum $20k, and we want our shoots to be at that level as our aim is to break into film festivals and then streaming services. That's why we had 4 cameras! (Three FX6s and one FX3).

So I would pay this price again though because we have something that is an intriguing interview and super cinematic looking. Which was our purpose with this documentary.

It was a particularly expensive shoot because:

  • We had less than one week notice to sort the whole shoot and as we are in SF, fly over to NYC and hire a local crew.

  • Because Gary Marcus was coming to NYC and no longer has an office here, we had to pay for a filming location (when hitherto we've been using people's offices and homes).

  • Our AirBnB was expensive because there was a chance we might have had to house crew and do some b-roll filming there, so it had to suit both those things.

  • We had to ship extra luggage on our flights.

I will note that I've been quite successful at getting price reductions on things like crew and gear hire. And I'll continue to fight for value. Mike is also quite experienced with these bigger shoots so he has a good intuition for what should cost what.

Regarding the burn, we also raised $50,000 privately in the last few weeks. So although the interview a significant cost, we thought it was worth it. BUT to finish the documentary we still need to raise a lot.

Michael you should come help us fundraise aha! We could use your expertise.

[Costings below if you'd like a look!]

🏅
'Making God': a Documentary on AI Risks for the Public
michaeltrazzi avatar

Michaël Rubens Trazzi

3 days ago

@Connoraxiotes curious: how much did flying to NYC and having 9/10 people on set cost?

With that burn, how many more interviews can you shoot?

Coordinal Research: Accelerating the research of safely deploying AI systems.
🐝

Michael Chen

4 days ago

Relevant: AIs at the current capability level may be important for future safety work

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

4 days ago

Progress update

'Making God' Update [12th May 2025] - Gary Marcus Interview, NYC

***Watch our exclusive teaser clip of the interview on X/Twitter and LinkedIn***

We spent the last couple weeks in New York and hired a full crew (around 9/10 people on set) to film a professional cinematic interview with Gary Marcus.

EVN General Support Application
🐹

Elizabeth Van Nostrand

4 days ago

Work funded with this grant:
- extended my work on chaos theory, which would otherwise have run out of money in August (https://acesounderglass.com/2024/09/20/applications-of-chaos-saying-no-with-hastings-greer/ , https://acesounderglass.com/2024/11/01/11924/)
- my vo2max research was covered by a client, however tutoring on the write up was covered by this grant (https://acesounderglass.com/2025/03/09/11954/)
- luck based medicine updates (https://acesounderglass.com/2025/04/11/journal-of-null-results-ezmelt-sublingual-vitamins/, including a prediction market on the outcome, https://acesounderglass.com/2024/12/01/luck-based-medicine-no-good-very-bad-winter-cured-my-hypothyroidism/ )
- unpublished drafts on cults, abusive relationships, and high investment groups
- AI research tool comparisons: https://acesounderglass.com/2024/10/04/ai-research-assistants-competition-2024q3-tie-between-elicit-and-you-com/

EVN General Support Application
🐹

Elizabeth Van Nostrand

4 days ago

This eventually led to https://www.lesswrong.com/posts/son5eEGymm4h856J9/estimating-the-benefits-of-a-new-flu-drug-bxm

'Making God': a Documentary on AI Risks for the Public
Seldon avatar

Seldon

4 days ago

🏴‍☠️ We are staunch supporters of creating a global movement for existential security and, if this is a success, it will be one of the best ways to kickstart it. We will be watching from the sidelines!

'Making God': a Documentary on AI Risks for the Public
Martian-Moonshine avatar

Stephan Wäldchen

4 days ago

This is a great initiative! Sounds like the crew knows what they are doing!

The new pope is also into AIS, maybe you could get him to interview? :D

[CLOSED] Arkose may close soon
🐤

Vael Gates

5 days ago

Stating interest in pledging $8k if we're close to the funding bar!

Split Personality Training
mariushobbhahn avatar

Marius Hobbhahn

5 days ago

Funding with $2000 to get the project off the ground.

I have talked to Florian about this project during the last MATS cohort presentation day. I felt like his conceptual considerations were good, and the motivation makes sense.

I have no clear ability in favor or against his ability to execute projects quickly, which is why I'm keeping it at $2k.

I might consider more funding if there are good early results or other strong evidence of progress that I can easily verify. I'd recommend trying to sprint to a 4-6 week MVP and publishing or at least writing up and privately sharing the results.

ProjectsCommentsDonations
'Making God': a Documentary on AI Risks for the Public
JLK avatar

Jisoo Kim

about 2 hours ago

Very important and timely work. Education is empowerment and people need to know about what AI - including AGI and superintelligence - may do to their lives if we're not careful now. Good luck, Connor!

Split Personality Training
Austin avatar

Austin Chen

about 8 hours ago

Approving and making a small ($1k) donation as well, as the priors on this kind of project seem reasonable (MATS mentee asking to continue their project), and Marius and Evan's support gives me some confidence. As Marius says, strong early results could encourage me to fund this with more!

Visions from the Archive
TAHS avatar

Tobias Stevenson

about 10 hours ago

because I was told I need to show some self love ... no one else did

AI forecasting and policy research by the AI 2027 team
🐙

Mudit Arun Tulsianey

about 16 hours ago

I'm excited for what the team does next!

Personally loved AI 2027, and am exploring ways to contribute positively to the field :)

One thing on my mind:

As a physicist, I've observed that our most innovative solutions often emerge when diverse perspectives converge. I'd love to see more initiatives actively recruiting non-STEM majors into AI safety. Humanities scholars bring unique analytical frameworks and ethical considerations that complement technical approaches, as brilliantly illustrated in Gillian Tett's 'Anthrovision'. Perhaps dedicated fellowships or collaborative projects pairing technical and humanities researchers could catalyse these valuable cross-disciplinary insights?

AI Safety Reading Group at metauni [Retrospective]
mfar avatar

Matthew Cameron Farrugia-Roberts

about 18 hours ago

Final report

Thanks to everyone for the retrospective support!

As a final note, though unrelated to this funding, I recently moved to Oxford and recently started a new AI safety reading group with some students here.

Finishing The SB-1047 Documentary
michaeltrazzi avatar

Michaël Rubens Trazzi

about 20 hours ago

Final report

closing the project

Making 52 AI Alignment Video Explainers and Podcasts
michaeltrazzi avatar

Michaël Rubens Trazzi

about 20 hours ago

Final report

closing project

Create open source predictors for various genetically influenced traits such as intelligence and disease risk
🌻

Gene Smith

about 21 hours ago

Progress update

What progress have you made since your last update?

This project has slowed down a bit because all the volunteers working on it became busy with other work. We have made some additional progress since then, most notably some work by Kman to make a predictor that we think should be able to explain roughly 10% of the within-family variance in intelligence.

What are your next steps?

Kman needs to finish the predictor and Franz needs to get a website up where people can upload genomes.

Is there anything others could help you with?

We could use some help finishing the front end for the website. There is a partly finished front-end, and a dev who is willling to work with anyone that wants to help get this project across the finish line. We still have about half the funds left, so we can provide some money to devs that want to help out with this (though probably not a market rate salary)

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

2 days ago

@jeff3454 thanks for your donations! Can you please (if you'd like) email me on connor.axiotes@gmail.com - we'd love to say thanks.

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

2 days ago

@Austin thank you! We appreciate it.

'Making God': a Documentary on AI Risks for the Public
Austin avatar

Austin Chen

3 days ago

@Connoraxiotes Just wanted to say that I very much appreciate this level of transparency and openness about your thinking & finances; I think it's super helpful for others embarking on similar projects!

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

3 days ago

@michaeltrazzi hey!

All in all just above $17,000. 9/10 people on set also included myself, Mike, Gary and our film photographer.

A Netflix shoot is usually minimum $20k, and we want our shoots to be at that level as our aim is to break into film festivals and then streaming services. That's why we had 4 cameras! (Three FX6s and one FX3).

So I would pay this price again though because we have something that is an intriguing interview and super cinematic looking. Which was our purpose with this documentary.

It was a particularly expensive shoot because:

  • We had less than one week notice to sort the whole shoot and as we are in SF, fly over to NYC and hire a local crew.

  • Because Gary Marcus was coming to NYC and no longer has an office here, we had to pay for a filming location (when hitherto we've been using people's offices and homes).

  • Our AirBnB was expensive because there was a chance we might have had to house crew and do some b-roll filming there, so it had to suit both those things.

  • We had to ship extra luggage on our flights.

I will note that I've been quite successful at getting price reductions on things like crew and gear hire. And I'll continue to fight for value. Mike is also quite experienced with these bigger shoots so he has a good intuition for what should cost what.

Regarding the burn, we also raised $50,000 privately in the last few weeks. So although the interview a significant cost, we thought it was worth it. BUT to finish the documentary we still need to raise a lot.

Michael you should come help us fundraise aha! We could use your expertise.

[Costings below if you'd like a look!]

🏅
'Making God': a Documentary on AI Risks for the Public
michaeltrazzi avatar

Michaël Rubens Trazzi

3 days ago

@Connoraxiotes curious: how much did flying to NYC and having 9/10 people on set cost?

With that burn, how many more interviews can you shoot?

Coordinal Research: Accelerating the research of safely deploying AI systems.
🐝

Michael Chen

4 days ago

Relevant: AIs at the current capability level may be important for future safety work

'Making God': a Documentary on AI Risks for the Public
Connoraxiotes avatar

Connor Axiotes

4 days ago

Progress update

'Making God' Update [12th May 2025] - Gary Marcus Interview, NYC

***Watch our exclusive teaser clip of the interview on X/Twitter and LinkedIn***

We spent the last couple weeks in New York and hired a full crew (around 9/10 people on set) to film a professional cinematic interview with Gary Marcus.

EVN General Support Application
🐹

Elizabeth Van Nostrand

4 days ago

Work funded with this grant:
- extended my work on chaos theory, which would otherwise have run out of money in August (https://acesounderglass.com/2024/09/20/applications-of-chaos-saying-no-with-hastings-greer/ , https://acesounderglass.com/2024/11/01/11924/)
- my vo2max research was covered by a client, however tutoring on the write up was covered by this grant (https://acesounderglass.com/2025/03/09/11954/)
- luck based medicine updates (https://acesounderglass.com/2025/04/11/journal-of-null-results-ezmelt-sublingual-vitamins/, including a prediction market on the outcome, https://acesounderglass.com/2024/12/01/luck-based-medicine-no-good-very-bad-winter-cured-my-hypothyroidism/ )
- unpublished drafts on cults, abusive relationships, and high investment groups
- AI research tool comparisons: https://acesounderglass.com/2024/10/04/ai-research-assistants-competition-2024q3-tie-between-elicit-and-you-com/

EVN General Support Application
🐹

Elizabeth Van Nostrand

4 days ago

This eventually led to https://www.lesswrong.com/posts/son5eEGymm4h856J9/estimating-the-benefits-of-a-new-flu-drug-bxm

'Making God': a Documentary on AI Risks for the Public
Seldon avatar

Seldon

4 days ago

🏴‍☠️ We are staunch supporters of creating a global movement for existential security and, if this is a success, it will be one of the best ways to kickstart it. We will be watching from the sidelines!

'Making God': a Documentary on AI Risks for the Public
Martian-Moonshine avatar

Stephan Wäldchen

4 days ago

This is a great initiative! Sounds like the crew knows what they are doing!

The new pope is also into AIS, maybe you could get him to interview? :D

[CLOSED] Arkose may close soon
🐤

Vael Gates

5 days ago

Stating interest in pledging $8k if we're close to the funding bar!

Split Personality Training
mariushobbhahn avatar

Marius Hobbhahn

5 days ago

Funding with $2000 to get the project off the ground.

I have talked to Florian about this project during the last MATS cohort presentation day. I felt like his conceptual considerations were good, and the motivation makes sense.

I have no clear ability in favor or against his ability to execute projects quickly, which is why I'm keeping it at $2k.

I might consider more funding if there are good early results or other strong evidence of progress that I can easily verify. I'd recommend trying to sprint to a 4-6 week MVP and publishing or at least writing up and privately sharing the results.