Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
1

Reflective altruism

EA community
DavidThorstad avatar

David Thorstad

ActiveGrant
$2,000raised
$1,000funding goal
Fully funded and not currently accepting donations.

Project summary

I work on longtermism at the Global Priorities Institute. I've recently started communicating my research and related thoughts on my blog, Reflective Altruism, and associated social media accounts. I'd like to continue this outreach work, and possibly to expand it.

Project goals

The goal of my work is to use academic research to drive positive change within and around the effective altruism movement.

How will this funding be used?

I'm seeking funding to support my blog and associated social media accounts. At a minimum, I'd like to support existing activities (blog, social media). I might branch out into further offerings, such as a podcast.


I've found myself spending more money than I expected on my Reflective Altruism project. Currently, I'm spending about $250/year on web hosting, $100/year on Twitter, and $200/year on text-to-speech. I've also had to rely on some free services - for example, a very good graphic designer donated their time because I couldn't afford a proper logo.

I think that $500 could cover my expenses for this year, and $1-2k could cover my expenses for two years. I'm asking for the lower amount, $1k, to cover two years of operating costs.

All funding will be used to support Reflective Altruism. I won't take any salary from this funding. All funding will be used to cover expenses.

How could this project be actively harmful?

I'm often critical of longtermism, and sometimes more broadly critical of effective altruism. If you like any of the views or movements I'm criticizing, you might think I'm doing exactly the wrong thing. If you think that levels of near-term existential risk are very high, I might be among the worst people ever born.

What other funding is this person or project getting?

I'm not taking any other funding for this project. I'm funded for unrelated work from the Templeton Foundation's Humility in Inquiry project, and funding for previously completed projects is listed on my CV.

Comments6Donations1Similar7
michaeltrazzi avatar

Michaël Rubens Trazzi

Making 52 AI Alignment Video Explainers and Podcasts

EA Community Choice
8
9
$15.3K raised
Hein avatar

Hein de Haan

Write and publish an e-book advocating for longtermism and sentientism.

How can we build an awesome civilization for all sentient life, and ensure it will be (even more) awesome in the future?

ACX Grants 2024
1
5
$0 raised
fdoria avatar

Felipe Doria

Global Priorities Research at ANU

2-month support to do research on consequentialism and decision theory at ANU

1
8
$0 raised
David_Moss avatar

David Moss

Experiments to test EA / longtermist framings and branding

Testing responses to “effective altruism”, “longtermism”, “existential risk” etc.

EA community
5
16
$26.8K raised
JorgenLjones avatar

Jørgen Ljønes

80,000 Hours

We provide research and support to help people move into careers that effectively tackle the world’s most pressing problems.

Technical AI safetyAI governanceBiosecurityEA Community ChoiceEA communityGlobal catastrophic risks
12
13
$4.92K raised
🐙

Aryeh L. Englander

Continued funding for a PhD in AI x-risk decision and risk analysis

Continuation of a previous grant to allow me to pursue a PhD in risk and decision analysis related to AI x-risks

Technical AI safety
2
1
$0 raised
MichaelDello avatar

Michael Dello-Iacovo

Space Science Guy (An EA-aligned YouTube channel)

Animal welfareLong-Term Future FundEA Infrastructure FundGlobal catastrophic risks
3
1
$0 raised