Manifund foxManifund
Home
Login
About
People
Categories
Newsletter
HomeAboutPeopleCategoriesLoginCreate
30

Keep Apart Research Going: Global AI Safety Research & Talent Pipeline

Technical AI safetyAI governanceEA community
Apart avatar

Apart Research

ActiveGrant
$130,786raised
$954,800funding goal

Donate

Sign in to donate

Project summary

Apart Research is at a pivotal moment. In the past 2.5 years, we've built a global pipeline for AI safety research and talent that has produced 22 peer-reviewed publications in venues like ICLR, NeurIPS, ICML, and ACL, engaged 3,500+ participants in 42 research sprints across 50+ global locations, and helped launch top talent into AI safety careers.

Our impact spans research excellence, talent development, and policy influence: Two of our recent publications received Oral Spotlights at ICLR 2025 (top 1.8% of accepted papers) and our research has been cited by leading AI and AI safety labs. Our participants have landed jobs at METR, Oxford, Far.ai, and in impactful founder roles, while our policy engagement includes presenting at premier forums like IASEAI and serving as expert consultants to the EU AI Act Code of Practice. Major tech publications have featured our work, extending our influence beyond academic circles. Without immediate funding, this momentum will stop in June 2025.

Read our Impact Report here: https://apartresearch.com/donate

What are this project's goals? How will you achieve them?

Our primary goals are to:

  1. Convert untapped technical talent into AI safety researchers: Via our global research sprints (200+ monthly participants), identify individuals with exceptional potential from tech and science backgrounds and get them to contribute to AI safety immediately

  2. Produce high-impact technical AI safety research: Publish 10-15 new peer-reviewed papers on critical challenges including interpretability, evaluation methodologies for critically dangerous capabilities, and AGI security and control; enable horizon-scanning for important research topics in AI safety via our open-ended research hackathons

  3. Place trained researchers at key organizations: Support 30+ Lab Fellows at any given time, preparing them for roles at leading AI safety institutions, non-profits, and startups

We'll achieve these through our proven three-part model:

  • Global Research Sprints: Weekend-long events across 50+ locations identifying promising researchers and novel approaches

  • Studio Program: 4-week accelerator developing the best sprint ideas into substantive research proposals

  • Lab Fellowship: 3-6 month intensive global program for publication-quality work with compute resources, project management, and mentorship

Our model excels at rapidly identifying and developing talent with significant counterfactual impact. For example, one of the participants of our March 2024 METR x Apart hackathon, a serial entrepreneur with a physics and robotics background, joined METR as a full-time member of technical staff largely because of our event. Shortly after our event, he also contributed to a research project in our lab, which he presented at ICLR 2025 (and which received an oral spotlight). Similar success stories have occurred for fellows landing jobs at Oxford, Far.ai, founding impactful AI safety startups, or establishing new AI safety teams in high-growth organizations.

How will this funding be used?

The funding will directly fund our talent and research acceleration pipeline. Our budget breakdown for 12 months is (scale down accordingly):

Staff Compensation for 8 FTE ($691,200, 73%):

  • Research Project Management ensuring fellows produce publication-quality work

  • Research Engineering providing technical support and automation across projects and talent pipelines

  • Sprint & core operations ensuring program effectiveness, follow-up, and impact

Program Related Costs ($156,000, 16%):

  • Direct Program Expenses ($54,000): Lab & Studio infrastructure, research software, fellow conference travel and attendance

  • Travel Costs ($102,000): Travel, conference attendance, meals and accommodations for the apart team and lab fellows presenting their work.

Indirect Costs & Fiscal Sponsorship ($107,600, 11%):

  • Indirect Expenses ($60,000): Software & subscription costs, office rental and other necessary operational expenses

  • Fiscal Sponsorship ($47,600): Costs incurring through our agreement with Ashgro for their accounting, legal support, and non-profit status retention

Our current ask of $954,800 represents a budget of 12 months. Towards that number, we have multiple funding milestones:

  • $120,000 will be enough to keep our position in AI safety and expand our automated field building and research tooling, despite the need to cut down staff and all programs.

  • $238,700 is the minimum amount we need to continue our research and events work for three months, providing opportunity for the Apart community.

  • $477,400 will give us until the end of the year, giving hundreds of people a chance to partake and contribute to AI safety.

$954,800 will enable us to continue into 2026 with our research and events work, creating impact for thousands of people.

How much is your is your donation worth?

  • $100,000 enables the publication of 4 peer reviewed AI Safety publications

  • $50,000 Enables 3 global research sprints identifying new safety approaches

  • $5,000 supports a Lab Fellow producing publication-quality research (including. conference attendance)

  • $200 Enables the participation of 3 hackathon participants

Who is on your team? What's your track record on similar projects?

Our team combines research expertise and operational excellence, with the following key members:

  • Jason Hoelscher-Obermaier (Research Director): Quantum optics PhD, AI engineer at multiple startups, PIBBSS fellow, and director of research

  • Natalia Pérez-Campanero (Research Project Manager): PhD in Bioengineering, former program manager at Royal Society's talent accelerator

  • Archana Vaidheeswaran (Community Program Manager): Board member at Women in ML, experienced in organizing workshops with 2,000+ participants co-located with major ML conferences

  • Jaime Raldúa (Research Engineer): 8+ years ML engineering experience with multiple key contributions to software stacks at impactful EA orgs

  • Jacob Haimes (Research Assistant): MS from CU Boulder, AI Safety Specialist at the Odyssean Institute, founder of the into AI Safety Podcast.

  • Clement Neo (Research Assistant): Research Engineer at the Singapore AISI & former research intern at Oxford supporting Apart researchers part-time.

Advisors:

  • Esben Kran: Co-founder and advisor

  • Finn Metz: Operations and funding advisor

  • Christian Schroeder de Witt: Research advisor

  • Eric Ries: Strategic advisor

  • Nick Fitz: Organizational development advisor

Track Record:

  • 22 peer-reviewed AI safety publications, including at ICLR, NeurIPS, and ACL

  • Two papers receiving Oral Spotlights at ICLR 2025 (top 1.8% of accepted papers)

  • 42 global research sprints engaging 3,500+ participants

  • 105 researchers incubated through our Lab Fellowship. 40 people incubated through our Studio Fellowship since December.

  • 26 placements at 20+ organizations including METR, Oxford, & Far.ai.

  • Research cited by OpenAI's Superalignment team and other major AI labs

What are the most likely causes and outcomes if this project fails?

The most likely failure modes are:

Insufficient funding: Without adequate resources, we would be forced to disband a high-functioning team built over 2.5 years, losing a proven talent pipeline at a critical time for AI safety and canceling valuable talent capital and research projects. Mitigation: We have already diversified our funding drastically, including partnerships and sponsorships.

Research relevance and impact: Our research may not keep up with rapidly evolving field priorities and we could face diminishing returns on novelty for our research hackathon model. Mitigation: We maintain close collaboration with leading AI labs and safety organizations to continuously align our research priorities, while our model allows for rapid adaptation to emerging safety concerns and to previously neglected topic areas. 

Opportunity cost: With AI capabilities advancing rapidly, moving fast now is necessary to keep critical momentum at precisely the time when safety research is most needed. Mitigation: Our model is designed for efficiency and rapid adaptation, allowing us to maximize impact per dollar invested while prioritizing time-sensitive work on urgent and impactful research areas, such as by prioritizing research inputs for the General-Purpose AI Code of Practice.

Talent pipeline execution risk: Challenges in maintaining quality across talent in our work between global mid-career talent and early-career researchers and avoiding overlap with other programs. Mitigation: We have systematic evaluation metrics for participants, and strategic focus on technical backgrounds and locations where we complement rather than compete with existing programs. Examples of differentiation include being remote-first and part-time, essential for helping mid-career individuals transition, and focusing on strong research management, helping non-academics succeed in research.

Industry and partnership challenges: Difficulties in launching new programs, ensuring partner alignment, and continuously facilitating high quality connections between stakeholders. Mitigation: We've built strong connections with leaders and researchers at key organizations, established formal partnership agreements with clear expectations, and designed our talent pipeline to align with the needs of the AI safety field. We expand our sponsorship setup where e.g. $5k in compute is provided by Lambda Labs to every team for free.

Broader ecosystem risks: Public skepticism  of AI safety work could negatively impact donor perception and fundraising efforts. Mitigation: We maintain transparent operations, publish our research openly, engage constructively with diverse perspectives, and focus our messaging on concrete technical contributions.

If we fail to maintain Apart Research, the field would lose:

  • A proven pipeline for identifying and developing global technical talent in AI safety

  • An efficient mechanism for exploring novel research directions at scale

  • A bridge between diverse technical communities and established AI safety organizations

How much money have you raised in the last 12 months, and from where?

In the past 12 months, Apart Research has raised approximately $680,000 from:

  • Survival and Flourishing Fund (SFF)

  • AI Safety Tactical Opportunities Fund Fund (AISTOF)

  • Open Philanthropy

We've also previously received support from the Long-Term Future Fund (LTFF), Foresight & ACX. This funding has enabled us to build our team and infrastructure, but our current funding expires in June 2025, necessitating this fundraising round to maintain our operations.

View our donations page, read our impact report and find more testimonials here:

https://apartresearch.com/donate

Check our website for the total current amount raised (including other sources).

Comments36Donations26Similar8
Austin avatar

Austin Chen

6 days ago

Approving this grant as part of our portfolio on AI safety fieldbuilding! I'm happy to see such widespread praise for Apart's work, and to see that Richard, Anton, Ryan, Soroush, and Gavin (among many others) want to fund Apart to continue with its mission.

Some reflections:

  • Gavin & Anton have called out that larger funders seem not to be funding Apart. In my head, I sometimes think of Apart as part of & supporting a "shadow AI safety scene", my made-up distinction to contrast Apart and things like AISC, aisafety.world, etc from the "mainstream AI safety scene", which is more typified by OpenPhil, Constellation, MATS, and the AI labs. My sense is that the mainstream scene is top down, high-trust, densely-networked in the Bay (plus London, DC); while the shadow scene is bottoms-up, loosely confederated, remote. The mainstream scene takes a while to build up the trust to dedicate large amounts of funding to people outside of it; I suspect that's one of the key reasons for Apart's funding difficulties. Part of why I'm excited to run Manifund is to support projects outside the "mainstream" (where I/Manifold/Manifund began, and arguably still reside); I feel that Apart is in some sense kindred to Manifund, and I'm particularly glad we can be helping them here.

  • I also just really like hackathons (see this), and so I'm glad that somebody else in the ecosystem is systematically pushing for more hackathons!

  • My biggest uncertainty with Apart is that Esben himself is stepping back into an advisory role, to focus more on their new for-profit incubator Seldon. I'm excited for Seldon (we're hosting their first batch out of Mox!), but from an outside view, having a founder depart means that Apart now sits at a fairly critical juncture for having continued impact. As a funder, my best guess is that I'd prefer to fund Seldon over Apart (though I recognize that these cater to different audiences and funders will have different preferences.)

  • My unsolicited advice is that Apart should try to expand their revenue base beyond soliciting donations, to encourage discipline, financial resilience, and incentive alignment. For example, Apart seems to do well at placing great candidates into new jobs, and a natural thing to do would be to ask employers recruiter's fees to fund Apart's continued operations. "Alumni donations" would be another angle on this, similar to how universities raise from those they've taught.

donated $1,000
peterbarnett avatar

Peter Barnett

6 days ago

@Austin I would be a bit scared if Apart began relying on recruiter's fees because this has a very strong incentive for training people to work at AI labs (and not necessarily working on good safety things).

donated $10,000
soroushjp avatar

Soroush Pour

6 days ago

(see my comment further below)

Artem avatar

Karpov

9 days ago

I upvoted because Apart Research offered me help in my research which we successfully finished and were accepted at three workshops with. Apart Research gave me invaluable support and help through out the process. I hope Apart Research continues

🐰

Siya Singh

14 days ago

I don't think anything has helped my career in AI Safety more than Apart; if it weren't for their hackathons (I participated in the 'Women in AI Safety' one), my contribution to the community might have remained passive for a long time. Their hackathon is what spurred action for me; the resources and help they provided were instrumental for the creation of the resource that I eventually managed after the hackathon. Apart has continued to incubate support for my work in AI safety by accepting me into their Studio program, where they have provided me with ample feedback and connected me to many skilled researchers and teams working on similar projects. I'm eternally grateful to Apart, and it receives my highest recommendation- they are the very definition of philanthropy. If you're reading this, you should seriously consider donating to this wonderfully altruistic organisation.

donated $10,000
soroushjp avatar

Soroush Pour

15 days ago

Apart has been working for many years on building a strong, meritocratic pipeline of AIS talent, drawing from some of the most untapped pools of talent globally. The quality and pace of their work is incredibly impressive. I'm donating to their cause and hope others who care about AIS do so as well -- it would be a massive shame for an org of their impact and value not to continue on.

🐞

Helios

15 days ago

Apart incubates curiosity into impact. As an embedded software engineer with a minimal ML background, targeted learning and feedback opportunities were essential to building a working understanding of LLMs -- Apart's Hackathons are this platform. I went from writing Linux drivers to co-authoring a mechanistic interpretability paper that landed at ICLR '25 Building Trust Workshop in 8 months. Apart introduced me to the teammates that made that possible, provided the compute resources (and additional resources for follow-up research through Lambda), and funded the conference visit. They don't just talk about bridging technical talent into AI safety -- they actually do it.

donated $100
🍄

Suhas Hariharan

17 days ago

Apart's talent pipeline is very differentiated from other offerings in the AI Safety ecosystem and seems to have already produced real outcomes.

donated $100
🍓

Michal Bravansky

17 days ago

I believe Apart’s individualized fellowships are particularly valuable in a world increasingly dominated by generic, MATS-like alternatives.

donated $1,000
peterbarnett avatar

Peter Barnett

18 days ago

Aparts hackathons and other community building seems pretty great, and often produce actually useful work

donated $100,000
Richard avatar

Richard Ngo

18 days ago

Building a culture of hands-on experimentation is probably the best way to do AI safety outreach, and Apart seems to have executed on it really well.

donated $4,500
gleech avatar

Gavin Leech

19 days ago

I like Apart; I like their results.

The top-of-funnel reference class they're in includes Bluedot, Nontrivial, AISC, aisafety.info. Three of those groups also struggle for institutional funding. Two of those groups are also trying to put out real research as legible output. (MATS isn't top-of-funnel in this sense.)

What do we want from this kind of effort?

  1. leaders who know what they're talking about technically (check)

  2. friendly (check)

  3. otherwise good vibes and epistemics (check)

  4. actually doing stuff; empirical; e.g. seeking (proxy) feedback for ideas from conference reviews (check)

  5. throughput (check)

  6. buy-in from ML or safety or governance or whatnot (mixed?)

I find it difficult to estimate the value of openness and friendliness in field-building but it's not small, and this is not their only selling point. Good luck!

donated $4,500
gleech avatar

Gavin Leech

19 days ago

@gleech I wonder if putting the full $1m target up on Manifund suppresses donors by making their donation seem smaller. But it has the advantage of being fully honest.

JaesonB avatar

Jaeson Booker

20 days ago

Strongly endorse this!

donated $150
joshl avatar

Josh Landes

20 days ago

It's been great collaborating with the Apart team on top-of-the-funnel fieldbuilding!

Apart has a good way to make many different kinds of bets and test ideas relatively cheaply (something I think they should lean into more!). I'd overall like for their work to continue.

donated $10
🌷

24 days ago

Apart research publishes multiple papers in AI safety. I don't like the entrepreneurial vibe but they have results.

AntonMakiievskyi avatar

Anton Makiievskyi

24 days ago

With such an amazing track record and support previous from big funders, I wonder why they (big funders) are letting you run out of money. Did they all refuse additional support?

Apart avatar

Apart Research

24 days ago

@AntonMakiievskyi

TLDR; no big funders have retracted funding, we have undersold our impact in grant applications by not focusing on the metrics that we now know work best, our network within SF isn't as strong as other orgs, and the AIS funding ecosystem face general problems. See a longer response to this question here.

🧡
AntonMakiievskyi avatar

Anton Makiievskyi

14 days ago

@Apart Sending 100k through every.org Good luck! I hope OP or SFF will step up in upcoming months and cover the rest of you budget

donated $10
charbel-raphael avatar

Charbel-Raphael Segerie

25 days ago

Apart has been useful for me to quickly experiment with ideas/improve on quick iteration. I've organised multiple hackathons before knowing apart, and their format is vastly more effective at converting talent/by unit of effort. While I was head of EffiSciences’ AI Safety Unit, this was one of my favorite event formats, and this is one of the format that I encourage alumni of ML4Good to run. Empirically, each apart hackathon that I organized in Paris enabled the long term careers of 0.6 person in AI Safety (see the table). This means that, on average, 0.6 new full-time persons started working on AI safety after each Apart hackathon event in Paris.

donated $20
Bart-Bussmann avatar

Bart Bussmann

26 days ago

During an Apart Research hackathon I got my first hands-on experience with mechanistic interpretability and fell in love with it. Now I'm working on mechinterp full-time and have published mechinterp papers at ICLR and ICML. Generally their hackathons are very accessible, well-organized, and a great entry-point for people interested in AI safety.

I believe this is a great funding opportunity for any funders interested in getting more people into working on AI safety!

briantan avatar

Brian Tan

27 days ago

I've been impressed with Apart Research for a while now, and I think more funders should consider filling their funding gap. Their Apart Sprints are a great global on-ramp for many people into AI safety research, and some of our fellows at WhiteBox Research have participated in them. I also got to listen to both of their ICLR orals in-person, which were insightful.

donated $400
jechapman2000yahoocom avatar

Jim Chapman

27 days ago

I think Apart Research's hackathons are a great place for community and skill building. I want your work to continue.

donated $10
🍄

Felix Michalak

about 1 month ago

AI Safety is suffering from Elitism, induced by scarce financial resources that then also have to be used to accommodate high-paying jobs to make carreer switches more attractive. Even though the interest in AI Safety has grown tremendously, these dynamics are still in place and almost no one offers low entry points for early career individuals (e.g. students in undergraduate studies).

Apart is the only organization I know that goes against this elitist trend by offering support and guidance to virtually anyone who is seriously interested in AI Safety research, no matter their previous qualifications. The (AI Safety) world needs more research and field-building organizations like Apart, as otherwise the field of AI Safety will not manage to develop fast enough to solve fundamental issues before it's too late.

Apart had a tremendous impact on the development of my research skills and interests, and I am grateful for their existence. This organization cannot cease to exist, as it would be detrimental to the AI Safety landscape.

donated $26
evelynciara avatar

Evelyn Ciara

about 1 month ago

I participated Apart's security evaluations hackathon with a friend last May, and it helped me get in the foot in the door and start exploring the AI safety technical research space. I believe that Apart's hackathons and other programs are a critical part of the talent pipeline for the AI safety community and it would be a shame if they had to shut down due to lack of funding.

donated $50
🍄

Devina Jain

about 1 month ago

Apart research offers an accessible entry point to AI safety research and it would be a tremendous loss if it didn't exist anymore - I don't believe there's another organization that does what they do!

🦄

about 1 month ago

I first got into mech interp by going to an Apart Hackathon. I have enjoyed every Apart Hackathon that I've participated in and found them to be extremely productive.

🥭

Abby Lupi

about 1 month ago

Apart Research forms a critical entry point to contributing meaningfully to research into the powerful technologies poised to shape our future. It is absolutely essential that we expand participation in AI ethics and safety research beyond the institutions that have the most money and power. As a widely recognized and growingly accessible nonprofit, Apart is uniquely positioned to elevate global, diverse, and underrepresented voices in this critical conversation. I'm personally thankful for Apart as an early career researcher without a graduate degree who is now equipped to contribute to the research community; something I previously thought was out of reach.

donated $60
🍇

Jeremias Ferrao

about 1 month ago

Apart has been instrumental in my early career as an AI researcher. Its hackathons and Lab program significantly boosted my research skills and confidence, culminating in my first publication at a workshop of a top-tier conference. This pivotal achievement, in particular, has profoundly shaped my future aspirations and motivated me to aim much higher. Beyond the technical growth, I greatly valued engaging with like-minded individuals and becoming aware of the broader AI safety community and opportunities. Moreover, Apart's highly accommodating programs provided essential flexibility, a key factor that makes participation feasible for many members. I am very optimistic about the organization and wholeheartedly believe in its continued ability to produce world-class, socially impactful research.

🐸

Mindy Ng

about 1 month ago

Apart Research has immensely helped me in my AI Safety/Alignment journey. Coming from industry, knowing about AI development and not necessarily the consequences of it has been enlightening. Through Apart, I have been able to gain a deeper understanding of AI Safety through their Hackathon, Studio and Fellowship. Not only am I able to connect with others in the lab through shared mission, but also help make an impact through rigorous research. The lab has helped develop my research skills so that I can contribute to issues that have huge need to be explored yet not many people working on it. Apart Research helps build up the talent needed so that AI Safety labs can continue to ensure safe AI. If Apart Research shuts down, this would be a huge loss not just to the research field, but society.

donated $30
🥦

Jord Nguyen

about 1 month ago

Love the hackathons and the Apart community! They were very useful when I first started safety research.

🐬

Eitan Sprejer

about 1 month ago

Apart makes a huge impact by providing early career aspiring AI Safety researchers the opportunity to do a research project on a weekend, and have something to show for to get accepted on research internships

🍩

Markela Zeneli

about 1 month ago

Apart Research is the most accessible AI Safety research group I have encountered. Their pipeline of hackathons -> studio -> lab fellow is unique, and given the number of successful papers that come out of the lab, the pipeline works. Apart also acts as a catalyst for people that want to career-switch into AI Safety, and the mentorship I have received from them has been the most impactful and valuable contribution to my personal journey.

Balancing accessibility with ops can be incredibly tricky, but Apart have a proven track record of being able to balance it. Every member is dedicated to producing -exceptional- research, whilst bringing people up to speed with the latest developments and unbiased viewpoints. This alone is priceless, but without funding, they will not be able to continue fostering such an amazing community, and that would be a massive loss to the AI Safety field.

donated $30
🐢

Nyasha Duri

about 1 month ago

Apart Research is an exceptional, vital, and impactful community to say the least. Without it, I would not have been able to connect with such a transformative, inclusive, and unique network now. Having experienced and or evaluated many other formats, I believe that their model is the most effective and efficient.

The experiences I gained have been very helpful for my career overall - instrumental in terms of focusing more on AI Safety. To name just one example of the many different ways in which it has been extremely valuable, leading to inbound opportunities with multinationals, national firms, plus a leading local education institution.

I 100% feel this is one of the best things I have ever been a part of, where so much talent abounds from fellow participants, mentors, speakers and so on. As for the great minds behind the scenes powering things, I have had the privilege of collaborating with many incredible teams over the past decade: they truly stand out as among the best of the best.

Also, I know how meaningful it is to other people, including those who tell me they have yet to take part in a research sprint but aspire to do so. Not being able to continue this essential work would be a huge loss, not to mention at such a crucial time for the field.

What I have written above in yet another failed attempt to summarise (should have gotten an LLM involved but didn't think of it until now) does not do Apart justice; I would be happy to expand anytime.

donated $20
🦀

Auguste Baum

about 1 month ago

Apart is a great community to get up to speed in AI safety

donated $20
Lucie avatar

Lucie Philippon

about 1 month ago

Apart Sprints were useful in bootstrapping my AI Safety Career. The hackathons reports were my first publications. They also gave me concrete experience thinking about LLMs and AI strategy.