saulmunn avatar
Saul Munn


Co-Organizer of OPTIC
$2,022.16total balance
$0charity balance
$2,022.16cash balance

$0 in pending offers



saulmunn avatar

Saul Munn

2 months ago

it'd be great to have a clear theory of change, if you have one — if you don't, that's okay, but if the goal is (e.g.) "get political researchers to use Estimaker in their research," then you might consider writing out a plan on how you intend to (e.g.) get political researchers to use Estimaker in their research.

tldr: these projects look cool; what is your concrete plan to turn it from "oh cool, this little app thingy" to "a bunch of {key decisionmakers, relevant academics, etc} are using this"?

also, smaller comments:

  • i just signed up for estimaker... but i have no idea how to use it? i'm just shown a snowflake, a percentage, and a blanking cursor for code. was there some tutorial, or onboarding process that i missed? i don't even know what language to use :( it'd be really helpful to at least have a documentation! (and if there is one & i couldn't find it in 2-3 minutes of looking, that makes it an easy problem to fix — just make it more prominent!)

  • on viewpoints, it'd be awesome if i could use keypresses instead of clicks (maybe up/down/left/right arrows?)

  • viewpoints sorta feels like i'm filling out a census report. it almost feels like a game, but there could be much more gamified elements that make it more fun to fill out; after 2-3 minutes, it gets to be pretty much exactly the same, over & over again.

good luck nathan! always here if you want or need help :D

saulmunn avatar

Saul Munn

3 months ago

Here's a Notion doc of the below :)

ACX Forecasting Mini-Grants Update

Hey! We’re posting this update as ACX Forecasting Mini-Grants begins the retroactive evaluation period. A quick timeline:

  • We ran a pilot, intercollegiate forecasting competition in April 2023. It went really well!

  • We wrote a (detailed) postmortem on the pilot competition in May 2023.

  • We kept working on OPTIC over the summer, and have a lot of plans for the future (some specifics are below).

Rachel & Austin wanted all of the projects to answer the questions below. If you want the details on the pilot competition, check out the postmortem, but it doesn’t include our future plans.

We’d also be happy to meet — online or in person! Reach out to, or come chat with us at Manifest. Tom is based in the Bay, Jingyi is based in Boston, and Saul is in the Bay for September and Boston for October.

  1. How much money have you spent so far? Have you gotten more funding from other sources? Do you need more funding?

    Answers detailed in the table below. We received funding from Manifund and the Long Term Future Fund (LTFF).

    Total Manifund LTFF Original Funding 6001 3901 2100 Used 2208 Committed* 3000 Leftover 792

    *The prize money has not yet been given out, but has been committed — it’s sitting in our bank account waiting to go to the winners.

    Our original Manifund project was funded to run a pilot, intercollegiate forecasting competition. This original, pilot competition doesn’t need more funding — it’s already been completed — but OPTIC as an organization has expanded. We’re seeking funding to run additional competitions and help support forecasting clubs. We are likely to have a moderate through a corporate sponsor and some forecasting organizations, and have pending applications more funding through Open Philanthropy, the Long-Term Future Fund, and Manifund. We also applied to Lightspeed and received no funding.

  2. How is the project going? (a few paragraphs)

    1. The pilot went well (see the postmortem for a detailed account), and we’re planning to run 2-4 tournaments in the fall. (This is dependent primarily on funding & high-quality hires.)

    2. We’ve adjusted the mission of OPTIC from purely running forecasting competitions to generally promoting collegiate forecasting. At this point, that means expanding our reach to support about 2-4 forecasting clubs this semester, and more in future semesters. We’re providing fiscal support (fundraising now!), organizational mentorship, instructional content, and outreach materials.

  3. How well has your project gone compared to where you expected it to be at this point? (Score from 1-10, 10 = Better than expected)

    1. Our scoring rule:

      1. 1 = much worse than expected

      2. 5 = exactly as expected

      3. 10 = much better than expected

    2. Average response: 7.83

      1. Saul’s response: 8

      2. Tom’s response: 8

      3. Jingyi’s response: 7.5

  4. Are there any remaining ways you need help, besides more funding?

    1. Yes! We need:

      1. competitors! Sign up on our website :)

      2. organizers! We’re running competitions in the fall in at least Boston and the Bay Area, and potentially DC and London. We’re bottlenecked on local organizing capacity, and finding good organizers will determine which we can run.

      3. speakers/panelists! We are still looking for speakers/panelists, especially for the non-Bay Area competitions.

  5. Any other thoughts or feedback?

    1. Nothing other than gratitude toward Rachel, Austin, and Scott for setting this up :)

saulmunn avatar

Saul Munn

3 months ago

@joel_bkr Hey Joel! Thanks so much for your comments — really appreciate your thoughts. I've answered each of your two reasons for your skepticism below.

TLDR: regulatory approval is overrated as a bottleneck for (most) forecasting use-cases, and creating a large pool of forecasters is largely instrumental to acceptance inside influential decision-making institutions. On potentially more effective ways of creating top forecasters (e.g. job postings, public examples), I think it's important to note that universities offer a fairly unique opportunity to immerse students in a shared experience, culture, etc. Meeting for an hour a week with friends is a totally different level of commitment than learning forecasting because it was listed on a job posting. (Also, Misha himself has chatted with us and is very bullish on university-level forecasting!)

The above TLDR probably covers about 60-70% of the content below. Happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)


It doesn't seem like having a larger pool of forecasters is an important bottleneck for use-cases I am aware of. "Regulatory approval" and "acceptance inside prestige institutions" feel like better candidates.

I think this actually might not be true, or at least not how I'm understanding it.

  1. Re: regulatory approval, this is only the limiting factor for real-money prediction markets. Although it's hugely important in that particular use-case, I think it's important to note that many use-cases of forecasting besides real-money prediction markets do not rely on regulatory approval. Metaculus (and even Manifold!) are great examples of platforms which provide incredible value, and Open Philanthropy's explicit use of forecasting in their grant-making is a great example of decision-makers using forecasting — neither of these required regulatory approval, and regulatory approval would not have improved their impact. If (e.g.) Metaculus had 10x the superforecasters, we would probably have substantially more forecasts on more topics, leading to more accurate forecasts on a wider variety of important areas.

    IMHO, real-money prediction markets are unlikely to be the source of the majority of impact from the field of forecasting. Much more likely, it'd come from people identifying talented forecasters and using those individuals in key situations. This is a hotter take, but I do think it's a crux — real-money prediction markets would be great, but in terms of impact, I'd far prefer a widespread Metaculus to a widespread Kalshi.

  2. Re: "acceptance inside prestige institutions," I'm not entirely sure what you mean.

    If you mean "acceptance from influential decision-makers/key decision-making bodies, like politicians, big NGOs, etc," that makes sense, and I agree!

    Note that:

    1. current students = future key-decision-makers

    2. acceptance from influential decision-makers would be significantly easier if forecasting was a generally accepted way of doing things (see our 2nd goal under our theory of change)

    3. in order for forecasting to be desirable to influential decision-makers, it needs to first work well — one of the best ways for forecasting to work better is if more people are doing it. This is true both for classic "wisdom of the crowds" reasons, but also because if 10x people are doing forecasting, we'll likely discover 10x superforecasters, and forecasts will be much more accurate, and we'll have forecasts on a wider area of topics, etc.

    If, however, you literally mean "acceptance inside prestige institutions," I don't quite agree — I don't think that's a bottleneck to impactful forecasting. Regardless, I do still think college forecasting clubs solve for this — universities are some of the most prestigious institutions in the US, and high-quality clubs (with associated professors, speakers, etc) at said institutions is acceptance.

Another few comments:

  • Michael Story (Swift Centre) wrote an essay on where forecasting is & isn't useful. Would recommend!

  • I'm in a bit of a unique position to do something, compared to almost all of the rest of the forecasting community — this sort of thing pretty much only works when its student-lead.

  • University forecasting seems substantially cheaper to implement than regulatory approval or broad acceptance at prestigious institutions.

  • This proposal doesn't trade off with regulatory approval or acceptance from prestigious institutions, except with funding efforts. To my knowledge, although there are efforts that could improve the regulatory regime of prediction markets or the institutional acceptance of forecasting, they require social & legal capital, not money (or at least, not money on the order of $8.4k).


I would guess that university forecasting clubs are a less beneficial means of creating top forecasters than "jobs listing forecasting skill as desired qualification," "excellent public examples of forecasting to emulate" (e.g. Misha's AI bio report). Not sure about cost-effectiveness, though.

  • There are a lot of potentially effective approaches, and I think a lot of them could should be tried. The space of possible strategies is huge, and we ought to start trying low-cost stuff and seeing what works and what doesn't.

  • University clubs are in a pretty unique position. They have the opportunity to very significantly influence the life of someone. Students often structure their friend groups around and spend a lot of time in university clubs — this isn't the case with the other means you mentioned.

    • This is one of the main reasons that university EA groups have been so incredibly popular at building the EA community. I recently chatted with Jessica McCurdy (who runs & started UGAP), and her perspective was (paraphrasing) that university groups are a good if you want students to learn & explore things collaboratively, to make significant changes to their lives, and to group together in a way that allows them to signal-boost a particular idea. All of these apply to forecasting, in a similar way to EA. She was "very excited" about the idea of university forecasting clubs!

  • Again, I'm in a fairly unique position — my comparative advantage is that I can start clubs, while others can try other strategies (that they might have a comparative advantage in).

    • E.g. Rethink Priorities might be able to make jobs listings with forecasting skills as a desired qualification, but I probably can't; on the other hand, I can start university forecasting clubs, but they probably can't.

  • Forecasting clubs are very measurable, compared to some of the strategies you mentioned. 3 forecasting clubs with about 10 active members each (a roughly median outcome) would mean about 30 new forecasters per year, and probably about 50-100 people who've "heard of it" (friends of friends, those who dropped after half a semester, etc). How many people have gotten into forecasting through jobs listings, or public examples of forecasting? It seems pretty hard to say.

  • Also, Misha Yagudin himself has chatted with us and is very bullish on university-level forecasting!

Just to reiterate from above: thank you for commenting & for your thoughts! I'm happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)

saulmunn avatar

Saul Munn

3 months ago

I’d be happy to chat more about this over a call with any regrantors who’re interested. Book a 25 or 50 minute call here: :)

saulmunn avatar

Saul Munn

7 months ago

Update: we held the (successful!!) pilot competition! A postmortem of the pilot competition is below — take a look and share your thoughts!

[ ]

saulmunn avatar

Saul Munn

9 months ago

We have a website:!

You can contact us directly at :)