@joel_bkr Hey Joel! Thanks so much for your comments — really appreciate your thoughts. I've answered each of your two reasons for your skepticism below.
TLDR: regulatory approval is overrated as a bottleneck for (most) forecasting use-cases, and creating a large pool of forecasters is largely instrumental to acceptance inside influential decision-making institutions. On potentially more effective ways of creating top forecasters (e.g. job postings, public examples), I think it's important to note that universities offer a fairly unique opportunity to immerse students in a shared experience, culture, etc. Meeting for an hour a week with friends is a totally different level of commitment than learning forecasting because it was listed on a job posting. (Also, Misha himself has chatted with us and is very bullish on university-level forecasting!)
The above TLDR probably covers about 60-70% of the content below. Happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)
(1) LARGER POOL OF (SUPER)FORECASTERS NOT A BOTTLENECK
It doesn't seem like having a larger pool of forecasters is an important bottleneck for use-cases I am aware of. "Regulatory approval" and "acceptance inside prestige institutions" feel like better candidates.
I think this actually might not be true, or at least not how I'm understanding it.
Re: regulatory approval, this is only the limiting factor for real-money prediction markets. Although it's hugely important in that particular use-case, I think it's important to note that many use-cases of forecasting besides real-money prediction markets do not rely on regulatory approval. Metaculus (and even Manifold!) are great examples of platforms which provide incredible value, and Open Philanthropy's explicit use of forecasting in their grant-making is a great example of decision-makers using forecasting — neither of these required regulatory approval, and regulatory approval would not have improved their impact. If (e.g.) Metaculus had 10x the superforecasters, we would probably have substantially more forecasts on more topics, leading to more accurate forecasts on a wider variety of important areas.
IMHO, real-money prediction markets are unlikely to be the source of the majority of impact from the field of forecasting. Much more likely, it'd come from people identifying talented forecasters and using those individuals in key situations. This is a hotter take, but I do think it's a crux — real-money prediction markets would be great, but in terms of impact, I'd far prefer a widespread Metaculus to a widespread Kalshi.
Re: "acceptance inside prestige institutions," I'm not entirely sure what you mean.
If you mean "acceptance from influential decision-makers/key decision-making bodies, like politicians, big NGOs, etc," that makes sense, and I agree!
current students = future key-decision-makers
acceptance from influential decision-makers would be significantly easier if forecasting was a generally accepted way of doing things (see our 2nd goal under our theory of change)
in order for forecasting to be desirable to influential decision-makers, it needs to first work well — one of the best ways for forecasting to work better is if more people are doing it. This is true both for classic "wisdom of the crowds" reasons, but also because if 10x people are doing forecasting, we'll likely discover 10x superforecasters, and forecasts will be much more accurate, and we'll have forecasts on a wider area of topics, etc.
If, however, you literally mean "acceptance inside prestige institutions," I don't quite agree — I don't think that's a bottleneck to impactful forecasting. Regardless, I do still think college forecasting clubs solve for this — universities are some of the most prestigious institutions in the US, and high-quality clubs (with associated professors, speakers, etc) at said institutions is acceptance.
Another few comments:
Michael Story (Swift Centre) wrote an essay on where forecasting is & isn't useful. Would recommend!
I'm in a bit of a unique position to do something, compared to almost all of the rest of the forecasting community — this sort of thing pretty much only works when its student-lead.
University forecasting seems substantially cheaper to implement than regulatory approval or broad acceptance at prestigious institutions.
This proposal doesn't trade off with regulatory approval or acceptance from prestigious institutions, except with funding efforts. To my knowledge, although there are efforts that could improve the regulatory regime of prediction markets or the institutional acceptance of forecasting, they require social & legal capital, not money (or at least, not money on the order of $8.4k).
(2) OTHER WAYS OF IDENTIFYING TOP FORECASTERS
I would guess that university forecasting clubs are a less beneficial means of creating top forecasters than "jobs listing forecasting skill as desired qualification," "excellent public examples of forecasting to emulate" (e.g. Misha's AI bio report). Not sure about cost-effectiveness, though.
There are a lot of potentially effective approaches, and I think a lot of them
could should be tried. The space of possible strategies is huge, and we ought to start trying low-cost stuff and seeing what works and what doesn't.
University clubs are in a pretty unique position. They have the opportunity to very significantly influence the life of someone. Students often structure their friend groups around and spend a lot of time in university clubs — this isn't the case with the other means you mentioned.
This is one of the main reasons that university EA groups have been so incredibly popular at building the EA community. I recently chatted with Jessica McCurdy (who runs & started UGAP), and her perspective was (paraphrasing) that university groups are a good if you want students to learn & explore things collaboratively, to make significant changes to their lives, and to group together in a way that allows them to signal-boost a particular idea. All of these apply to forecasting, in a similar way to EA. She was "very excited" about the idea of university forecasting clubs!
Again, I'm in a fairly unique position — my comparative advantage is that I can start clubs, while others can try other strategies (that they might have a comparative advantage in).
E.g. Rethink Priorities might be able to make jobs listings with forecasting skills as a desired qualification, but I probably can't; on the other hand, I can start university forecasting clubs, but they probably can't.
Forecasting clubs are very measurable, compared to some of the strategies you mentioned. 3 forecasting clubs with about 10 active members each (a roughly median outcome) would mean about 30 new forecasters per year, and probably about 50-100 people who've "heard of it" (friends of friends, those who dropped after half a semester, etc). How many people have gotten into forecasting through jobs listings, or public examples of forecasting? It seems pretty hard to say.
Also, Misha Yagudin himself has chatted with us and is very bullish on university-level forecasting!
Just to reiterate from above: thank you for commenting & for your thoughts! I'm happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)