Apply to Effective Altruism Funds now
post by Jonas V (Jonas Vollmer) · 2021-02-13T13:36:39.977Z · LW · GW · 5 commentsThis is a link post for https://forum.effectivealtruism.org/posts/NfkdSooNiHcdCBSJs/apply-to-ea-funds-now-1
Contents
Summary Recent updates Long-Term Future Fund What types of grants can we fund? None 5 comments
I expect EA Funds – and the Long-Term Future Fund in particular – to be of interest to people on LessWrong, so I'm crossposting my EA Forum post [EA · GW] with the excerpts that seem most relevant:
Summary
- The Animal Welfare Fund, the Long-Term Future Fund, and the EA Infrastructure Fund (formerly the EA Meta Fund [EA · GW]) are calling for applications.
- Applying is fast and easy – it typically takes less than a few hours. If you are unsure whether to apply, simply give it a try.
- The Long-Term Future Fund and EA Infrastructure Fund now support anonymized grants: if you prefer not having your name listed in the public payout report, we are still interested in funding you.
- If you have a project you think will improve the world, and it seems like a good fit for one of our funds, we encourage you to apply by 7 March (11:59pm PST). Apply here. We’d be excited to hear from you!
Recent updates
- The Long-Term Future Fund and EA Infrastructure Fund now officially support anonymized grants. To be transparent towards donors and the effective altruism community, we generally prefer to publish a report about your grant, with your name attached to it. But if you prefer we do not disclose any of your personal information, you can now choose one of the following options: 1) Requesting that the public grant report be anonymized. In this case, we will consider your request, but in some cases, we may end up asking you to choose between a public grant or none at all. 2) Requesting we do not publish a public grant report of any kind. In this case, if we think the grant is above our threshold for funding, we will refer it to private funders.
(…)
Long-Term Future Fund
The Long-Term Future Fund aims to positively influence the long-term trajectory of civilization, primarily via making grants that contribute to the mitigation of global catastrophic risks. Historically, we’ve funded a variety of longtermist projects, including:
- Scholarships, academic teaching buy-outs, and additional funding for academics to free up their time
- Funding to make existing researchers more effective
- Direct work in AI, biosecurity, forecasting, and philanthropic timing
- Up-skilling in a field to prepare for future work
- Seed money for new organizations
- Movement-building programs
See our previous grants here. Most of our grants are reported publicly, but we also give applicants the option to receive an anonymous grant, or to be referred to a private donor.
The fund has an intentionally broad remit that encompasses a wide range of potential projects. We strongly encourage anyone who thinks they could use money to benefit the long-term future to apply.
(…)
What types of grants can we fund?
For grants to individuals, all of our funds can likely make the following types of grants:
- Events/workshops
- Scholarships
- Self-study
- Research projects
- Content creation
- Product creation (e.g., a tool/resource that can be used by the community)
We can refer applications for for-profit projects (e.g., seed funding for start-ups) to EA-aligned investors. If you are a for-profit, simply apply through the standard application form and indicate your for-profit status in the application.
For legal reasons, we will likely not be able to make the following types of grants:
- Grantseekers requesting funding for a list of possible projects
- In this case, we would fund only a single project of the proposed ones. Feel free to apply with multiple projects, but we will have to confirm a specific project before we issue funding.
- Self-development that is not directly related to the common good
- In order to make grants, the public benefit needs to be greater than the private benefit to any individual. So we cannot make grants that focus on helping a single individual in a way that is not directly connected to public benefit.
Please err on the side of applying, as it is likely we will be able to make something work if the fund managers are excited about the project. We look forward to hearing from you.
5 comments
Comments sorted by top scores.
comment by magfrump · 2021-02-14T05:34:32.191Z · LW(p) · GW(p)
I am currently writing fiction that features protagonists that are EAs.
This seems at least related to the infrastructure fund goal of presenting EA principles and exposing more people to them.
I think receiving a grant would make me more likely to aggressively pursue options to professionally edit, publish, and publicize the work. That feels kind of selfish and makes me self-conscious, but also wouldn't require a very large grant. It's hard for me to unwrap my feelings about this vs. the actual public good, so I'm asking here first.
Does this sounds like a good grant use?
Replies from: habryka4, Jonas Vollmer↑ comment by habryka (habryka4) · 2021-02-14T07:22:36.535Z · LW(p) · GW(p)
I am reasonably excited about fiction (and am on the Long Term Future Fund). I have written previously about my thoughts on fiction here [EA · GW]:
Replies from: RedManThe track record of fiction
In a general sense, I think that fiction has a pretty strong track record of both being successful at conveying important ideas, and being a good attractor of talent and other resources. I also think that good fiction is often necessary to establish shared norms and shared language.
Here are some examples of communities and institutions that I think used fiction very centrally in their function. Note that after the first example, I am making no claim that the effect was good, I’m just establishing the magnitude of the potential effect size.
- Harry Potter and the Methods of Rationality (HPMOR) was instrumental in the growth and development of both the EA and Rationality communities. It is very likely the single most important recruitment mechanism for productive AI alignment researchers, and has also drawn many other people to work on the broader aims of the EA and Rationality communities.
- Fiction was a core part of the strategy of the neoliberal movement; fiction writers were among the groups referred to by Hayek as "secondhand dealers in ideas.” An example of someone whose fiction played both a large role in the rise of neoliberalism and in its eventual spread would be Ayn Rand.
- Almost every major religion, culture and nation-state is built on shared myths and stories, usually fictional (though the stories are often held to be true by the groups in question, making this data point a bit more confusing).
- Francis Bacon’s (unfinished) utopian novel “The New Atlantis” is often cited as the primary inspiration for the founding of the Royal Society, which may have been the single institution with the greatest influence on the progress of the scientific revolution.
On a more conceptual level, I think fiction tends to be particularly good at achieving the following aims (compared to non-fiction writing):
- Teaching low-level cognitive patterns by displaying characters that follow those patterns, allowing the reader to learn from very concrete examples set in a fictional world. (Compare Aesop’s Fables to some nonfiction book of moral precepts — it can be much easier to remember good habits when we attach them to characters.)
- Establishing norms, by having stories that display the consequences of not following certain norms, and the rewards of following them in the right way
- Establishing a common language, by not only explaining concepts, but also showing concepts as they are used, and how they are brought up in conversational context
- Establishing common goals, by creating concrete utopian visions of possible futures that motivate people to work towards them together
- Reaching a broader audience, since we naturally find stories more exciting than abstract descriptions of concepts
(I wrote in more detail about how this works for HPMOR [EA · GW] in the last grant round.)
↑ comment by RedMan · 2021-02-23T20:53:12.084Z · LW(p) · GW(p)
I've got some partial outlines for what I think are interesting sci-fi that I've wanted to pay to have ghostwritten or turned into a short film. Is this the right place for that?
Replies from: habryka4↑ comment by habryka (habryka4) · 2021-02-24T07:20:13.493Z · LW(p) · GW(p)
Maybe, but really depends on whether you have a good track record or there is some other reason why it seems like a good idea to fund from an altruistic perspective.
↑ comment by Jonas V (Jonas Vollmer) · 2021-02-14T16:10:36.785Z · LW(p) · GW(p)
I largely agree with Habryka's perspective. I personally (not speaking on behalf of the EA Infrastructure Fund) would be particularly interested in such a grant if you had a track record of successful writing, as this would make it more likely you'd actually reach a large audience. E.g., Eliezer did not just write HPMoR but was a successful blogger on Overcoming Bias and wrote the sequences.