Choose an annual donation amount:
AI safety is funding-limited at the moment and every bit counts.
If you have
5 minutes–1 hour
You can delegate to experienced grantmakers who know of more good opportunities than they can fund.
Donate to a fund, such as the AI Risk Mitigation Fund. Grantmakers evaluate projects on behalf of donors and choose the projects they think are most effective to fund. Alternatively, you can donate to a donor lottery which gives you a chance to direct a larger amount of money.
If you have
1–50 hours
With some time, you might be able to find good opportunities yourself. Otherwise you can delegate to experienced grantmakers.
Either donate to a fund, such as the AI Risk Mitigation Fund, donate to a specific project that you think effectively tackles the issues of AI safety if you have one in mind, delegate to someone in your network if you know someone whose opinion in this area you trust, or send money via a donor lottery to give you a chance to direct a much larger amount and dedicate your time to researching where it should go.
We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. GiveWiki and the Nonlinear Network are other platforms where you can choose projects to fund.
If you have
Ongoing commitment
By studying more before donating, you donate not just your money but your cognition to find and assess opportunities which big grantmakers – who have less time per unit of money – might miss.
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider engaging with researchers in the comments sections of their research posts on LessWrong.
Either donate to a specific project that you think effectively tackles the issues of AI safety if you have one in mind, delegate to someone in your network if you know someone whose opinions in this area you trust, or send money via a donor lottery to give you a chance to direct a much larger amount and dedicate your time to researching where it should go.
We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. GiveWiki and the Nonlinear Network are other platforms where you can choose projects to fund.
Donating to a fund such as the AI Risk Mitigation Fund is a good fallback option.
If you have
Major focus
By studying more before you donate, you donate not just your money but your cognition to find and assess opportunities which big grantmakers – who have less time per unit of money – might miss.
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider self-funding to try and tackle the problem yourself, either directly as a researcher or by using your existing skills to support the field.
Either donate to a specific project that you think effectively tackles the issues of AI safety if you have one in mind, delegate to someone in your network if you know someone whose opinions in this area you trust, or send money via a donor lottery to give you a chance to direct a much larger amount and dedicate your time to researching where it should go.
We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. GiveWiki and the Nonlinear Network are other platforms where you can choose projects to fund.
Donating to a fund such as the AI Risk Mitigation Fund is a good fallback option.
AI safety is funding-limited at the moment and notable donations can make a big difference.
If you have
5 minutes–1 hour
You can delegate to experienced grantmakers who know of more good opportunities than they can fund.
Donate to a fund, such as the AI Risk Mitigation Fund. Grantmakers evaluate projects on behalf of donors and choose the projects they think are most effective to fund.
If you have
1–50 hours
With some time, you might be able to find good opportunities yourself, otherwise you can delegate to experienced grantmakers.
Either donate to a fund, such as the AI Risk Mitigation Fund, donate to specific projects that you think effectively tackle the issues of AI safety if you have some in mind, delegate to someone in your network if you know someone whose opinions in this area you trust, or send money via a donor lottery to give you a chance to direct a much larger amount and dedicate your time to researching where it should go.
We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. GiveWiki and the Nonlinear Network are other platforms where you can choose projects to fund.
If you have
Ongoing commitment
By studying more before you donate, you donate not just your money but your cognition to find and assess opportunities which big grantmakers – who have less time per unit of money – might miss.
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider engaging with researchers in the comments sections of their research posts on LessWrong.
Donating to a fund such as the AI Risk Mitigation Fund or to a donor lottery is a good fallback option.
If you have
Major focus
By studying more before you donate, you donate not just your money but your cognition to find and assess opportunities which big grantmakers – who have less time per unit of money – might miss.
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider self-funding to try and tackle the problem yourself, either directly as a researcher or by using your existing skills to support the field.
Donating to a fund such as the AI Risk Mitigation Fund or to a donor lottery is a good fallback option.
At this scale of donation, you could enable grantmakers to support someone working on AI safety full-time, or provide significant support to an organization.
If you have
5 minutes–1 hour
Donate to a fund, such as the AI Risk Mitigation Fund. Grantmakers evaluate projects on behalf of donors and choose the projects they think are most effective to fund.
If you have
1–50 hours
Either donate to a fund, such as the AI Risk Mitigation Fund, donate to specific projects that you think effectively tackle the issues of AI safety if you have some in mind, delegate to someone in your network if you know someone whose opinions in this area you trust, or send money via a donor lottery to give you a chance to direct a much larger amount and dedicate your time to researching where it should go.
We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. GiveWiki and the Nonlinear Network are other platforms where you can choose projects to fund.
If you have
Ongoing commitment
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider engaging with researchers in the comments sections of their research posts on LessWrong.
Donating to a fund such as the AI Risk Mitigation Fund is a good fallback option.
If you have
Major focus
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider self-funding to try and tackle the problem yourself, either directly as a researcher or by using your existing skills to support the field.
Donating to a fund such as the AI Risk Mitigation Fund is a good fallback option.
You can provide significant support to several organizations, and could also support many full-time researchers. The total funding for AI safety was around $150 million in 2023; you can be a notable fraction of the funding ecosystem if you care and can dedicate the funds.
If you have
5 minutes–1 hour
Either delegate to someone in your network who you think has a good understanding of the challenge (possibly by sponsoring them as an S-process recommender, which will give them infrastructure and a menu of applications), or donate to a fund such as the AI Risk Mitigation Fund where grantmakers evaluate projects on behalf of donors.
If you have
1–50 hours
Some high impact ideas include sponsoring an S-process recommender whose judgment you trust, which will will give them infrastructure and a menu of applications, or donate to a donor lottery which can amplify your donation.
Alternatively, either donate to a fund – such as the AI Risk Mitigation Fund – or to specific projects that you think effectively tackle the issues of AI safety, or select individuals to donate to directly. We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful, or nominate regrantors to decide on your behalf. Givewiki and the Nonlinear Network are similar platforms where you can choose projects to fund.
If you have
Ongoing commitment
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider engaging with researchers in the comments sections of their research posts LessWrong.
Some high impact ideas for donating include sponsoring an S-process recommender whose judgment you trust, which will give them infrastructure and a menu of applications, or donate to a donor lottery which can amplify your donation.
Alternatively, either donate to a fund – such as the AI Risk Mitigation Fund – or to specific projects that you think effectively tackle the issues of AI safety, or select individuals to donate to directly. We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful.
If you have
Major focus
Read up to understand the problem, get involved in the community, and fund projects or individuals who you think are doing the best work.
Consider self-funding to try and tackle the problem yourself, either directly as a researcher or by using your existing skills to support the field. You can also skill up to improve your abilities as a grantmaker.
Consider engaging with researchers in the comments sections of their research posts on LessWrong.
Some high impact ideas for donating include sponsoring an S-process recommender whose judgment you trust, which will give them infrastructure and a menu of applications, or donate to a donor lottery which can amplify your donation.
Alternatively, either donate to a fund – such as the AI Risk Mitigation Fund – or to specific projects that you think effectively tackle the issues of AI safety, or select individuals to donate to directly. We recommend exploring Manifund, which allows you to "invest" in projects you think will be impactful.
(ɔ) 2024 · This site is released under a CC BY-SA license