We’re open to receiving applications from individuals who are already pursuing careers related to reducing global catastrophic risk (or otherwise improving the long-term future), looking to transition into such careers from other lines of work, or only just starting their careers. We think there are many career tracks which are potentially promising from this perspective (including many of the ones in this list from 80,000 Hours), and there is therefore a correspondingly wide range of proposals we would consider funding.
We’re open to supporting a variety of career development and transition activities, including (but not necessarily limited to) graduate study, unpaid internships, independent study, career transition and exploration periods, postdocs, obtaining professional certifications, online courses, and other types of one-off career-capital-building activities.
To name a few concrete examples of the kinds of applicants we’re open to funding, in no particular order:
- A final-year undergraduate student who wants to pursue a master’s or a PhD program in machine learning in order to contribute to technical research that helps mitigate risks from advanced artificial intelligence.
- An individual who wants to do an unpaid internship at a think tank focused on biosecurity, with the aim of pursuing a career dedicated to reducing global catastrophic biological risk.
- A former senior ML engineer at an AI company who wants to spend six months on independent study and career exploration in order to gain context on and investigate career options in AI risk mitigation.
- An individual who wants to attend law school or obtain an MPP, with the aim of working in government on policy issues relevant to improving the long-term future.
- A recent physics PhD who wants to spend six months going through a self-guided ML curriculum and working on projects in interpretability, in order to transition to contributing to technical research that helps mitigate risks from advanced AI systems.
- A software engineer who wants to spend the next three months doing independent study in order to gain relevant certifications for a career in information security, with the longer-term goal of working for an organization focused on reducing global catastrophic risk.
- An experienced management consultant who wants to spend three months exploring different ways to apply their skill set to reducing global catastrophic risk and applying to relevant jobs, with an eye to transitioning to a related career.
- A PhD graduate in an unrelated sub-area of computational biology who wants to spend four months getting up to speed on DNA synthesis screening in order to transition to working on this topic.
- A professor in machine learning, theoretical computer science, or another technical field who wants funding to take a one-year sabbatical to explore ways to contribute to technical AI safety or AI governance.
- An individual who wants to attend journalism school, with the aim of covering topics relevant to the long-term future (potentially among other important topics).