Connect with us

Hi, what are you looking for?

AI Education

BASE Fellowship Launches to Support Black Researchers in AI Safety, Governance, and Security

Black in AI Safety and Ethics launches the BASE Fellowship, a 12-week remote program for Black professionals in AI safety, security, and governance, closing applications January 9, 2026.

Black in AI Safety and Ethics (BASE) has announced the opening of applications for its inaugural BASE Fellowship, a twelve-week, part-time, fully remote program aimed at supporting Black researchers, professionals, and emerging talent in the fields of AI safety, AI security, and AI governance. The fellowship is set to run from April to July 2026, amidst increasing scrutiny over the governance and accountability of artificial intelligence systems.

As demand for expertise in AI oversight intensifies across educational, industrial, and governmental sectors, BASE emphasizes the fellowship as a vital pathway into research and policy-oriented roles. The program seeks to address significant gaps in representation within these crucial areas of technology and governance.

The BASE Fellowship is organized around three distinct tracks: AI alignment, AI security, and AI governance. Participants will first engage in a foundational curriculum that covers core principles of AI safety and ethics before moving into guided mentorship and culminating in a capstone research project tailored to their selected track. The technical tracks delve into critical subjects such as interpretability, oversight, adversarial robustness, and risk management. Conversely, the non-technical tracks focus on policy analysis, standards development, and systemic risk evaluation, with fellows expected to produce a research deliverable by the program’s conclusion.

In a LinkedIn post marking the fellowship’s launch, Lawrence Wagner emphasized the program’s commitment to helping Black scholars and professionals contribute substantively to the domains of AI safety, security, and governance. Wagner noted that the fellowship combines foundational training with structured mentorship and applicable research outcomes, aiming to foster a new generation of experts in these vital fields.

The application process is set to close on January 9, 2026, and will follow a multi-stage selection procedure. Initially, applicants will undergo a review process, with successful candidates invited to complete either a coding assessment or a work-based task, depending on their chosen track. Those advancing through these stages will then apply to specific mentor-led projects, culminating in final decisions set to be announced in early March.

BASE is positioning the fellowship as part of a broader initiative to cultivate a global community of Black practitioners and researchers committed to responsible AI deployment. The organization underscores the ongoing underrepresentation in AI governance and safety roles as a potential risk to developing equitable and accountable technologies.

As concerns about the ethical implications of AI systems continue to grow, the BASE Fellowship represents a proactive step towards diversifying the talent pool in this critical area. By investing in Black scholars and professionals, the program not only addresses issues of representation but also aims to enhance the overall integrity and fairness of AI research and application.

See also
David Park
Written By

At AIPressa, my work focuses on discovering how artificial intelligence is transforming the way we learn and teach. I've covered everything from adaptive learning platforms to the debate over ethical AI use in classrooms and universities. My approach: balancing enthusiasm for educational innovation with legitimate concerns about equity and access. When I'm not writing about EdTech, I'm probably exploring new AI tools for educators or reflecting on how technology can truly democratize knowledge without leaving anyone behind.

You May Also Like

AI Regulation

As the U.S. enacts the Cyber Incident Reporting for Critical Infrastructure Act, firms face 72-hour reporting mandates, elevating compliance costs and legal risks.

AI Government

India's DPDP Rules mandate transparent data handling and require deepfake creators to watermark content, as comprehensive AI Governance Guidelines reshape digital rights and innovation.

Top Stories

Google DeepMind partners with the UK government to establish a fully automated research lab by 2026, targeting breakthroughs in clean energy and nuclear fusion.

Top Stories

Shirish Gupta shifts from AI skepticism to integration, actively training AI for arbitration to enhance efficiency, reflecting a pivotal evolution in legal technology.

Top Stories

Japan's government unveils a $6.34B AI plan to boost startups through public-private foundation models and governance reforms, aiming for a competitive tech landscape.

Top Stories

UNCTAD reports AI market poised to soar to $4.8 trillion by 2033, highlighting the need for a robust ecosystem to ensure successful integration and...

AI Government

Japan's industry ministry is investing $6.3 billion over five years to propel private AI development, aiming to position the nation as a global technology...

Top Stories

Corporate giants like Tencent, Siemens, and Google are actively shaping global AI governance narratives, influencing standards and regulations across China, Germany, and the US.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.