The Institute for Trustworthy AI in Law & Society (TRAILS) has announced 11 Broader Impact Awards designed to expand access, participation and understanding of trustworthy artificial intelligence.
Unveiled March 11, the awards—up to $25,000 each—will support seed projects that help diverse stakeholder communities engage with and influence the future of AI. TRAILS leaders say the funding is intended to spark grassroots initiatives connecting academia, industry and local communities while expanding access to AI education and governance tools.
“Our goal is to help close the loop among academia, industry and communities, all of whom are focused on harnessing the power of AI to advance the social good,” said Hal Daumé III, professor of computer science at the University of Maryland and director of TRAILS.
Advancing AI Literacy in Schools and Colleges
Several funded projects focus on improving AI literacy across K–12 and higher education settings.
“AI Learning Labs: Cross-Sector Seminars & Participatory Design Tools for Community Colleges,” led by Krishna Bista of Morgan State University, will launch a three-week online seminar series featuring TRAILS researchers and external experts. The seminars will engage community college and higher education leaders in applied learning and participatory design activities focused on responsible AI adoption. The project will also produce a curated repository of tools to guide ethical, context-sensitive AI implementation across diverse campuses.
“AI Literacy Codesign Workshops with Social Studies Teachers,” led by Sarah McGrew of UMD and Virginia Byrne of Morgan State, will convene middle and high school educators to develop lessons that help students critically evaluate AI-generated text and media. Through collaborative workshops, teachers will align AI literacy concepts with existing curricular standards and pilot classroom-ready materials.
Another education-focused initiative—“Better Together: Building AI Capacity Through Joint Learning for Mentors and Teacher Candidates,” led by Martha James of Morgan State—will host a four-day workshop pairing teacher candidates with mentor teachers. The program aims to build shared expertise in evaluating AI accuracy, fairness and risk, particularly in under-resourced schools.
In elementary education, “High-Dosage Human-Enabled AI Tutoring to Build Trust in Using AI to Teach SoR with Young Learners,” led by Valerie Riggs of Morgan State, will combine an adaptive AI reading tutor with educators who can provide emotional support in a 1:4 ratio. The project will measure gains in literacy, student engagement and trust in AI-assisted instruction while developing guidance for scaling human-centered AI in schools.
Engaging Youth, Families and Local Communities
The broader impact awards also prioritize outreach to younger learners and local communities.
“Digital Futures: AI and Business Analytics Workshops for Baltimore Youth,” led by Maxim Bushuev of Morgan State, will launch a summer camp for high school students in the greater Baltimore area. Building on previous programming that reached more than 200 students and counselors, the initiative connects AI and business analytics skills to real-world career pathways.
“Training Researchers in AI-Enabled Learning & Systems: A Summer Academy for G9–12 Students,” led by Ekundayo Shittu of George Washington University, will host a summer camp in Washington, D.C., where students will develop technical and analytical skills while exploring the social and ethical implications of AI. Through a scaffolded curriculum, participants will learn how data, mathematical models and AI systems support decision-making in engineering, business and public policy.
The trustworthy AI curricula for these camps are being co-designed alongside TRAILS’ annual AI Summer Camp at the University of Maryland, with the goal of eventually expanding the program to similar initiatives across the country.
Meanwhile, “AI for IA: Artificial Intelligence for Inclusion and Access,” led by Elizabeth Morgan of Morgan State, will work with families of children receiving special education services. Through workshops in California and Maryland, families will learn how AI tools can support advocacy related to individualized education plans (IEPs) and 504 plans. The project will also produce a publicly available AI for Family Advocacy Toolkit.
Another initiative—“AI-Powered Search: Pilot Education for Public Librarians,” led by Ryan O’Grady of UMD—will develop training programs to help librarians navigate emerging AI-driven search tools such as Google Gemini and Perplexity AI. Working with a Maryland county library system, the project will use focus groups and surveys to refine educational materials and better understand how librarians integrate AI-enhanced search tools into information retrieval.
“Creating an AI Trust Game Show,” led by Jordan Boyd-Graber of UMD, will help members of the public learn how to calibrate trust in AI through a game-show-style competition. Teams will craft questions, evaluate AI outputs and compare human strengths with machine capabilities, offering a public window into where AI systems succeed—and where they fall short.
Strengthening AI Oversight and Participatory Design
Additional projects address AI governance and participatory design.
“AI Governance Assessment Toolkit for Resource-Limited Organizations,” led by Brandeis Marshall, founder and CEO of DataedX Group, will develop an explainability-focused assessment tool to help small and midsize enterprises evaluate AI risks, strengthen safeguards and align digital tools with organizational values.
Rounding out the portfolio, “Exemplars of Participatory AI Success,” led by Katie Shilton and María Isabel Magaña from UMD, will create success metrics for participatory AI initiatives and evaluate projects from the TRAILS database and other National Science Foundation AI Institutes. The team will publish case studies and share findings with research and industry communities.
Collectively, the 11 awards reflect TRAILS’ mission to broaden participation in trustworthy AI and ensure that communities most affected by emerging technologies have a voice in shaping them.
By investing in educators, students, families, librarians and small businesses, TRAILS leaders hope to build sustainable pipelines of talent and informed stakeholders who can guide AI innovation responsibly.
“The Broader Impact Program is designed not only to support promising pilot projects, but also to build lasting connections that ensure AI technologies are developed with transparency, accountability and community trust at their core,” said David Broniatowski, professor of engineering management and systems engineering at GW and deputy director of TRAILS.
—Story by UMIACS communications group
About TRAILS: TRAILS is a coalition of four academic institutions—the University of Maryland, George Washington University, Morgan State University, and Cornell University—focused on transforming the field of AI from “tech first” to “people first,” where AI systems are developed and governed in a way that promotes human rights and serves the interests of the people who use them. TRAILS receives administrative and technical support from the University of Maryland Institute for Advanced Computer Studies (UMIACS) and from staff at George Washington University.