We are developing methods to ensure that our machine-learning algorithms and data-driven products are fair, transparent, accountable, and useful.
The Research team develops powerful machine-learning algorithms that can improve the coverage and quality of Wikimedia projects and improve the user experience for readers and contributors. Wikimedia Foundation product teams as well as external researchers and community developers use the Research team's machine-learning resources to build tools to connect reader with articles that interest them, help new editors find content and collaborators, and support the volunteers who work to improve the quality of Wikipedia.
The aim of this project is to develop best practices and pilot new techniques to effectively prevent, identify, and address bias in the technologies we build, at every stage of the design process.
Over the past decade, scientists, journalists, policymakers, and citizens from around the world have grown increasingly concerned about the risks that machine learning and artificial intelligent (AI) systems present. Research has shown how these systems can violate personal privacy, discriminate against disadvantaged groups, perpetuate harmful biases, and disrupt social processes. These harmful impacts can often be traced back to the design of these systems—the data used to train them, the algorithmic techniques that allow the system to learn, the way the system is tested and refined, and the way information is presented to end users.
As a mission-driven organization that is committed to openness, transparency and social good, the Wikimedia Foundation has a mandate to ensure that the AI systems we build are ethical and human centered—that they reflect our values, empower our users, and improve our projects.
Ethical AI White PaperWe released a white paper that lays out some possible risk scenarios and process proposals for ethical and human-centered AI at Wikimedia.
Battle of the feedsWe tested whether mobile app users preferred a "top article" feed ranked by page views or editing trends. Overall, raters were more interested in, and more familiar with, articles that appeared in the "top read" list.
Ellery Wulczyn (Wikimedia Foundation)