December 8, 2021 / 1:00-2:00 p.m. ET
Expeditions in Experiential AI
As machine learning gets incorporated into more critical socio-technical systems, valid concerns have been raised about the potential for these models to perpetuate and entrench unjust social biases. The concerns are exacerbated by a lack of transparency: rarely do we know if a given application of machine learning was intentionally designed with fairness in mind, or tested to ensure that espoused fairness objectives are being achieved in practice. In this study, we partnered with Pymetrics, a startup that offers candidate screening services for employers, to conduct an algorithm audit of their machine learning pipeline. We discuss how we structured our audit, how we maintained independence as auditors, and the results of our audit.
Christo Wilson is an Associate Professor in the Khoury College of Computer Sciences at Northeastern University. He is a founding member of the Cybersecurity and Privacy Institute at Northeastern, and serves as director of the BS in Cybersecurity program. Professor Wilson’s research focuses on online security and privacy, with a specific interest in algorithmic auditing. Algorithmic auditing is an emerging, interdisciplinary area that uses experimental techniques to measure the black-box algorithmic systems that pervade daily life in order to increase transparency and accountability of these systems. His work is supported by the U.S. National Science Foundation, a Sloan Fellowship, the Mozilla Foundation, the Knight Foundation, the Russell Sage Foundation, the Democracy Fund, the Data Transparency Lab, the European Commission, Google, Pymetrics, and Verisign Labs.