Skip to content Skip to navigation

Londa Schiebinger Presents Research on Gender and Fairness in Machine Learning

Photo of Londa Schiebinger
Mar 4 2018

Posted In:

Announcements, Faculty, In the News

Former Clayman Institute Director Londa Schiebinger Analyzes Gender Biases and Cultural Assumptions in Algorithms

Former Clayman Institute Director and Current Faculty Research Fellow Londa Schiebinger discussed her latest research with the Gendered Innovations Project on how machine learning systems can inherit human biases and increase inequality. In her presentation, Schiebinger posed the question that has guided Gendered Innovations since its inception: “Can we harness the creative power of sex & gender analysis for discovery and innovation?” She turned to an examination of biases in machine learning to explore this question at the convening of fellows. Schiebinger pointed out, for example, that despite the belief that algorithms are value neutral, these automated processes can acquire human biases. When trained on historical data, for example, systems inherit unconscious gender bias from the past in ways that can amplify gender inequality in the future—even when governments, universities, and companies themselves have implemented policies to foster equality “Machine learning captures embedded gendered associations and often the conscious and unconscious bias embodied in them,” Schiebinger said. 

Gendered Innovations is devoted to positive solutions, such as the “hard de-biasing” of these technologies. Schiebinger gave the example of changes at Google that show how gendered analyses can change digital culture: In its early days, Google Translate unconsciously defalted to the masculine pronoun. As Google engineers became aware of this bias, they were able to move toward more gender inclusive language.

More information at the Clayman Institue website