WHEN GOOD ALGORITHMS GO SEXIST: WHY AND HOW TO ADVANCE AI GENDER EQUITY
By Genevieve Smith, Associate Director | Center for Equity, Gender & Leadership (UC Berkeley Haas School of Business), and Ishita Rustagi, Analyst | Center for Equity, Gender & Leadership (UC Berkeley Haas School of Business). Genevieve Smith and Ishita Rustagi are Members of the Women4AI Daring Circle.
Many institutions make decisions based on artificial intelligence (AI) systems using machine learning (ML), whereby a series of algorithms takes and learns from massive amounts of data to find patterns and make predictions. Yet gender bias in these systems is pervasive. Our analysis finds that gender-biased AI has profound impacts on women’s psychological, economic, and health security. It can also reinforce and amplify existing harmful gender stereotypes and prejudices.
Social change leaders and ML systems
developers alike must ask: How can we build gender-smart AI to advance
gender equity, rather than embed and scale gender bias? There are roles for
social change leaders, as well as leaders at organizations developing ML
systems.
Read about the 7 actions for social change
leaders and ML developers to make AI gender-smart, as well as new analysis on
the impacts of gender-biased AI systems in our article published with Stanford
Social Innovation Review – HERE.
Many institutions make decisions based on artificial intelligence (AI) systems using machine learning (ML), whereby a series of algorithms takes and learns from massive amounts of data to find patterns and make predictions. Yet gender bias in these systems is pervasive.
A better representation of women and diverse mindsets in the development and deployment of AI would reduce the gender stereotypes and the negative consequences they lead to. If that is not the case, evidence suggests that by 2022, 85% of AI projects will deliver erroneous outcomes due to bias.
Professor Gina Neff is a Senior Research Fellow and Associate Professor at the Oxford Internet Institute and the Department of Sociology at the University of Oxford.
Founder and CEO, Brave14 Capital; Author of ‘The Young Aspiring Entrepreneur: A 14-year Old Girl from Silicon Valley shares how to Overcome Age and Gender Barriers.’
Kate Glazebrook is cofounder and Head of Research at Applied, a web platform that uses the best behavioural research to help organisations find the best person for the job regardless of their background to make hiring smart, fair, and easy.