New Feminist Studies in Audiovisual Industries| Algorithmic Gender Bias and Audiovisual Data: A Research Agenda

Authors

  • Miren Gutierrez University of Deusto Bilbao, Spain

Keywords:

algorithmic bias, big data, women, equality, audiovisual data, media

Abstract

Algorithms are increasingly used to offer jobs, loans, medical care, and other services, as well as to influence behavior. Decisions that create the algorithms, the data sets that feed them, and the outputs that result from algorithmic decision making, can be biased, potentially harming women and perpetuating discrimination. Audiovisual data are especially challenging because of online growth and their impact on women’s lives. While scholarship has acknowledged data divides and cases of algorithmic bias mostly in online texts, it has yet to explore the relevance of audiovisual content for gender algorithmic bias. Based on previous guidelines and literature on algorithmic bias, this article (a) connects different types of bias with factors and harmful outcomes for women; (b) examines challenges around the lack of clarity about which data are necessary for fair algorithmic decision making, the lack of understanding of how machine learning algorithms work, and the lack of incentives for corporations to correct bias; and (c) offers a research agenda to address algorithmic gender discrimination prevalent in audiovisual data. 

Author Biography

Miren Gutierrez, University of Deusto Bilbao, Spain

Associate ProfessorUniversity of DeustoBilbao, Spain

Downloads

Published

2021-01-06

Issue

Section

Special Sections