Research Area:  Machine Learning
Mitigating algorithmic bias during the development life cycle of AI-enabled software is crucial given that any bias in these algorithms is inherited by the software systems using them. At the Hot-off-the-Press GECCO track, we aim at disseminating our article Multi-objective search for gender-fair and semantically correct word embeddings. Applied Soft Computing, 2023 [5]. In this work, we exploit multi-objective search to strike an optimal balance between reducing gender bias and improving semantic correctness of word embedding models, which are at the core of many AI-enabled systems. Our results show that, while single-objective search approaches are able to reduce the gender bias of word embeddings, they also reduce their semantic correctness. On the other hand, multi-objective approaches are successful in improving both goals, in contrast to existing work which solely focuses on reducing gender bias. Our results show that multi-objective evolutionary approaches can be successfully exploited to address bias in AI-enable software systems, and we encourage the research community to further explore opportunities in this direction.
Keywords:  
Author(s) Name:  Max Hort, Rebecca Moussa, Federica Sarro
Journal name:  Applied Soft Computing
Conferrence name:  
Publisher name:  ACM Digital Library
DOI:  10.1145/3583133.3595847
Volume Information:  volume 83, Pages 23-24, (2023)
Paper Link:   https://dl.acm.org/doi/10.1145/3583133.3595847