Gender role bias in AI algorithms

Should it surprise us that human biases find their way into human-designed AI algorithms trained using data sets of human artifacts?

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women?mbid=nl_82117_p2&CNDID=24258719

About Mark H

Information technologist, knowledge management expert, and writer. Academic background in knowledge management, social and natural sciences, information technologies, learning, educational technologies, and philosophy. Married with one adult child who's married and has a teenage daughter.

One thought on “Gender role bias in AI algorithms

  1. What is surprising is the biased data associations are bolstered by the machine learning programs. I question the ethics of misrepresenting the data “to change reality to make our systems perform in an aspirational way”. In philosophy this would be considered a noble lie? 😉

Leave a Reply