Gender role bias in AI algorithms
Should it surprise us that human biases find their way into human-designed AI algorithms trained using data sets of human artifacts?
Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.
https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women?mbid=nl_82117_p2&CNDID=24258719
What is surprising is the biased data associations are bolstered by the machine learning programs. I question the ethics of misrepresenting the data “to change reality to make our systems perform in an aspirational way”. In philosophy this would be considered a noble lie? 😉