Gender role bias in AI algorithms

Should it surprise us that human biases find their way into human-designed AI algorithms trained using data sets of human artifacts?

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

https://www.wired.com/story/machines-taught-by-photos-learn-a-sexist-view-of-women?mbid=nl_82117_p2&CNDID=24258719

1
Leave a Reply

Please Login to comment
1 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
1 Comment authors
Brent Recent comment authors

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  Subscribe  
newest oldest most voted
Notify of
Brent
Member
Brent

What is surprising is the biased data associations are bolstered by the machine learning programs. I question the ethics of misrepresenting the data “to change reality to make our systems perform in an aspirational way”. In philosophy this would be considered a noble lie? 😉