Gender role bias in AI algorithms

Gender role bias in AI algorithms

Should it surprise us that human biases find their way into human-designed AI algorithms trained using data sets of human artifacts?

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

0 0 votes
Article Rating
Notify of

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest Most Voted
Inline Feedbacks
View all comments

What is surprising is the biased data associations are bolstered by the machine learning programs. I question the ethics of misrepresenting the data “to change reality to make our systems perform in an aspirational way”. In philosophy this would be considered a noble lie? 😉

Would love your thoughts, please comment.x