SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
The spread of Artificial Intelligence, mostly in the form of representational learning, has introduced an issue that, at first glance, seems difficult to get fixed: bias.
Throughout the last years, more and more examples have emerged where people got mistaken by animals, bots became racists and machine learning solutions, chiefly used by recruitment companies and translation services, spot on the news with bias related issues.
In this presentation, we will see the impact bias has on people and how to fix it without having to dive deep into the data and remove it manually.
NEUTRALISING BIAS ON WORD
• Machine Learning Engineer at Quby;
• Coursera Mentor;
• City.AI Ambassador;
• School of AI Dean [Utrecht]
• IBM Watson AI XPRIZE contestant;
• Public speaker;
• Family man and father of 3.
How do you see racism?
• Before you proceed, please watch this video: https://www.youtube.com/watch?v=5F_atkP3pqs
• The audio is in Portuguese, but in the next slide you will ﬁnd translations for what people said in the
Source: Canal deTV da FAP (Astrojildo Pereira Foundation)
• Group 1
• He is late;
• She is a fashion designer;
• Holds an executive position in either the HR
or Finance area;
• Taking care of his garden. Doesn’t look like a
• She is cleaning her own house; the countertop;
• Grafﬁti artist; it’s an art, it’s not vandalism.
• Group II
• Vandalising the wall; she is a spitter;
• She is a housekeeper; cleaning the house;
• He is a gardener;
• He looks like a security guard or a
• Seamstress; saleswoman;
• He is running away; he is a thief.
• Blue is for boys, pink for girls.
• Boys are better at maths and science.
• Tall people make better leaders.
• New mothers are more absent from work
than new fathers.
• People with tattoos are rebellious.
• Younger people are better with technology
than older people.
–Joanna Bryson, University of Bath and Princeton University
"AI is just an extension of our existing culture.”
Racialized code & Unregulated algorithms
Joy Buolamwini, Code4Rights and MIT Media Lab Researcher.
How white engineers built racist code – and
why it's dangerous for black people
Both black and white Americans, for
example, are faster at associating names
like “Brad” and “Courtney” with words
like “happy” and “sunrise,” and names like
“Leroy” and “Latisha” with words like
“hatred” and “vomit” than vice versa.
Names like “Brett” and “Allison” were
more similar to those for positive words
including love and laughter, and those for
names like “Alonzo” and “Shaniqua” were
more similar to negative words like
“cancer” and “failure.”
How closely related the embeddings for
words like “hygienist” and “librarian” were
to those of words like “female” and
“woman.” It then compared this
computer-generated gender association
measure to the actual percentage of
women in that occupation.
A ⋅ B
Father (L2 norm): 5.31
Mother (L2 norm): 5.63
Similarity: d / p = 0.89
Car (L2 norm): 5.73
Bird (L2 norm): 4.83
Similarity: d / p = 0.21
Neutralising bias from non-gender speciﬁc
e ⋅ g
= e − ebias
Source: Bolukbasi et al., 2016, https://arxiv.org/pdf/1607.06520.pdf
Does it work?
• Cosine similarity between receptionist
and gender, before neutralising:
• Cosine similarity between receptionist
and gender, after neutralising:
Equalising gender-speciﬁc words
• Cosine similarity between actor and gender, before
• Cosine similarity between actress and gender, before
• Cosine similarity between actor and gender, after
• Cosine similarity between actress and gender, after
How far is actor from babysitter?
• Cosine similarity between actor and babysitter, before
• Cosine similarity between actress and babysitter, before
• Cosine similarity between actor and babysitter, after
• Cosine similarity between actress and babysitter, after
• Bolukbasi et al., 2016, https://arxiv.org/pdf/1607.06520.pdf
• Jeffrey Pennington, Richard Socher, and Christopher D. Manning, https://nlp.stanford.edu/projects/glove/