SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
Ce diaporama a bien été signalé.
Activez votre essai gratuit de 30 jours pour accéder à une lecture illimitée
AI and Ethics - We are the guardians of our future
We're in charge, as data scientist and software engineers, of software that where mistakes have a very high price. It's time to look at the impact we have, how our software will be used and how to avoid creating unfair, biased and dangerous software
Transcription
1.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
2.
Photo: Rosemary Ketchum
GANG CRIME
CLASSIFICATION
Partially Generative Neural Networks For Gang
Crime Classification With Partial Information
3.
Photo: Perry Wilson
42
BABIES
28
ADMITTED
GANG MEMBERS
31.
Photo: Fadil Elmansour
AREA W. HIGH CRIME
SEND MORE POLICE
DO MORE ARRESTS
APPEARS TO HAVE MORE CRIME
SEND MORE POLICE
DO MORE ARRESTS
RUNAWAY FEEDBACK LOOP
35.
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
36.
CONSIDER HOW
THE TECHNOLOGY
COULD BE USED
Photo: Pixabay
45.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
Notes de l'éditeur
23% no proof Living on a certain block can label you as Crip or Hells Angel Used to justify arrest, support maximum penalties No right to know if you’re on it, or to challenge it Used for background checks for jobs
Quote from a song by Tom Lehrer
The AI Gaydar paper
Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images
50 clicks on facebook Used by Cambridge analytica
81% men, 74% women Data from dating websites
Fluidity, what determines someones preferences? Bi? Bi-curious? outspoken
Long tail, criminal in k countries, death penalty,
Criminal or Not = 90%
1856 Images of Chinese Men 18-55, No facial hair or scars – faces cut out730 -- Criminals – ID From Police Department 235 Violent Crimes, 536 Theft, Fraud, Corruption, Forgery, Racketeering1126 -- Random images from internet – “corporate photos” as Non-Criminalsnote: 90%, SAME NETWORK ARCHITECTURE WAS ONLY ABLE TO PICK UP GENDER IN 86%
Criminals Non-criminals
Criminal or Not = 90%
Gender w. AlexNet = 86%
Harm can further be classified into five types: stereotyping, recognition, denigration, under-representation and ex-nomination.
Classification is always a product of its time We are currently in the biggest experiment of classification in human history
Kate Crawford
Wouldn’t be able to tell one Wookie from the next Wife: Malla, Son: Lumpy
Areas with more crime get more policing => more crimes reported => considered worse neighborhood, more policing
Scrubbing to neutral What is neutral – gender, ethnicity, are people from lapponia counted as finnish, sexual orientation, political orientation? What about words like pregnancy, or color blindness? Queen? Who decides? Everyone is a majority somewhere and a minority somewhere else
We're in charge, as data scientist and software engineers, of software that where mistakes have a very high price. It's time to look at the impact we have, how our software will be used and how to avoid creating unfair, biased and dangerous software
Transcription
1.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
2.
Photo: Rosemary Ketchum
GANG CRIME
CLASSIFICATION
Partially Generative Neural Networks For Gang
Crime Classification With Partial Information
3.
Photo: Perry Wilson
42
BABIES
28
ADMITTED
GANG MEMBERS
31.
Photo: Fadil Elmansour
AREA W. HIGH CRIME
SEND MORE POLICE
DO MORE ARRESTS
APPEARS TO HAVE MORE CRIME
SEND MORE POLICE
DO MORE ARRESTS
RUNAWAY FEEDBACK LOOP
35.
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
FAIR AND INCLUSIVE
TRANSPARENT
ACCOUNTABLE
SAFE AND RELIABLE
SECURITY AND PRIVACY
36.
CONSIDER HOW
THE TECHNOLOGY
COULD BE USED
Photo: Pixabay
45.
WE ARE THE
GUARDIANS
OF OUR FUTURE
Tess Ferrandez
Photo: Porapak Apichodilok
Notes de l'éditeur
23% no proof Living on a certain block can label you as Crip or Hells Angel Used to justify arrest, support maximum penalties No right to know if you’re on it, or to challenge it Used for background checks for jobs
Quote from a song by Tom Lehrer
The AI Gaydar paper
Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images
50 clicks on facebook Used by Cambridge analytica
81% men, 74% women Data from dating websites
Fluidity, what determines someones preferences? Bi? Bi-curious? outspoken
Long tail, criminal in k countries, death penalty,
Criminal or Not = 90%
1856 Images of Chinese Men 18-55, No facial hair or scars – faces cut out730 -- Criminals – ID From Police Department 235 Violent Crimes, 536 Theft, Fraud, Corruption, Forgery, Racketeering1126 -- Random images from internet – “corporate photos” as Non-Criminalsnote: 90%, SAME NETWORK ARCHITECTURE WAS ONLY ABLE TO PICK UP GENDER IN 86%
Criminals Non-criminals
Criminal or Not = 90%
Gender w. AlexNet = 86%
Harm can further be classified into five types: stereotyping, recognition, denigration, under-representation and ex-nomination.
Classification is always a product of its time We are currently in the biggest experiment of classification in human history
Kate Crawford
Wouldn’t be able to tell one Wookie from the next Wife: Malla, Son: Lumpy
Areas with more crime get more policing => more crimes reported => considered worse neighborhood, more policing
Scrubbing to neutral What is neutral – gender, ethnicity, are people from lapponia counted as finnish, sexual orientation, political orientation? What about words like pregnancy, or color blindness? Queen? Who decides? Everyone is a majority somewhere and a minority somewhere else
Il semblerait que vous ayez déjà ajouté cette diapositive à .
Créer un clipboard
Vous avez clippé votre première diapositive !
En clippant ainsi les diapos qui vous intéressent, vous pourrez les revoir plus tard. Personnalisez le nom d’un clipboard pour mettre de côté vos diapositives.
Partager ce SlideShare
Offre spéciale pour les lecteurs de SlideShare
Juste pour vous: Essai GRATUIT de 60 jours dans la plus grande bibliothèque numérique du monde.
La famille SlideShare vient de s'agrandir. Profitez de l'accès à des millions de livres numériques, livres audio, magazines et bien plus encore sur Scribd.