5. Case study: Recidivism risk prediction
COMPAS recidivism prediction tool
– Built by a commercial company, Northpointe, Inc.
Estimates likelihood of criminals re-offending in future
– Inputs: Based on a long questionnaire
– Outputs: Used across US by judges and parole officers
Are COMPAS’ estimates fair to salient social groups?
5
Machine Bias: There’s software used across the
country to predict future criminals. Propublica
6. Case study: Recidivism risk prediction
6
Is the algorithm fair
to all groups?
When base rates differ, no non-trivial solution can achieve similar FPR,
FNR, FDR, FOR!
7. 7
Open invitation to join the P7003 working group
http://sites.ieee.org/sagroups-7003/
8. Key question when developing or
deploying an algorithmic system
8
Who will be affected?
What are the decision/optimization criteria?
How are these criteria justified?
Are these justifications acceptable in the context where the system is
used?
9. P7003 foundational sections
Taxonomy of Algorithmic Bias
Legal frameworks related to Bias
Psychology of Bias
Cultural aspects
9
P7003 algorithm development sections
Algorithmic system design stages
Person categorization and identifying affected population groups
Assurance of representativeness of testing/training/validation data
Evaluation of system outcomes
Evaluation of algorithmic processing
Assessment of resilience against external manipulation to Bias
Documentation of criteria, scope and justifications of choices