CS 6006, October 2023
With Haym Hirsch and Anil Damle
Bias: prejudice, usually in a way considered to be unfair
(mathematical) systematic distortion of a statistical result
Prejudice: preconceived opinion
(legal) harm or injury resulting from some action or judgment
Designing recommendations which optimize engagement leads to over-recommending the most prevalent types (Steck, 2018)
rom-com 80% of the time
horror 20% of the time
optimize for probable click
$$\mathbb P(\mathsf{click})= \mathbb P(\mathsf{click}~\text{and}~\mathsf{romcom}) + \mathbb P(\mathsf{click}~\text{and}~\mathsf{horror}) $$
\(0.8\times \mathbb P(\mathsf{romcom})+0.2\times P(\mathsf{horror}) \)
\((1- \mathbb P(\mathsf{romcom}) )\)
Designing recommendations which optimize engagement leads to over-recommending the most prevalent types (Steck, 2018)
rom-com 80% of the time
horror 20% of the time
optimize for probable click
recommend rom-com 100% of the time
$$\max_{0\leq x\leq 1}0.8\times x + 0.2 \times (1-x)$$
Data-driven machine translation perpetuates gender bias.
Machine translation works by maximizing the probability of an English sentence given Hungarian sentence
\(0\)
\(1\)
(A)
(B)
(C)
send to recruiter
Data-driven resume screening algorithms repeat pre-existing bias
✅
❌
❌
records
resume
Risk algorithms used in healthcare under-predict the needs of Black patients compared with White patients (Obermeyer et al., 2019).
risk score
medical records
$$
$$$
$
Pattern recognition (i.e. machine learning) replicates and amplifies bias in the data
data
decisions
Naive optimization under uncertainty leads to miscalibration and bias
Technologies are developed and used within a particular social, economic, and political context. They arise out of a social structure, they are grafted on to it, and they may reinforce it or destroy it, often in ways that are neither foreseen nor foreseeable.”
Ursula Franklin, 1989
I think the computer has from the beginning been a fundamentally conservative force. It has made possible the saving of institutions pretty much as they were, which otherwise might have had to be changed. [...] If it had not been for the computer [...] it might have been necessary to introduce a social invention, as opposed to the technical invention.
Joseph Weizenbaum, 1985
The Fire Next Time James Baldwin (1963)
13th Ava DuVernay (2016)
I Am Not Your Negro Raoul Peck (2017)
Do Artifacts have Politics Langdon Winner (1980)
Bias in Computer Systems Batya Friedman and Helen Nissenbaum (1996)
The Fire Next Time James Baldwin (1963)
13th Ava DuVernay (2016)
I Am Not Your Negro Raoul Peck (2017)
Do Artifacts have Politics Langdon Winner (1980)
Bias in Computer Systems Batya Friedman and Helen Nissenbaum (1996)
The Fire Next Time James Baldwin (1963)
13th Ava DuVernay (2016)
I Am Not Your Negro Raoul Peck (2017)
Do Artifacts have Politics Langdon Winner (1980)
Bias in Computer Systems Batya Friedman and Helen Nissenbaum (1996)
The Fire Next Time James Baldwin (1963)
13th Ava DuVernay (2016)
I Am Not Your Negro Raoul Peck (2017)
Do Artifacts have Politics Langdon Winner (1980)
Bias in Computer Systems Batya Friedman and Helen Nissenbaum (1996)