Module 1: Foundations of Justice
Lesson 6
Justice and Technology — Algorithms, AI, and the End of Fairness?
Lesson 6
Justice and Technology — Algorithms, AI, and the End of Fairness?
1. Can a Machine Be Just?
In an era where algorithms decide everything — from who gets a loan to who goes to jail — we must ask:
Can code deliver justice?
Or is justice being quietly replaced by convenience?
2. What Is Algorithmic Decision-Making?
An algorithm is a set of instructions used to process data and make decisions. Today, algorithms are embedded in systems that:
• Predict how likely someone is to reoffend
• Recommend bail amounts or sentencing durations
• Screen job applications
• Flag suspected welfare fraud
• Select who gets searched or stopped at borders
These systems are often described as “neutral,” but in reality, they reflect the values and biases of those who create and train them.
3. Bias Built into the Machine
“Garbage in, garbage out.”
If an algorithm is trained on biased data, it will replicate and amplify that bias.
Examples:
• A predictive policing system trained on arrest data from over-policed neighborhoods will recommend more patrols in those same areas, reinforcing over-policing.
• Facial recognition software has been shown to misidentify people of color at higher rates than white individuals.
• Hiring algorithms trained on male-dominated résumés may learn to favor male candidates, perpetuating gender inequality.
Even when no human makes the final decision, injustice can hide within the code — disguised as mathematics.
4. Can Justice Be Automated?
Different theories of justice offer different perspectives on AI in law and punishment:
• Utilitarianism supports algorithms if they make decisions faster, cheaper, and reduce harm overall.
• Kantian Ethics opposes any system that treats people as data points instead of moral beings with inherent dignity.
• Rawlsian Theory is skeptical of opaque systems, especially if the people most affected cannot understand or challenge the results.
• Libertarianism resists algorithmic surveillance and data collection, viewing it as a violation of personal liberty.
• Critical Theory argues that algorithms often reproduce and entrench existing inequalities of race, gender, and class.
5. Case Study: COMPAS in the U.S. Courts
The COMPAS software was developed to predict the likelihood that a defendant would commit future crimes. It has been widely used to guide bail and sentencing decisions.
However, independent investigations found that COMPAS rated Black defendants as higher risk than white defendants, even when actual outcomes did not support the scores.
Adding to the controversy, the company that created COMPAS refused to reveal how it works, citing “trade secrets.”
“A secret formula decides your freedom. Is that justice?”
6. Global Risks
Algorithmic injustice is not confined to one country:
• In China, the “social credit” system scores citizens on behavior and restricts access to services based on their rating.
• In the U.S. and UK, predictive policing relies on flawed historical data, often reinforcing biased enforcement patterns.
• In Estonia and China, experimental AI judges are now making rulings in small claims cases — with no human judge present.
“When decisions become automated, accountability disappears.”
Discussion Questions
1. Should algorithms ever be allowed to make decisions in the justice system?
2. How can we ensure fairness and transparency in algorithmic tools?
3. Is algorithmic injustice more dangerous because it’s harder to see?
4. Who should be responsible when a biased algorithm causes harm — the coder, the user, or society?
Assignment (Optional)
Choose one real-world algorithm or AI system used today in law, employment, finance, or government.
In your response, answer:
• What is the tool and what does it do?
• Who uses it, and why?
• Has it been accused of bias or injustice?
• Would you trust your freedom, job, or health to this tool?
Use one or more theories of justice to defend or criticize the system.
Next Lesson Preview:
Lesson 7 – Justice and Identity: Race, Gender, and the Struggle for Equal Protection