Module 2 - Justice in Practice
Lesson 9
Technology and Justice: Who Controls the Code?
Technology and Justice: Who Controls the Code?
Guiding Questions
• Can algorithms be just?
• Who is responsible when machines make unjust decisions?
• Is justice evolving — or disappearing — in the digital age?
Justice by Algorithm?
Today, technology doesn’t just assist the justice system — it shapes it. Across the world:
• Predictive policing tools decide where police patrol
• AI software recommends prison sentences
• Facial recognition tools identify — and often misidentify — suspects
• Social media influences juries, elections, and public perception
• Mass surveillance monitors behavior, often without consent
Technology is no longer separate from justice — it is embedded in it. The key question becomes: who programs the rules, and who pays the price?
Invisible Bias, Visible Harm
Many assume algorithms are neutral — machines that follow logic. But all algorithms are made by humans. And humans carry bias, assumptions, and power.
Some real-world examples:
• COMPAS, a “risk score” tool, was found to give higher danger ratings to Black defendants than white ones — for identical crimes.
• Facial recognition is far less accurate for women and people of color, leading to wrongful arrests.
• Hiring algorithms, trained on biased data, often replicate discrimination rather than eliminate it.
These systems are often proprietary and closed to public review — what some call “black-box justice.” If you can’t see the rules, how can you challenge them?
Philosophical Perspectives
John Rawls might ask: Would you approve of algorithmic justice if you didn’t know your race, class, or gender? If the answer is no, the system fails the “veil of ignorance” test.
Michel Foucault warned that modern surveillance systems don’t just watch — they control. Digital tools may be the new “panopticon,” always watching, always judging.
Langdon Winner argued that technology can carry politics. A machine isn’t neutral — it can encode power, advantage, and exclusion.
Two Perspectives on Technology and Justice
Some believe that algorithms are a sign of progress. They can make systems faster, cheaper, and — potentially — more objective than human judges.
Others see them as a threat. Automation can hide bias behind a screen, reduce accountability, and centralize control in the hands of the few.
The truth may be somewhere in between — and it depends on how technology is designed, used, and governed.
A Thought Experiment
Imagine you’re arrested.
Instead of a judge, an algorithm determines whether you get bail, how long your sentence is, and whether you can appeal.
You don’t get to know how the system made its decision. No explanation. No questions allowed.
Would you trust that system?
Would you call that justice?
Building Just Technology
If technology is to serve justice, it must be designed with justice in mind. Some key reforms include:
• Auditable algorithms – systems should be open to public inspection
• Human oversight – machines can assist, but not replace human judgment
• Ethical AI standards – enforceable by law, not just corporate guidelines
• Diverse design teams – reflecting the full range of human experience
• Right to explanation – people must know how decisions about them are made
• Digital literacy – all citizens should understand how tech affects their rights and freedoms
Technology can serve justice — but only if we demand it.
Reflect and Discuss
• Should artificial intelligence be used to decide who goes to jail or gets parole?
• When an algorithm causes harm, who should be held responsible?
• Can a justice system be fair if it becomes faster, cheaper — but less human?
Suggested Readings
• Weapons of Math Destruction – Cathy O’Neil
• Automating Inequality – Virginia Eubanks
• Do Artifacts Have Politics? – Langdon Winner
• The Age of Surveillance Capitalism – Shoshana Zuboff
• European Union – GDPR’s “Right to Explanation” provision
Next Lesson Preview
Lesson 10: The Environment and Justice – Who Speaks for Future Generations?
Does justice include those who haven’t been born yet — or species that cannot speak?
“Injustice coded in algorithms is still injustice — just faster, and harder to see.”