Philosopher Dr. Laura Kim argues that algorithms making high-stakes decisions (loans, sentencing) should be explainable to those affected, even if this reduces predictive accuracy.
Which analysis would best support Kim's explainability argument?
Black-box algorithms denied loans to individuals with identical situations as approved applicants based on inexplicable factors; affected individuals had no way to understand or challenge decisions—undermining procedural justice rights regardless of aggregate accuracy
Algorithms are complex
Some decisions are important
Technology advances rapidly
Correct Answer: A
Choice A is the best answer. Arbitrary outcomes + no challenge mechanism proves procedural injustice.
đź’ˇ Strategy: Rights-based arguments need evidence of rights violation, not just efficiency concerns.