Skip to main content

Automated Decision-Making and Relational Justice: Credit and Justice for Low-Income People

Patricia Cochran, Assistant Professor, Faculty of Law, University of Victoria

Freya Kodar, Associate Professor, Faculty of Law, University of Victoria

Published April 19, 2018 by Technologies of Justice.

On January 26, 2018, during the Technologies of Justice Conference hosted at the University of Ontario Institute of Technology, Patricia Cochran and Freya Kodar discussed automated decision-making and relational justice in a session entitled Terms of Data Mining and Justice Outcomes.

 

 

Cochran and Kodar presented the need for robust reflective interdisciplinary conversation in the context of high-cost online lending and income inequality on the regulation of algorithmic decision making. They explained the complexities of equality issues in both a human and algorithmic context.

The researchers discussed the role of algorithmic decision making in online lending and the justice and equality considerations that arise. They showed cases of automated processes relying on algorithms in software to make decisions of a complex nature (e.g. credit risk assessments) and how predictive inferences are made from large sets of data about human behaviour. They ask if these practices can be tested against legal norms in terms of equality and justice, and highlight the value of neutrality and non-objectivity in algorithmic decision making. They also pointed out potential issues that can arise from humans being unable to understand the basis on which algorithms make decisions.

The researchers showed how algorithmic decision making can affect the online lending environment specifically, but can sometimes also apply to brick-and-mortar lending situations, making things such as pre-authorized loans available completely online with very little human contact. They discussed the effect of these techniques on both high-cost and low-cost loans, and how data-driven algorithms for assessing credit risk can benefit lender and borrower, with algorithms opening the ability for credit acceptance for parties who may have otherwise been turned down.

Cochran and Kodar also pointed out some of the issues with this type of lending strategy, showing the lack of use of traditional credit reports and credit scores, and how this can cause consumer risks such as penalties for those with limited data footprints. They also discussed how there can be a technical bias that can become ingrained in this type of learning algorithm over time.

They leave us with final questions of:

  • What can we do with law in cases such as this?
  • What should we do?
  • How do we regulate algorithmic decision making?
  • What is the role of law?

Emphasizing the use of law in relational terms, they highlighted the need for more research and study on how we can use these algorithms to help subvert cultural and sociological stereotypes as well as inequalities and biases within loan and legal systems.