Skip to main content
Ontario Tech acknowledges the lands and people of the Mississaugas of Scugog Island First Nation.

We are thankful to be welcome on these lands in friendship. The lands we are situated on are covered by the Williams Treaties and are the traditional territory of the Mississaugas, a branch of the greater Anishinaabeg Nation, including Algonquin, Ojibway, Odawa and Pottawatomi. These lands remain home to many Indigenous nations and peoples.

We acknowledge this land out of respect for the Indigenous nations who have cared for Turtle Island, also called North America, from before the arrival of settler peoples until this day. Most importantly, we acknowledge that the history of these lands has been tainted by poor treatment and a lack of friendship with the First Nations who call them home.

This history is something we are all affected by because we are all treaty people in Canada. We all have a shared history to reflect on, and each of us is affected by this history in different ways. Our past defines our present, but if we move forward as friends and allies, then it does not have to define our future.

Learn more about Indigenous Education and Cultural Services

Automated Decision-Making and Relational Justice: Credit and Justice for Low-Income People

Patricia Cochran, Assistant Professor, Faculty of Law, University of Victoria

Freya Kodar, Associate Professor, Faculty of Law, University of Victoria

Published April 19, 2018 by Technologies of Justice.

On January 26, 2018, during the Technologies of Justice Conference hosted at the University of Ontario Institute of Technology, Patricia Cochran and Freya Kodar discussed automated decision-making and relational justice in a session entitled Terms of Data Mining and Justice Outcomes.

 

 

Cochran and Kodar presented the need for robust reflective interdisciplinary conversation in the context of high-cost online lending and income inequality on the regulation of algorithmic decision making. They explained the complexities of equality issues in both a human and algorithmic context.

The researchers discussed the role of algorithmic decision making in online lending and the justice and equality considerations that arise. They showed cases of automated processes relying on algorithms in software to make decisions of a complex nature (e.g. credit risk assessments) and how predictive inferences are made from large sets of data about human behaviour. They ask if these practices can be tested against legal norms in terms of equality and justice, and highlight the value of neutrality and non-objectivity in algorithmic decision making. They also pointed out potential issues that can arise from humans being unable to understand the basis on which algorithms make decisions.

The researchers showed how algorithmic decision making can affect the online lending environment specifically, but can sometimes also apply to brick-and-mortar lending situations, making things such as pre-authorized loans available completely online with very little human contact. They discussed the effect of these techniques on both high-cost and low-cost loans, and how data-driven algorithms for assessing credit risk can benefit lender and borrower, with algorithms opening the ability for credit acceptance for parties who may have otherwise been turned down.

Cochran and Kodar also pointed out some of the issues with this type of lending strategy, showing the lack of use of traditional credit reports and credit scores, and how this can cause consumer risks such as penalties for those with limited data footprints. They also discussed how there can be a technical bias that can become ingrained in this type of learning algorithm over time.

They leave us with final questions of:

  • What can we do with law in cases such as this?
  • What should we do?
  • How do we regulate algorithmic decision making?
  • What is the role of law?

Emphasizing the use of law in relational terms, they highlighted the need for more research and study on how we can use these algorithms to help subvert cultural and sociological stereotypes as well as inequalities and biases within loan and legal systems.