Emilee Rader bio photo

Emilee Rader

Associate Professor @ the University of Wisconsin-Madison

CV Email

Explanations as Mechanisms for Supporting Algorithmic Transparency

by: Emilee Rader, Kelley Cotter and Janghee Cho

Abstract

Transparency can empower users to make informed choices about how they use an algorithmic decision-making system and judge its potential consequences. However, transparency is often conceptualized by the outcomes it is intended to bring about, not the specifics of mechanisms to achieve those outcomes. We conducted an online experiment focusing on how different ways of explaining Facebook’s News Feed algorithm might affect participants’ beliefs and judgments about the News Feed. We found that all explanations caused participants to become more aware of how the system works, and helped them to determine whether the system is biased and if they can control what they see. The explanations were less effective for helping participants evaluate the correctness of the system’s output, and form opinions about how sensible and consistent its behavior is. We present implications for the design of transparency mechanisms in algorithmic decision-making systems based on these results.

Reference

Emilee Rader, Kelley Cotter and Janghee Cho. “Explanations as Mechanisms for Supporting Algorithmic TransparencyCHI 2018. Montreal, Quebec, Canada. April 2018.

Download: PDF