Emilee Rader bio photo

Emilee Rader

Associate Professor @ the University of Wisconsin-Madison

CV Email

Projects

Managing Privacy of Derived Data

Sensors, usage logs, and other forms of automated collection of personal data are becoming harder if not impossible to avoid; from personal health and fitness trackers to city-wide surveillance cameras to web application and server logs, they are a pervasive aspect of the physical and digital infrastructure around us. In isolation, sensor data may seem non-sensitive and harmless; however, aggregation produces derived data consisting of new insights and inferences that are not obvious to users and can be surprising, unsettling or harmful when used for unexpected purposes.

Consent (notice and choice) is the typical framework for data sharing rights and permissions regarding technology use. But when sensor data collection is automatic and requires no manual interaction from the user, it is difficult to imagine how people can be making informed decisions about their preferences. I am studying how to help people become better able to recognize situations when their behaviors produce data that might be used to infer information they and others might prefer not to reveal, and how to design mechanisms that provide support for coordination and social awareness about acceptable uses of derived data.

This project was supported by National Science Foundation under Grant No. CNS-1524296.

  • Emilee Rader. “Data Privacy and Pluralistic IgnoranceSymposium on Usable Privacy and Security (SOUPS). 2023. ( Abstract, PDF )

  • Emilee Rader. “Normative and Non-Social Beliefs about Sensor Data: Implications for Collective Privacy ManagementSymposium on Usable Privacy and Security (SOUPS). 2022. ( Abstract, Link, PDF )

  • Hautea, S., Nthala, N., Kollig, F., Ferraz, J.M., and Rader, E.. ““Assertive driver, I can imagine that”: Interpretations of Inferences from Driving Data.” Poster in 2021 Symposium on Usable Privacy and Security. 2021. ( Abstract, PDF, Poster )

  • Nthala, N. and Rader, E.. “Towards a Conceptual Model for Provoking Privacy Speculation.” Poster in Poster presented at the 2020 Symposium on Usable Privacy and Security. 2020. ( Abstract, Link, PDF, Poster )

  • Emilee Rader, Samantha Hautea and Anjali Munasinghe. “I Have a Narrow Thought Process: Constraints on Explanations Connecting Inferences and Self-PerceptionsSymposium on Usable Privacy and Security (SOUPS). 2020. [IAPP SOUPS Privacy Award] ( Abstract, Link, PDF )

  • Hautea, S., Munasinghe, A., and Rader, E.. “That’s Not Me: Surprising Algorithmic Inferences.” Poster in Extended Abstracts of the 2020 CHI Conference On Human Factors In Computing Systems. 2020. DOI: 10.1145/3334480.3382816. ( Abstract, PDF, Poster )

  • Nthala, N. and Rader, E.. “Towards a Conceptual Model for Provoking Privacy Speculation.” Poster in Extended Abstracts of the 2020 CHI Conference On Human Factors In Computing Systems. 2020. DOI: 10.1145/3334480.3382816. ( Abstract, PDF )

  • Emilee Rader and Janine Slaker. “The Importance of Visibility for Folk Theories of Sensor DataSymposium on Usable Privacy and Security (SOUPS). Santa Clara, CA. July 2017. ( Abstract, Link, PDF )

  • Yumi Jung and Emilee Rader. “The Imagined Audience and Privacy Concern on Facebook: Differences Between Producers and ConsumersSocial Media + Society. Vol. 2 No. 2 2016. DOI: 10.1177/2056305116644615. ( Abstract, Link, PDF )

Misdirected Email

Email is an essential tool for communication and social interaction that has been repurposed over the years as a broadcast medium connecting businesses with their customers, and as an authentication mechanism. These uses are enabled by the fact that the only barrier to reaching someone by email is knowing their email address. This feature has made email an attractive platform for scammers and a vector for security threats, but it also has another side-effect that is becoming increasingly common: misdirected email, or legitimate emails that are intended for somebody else but are sent to the wrong recipient. At best, misdirected email contributes to email overload; but at worst, sensitive private information can be revealed via misdirected email to unintended third parties. I am studying what causes misdirected email to be sent, how people on the receiving end deal with it, and what might be done to help both senders and recipients prevent it.

Algorithmic Curation in Social Media

Socio-technical systems provide access to ever-increasing quantities of information online. To help people cope with information overload, these systems implement algorithmic curation: automated selection of what content should be displayed to users, what should be hidden, and in what order it should be presented. Virtually every Internet user who reads online news, visits social media sites, or uses a search engine has encountered algorithmic curation at some point, probably without even realizing it.

Personalization algorithms are a necessary and beneficial part of the infrastructure of a socio-technical system. But, because algorithmic curation is invisible, users do not know the extent to which their choices about what they might read or who they might communicate with in social media are constrained. For example, effects of a feedback loop inherent in the rules that prioritize Facebook posts for display could have unintended effects on which Facebook Friends those users stay in touch with. How might properties of human communication and behavior interact with filtering algorithms to shape information access and use in increasingly connected and automated online environments?

This project was supported by National Science Foundation under Grant No. IIS-1217212.

  • Emilee Rader, Kelley Cotter and Janghee Cho. “Explanations as Mechanisms for Supporting Algorithmic TransparencyCHI 2018. Montreal, Quebec, Canada. April 2018. DOI: 10.1145/3173574.3173677. ( Abstract, PDF, ACM DL, Appendix )

  • Kelley Cotter, Janghee Cho, and Emilee Rader. “Explaining the News Feed Algorithm: An Analysis of the News Feed FYI Blog.” Poster in CHI 2017 Extended Abstracts. Denver, CO. May 2017. DOI: 10.1145/3027063.3053114. ( Abstract, PDF )

  • Emilee Rader. “Examining User Surprise as a Symptom of Algorithmic FilteringInternational Journal of Human-Computer Studies. Vol. 98 pp. 72-88. 2017. DOI: 10.1016/j.ijhcs.2016.10.005. ( Abstract, Link, PDF )

  • Emilee Rader and Rebecca Gray. “Understanding User Beliefs About Algorithmic Curation in the Facebook News FeedCHI 2015. Seoul, Korea. April 2015. DOI: 10.1145/2702123.2702174. ( Abstract, PDF, ACM DL )

  • Emilee Rader, Velasquez, A., Hales, K., and Kwok, H.. “The Gap Between Producer Intentions and Consumer Behavior in Social MediaProceedings of the 17th ACM international conference on Supporting group work (GROUP). Sanibel Island, FL. October 2012. DOI: 10.1145/2389176.2389213. ( Abstract, PDF, ACM DL )

Mental Models of Computer Security

Everyone who installs software or apps on a computing device, uses email and social media, and surfs the web makes computer security decisions as part of these activities, whether or not they are aware they are doing so. However, it is very difficult for users to learn about the consequences of these decisions. Users may not know when an action has an adverse security consequence, because outcomes are often delayed or invisible.

For example, when a user clicks on a shady link in an email message, he may not immediately recognize this as a risky activity. Later on, if he learns that his account has been compromised, he may not be able to associate that feedback with the action that triggered the breach. If users can’t learn from direct feedback as part of their experiences, then how do they develop the mental models they use to make security-related decisions, and how do these mental models correspond with security-related behaviors?

This project was supported by National Science Foundation under Grant No. CNS-1115926.

  • Wash, R. and Rader, E.. “Prioritizing security over usability: Strategies for how people choose passwords.” Poster in 2021 Symposium on Usable Privacy and Security. 2021. ( Poster )

  • Rick Wash and Emilee Rader. “Prioritizing Security over Usability: Strategies for How People Choose PasswordsJournal of Cybersecurity. Vol. 7 No. 1 2021. DOI: 10.1093/cybsec/tyab012. ( Abstract, Data )

  • Rick Wash, Emilee Rader, and Chris Fennell. “Can People Self-Report Security Accurately? Agreement Between Self-Report and Behavioral MeasuresCHI 2017. Denver, CO. May 2017. DOI: 10.1145/3025453.3025911. ( Abstract, PDF, ACM DL, Data )

  • Emilee Rader and Rick Wash. “Influencing Mental Models of Security.” Poster in NSF Secure and Trustworthy Cyberspace PI Meeting. Arlington, VA. January 2017. ( PDF )

  • Rick Wash and Emilee Rader. “Human Interdependencies in Security Systems.” CCC Visioning Workshop on Grand Challenges in Sociotechnical Cybersecurity. 2016. ( PDF )

  • Rick Wash, Emilee Rader, Ruthie Berman, and Zac Wellmer. “Understanding Password Choices: How Frequently Entered Passwords are Re-used Across WebsitesSymposium on Usable Privacy and Security (SOUPS). Denver, CO. June 2016. ( Abstract, Link, PDF, Data )

  • Emilee Rader and Rick Wash. “Identifying Patterns in Informal Sources of Security InformationJournal of Cybersecurity. Vol. 1 No. 1 2015. DOI: 10.1093/cybsec/tyv008. ( Abstract, Link, PDF )

  • Rick Wash and Emilee Rader. “Too Much Knowledge? Security Beliefs and Protective Behaviors Among United States Internet UsersSymposium on Usable Privacy and Security (SOUPS). Ottawa, Canada. July 2015. ( Abstract, Link, PDF, Data )

  • Katie Hoban, Emilee Rader, Rick Wash, and Kami Vaniea. “Computer Security Information in Stories, News Articles, and Education Documents.” Poster in Symposium on Usable Privacy and Security (SOUPS). July 2014. [Distinguished Poster Award] ( PDF )

  • Rick Wash, Emilee Rader, Kami Vaniea, and Michelle Rizor. “Out of the Loop: How Automated Software Updates Cause Unintended Security ConsequencesSymposium on Usable Privacy and Security (SOUPS). Menlo Park, CA. July 2014. ( Abstract, Link, PDF )

  • Kami Vaniea, Emilee Rader and Rick Wash. “Mental Models of Software Updates.” International Communication Association. Seattle, WA. May 2014. ( Abstract, PDF )

  • Kami Vaniea, Emilee Rader, and Rick Wash. “Betrayed By Updates: How Negative Experiences Affect Future SecurityCHI 2014. Toronto, Canada. April 2014. DOI: 10.1145/2556288.2557275. ( Abstract, PDF, ACM DL )

  • Rizor, M., Vaniea, K., Rader, E., Wash, R.. “Out of the Loop: How Automated Software Updates Cause Unintended Security Consequences.” Poster in MSU Cyberinfrastructure Days 2013. East Lansing, MI. October 2013. [Winner of the 2nd place prize for Best Poster] ( Poster )

  • Rick Wash and Emilee Rader. “Folk Models of Home Computer Security.” In The Death of the Internet, Edited by Markus Jacobsson. Wiley. June 2012. ISBN 978-1118062418 ( Link )

  • Emilee Rader, Rick Wash and Brandon Brooks. “Stories as Informal Lessons About SecuritySymposium on Usable Privacy and Security (SOUPS). Washington, DC. July 2012. DOI: 10.1145/2335356.2335364. ( Abstract, PDF, ACM DL, Data )

  • Rick Wash and Emilee Rader. “Influencing Mental Models of SecurityProceedings of the New Security Paradigms Workshop (NSPW). Marshall, CA. September 2011. DOI: 10.1145/2073276.2073283. ( Abstract, PDF, ACM DL )