skip to main content


Search for: All records

Creators/Authors contains: "Schaub, Florian"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Companies' privacy policies and their contents are being analyzed for many reasons, including to assess the readability, usability, and utility of privacy policies; to extract and analyze data practices of apps and websites; to assess compliance of companies with relevant laws and their own privacy policies, and to develop tools and machine learning models to summarize and read policies. Despite the importance and interest in studying privacy policies from researchers, regulators, and privacy activists, few best practices or approaches have emerged and infrastructure and tool support is scarce or scattered. In order to provide insight into how researchers study privacy policies and the challenges they face when doing so, we conducted 26 interviews with researchers from various disciplines who have conducted research on privacy policies. We provide insights on a range of challenges around policy selection, policy retrieval, and policy content analysis, as well as multiple overarching challenges researchers experienced across the research process. Based on our findings, we discuss opportunities to better facilitate privacy policy research, including research directions for methodologically advancing privacy policy analysis, potential structural changes around privacy policies, and avenues for fostering an interdisciplinary research community and maturing the field. 
    more » « less
    Free, publicly-accessible full text available October 1, 2024
  2. Free, publicly-accessible full text available May 1, 2024
  3. Workplaces are increasingly adopting emotion AI, promising benefits to organizations. However, little is known about the perceptions and experiences of workers subject to emotion AI in the workplace. Our interview study with (n=15) US adult workers addresses this gap, finding that (1) participants viewed emotion AI as a deep privacy violation over the privacy of workers’ sensitive emotional information; (2) emotion AI may function to enforce workers’ compliance with emotional labor expectations, and that workers may engage in emotional labor as a mechanism to preserve privacy over their emotions; (3) workers may be exposed to a wide range of harms as a consequence of emotion AI in the workplace. Findings reveal the need to recognize and define an individual right to what we introduce as emotional privacy, as well as raise important research and policy questions on how to protect and preserve emotional privacy within and beyond the workplace. 
    more » « less
  4. null (Ed.)
    Technology plays an increasingly salient role in facilitating intimate partner violence (IPV). Customer support at computer security companies are receiving cases that involve tech-enabled IPV but might not be well equipped to handle these cases. To assess customer support’s existing practices and identify areas for improvement, we conducted five focus groups with professionals who work with IPV survivors (n=17). IPV professionals made numerous suggestions, such as using trauma-informed language, avoiding promises to solve problems, and making referrals to resources and support organizations. To evaluate the practicality of these suggestions, we conducted four focus groups with customer support practitioners (n=11). Support practitioners expressed interest in training agents for IPV cases, but mentioned challenges in identifying potential survivors and frontline agents’ limited capacity to help. We conclude with recommendations for computer security companies to better address tech-enabled IPV through training support agents, tracking the prevalence of these cases, and establishing partnerships with IPV advocates. 
    more » « less
  5. null (Ed.)
    Increasingly, icons are being proposed to concisely convey privacy-related information and choices to users. However, complex privacy concepts can be difficult to communicate. We investigate which icons effectively signal the presence of privacy choices. In a series of user studies, we designed and evaluated icons and accompanying textual descriptions (link texts) conveying choice, opting-out, and sale of personal information — the latter an opt-out mandated by the California Consumer Privacy Act (CCPA). We identified icon-link text pairings that conveyed the presence of privacy choices without creating misconceptions, with a blue stylized toggle icon paired with “Privacy Options” performing best. The two CCPA-mandated link texts (“Do Not Sell My Personal Information” and “Do Not Sell My Info”) accurately communicated the presence of do-not-sell opt-outs with most icons. Our results provide insights for the design of privacy choice indicators and highlight the necessity of incorporating user testing into policy making. 
    more » « less
  6. Website privacy policies sometimes provide users the option to opt-out of certain collections and uses of their personal data. Unfortunately, many privacy policies bury these instructions deep in their text, and few web users have the time or skill necessary to discover them. We describe a method for the automated detection of opt-out choices in privacy policy text and their presentation to users through a web browser extension. We describe the creation of two corpora of opt-out choices, which enable the training of classifiers to identify opt-outs in privacy policies. Our overall approach for extracting and classifying opt-out choices combines heuristics to identify commonly found opt-out hyperlinks with supervised machine learning to automatically identify less conspicuous instances. Our approach achieves a precision of 0.93 and a recall of 0.9. We introduce Opt-Out Easy, a web browser extension designed to present available opt-out choices to users as they browse the web. We evaluate the usability of our browser extension with a user study. We also present results of a large-scale analysis of opt-outs found in the text of thousands of the most popular websites. 
    more » « less
  7. Trustworthy data repositories ensure the security of their collections. We argue they should also ensure the security of researcher and human subject data. Here we demonstrate the use of a privacy impact assessment (PIA) to evaluate potential privacy risks to researchers using the ICPSR’s Open Badges Research Credential System as a case study. We present our workflow and discuss potential privacy risks and mitigations for those risks. [This paper is a conference pre-print presented at IDCC 2020 after lightweight peer review.] 
    more » « less