skip to main content

Search for: All records

Creators/Authors contains: "Schaub, Florian"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available April 29, 2023
  2. Technology plays an increasingly salient role in facilitating intimate partner violence (IPV). Customer support at computer security companies are receiving cases that involve tech-enabled IPV but might not be well equipped to handle these cases. To assess customer support’s existing practices and identify areas for improvement, we conducted five focus groups with professionals who work with IPV survivors (n=17). IPV professionals made numerous suggestions, such as using trauma-informed language, avoiding promises to solve problems, and making referrals to resources and support organizations. To evaluate the practicality of these suggestions, we conducted four focus groups with customer support practitioners (n=11). Supportmore »practitioners expressed interest in training agents for IPV cases, but mentioned challenges in identifying potential survivors and frontline agents’ limited capacity to help. We conclude with recommendations for computer security companies to better address tech-enabled IPV through training support agents, tracking the prevalence of these cases, and establishing partnerships with IPV advocates.« less
  3. Increasingly, icons are being proposed to concisely convey privacy-related information and choices to users. However, complex privacy concepts can be difficult to communicate. We investigate which icons effectively signal the presence of privacy choices. In a series of user studies, we designed and evaluated icons and accompanying textual descriptions (link texts) conveying choice, opting-out, and sale of personal information — the latter an opt-out mandated by the California Consumer Privacy Act (CCPA). We identified icon-link text pairings that conveyed the presence of privacy choices without creating misconceptions, with a blue stylized toggle icon paired with “Privacy Options” performing best. Themore »two CCPA-mandated link texts (“Do Not Sell My Personal Information” and “Do Not Sell My Info”) accurately communicated the presence of do-not-sell opt-outs with most icons. Our results provide insights for the design of privacy choice indicators and highlight the necessity of incorporating user testing into policy making.« less
  4. Website privacy policies sometimes provide users the option to opt-out of certain collections and uses of their personal data. Unfortunately, many privacy policies bury these instructions deep in their text, and few web users have the time or skill necessary to discover them. We describe a method for the automated detection of opt-out choices in privacy policy text and their presentation to users through a web browser extension. We describe the creation of two corpora of opt-out choices, which enable the training of classifiers to identify opt-outs in privacy policies. Our overall approach for extracting and classifying opt-out choices combinesmore »heuristics to identify commonly found opt-out hyperlinks with supervised machine learning to automatically identify less conspicuous instances. Our approach achieves a precision of 0.93 and a recall of 0.9. We introduce Opt-Out Easy, a web browser extension designed to present available opt-out choices to users as they browse the web. We evaluate the usability of our browser extension with a user study. We also present results of a large-scale analysis of opt-outs found in the text of thousands of the most popular websites.« less
  5. Trustworthy data repositories ensure the security of their collections. We argue they should also ensure the security of researcher and human subject data. Here we demonstrate the use of a privacy impact assessment (PIA) to evaluate potential privacy risks to researchers using the ICPSR’s Open Badges Research Credential System as a case study. We present our workflow and discuss potential privacy risks and mitigations for those risks. [This paper is a conference pre-print presented at IDCC 2020 after lightweight peer review.]