Most police searches today are authorized by citizens' consent, rather than probable cause or reasonable suspicion. The main constitutional limitation on so‐called “consent searches” is the voluntariness test: whether a reasonable person would have felt free to refuse the officer's request to conduct the search. We investigate whether this legal inquiry is subject to a systematic bias whereby uninvolved decision‐makers overstate the voluntariness of consent and underestimate the psychological pressure individuals feel to comply. We find evidence for a robust bias extending to requests, tasks, and populations that have not been examined previously. Across three pre‐registered experiments, we approached participants (“Experiencers”) with intrusive search requests and measured their behavioral compliance and self‐reported feelings of psychological freedom. Another group of participants (“Forecasters”) reported whether they would comply if hypothetically placed in the same situation. Study 1 investigated participants' willingness to allow experimenters access to their unlocked personal smartphones in order to read through the search histories on their web browsers—a private sphere where many individuals feel they have something to hide. Results revealed that whereas 27% of Forecasters reported they would permit such a search, 92% of Experiencers complied when asked. Study 2 replicated this underestimation‐of‐compliance effect when individuals were asked to permit a search of their purses, backpacks, and other bags—traditional searches not eligible for the heightened legal protection extended to digital devices. Study 3 replicated the gap between Forecasters' projections and Experiencers' behavior in a more representative sample, and found it persists even when participants' predictions are incentivized monetarily.
- Award ID(s):
- 1823661
- NSF-PAR ID:
- 10098560
- Date Published:
- Journal Name:
- The Yale law journal
- Volume:
- 128
- Issue:
- 7
- ISSN:
- 1939-8611
- Page Range / eLocation ID:
- 1962-2033
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
As data privacy continues to be a crucial human-right concern as recognized by the UN, regulatory agencies have demanded developers obtain user permission before accessing user-sensitive data. Mainly through the use of privacy policies statements, developers fulfill their legal requirements to keep users abreast of the requests for their data. In addition, platforms such as Android enforces explicit permission request using the permission model. Nonetheless, recent research has shown that service providers hardly make full disclosure when requesting data in these statements. Neither is the current permission model designed to provide adequate informed consent. Often users have no clear understanding of the reason and scope of usage of the data request. This paper proposes an unambiguous, informed consent process that provides developers with a standardized method for declaring Intent. Our proposed Intent-aware permission architecture extends the current Android permission model with a precise mechanism for full disclosure of purpose and scope limitation. The design of which is based on an ontology study of data requests purposes. The overarching objective of this model is to ensure end-users are adequately informed before making decisions on their data. Additionally, this model has the potential to improve trust between end-users and developers.more » « less
-
Abstract Having the means to share research data openly is essential to modern science. For human research, a key aspect in this endeavor is obtaining consent from participants, not just to take part in a study, which is a basic ethical principle, but also to share their data with the scientific community. To ensure that the participants' privacy is respected, national and/or supranational regulations and laws are in place. It is, however, not always clear to researchers what the implications of those are, nor how to comply with them. The Open Brain Consent (
https://open-brain-consent.readthedocs.io ) is an international initiative that aims to provide researchers in the brain imaging community with information about data sharing options and tools. We present here a short history of this project and its latest developments, and share pointers to consent forms, including a template consent form that is compliant with the EU general data protection regulation. We also share pointers to an associated data user agreement that is not only useful in the EU context, but also for any researchers dealing with personal (clinical) data elsewhere. -
Recent data protection regulations (notably, GDPR and CCPA) grant consumers various rights, including the right to access, modify or delete any personal information collected about them (and retained) by a service provider. To exercise these rights, one must submit a verifiable consumer request proving that the collected data indeed pertains to them. This action is straightforward for consumers with active accounts with a service provider at the time of data collection, since they can use standard (e.g., password-based) means of authentication to validate their requests. However, a major conundrum arises from the need to support consumers without accounts to exercise their rights. To this end, some service providers began requiring such accountless consumers to reveal and prove their identities (e.g., using government-issued documents, utility bills, or credit card numbers) as part of issuing a verifiable consumer request. While understandable as a short-term fix, this approach is cumbersome and expensive for service providers as well as privacy-invasive for consumers. Consequently, there is a strong need to provide better means of authenticating requests from accountless consumers. To achieve this, we propose VICEROY, a privacy-preserving and scalable framework for producing proofs of data ownership, which form a basis for verifiable consumer requests. Building upon existing web techniques and features, VICEROY allows accountless consumers to interact with service providers, and later prove that they are the same person in a privacy-preserving manner, while requiring minimal changes for both parties. We design and implement VICEROY with emphasis on security/privacy, deployability and usability. We also assess its practicality via extensive experiments.more » « less
-
null (Ed.)Browser users encounter a broad array of potentially intrusive practices: from behavioral profiling, to crypto-mining, fingerprinting, and more. We study people’s perception, awareness, understanding, and preferences to opt out of those practices. We conducted a mixed-methods study that included qualitative (n=186) and quantitative (n=888) surveys covering 8 neutrally presented practices, equally highlighting both their benefits and risks. Consistent with prior research focusing on specific practices and mitigation techniques, we observe that most people are unaware of how to effectively identify or control the practices we surveyed. However, our user-centered approach reveals diverse views about the perceived risks and benefits, and that the majority of our participants wished to both restrict and be explicitly notified about the surveyed practices. Though prior research shows that meaningful controls are rarely available, we found that many participants mistakenly assume opt-out settings are common but just too difficult to find. However, even if they were hypothetically available on every website, our findings suggest that settings which allow practices by default are more burdensome to users than alternatives which are contextualized to website categories instead. Our results argue for settings which can distinguish among website categories where certain practices are seen as permissible, proactively notify users about their presence, and otherwise deny intrusive practices by default. Standardizing these settings in the browser rather than being left to individual websites would have the advantage of providing a uniform interface to support notification, control, and could help mitigate dark patterns. We also discuss the regulatory implications of the findings.more » « less