skip to main content


Title: How Developers Talk About Personal Data and What It Means for User Privacy: A Case Study of a Developer Forum on Reddit
While online developer forums are major resources of knowledge for application developers, their roles in promoting better privacy practices remain underexplored. In this paper, we conducted a qualitative analysis of a sample of 207 threads (4772 unique posts) mentioning different forms of personal data from the /r/androiddev forum on Reddit. We started with bottom-up open coding on the sampled posts to develop a typology of discussions about personal data use and conducted follow-up analyses to understand what types of posts elicited in-depth discussions on privacy issues or mentioned risky data practices. Our results show that Android developers rarely discussed privacy concerns when talking about a specific app design or implementation problem, but often had active discussions around privacy when stimulated by certain external events representing new privacy-enhancing restrictions from the Android operating system, app store policies, or privacy laws. Developers often felt these restrictions could cause considerable cost yet fail to generate any compelling benefit for themselves. Given these results, we present a set of suggestions for Android OS and the app store to design more effective methods to enhance privacy, and for developer forums(e.g., /r/androiddev) to encourage more in-depth privacy discussions and nudge developers to think more about privacy.  more » « less
Award ID(s):
1704087
NSF-PAR ID:
10285064
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of the ACM on Human-Computer Interaction
Volume:
4
Issue:
CSCW3
ISSN:
2573-0142
Page Range / eLocation ID:
1 to 28
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The European General Data Protection Regulation (GDPR) mandates a data controller (e.g., an app developer) to provide all information specified in Articles (Arts.) 13 and 14 to data subjects (e.g., app users) regarding how their data are being processed and what are their rights. While some studies have started to detect the fulfillment of GDPR requirements in a privacy policy, their exploration only focused on a subset of mandatory GDPR requirements. In this paper, our goal is to explore the state of GDPR-completeness violations in mobile apps' privacy policies. To achieve our goal, we design the PolicyChecker framework by taking a rule and semantic role based approach. PolicyChecker automatically detects completeness violations in privacy policies based not only on all mandatory GDPR requirements but also on all if-applicable GDPR requirements that will become mandatory under specific conditions. Using PolicyChecker, we conduct the first large-scale GDPR-completeness violation study on 205,973 privacy policies of Android apps in the UK Google Play store. PolicyChecker identified 163,068 (79.2%) privacy policies containing data collection statements; therefore, such policies are regulated by GDPR requirements. However, the majority (99.3%) of them failed to achieve the GDPR-completeness with at least one unsatisfied requirement; 98.1% of them had at least one unsatisfied mandatory requirement, while 73.0% of them had at least one unsatisfied if-applicable requirement logic chain. We conjecture that controllers' lack of understanding of some GDPR requirements and their poor practices in composing a privacy policy can be the potential major causes behind the GDPR-completeness violations. We further discuss recommendations for app developers to improve the completeness of their apps' privacy policies to provide a more transparent personal data processing environment to users. 
    more » « less
  2. Abstract: Health data is considered to be sensitive and personal; both governments and software platforms have enacted specific measures to protect it. Consumer apps that collect health data are becoming more popular, but raise new privacy concerns as they collect unnecessary data, share it with third parties, and track users. However, developers of these apps are not necessarily knowingly endangering users’ privacy; some may simply face challenges working with health features. To scope these challenges, we qualitatively analyzed 269 privacy-related posts on Stack Overflow by developers of health apps for Android- and iOS-based systems. We found that health-specific access control structures (e.g., enhanced requirements for permissions and authentication) underlie several privacy-related challenges developers face. The specific nature of problems often differed between the platforms, for example additional verification steps for Android developers, or confusing feedback about incorrectly formulated permission scopes for iOS. Developers also face problems introduced by third-party libraries. Official documentation plays a key part in understanding privacy requirements, but in some cases, may itself cause confusion. We discuss implications of our findings and propose ways to improve developers’ experience of working with health-related features -- and consequently to improve the privacy of their apps’ end users. 
    more » « less
  3. The dominant privacy framework of the information age relies on notions of “notice and consent.” That is, service providers will disclose, often through privacy policies, their data collection practices, and users can then consent to their terms. However, it is unlikely that most users comprehend these disclosures, which is due in no small part to ambiguous, deceptive, and misleading statements. By comparing actual collection and sharing practices to disclosures in privacy policies, we demonstrate the scope of the problem. Through analysis of 68,051 apps from the Google Play Store, their corresponding privacy policies, and observed data transmissions, we investigated the potential misrepresentations of apps in the Designed For Families (DFF) program, inconsistencies in disclosures regarding third-party data sharing, as well as contradictory disclosures about secure data transmissions. We find that of the 8,030 DFF apps (i.e., apps directed at children), 9.1% claim that their apps are not directed at children, while 30.6% claim to have no knowledge that the received data comes from children. In addition, we observe that 10.5% of 68,051 apps share personal identifiers with third-party service providers, yet do not declare any in their privacy policies, and only 22.2% of the apps explicitly name third parties. This ultimately makes it not only difficult, but in most cases impossible, for users to establish where their personal data is being processed. Furthermore, we find that 9,424 apps do not use TLS when transmitting personal identifiers, yet 28.4% of these apps claim to take measures to secure data transfer. Ultimately, these divergences between disclosures and actual app behaviors illustrate the ridiculousness of the notice and consent framework. 
    more » « less
  4. null (Ed.)
    Traditional parental control applications designed to protect children and teens from online risks do so through parental restrictions and privacy-invasive monitoring. We propose a new approach to adolescent online safety that aims to strike a balance between a teen’s privacy and their online safety through active communication and fostering trust between parents and children. We designed and developed an Android “app” called Circle of Trust and conducted a mixed methods user study of 17 parent-child pairs to understand their perceptions about the app. Using a within-subjects experimental design, we found that parents and children significantly preferred our new app design over existing parental control apps in terms of perceived usefulness, ease of use, and behavioral intent to use. By applying a lens of Value Sensitive Design to our interview data, we uncovered that parents and children who valued privacy, trust, freedom, and balance of power preferred our app over traditional apps. However, those who valued transparency and control preferred the status quo. Overall, we found that our app was better suited for teens than for younger children. 
    more » « less
  5. In-app privacy notices can help smartphone users make informed privacy decisions. However, they are rarely used in real-world apps, since developers often lack the knowledge, time, and resources to design and implement them well. We present Honeysuckle, a programming tool that helps Android developers build in-app privacy notices using an annotation-based code generation approach facilitated by an IDE plugin, a build system plugin, and a library. We conducted a within-subjects study with 12 Android developers to evaluate Honeysuckle. Each participant was asked to implement privacy notices for two popular open-source apps using the Honeysuckle library as a baseline as well as the annotation-based approach. Our results show that the annotation-based approach helps developers accomplish the task faster with significantly lower cognitive load. Developers preferred the annotation-based approach over the library approach because it was much easier to learn and use and allowed developers to achieve various types of privacy notices using a unified code format, which can enhance code readability and benefit team collaboration. 
    more » « less