Previous studies have demonstrated that privacy issues in mobile apps often stem from the integration of third-party libraries (TPLs). To shed light on factors that contribute to these issues, we investigate the privacy-related configuration choices available to and made by Android app developers who incorporate the Facebook Android SDK and Facebook Audience Network SDK in their apps. We compile these Facebook SDKs' privacy-related settings and their defaults. Employing a multi-method approach that integrates static and dynamic analysis, we analyze more than 6,000 popular apps to determine whether the apps incorporate Facebook SDKs and, if so, whether and how developers modify settings. Finally, we assess how these settings align with the privacy practices that developers disclose in the apps’ privacy labels and policies. We observe widespread inconsistencies between practices and disclosures in popular apps. These inconsistencies often stem from privacy settings, including a substantial number of cases in which apps retain default settings over alternatives that offer greater privacy. We observe fewer possible compliance issues in potentially child-directed apps, but issues persist even in these apps. We discuss remediation strategies that SDK and TPL providers could employ to help developers, particularly developers with fewer resources who rely heavily on SDKs. Our recommendations include aligning default privacy settings with data minimization principles and other conservative practices and making privacy-related SDK information both easier to find and harder to miss.
more »
« less
This content will become publicly available on July 1, 2026
The Effect of Platform Policies on App Privacy Compliance: A Study of Child-Directed Apps
Over the past few years, the two dominant app platforms made major improvements to their policies surrounding child-directed apps. While prior work repeatedly demonstrated that privacy issues were prevalent in child-directed apps, it is unclear whether platform policies can lead child-directed apps to comply with privacy requirements, when laws alone have not. To understand the effect of recent changes in platform policies (e.g., whether they result in greater levels of compliance with applicable privacy laws), we conducted a large-scale measurement study of the privacy behaviors of 7,377 child-directed Android apps, as well as a follow-up survey with some of their developers. We observed a drastic decrease in the number of apps that transmitted personal data without verifiable parental consent and an increase in the number of apps that encrypted their transmissions using TLS. However, improper use of third-party SDKs still led to privacy issues (e.g., inaccurate disclosures in apps’ privacy labels). Our analysis of apps’ privacy practices over a period of a few months in 2023 and a comparison of our results with those observed a few years ago demonstrate gradual improvements in apps’ privacy practices over time. We discuss how app platforms can further improve their policies and emphasize the role of enforcement in making such policies effective.
more »
« less
- Award ID(s):
- 2247951
- PAR ID:
- 10591795
- Publisher / Repository:
- Proceedings on Privacy Enhancing Technologies
- Date Published:
- Journal Name:
- Proceedings on Privacy Enhancing Technologies
- Volume:
- 2025
- Issue:
- 3
- ISSN:
- 2299-0984
- Page Range / eLocation ID:
- 170 to 191
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
While the United States currently has no comprehensive privacy law, the Children’s Online Privacy Protection Act (“COPPA”) has been in effect for over twenty years. As a result, the study of compliance issues among child-directed online services can yield important lessons for future enforcement efforts and can be used to inform the design of future state and federal privacy laws designed to protect people of all ages. This Essay describes relevant research conducted to understand privacy compliance issues and how that has led the author to several recommendations for how privacy enforcement can be improved more generally. While these recommendations are informed by the study of child-directed services’ compliance with COPPA, they are applicable to future state and federal privacy laws aimed at protecting the general public (i.e., not just children). Despite evidence of thousands of COPPA violations (e.g., one study found evidence that a majority of child-directed mo-bile apps appeared to be violating COPPA in various ways), the Federal Trade Commission (“FTC”) and state attorneys general — the only entities with enforcement authority under the law — pursue few enforcement efforts each year. Despite having competent personnel, these organizations are heavily constrained and under-resourced — as a result, enforcement by regulators is simply not seen as a credible threat by software developers. Research has found that developers are much more concerned with apps being removed from app stores (i.e., due to enforcement of platforms’ terms of service) than with the largely theoretical threat of regulatory enforcement. Yet the burden of COPPA compliance largely rests on numerous individual app developers. Thus, shifting enforcement efforts to the far-fewer platforms that distribute the apps (and make representations about their privacy and security properties) and data recipients (who ultimately receive consumers’ identifiable data) would likely yield better outcomes for consumers, while allowing the FTC to better focus its enforcement efforts and have greater impact. Based on these observations, this Essay proposes a new enforcement framework. In this framework, compliance burdens are shifted away from the numerous individual online services to the fewer bigger players who are best positioned to comply: platforms and third-party data recipients. The FTC’s limited resources can then focus on those entities at the top of the data food chain. Enforcement targeting the other, more numerous, individual online services could be left to a novel mechanism that uses a private right of action to foster more robust industry self-regulation through FTC-approved certification programs.more » « less
-
Many households include children who use voice personal assistants (VPA) such as Amazon Alexa. Children benefit from the rich functionalities of VPAs and third-party apps but are also exposed to new risks in the VPA ecosystem. In this article, we first investigate “risky” child-directed voice apps that contain inappropriate content or ask for personal information through voice interactions. We build SkillBot—a natural language processing-based system to automatically interact with VPA apps and analyze the resulting conversations. We find 28 risky child-directed apps and maintain a growing dataset of 31,966 non-overlapping app behaviors collected from 3,434 Alexa apps. Our findings suggest that although child-directed VPA apps are subject to stricter policy requirements and more intensive vetting, children remain vulnerable to inappropriate content and privacy violations. We then conduct a user study showing that parents are concerned about the identified risky apps. Many parents do not believe that these apps are available and designed for families/kids, although these apps are actually published in Amazon’s “Kids” product category. We also find that parents often neglect basic precautions, such as enabling parental controls on Alexa devices. Finally, we identify a novel risk in the VPA ecosystem: confounding utterances or voice commands shared by multiple apps that may cause a user to interact with a different app than intended. We identify 4,487 confounding utterances, including 581 shared by child-directed and non-child-directed apps. We find that 27% of these confounding utterances prioritize invoking a non-child-directed app over a child-directed app. This indicates that children are at real risk of accidentally invoking non-child-directed apps due to confounding utterances.more » « less
-
The dominant privacy framework of the information age relies on notions of “notice and consent.” That is, service providers will disclose, often through privacy policies, their data collection practices, and users can then consent to their terms. However, it is unlikely that most users comprehend these disclosures, which is due in no small part to ambiguous, deceptive, and misleading statements. By comparing actual collection and sharing practices to disclosures in privacy policies, we demonstrate the scope of the problem. Through analysis of 68,051 apps from the Google Play Store, their corresponding privacy policies, and observed data transmissions, we investigated the potential misrepresentations of apps in the Designed For Families (DFF) program, inconsistencies in disclosures regarding third-party data sharing, as well as contradictory disclosures about secure data transmissions. We find that of the 8,030 DFF apps (i.e., apps directed at children), 9.1% claim that their apps are not directed at children, while 30.6% claim to have no knowledge that the received data comes from children. In addition, we observe that 10.5% of 68,051 apps share personal identifiers with third-party service providers, yet do not declare any in their privacy policies, and only 22.2% of the apps explicitly name third parties. This ultimately makes it not only difficult, but in most cases impossible, for users to establish where their personal data is being processed. Furthermore, we find that 9,424 apps do not use TLS when transmitting personal identifiers, yet 28.4% of these apps claim to take measures to secure data transfer. Ultimately, these divergences between disclosures and actual app behaviors illustrate the ridiculousness of the notice and consent framework.more » « less
-
The Android mobile platform supports billions of devices across more than 190 countries around the world. This popularity coupled with user data collection by Android apps has made privacy protection a well-known challenge in the Android ecosystem. In practice, app producers provide privacy policies disclosing what information is collected and processed by the app. However, it is difficult to trace such claims to the corresponding app code to verify whether the implementation is consistent with the policy. Existing approaches for privacy policy alignment focus on information directly accessed through the Android platform (e.g., location and device ID), but are unable to handle user input, a major source of private information. In this paper, we propose a novel approach that automatically detects privacy leaks of user-entered data for a given Android app and determines whether such leakage may violate the app's privacy policy claims. For evaluation, we applied our approach to 120 popular apps from three privacy-relevant app categories: finance, health, and dating. The results show that our approach was able to detect 21 strong violations and 18 weak violations from the studied apps.more » « less