Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Over the past few years, the two dominant app platforms made major improvements to their policies surrounding child-directed apps. While prior work repeatedly demonstrated that privacy issues were prevalent in child-directed apps, it is unclear whether platform policies can lead child-directed apps to comply with privacy requirements, when laws alone have not. To understand the effect of recent changes in platform policies (e.g., whether they result in greater levels of compliance with applicable privacy laws), we conducted a large-scale measurement study of the privacy behaviors of 7,377 child-directed Android apps, as well as a follow-up survey with some of their developers. We observed a drastic decrease in the number of apps that transmitted personal data without verifiable parental consent and an increase in the number of apps that encrypted their transmissions using TLS. However, improper use of third-party SDKs still led to privacy issues (e.g., inaccurate disclosures in apps’ privacy labels). Our analysis of apps’ privacy practices over a period of a few months in 2023 and a comparison of our results with those observed a few years ago demonstrate gradual improvements in apps’ privacy practices over time. We discuss how app platforms can further improve their policies and emphasize the role of enforcement in making such policies effective.more » « lessFree, publicly-accessible full text available July 1, 2026
-
While the United States currently has no comprehensive privacy law, the Children’s Online Privacy Protection Act (“COPPA”) has been in effect for over twenty years. As a result, the study of compliance issues among child-directed online services can yield important lessons for future enforcement efforts and can be used to inform the design of future state and federal privacy laws designed to protect people of all ages. This Essay describes relevant research conducted to understand privacy compliance issues and how that has led the author to several recommendations for how privacy enforcement can be improved more generally. While these recommendations are informed by the study of child-directed services’ compliance with COPPA, they are applicable to future state and federal privacy laws aimed at protecting the general public (i.e., not just children). Despite evidence of thousands of COPPA violations (e.g., one study found evidence that a majority of child-directed mo-bile apps appeared to be violating COPPA in various ways), the Federal Trade Commission (“FTC”) and state attorneys general — the only entities with enforcement authority under the law — pursue few enforcement efforts each year. Despite having competent personnel, these organizations are heavily constrained and under-resourced — as a result, enforcement by regulators is simply not seen as a credible threat by software developers. Research has found that developers are much more concerned with apps being removed from app stores (i.e., due to enforcement of platforms’ terms of service) than with the largely theoretical threat of regulatory enforcement. Yet the burden of COPPA compliance largely rests on numerous individual app developers. Thus, shifting enforcement efforts to the far-fewer platforms that distribute the apps (and make representations about their privacy and security properties) and data recipients (who ultimately receive consumers’ identifiable data) would likely yield better outcomes for consumers, while allowing the FTC to better focus its enforcement efforts and have greater impact. Based on these observations, this Essay proposes a new enforcement framework. In this framework, compliance burdens are shifted away from the numerous individual online services to the fewer bigger players who are best positioned to comply: platforms and third-party data recipients. The FTC’s limited resources can then focus on those entities at the top of the data food chain. Enforcement targeting the other, more numerous, individual online services could be left to a novel mechanism that uses a private right of action to foster more robust industry self-regulation through FTC-approved certification programs.more » « less
-
This report documents the program and the outcomes of Dagstuhl Seminar "EU Cyber Resilience Act: Socio-Technical and Research Challenges" (24112). This timely seminar brought together experts in computer science, tech policy, and economics, as well as industry stakeholders, national agencies, and regulators to identify new research challenges posed by the EU Cyber Resilience Act (CRA), a new EU regulation that aims to set essential cybersecurity requirements for digital products to be permissible in the EU market. The seminar focused on analyzing the proposed text and standards for identifying obstacles in standardization, developer practices, user awareness, and software analysis methods for easing adoption, certification, and enforcement. Seminar participants noted the complexity of designing meaningful cybersecurity regulations and of aligning regulatory requirements with technological advancements, market trends, and vendor incentives, referencing past challenges with GDPR and COPPA adoption and compliance. The seminar also emphasized the importance of regulators, marketplaces, and both mobile and IoT platforms in eliminating malicious and deceptive actors from the market, and promoting transparent security practices from vendors and their software supply chain. The seminar showed the need for multi-disciplinary and collaborative efforts to support the CRA’s successful implementation and enhance cybersecurity across the EU.more » « lessFree, publicly-accessible full text available September 19, 2025
-
Like most modern software, secure messaging apps rely on third-party components to implement important app functionality. Although this practice reduces engineering costs, it also introduces the risk of inadvertent privacy breaches due to misconfiguration errors or incomplete documentation. Our research investigated secure messaging apps' usage of Google's Firebase Cloud Messaging (FCM) service to send push notifications to Android devices. We analyzed 21 popular secure messaging apps from the Google Play Store to determine what personal information these apps leak in the payload of push notifications sent via FCM. Of these apps, 11 leaked metadata, including user identifiers (10 apps), sender or recipient names (7 apps), and phone numbers (2 apps), while 4 apps leaked the actual message content. Furthermore, none of the data we observed being leaked to FCM was specifically disclosed in those apps' privacy disclosures. We also found several apps employing strategies to mitigate this privacy leakage to FCM, with varying levels of success. Of the strategies we identified, none appeared to be common, shared, or well-supported. We argue that this is fundamentally an economics problem: incentives need to be correctly aligned to motivate platforms and SDK providers to make their systems secure and private by default.more » « lessFree, publicly-accessible full text available October 1, 2025
-
Internet users often neglect important security actions (e.g., installing security updates or changing passwords) because they interrupt users’ main task at inopportune times. Commitment devices, such as reminders and promises, have been found to be effective at reducing procrastination in other domains. In a series of online experiments (\(n{\gt}3{,}000\)), we explored the effects of reminders and promises on users’ willingness to change a compromised password. We find that adding an option to delay the task increases the share of people willing to eventually change their password considerably. Critically, the option to delay yields this overall increase without reducing the share of people choosing to change their password immediately. Additionally, most participants who promised to change their password later, or asked to be reminded to do so, indeed followed through on their commitment, leading to a net positive effect. Reminding participants of their previous commitment further increased this effect.more » « less
-
Intelligent voice assistants may soon become proactive, offering suggestions without being directly invoked. Such behavior increases privacy risks, since proactive operation requires continuous monitoring of conversations. To mitigate this problem, our study proposes and evaluates one potential privacy control, in which the assistant requests permission for the information it wishes to use immediately after hearing it. To find out how people would react to runtime permission requests, we recruited 23 pairs of participants to hold conversations while receiving ambient suggestions from a proactive assistant, which we simulated in real time using the Wizard of Oz technique. The interactive sessions featured different modes and designs of runtime permission requests and were followed by in-depth interviews about people's preferences and concerns. Most participants were excited about the devices despite their continuous listening, but wanted control over the assistant's actions and their own data. They generally prioritized an interruption-free experience above more fine-grained control over what the device would hear.more » « less
-
Two-Factor Authentication (2FA) hardens an organization against user account compromise, but adds an extra step to organizations’ mission-critical tasks. We investigate to what extent quantitative analysis of operational logs of 2FA systems both supports and challenges recent results from user studies and surveys identifying usability challenges in 2FA systems. Using tens of millions of logs and records kept at two public universities, we quantify the at-scale impact on organizations and their employees during a mandatory 2FA implementation. We show the multiplicative effects of device remembrance, fragmented login services, and authentication timeouts on user burden. We find that user burden does not deviate far from other compliance and risk management time requirements already common to large organizations. We investigate the cause of more than one in twenty 2FA ceremonies being aborted or failing, and the variance in user experience across users. We hope our analysis will empower more organizations to protect themselves with 2FA.more » « less
-
Identifying privacy-sensitive data leaks by mobile applications has been a topic of great research interest for the past decade. Technically, such data flows are not “leaks” if they are disclosed in a privacy policy. To address this limitation in automated analysis, recent work has combined program analysis of applications with analysis of privacy policies to determine the flow-to-policy consistency, and hence violations thereof. However, this prior work has a fundamental weakness: it does not differentiate the entity (e.g., first-party vs. third-party) receiving the privacy-sensitive data. In this paper, we propose POLICHECK, which formalizes and implements an entity-sensitive flow-to-policy consistency model. We use POLICHECK to study 13,796 applications and their privacy policies and find that up to 42.4% of applications either incorrectly disclose or omit disclosing their privacy-sensitive data flows. Our results also demonstrate the significance of considering entities: without considering entity, prior approaches would falsely classify up to 38.4% of applications as having privacy-sensitive data flows consistent with their privacy policies. These false classifications include data flows to third-parties that are omitted (e.g., the policy states only the first-party collects the data type), incorrect (e.g., the policy states the third-party does not collect the data type), and ambiguous (e.g., the policy has conflicting statements about the data type collection). By defining a novel automated, entity-sensitive flow-to-policy consistency analysis, POLICHECK provides the highest-precision method to date to determine if applications properly disclose their privacy-sensitive behaviors.more » « less