skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: The TikTok Tradeoff: Compelling Algorithmic Content at the Expense of Personal Privacy
This paper presents the results of an interview study with twelve TikTok users to explore user awareness, perception, and experiences with the app’s algorithm in the context of privacy. The social media entertainment app TikTok collects user data to cater individualized video feeds based on users’ engagement with presented content which is regulated in a complex and overly long privacy policy. Our results demonstrate that participants generally have very little knowledge of the actual privacy regulations which is argued for with the benefit of receiving free entertaining content. However, participants experienced privacy-related downsides when algorithmically catered video content increasingly adapted to their biography, interests, or location and they in turn realized the detail of personal data that TikTok had access to. This illustrates the tradeoff users have to make between allowing TikTok to access their personal data and having favorable video consumption experiences on the app.  more » « less
Award ID(s):
1852260
PAR ID:
10313202
Author(s) / Creator(s):
;
Date Published:
Journal Name:
20th International Conference on Mobile and Ubiquitous Multimedia (MUM 2021), December 5–8, 2021, Leuven, Belgium. ACM, New York, NY, USA
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Furnell, Steven (Ed.)
    A huge amount of personal and sensitive data is shared on Facebook, which makes it a prime target for attackers. Adversaries can exploit third-party applications connected to a user’s Facebook profile (i.e., Facebook apps) to gain access to this personal information. Users’ lack of knowledge and the varying privacy policies of these apps make them further vulnerable to information leakage. However, little has been done to identify mismatches between users’ perceptions and the privacy policies of Facebook apps. We address this challenge in our work. We conducted a lab study with 31 participants, where we received data on how they share information in Facebook, their Facebook-related security and privacy practices, and their perceptions on the privacy aspects of 65 frequently-used Facebook apps in terms of data collection, sharing, and deletion. We then compared participants’ perceptions with the privacy policy of each reported app. Participants also reported their expectations about the types of information that should not be collected or shared by any Facebook app. Our analysis reveals significant mismatches between users’ privacy perceptions and reality (i.e., privacy policies of Facebook apps), where we identified over-optimism not only in users’ perceptions of information collection, but also on their self-efficacy in protecting their information in Facebook despite experiencing negative incidents in the past. To the best of our knowledge, this is the first study on the gap between users’ privacy perceptions around Facebook apps and the reality. The findings from this study offer directions for future research to address that gap through designing usable, effective, and personalized privacy notices to help users to make informed decisions about using Facebook apps. 
    more » « less
  2. Mobile and web apps are increasingly relying on the data generated or provided by users such as from their uploaded documents and images. Unfortunately, those apps may raise significant user privacy concerns. Specifically, to train or adapt their models for accurately processing huge amounts of data continuously collected from millions of app users, app or service providers have widely adopted the approach of crowdsourcing for recruiting crowd workers to manually annotate or transcribe the sampled ever-changing user data. However, when users' data are uploaded through apps and then become widely accessible to hundreds of thousands of anonymous crowd workers, many human-in-the-loop related privacy questions arise concerning both the app user community and the crowd worker community. In this paper, we propose to investigate the privacy risks brought by this significant trend of large-scale crowd-powered processing of app users' data generated in their daily activities. We consider the representative case of receipt scanning apps that have millions of users, and focus on the corresponding receipt transcription tasks that appear popularly on crowdsourcing platforms. We design and conduct an app user survey study (n=108) to explore how app users perceive privacy in the context of using receipt scanning apps. We also design and conduct a crowd worker survey study (n=102) to explore crowd workers' experiences on receipt and other types of transcription tasks as well as their attitudes towards such tasks. Overall, we found that most app users and crowd workers expressed strong concerns about the potential privacy risks to receipt owners, and they also had a very high level of agreement with the need for protecting receipt owners' privacy. Our work provides insights on app users' potential privacy risks in crowdsourcing, and highlights the need and challenges for protecting third party users' privacy on crowdsourcing platforms. We have responsibly disclosed our findings to the related crowdsourcing platform and app providers. 
    more » « less
  3. As concern over data privacy and existing privacy regulations grows, legal scholars have proposed alternative models for data privacy. This work explores the impact of one such model---the data fiduciary model, which would stipulate that data processors must use personal information only in ways that reflect the best interest of the data subject---through a pair of user studies. We first conduct an interview study with nine mobile app developers in which we explore whether, how, and why these developers believe their current data practices are consistent with the best interest of their users. We then conduct an online study with 390 users in which we survey participants about whether they consider the same data practices to be in their own best interests. We also ask both developers and users about their attitudes towards and their predictions about the impact of a data fiduciary law, and we conclude with recommendations about such an approach to future privacy regulations. 
    more » « less
  4. Video conferencing apps (VCAs) make it possible for previously private spaces -- bedrooms, living rooms, and kitchens -- into semi-public extensions of the office. For the most part, users have accepted these apps in their personal space without much thought about the permission models that govern the use of their private data during meetings. While access to a device's video camera is carefully controlled, little has been done to ensure the same level of privacy for accessing the microphone. In this work, we ask the question: what happens to the microphone data when a user clicks the mute button in a VCA? We first conduct a user study to analyze users' understanding of the permission model of the mute button. Then, using runtime binary analysis tools, we trace raw audio flow in many popular VCAs as it traverses the app from the audio driver to the network. We find fragmented policies for dealing with microphone data among VCAs -- some continuously monitor the microphone input during mute, and others do so periodically. One app transmits statistics of the audio to its telemetry servers while the app is muted. Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting -- cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted. 
    more » « less
  5. Mobile super apps are revolutionizing mobile computing by offering diverse services through integrated "miniapps'', creating comprehensive ecosystems akin to app stores like Google Play and Apple's App Store. While these platforms, such as WeChat, Alipay, and TikTok, enhance user convenience and functionality, they also raise significant security and privacy concerns due to the vast amounts of user data they handle. In response, the Workshop on Secure and Trustworthy Superapps (SaTS 2024) aims to address these critical issues by fostering collaboration among researchers and practitioners to explore solutions that protect users and enhance security within the super app landscape. 
    more » « less