skip to main content


Title: The TikTok Tradeoff: Compelling Algorithmic Content at the Expense of Personal Privacy
This paper presents the results of an interview study with twelve TikTok users to explore user awareness, perception, and experiences with the app’s algorithm in the context of privacy. The social media entertainment app TikTok collects user data to cater individualized video feeds based on users’ engagement with presented content which is regulated in a complex and overly long privacy policy. Our results demonstrate that participants generally have very little knowledge of the actual privacy regulations which is argued for with the benefit of receiving free entertaining content. However, participants experienced privacy-related downsides when algorithmically catered video content increasingly adapted to their biography, interests, or location and they in turn realized the detail of personal data that TikTok had access to. This illustrates the tradeoff users have to make between allowing TikTok to access their personal data and having favorable video consumption experiences on the app.  more » « less
Award ID(s):
1852260
NSF-PAR ID:
10313202
Author(s) / Creator(s):
;
Date Published:
Journal Name:
20th International Conference on Mobile and Ubiquitous Multimedia (MUM 2021), December 5–8, 2021, Leuven, Belgium. ACM, New York, NY, USA
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Video conferencing apps (VCAs) make it possible for previously private spaces -- bedrooms, living rooms, and kitchens -- into semi-public extensions of the office. For the most part, users have accepted these apps in their personal space without much thought about the permission models that govern the use of their private data during meetings. While access to a device's video camera is carefully controlled, little has been done to ensure the same level of privacy for accessing the microphone. In this work, we ask the question: what happens to the microphone data when a user clicks the mute button in a VCA? We first conduct a user study to analyze users' understanding of the permission model of the mute button. Then, using runtime binary analysis tools, we trace raw audio flow in many popular VCAs as it traverses the app from the audio driver to the network. We find fragmented policies for dealing with microphone data among VCAs -- some continuously monitor the microphone input during mute, and others do so periodically. One app transmits statistics of the audio to its telemetry servers while the app is muted. Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting -- cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted. 
    more » « less
  2. Furnell, Steven (Ed.)
    A huge amount of personal and sensitive data is shared on Facebook, which makes it a prime target for attackers. Adversaries can exploit third-party applications connected to a user’s Facebook profile (i.e., Facebook apps) to gain access to this personal information. Users’ lack of knowledge and the varying privacy policies of these apps make them further vulnerable to information leakage. However, little has been done to identify mismatches between users’ perceptions and the privacy policies of Facebook apps. We address this challenge in our work. We conducted a lab study with 31 participants, where we received data on how they share information in Facebook, their Facebook-related security and privacy practices, and their perceptions on the privacy aspects of 65 frequently-used Facebook apps in terms of data collection, sharing, and deletion. We then compared participants’ perceptions with the privacy policy of each reported app. Participants also reported their expectations about the types of information that should not be collected or shared by any Facebook app. Our analysis reveals significant mismatches between users’ privacy perceptions and reality (i.e., privacy policies of Facebook apps), where we identified over-optimism not only in users’ perceptions of information collection, but also on their self-efficacy in protecting their information in Facebook despite experiencing negative incidents in the past. To the best of our knowledge, this is the first study on the gap between users’ privacy perceptions around Facebook apps and the reality. The findings from this study offer directions for future research to address that gap through designing usable, effective, and personalized privacy notices to help users to make informed decisions about using Facebook apps. 
    more » « less
  3. Mobile Augmented Reality (MAR) is a portable, powerful, and suitable technology that integrates digital content, e.g., 3D virtual objects, into the physical world, which not only has been implemented for multiple intents such as shopping, entertainment, gaming, etc., but it is also expected to grow at a tremendous rate in the upcoming years. Unfortunately, the applications that implement MAR, hereby referred to as MAR-Apps, bear security issues, which have been imaged in worldwide incidents such as robberies, which has led authorities to ban MAR-Apps at specific locations. Existing problems with MAR-Apps can be classified into three categories: first, Space Invasion, which implies the intrusive modification through MAR of sensitive spaces, e.g., hospitals, memorials, etc. Second, Space Affectation, which involves the degradation of users' experience via interaction with undesirable MAR or malicious entities. Finally, MAR-Apps mishandling sensitive data leads to Privacy Leaks. To alleviate these concerns, we present an approach for Policy-Governed MAR-Apps, which allows end-users to fully control under what circumstances, e.g., their presence inside a given sensitive space, digital content may be displayed by MAR-Apps. Through SpaceMediator, a proof-of-concept MAR-App that imitates the well-known and successful MAR-App Pokemon GO, we evaluated our approach through a user study with 40 participants, who recognized and prevented the issues just described with success rates as high as 92.50%. Furthermore, there is an enriched interest in Policy-Governed MAR-Apps as 87.50% of participants agreed with it, and 82.50% would use it to implement content-based restrictions in MAR-Apps These promising results encourage the adoption of our solution in future MAR-Apps. 
    more » « less
  4. The culture within engineering colleges and departments has been historically quiet when considering social justice issues. Often the faculty in those departments are less concerned with social issues and are primarily focused on their disciplines and the concrete ways that they can make impacts academically and professionally in their respective arena’s. However, with the social climate of the United States shifting ever more towards a politically charged climate, and current events, particularly the protests against police brutality in recent years, faculty and students are constantly inundated with news of injustices happening in our society. The murder of George Floyd on May 25th 2020 sent shockwaves across the United States and the world. The video captured of his death shared across the globe brought everyone’s attention to the glaringly ugly problem of police brutality, paired with the COVID-19 pandemic, and US election year, the conditions were just right for a social activist movement to grow to a size that no one could ignore. Emmanuel Acho spoke out, motivated by injustices seen in the George Floyd murder, initially with podcasts and then by writing his book “Uncomfortable Converstations with a Black Man” [1]. In his book he touched on various social justice issues such as: racial terminology (i.e., Black or African American), implicit biases, white privilege, cultural appropriation, stereotypes (e.g., the “angry black man”), racial slurs (particularly the n-word), systemic racism, the myth of reverse racism, the criminal justice system, the struggles faced by black families, interracial families, allyship, and anti-racism. Students and faculty at Anonymous University felt compelled to set aside the time to meet and discuss this book in depth through the video conferencing client Zoom. In these meetings diverse facilitators were tasked with bringing the topics discussed by Acho in his book into conversation and pushing attendees of these meetings to consider those topics critically and personally. In an effort to avoid tasking attendees with reading homework to be able to participate in these discussions, the discussed chapter of the audiobook version of Acho’s book was played at the beginning of each meeting. Each audiobook chapter lasted between fifteen and twenty minutes, after which forty to forty-five minutes were left in the hour-long meetings to discuss the content of the chapter in question. Efforts by students and faculty were made to examine how some of the teachings of the book could be implemented into their lives and at Anonymous University. For broader topics, they would relate the content back to their personal lives (e.g., raising their children to be anti-racist and their experiences with racism in American and international cultures). Each meeting was recorded for posterity in the event that those conversations would be used in a paper such as this. Each meeting had at least one facilitator whose main role was to provide discussion prompts based on the chapter and ensure that the meeting environment was safe and inclusive. Naturally, some chapters address topics that are highly personal to some participants, so it was vital that all participants felt comfortable and supported to share their thoughts and experiences. The facilitator would intervene if the conversation veered in an aggressive direction. For example, if a participant starts an argument with another participant in a non-constructive manner, e.g., arguing over the definition of ethnicity, then the facilitator will interrupt, clear the air to bring the group back to a common ground, and then continue the discussion. Otherwise, participants were allowed to steer the direction of the conversation as new avenues of discussion popped up. These meetings were recorded with the goal of returning to these conversations and analyzing the conversations between attendees. Grounded theory will be used to first assess the most prominent themes of discussion between attendees for each meeting [2]. Attendees will be contacted to expressly ask their permission to have their words and thoughts used in this work, and upon agreement that data will begin to be processed. Select attendees will be asked to participate in focus group discussions, which will also be recorded via Zoom. These discussions will focus around the themes pulled from general discussion and will aim to dive deeper into the impact that this experience has had on them as either students or faculty members. A set of questions will be developed as prompts, but conversation is expected to evolve organically as these focus groups interact. These sessions will be scheduled for an hour, and a set of four focus groups with four participants are expected to participate for a total of sixteen total focus group participants. We hope to uncover how this experience changed the lives of the participants and present a model of how conversations such as this can promote diversity, equity, inclusion, and access activities amongst faculty and students outside of formal programs and strategic plans that are implemented at university, college, or departmental levels. 
    more » « less
  5. null (Ed.)
    Cloud photo services are widely used for persistent, convenient, and often free photo storage, which is especially useful for mobile devices. As users store more and more photos in the cloud, significant privacy concerns arise because even a single compromise of a user's credentials give attackers unfettered access to all of the user's photos. We have created Easy Secure Photos (ESP) to enable users to protect their photos on cloud photo services such as Google Photos. ESP introduces a new client-side encryption architecture that includes a novel format-preserving image encryption algorithm, an encrypted thumbnail display mechanism, and a usable key management system. ESP encrypts image data such that the result is still a standard format image like JPEG that is compatible with cloud photo services. ESP efficiently generates and displays encrypted thumbnails for fast and easy browsing of photo galleries from trusted user devices. ESP's key management makes it simple to authorize multiple user devices to view encrypted image content via a process similar to device pairing, but using the cloud photo service as a QR code communication channel. We have implemented ESP in a popular Android photos app for use with Google Photos and demonstrate that it is easy to use and provides encryption functionality transparently to users, maintains good interactive performance and image quality while providing strong privacy guarantees, and retains the sharing and storage benefits of Google Photos without any changes to the cloud service. 
    more » « less