To promote engagement, recommendation algorithms on platforms like YouTube increasingly personalize users’ feeds, limiting users’ exposure to diverse content and depriving them of opportunities to reflect on their interests compared to others’. In this work, we investigate how exchanging recommendations with strangers can help users discover new content and reflect. We tested this idea by developing OtherTube—a browser extension for YouTube that displays strangers’ personalized YouTube recommendations. OtherTube allows users to (i) create an anonymized profile for social comparison, (ii) share their recommended videos with others, and (iii) browse strangers’ YouTube recommendations. We conducted a 10-day-long user study (n = 41) followed by a post-study interview (n = 11). Our results reveal that users discovered and developed new interests from seeing OtherTube recommendations. We identified user and content characteristics that affect interaction and engagement with exchanged recommendations; for example, younger users interacted more with OtherTube, while the perceived irrelevance of some content discouraged users from watching certain videos. Users reflected on their interests as well as others’, recognizing similarities and differences. Our work shows promise for designs leveraging the exchange of personalized recommendations with strangers.
more »
« less
NoCDN: scalable content delivery without a middleman
Today's websites achieve scalability by either deploying their own platforms with sufficient spare capacity or signing up for services from a content delivery network (CDN). This paper investigates another alternative, where a website directly recruits Internet users to contribute their resources to help deliver the site's content. We show that this alternative, which we call NoCDN, can be implemented securely, transparently to the users accessing the site, and without changes to the content itself.
more »
« less
- Award ID(s):
- 1647145
- PAR ID:
- 10054246
- Date Published:
- Journal Name:
- 5th ACM/IEEE Workshop on Hot Topics in Web Systems and Technologies (HotWeb '17)
- Page Range / eLocation ID:
- 1 to 6
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Social media users create folk theories to help explain how elements of social media operate. Marginalized social media users face disproportionate content moderation and removal on social media platforms. We conducted a qualitative interview study (n = 24) to understand how marginalized social media users may create folk theories in response to content moderation and their perceptions of platforms’ spirit, and how these theories may relate to their marginalized identities. We found that marginalized social media users develop folk theories informed by their perceptions of platforms’ spirit to explain instances where their content was moderated in ways that violate their perceptions of how content moderation should work in practice. These folk theories typically address content being removed despite not violating community guidelines, along with bias against marginalized users embedded in guidelines. We provide implications for platforms, such as using marginalized users’ folk theories as tools to identify elements of platform moderation systems that function incorrectly and disproportionately impact marginalized users.more » « less
-
Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.more » « less
-
Netflix is the most popular video streaming site contributing to nearly a quarter of global video traffic. Given the dominance of Netflix on Internet traffic, understanding how individual users consume content on Netflix is of interest to not only the research community, but to network operators, content creators and providers, users and advertisers. In this context, we collect Netflix viewing activity from 1060 users spanning a 1 year period, and consisting of over 1.7 million episodes and movies. We group the users based on their activity level, and provide key insights pertaining to the user’s watch patterns, watch-session length, user preferences, predictability and watch-behavior continuation tendencies. We also implement and evaluate classifiers which are used to predict the user’s engagement in a series based on their past behavioral patterns.more » « less
-
Mainstream platforms’ content moderation systems typically employ generalized “one-size-fits-all” approaches, intended to serve both general and marginalized users. Thus, transgender people must often create their own technologies and moderation systems to meet their specific needs. In our interview study of transgender technology creators (n=115), we found that creators face issues of transphobic abuse and disproportionate content moderation. Trans tech creators address these issues by carefully moderating and vetting their userbases, centering trans contexts in content moderation systems, and employing collective governance and community models. Based on these findings, we argue that trans tech creators’ approaches to moderation offer important insights into how to better design for trans users, and ultimately, marginalized users in the larger platform ecology. We introduce the concept of trans-centered moderation – content moderation that reviews and successfully vets transphobic users, appoints trans moderators to effectively moderate trans contexts, considers the limitations and constraints of technology for addressing social challenges, and employs collective governance and community models. Trans-centered moderation can help to improve platform design for trans users while reducing the harm faced by trans people and marginalized users more broadly.more » « less