- Award ID(s):
- 2046590
- PAR ID:
- 10409140
- Date Published:
- Journal Name:
- Proceedings of the International AAAI Conference on Web and Social Media
- Volume:
- 16
- ISSN:
- 2162-3449
- Page Range / eLocation ID:
- 723 to 734
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
YouTube is the most popular video sharing platform with more than 2 billion active users and 1 billion hours of video content watched daily. The dominance of YouTube has had a big impact on the performance of Internet protocols, algorithms, and systems. Understanding the interaction of users with YouTube is thus of much interest to the research community. In this context, we collect YouTube watch history data from 243 users spanning a 1.5 year period. The dataset comprises of a total of 1.8 million videos. We use the dataset to analyze and present key insights about user-level usage behavior. We also show that our analysis can be used by researchers to tackle a myriad of problems in the general domains of networking and communication. We present baseline characteristics and also substantiated directions to solve a few representative problems related to local caching techniques, prefetching strategies, the performance of YouTube's recommendation engine, the variability of user's video preferences and application specific load provisioning.more » « less
-
People are increasingly exposed to science and political information from social media. One consequence is that these sites play host to “alternative influencers,” who spread misinformation. However, content posted by alternative influencers on different social media platforms is unlikely to be homogenous. Our study uses computational methods to investigate how dimensions we refer to as audience and channel of social media platforms influence emotion and topics in content posted by “alternative influencers” on different platforms. Using COVID-19 as an example, we find that alternative influencers’ content contained more anger and fear words on Facebook and Twitter compared to YouTube. We also found that these actors discussed substantively different topics in their COVID-19 content on YouTube compared to Twitter and Facebook. With these findings, we discuss how the audience and channel of different social media platforms affect alternative influencers’ ability to spread misinformation online.more » « less
-
Abstract In today’s digital world, understanding how YouTube’s recommendation systems guide what we watch is crucial. This study dives into these systems, revealing how they influence the content we see over time. We found that YouTube’s algorithms tend to push content in certain directions, affecting the variety and type of videos recommended to viewers. To uncover these patterns, we used a mixed methods approach to analyze videos recommended by YouTube. We looked at the emotions conveyed in videos, the moral messages they might carry, and whether they contained harmful content. Our research also involved statistical analysis to detect biases in how these videos are recommended and network analysis to see how certain videos become more influential than others. Our findings show that YouTube’s algorithms can lead to a narrowing of the content landscape, limiting the diversity of what gets recommended. This has important implications for how information is spread and consumed online, suggesting a need for more transparency and fairness in how these algorithms work. In summary, this paper highlights the need for a more inclusive approach to how digital platforms recommend content. By better understanding the impact of YouTube’s algorithms, we can work towards creating a digital space that offers a wider range of perspectives and voices, affording fairness, and enriching everyone’s online experience.
-
Online misinformation is believed to have contributed to vaccine hesitancy during the Covid-19 pandemic, highlighting concerns about social media’s destabilizing role in public life. Previous research identified a link between political conservatism and sharing misinformation; however, it is not clear how partisanship affects how much misinformation people see online. As a result, we do not know whether partisanship drives exposure to misinformation or people selectively share misinformation despite being exposed to factual content. To address this question, we study Twitter discussions about the Covid-19 pandemic, classifying users along the political and factual spectrum based on the information sources they share. In addition, we quantify exposure through retweet interactions. We uncover partisan asymmetries in the exposure to misinformation: conservatives are more likely to see and share misinformation, and while users’ connections expose them to ideologically congruent content, the interactions between political and factual dimensions create conditions for the highly polarized users—hardline conservatives and liberals—to amplify misinformation. Overall, however, misinformation receives less attention than factual content and political moderates, the bulk of users in our sample, help filter out misinformation. Identifying the extent of polarization and how political ideology exacerbates misinformation can help public health experts and policy makers improve their messaging.more » « less
-
To promote engagement, recommendation algorithms on platforms like YouTube increasingly personalize users’ feeds, limiting users’ exposure to diverse content and depriving them of opportunities to reflect on their interests compared to others’. In this work, we investigate how exchanging recommendations with strangers can help users discover new content and reflect. We tested this idea by developing OtherTube—a browser extension for YouTube that displays strangers’ personalized YouTube recommendations. OtherTube allows users to (i) create an anonymized profile for social comparison, (ii) share their recommended videos with others, and (iii) browse strangers’ YouTube recommendations. We conducted a 10-day-long user study (n = 41) followed by a post-study interview (n = 11). Our results reveal that users discovered and developed new interests from seeing OtherTube recommendations. We identified user and content characteristics that affect interaction and engagement with exchanged recommendations; for example, younger users interacted more with OtherTube, while the perceived irrelevance of some content discouraged users from watching certain videos. Users reflected on their interests as well as others’, recognizing similarities and differences. Our work shows promise for designs leveraging the exchange of personalized recommendations with strangers.more » « less