<?xml-model href='http://www.tei-c.org/release/xml/tei/custom/schema/relaxng/tei_all.rng' schematypens='http://relaxng.org/ns/structure/1.0'?><TEI xmlns="http://www.tei-c.org/ns/1.0">
	<teiHeader>
		<fileDesc>
			<titleStmt><title level='a'>"Give Everybody [..] a Little Bit More Equity": Content Creator Perspectives and Responses to the Algorithmic Demonetization of Content Associated with Disadvantaged Groups</title></titleStmt>
			<publicationStmt>
				<publisher></publisher>
				<date>11/07/2022</date>
			</publicationStmt>
			<sourceDesc>
				<bibl> 
					<idno type="par_id">10391666</idno>
					<idno type="doi">10.1145/3555149</idno>
					<title level='j'>Proceedings of the ACM on Human-Computer Interaction</title>
<idno>2573-0142</idno>
<biblScope unit="volume">6</biblScope>
<biblScope unit="issue">CSCW2</biblScope>					

					<author>Sara Kingsley</author><author>Proteeti Sinha</author><author>Clara Wang</author><author>Motahhare Eslami</author><author>Jason I. Hong</author>
				</bibl>
			</sourceDesc>
		</fileDesc>
		<profileDesc>
			<abstract><ab><![CDATA[Algorithmic systems help manage the governance of digital platforms featuring user-generated content, including how money is distributed to creators from the profits a platform earns from advertising on this content. However, creators producing content about disadvantaged populations have reported that these kinds of systems are biased, having associated their content with prohibited or unsafe content, leading to what creators believed were error-prone decisions to demonetize their videos. Motivated by these reports, we present the results of 20 interviews with YouTube creators and a content analysis of videos, tweets, and news about demonetization cases to understand YouTubers' perceptions of demonetization affecting videos featuring disadvantaged or vulnerable populations, as well as creator responses to demonetization, and what kinds of tools and infrastructure support they desired. We found creators had concerns about YouTube's algorithmic system stereotyping content featuring vulnerable demographics in harmful ways, for example by labeling it "unsafe'' for children or families -- creators believed these demonetization errors led to a range of economic, social, and personal harms. To provide more context to these findings, we analyzed and report on the technique a few creators used to audit YouTube's algorithms to learn what could cause the demonetization of videos featuring LGBTQ people, culture and/or social issues. In response to the varying beliefs about the causes and harms of demonetization errors, we found our interviewees wanted more reliable information and statistics about demonetization cases and errors, more control over their content and advertising, and better economic security.]]></ab></abstract>
		</profileDesc>
	</teiHeader>
	<text><body xmlns="http://www.tei-c.org/ns/1.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xlink="http://www.w3.org/1999/xlink">
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">INTRODUCTION</head><p>In online platforms, algorithms have a major role in managing economic opportunities, e.g. oering new ways of working online <ref type="bibr">[47,</ref><ref type="bibr">67]</ref>, providing goods and services in the sharing economy <ref type="bibr">[47]</ref>, and monetizing user-generated content in the creative economy <ref type="bibr">[15]</ref>. However, past work has discovered many examples of biases in these algorithms that can lead to economic harms, particularly towards demographics who have historically been marginalized and/or treated adversely in labor markets <ref type="bibr">[2,</ref><ref type="bibr">2,</ref><ref type="bibr">6,</ref><ref type="bibr">30,</ref><ref type="bibr">45,</ref><ref type="bibr">66]</ref>. For example, research has found digital platforms have shown job advertisements predominately to men <ref type="bibr">[45]</ref>, displayed ads for predatory housing loans more often to Black users <ref type="bibr">[6]</ref>, and monetized hate content <ref type="bibr">[55]</ref> while withholding income from creators who produced content about marginalized communities <ref type="bibr">[18]</ref>. In each of these cases, biases in algorithmic systems can aect the income earned by individuals and communities, exacerbating long-standing societal and economic harms.</p><p>On digital platforms, advertising on user-generated content generates billions of dollars for technology companies <ref type="bibr">[57,</ref><ref type="bibr">62]</ref>, and some platform companies pay a share of this income to users who are content creators <ref type="bibr">[37]</ref>. Many of these platforms use algorithmic systems to detect if content violates platform rules or community standards <ref type="bibr">[53]</ref> and then use the machine's predictions to determine whether to demonetize the content <ref type="bibr">[15]</ref>. However, news reports <ref type="bibr">[2,</ref><ref type="bibr">11,</ref><ref type="bibr">18,</ref><ref type="bibr">64]</ref> and lawsuits against technology companies <ref type="bibr">[19,</ref><ref type="bibr">40]</ref> have reported these algorithmic decisions can be error prone, and disproportionately harm content creators who make videos with content featuring marginalized communities. These reports have particularly focused on cases involving creators who believed they were demonetized for producing LGBTQ content <ref type="foot">1</ref> and content about Black culture and communities <ref type="bibr">[2]</ref>.</p><p>Preceding research has looked at demonetization as a mechanism to punish creators for publishing prohibited content on YouTube, such as violent, hate, or pornographic content, and has explored YouTube creators' socioeconomic interactions with automated moderation systems <ref type="bibr">[53]</ref>. It has also been shown marginalized groups might suer from disproportionate removals and diering content moderation experiences on digital platforms <ref type="bibr">[30]</ref>. Disproportionately removing content featuring marginalized demographics puts an extra burden on users or content creators who make content associated with marginalized groups. However, little is known about how creators who earn money from making content deal with these demonetization practices, particularly when it is considered discriminatory, or what the potential ways to support creators who make content about marginalized communities are.</p><p>In this paper, we investigate how content creators perceived and responded to what they believed were erroneous decisions to demonetize their videos due to biases in YouTube's algorithms, focusing on cases where content about demographics that have been historically economically disadvantaged had been demonetized. Toward this end, our paper contributes ndings from interviews with 20 YouTube content creators and a content analysis of 30 videos and 580 tweets by content creators. Our investigation probed how creators perceived YouTube demonetization as disproportionately unfair, why they believed demonetization happened, and how they responded and what their needs were. In addition to their perspectives and responses to demonetization, we present the tools creators said would help them navigate and respond to wrongful determinations about their content. We examined one specic case of perceived demonetization bias in depth, namely the demonetization of YouTube videos featuring content associated with the LBGTQ community. Prior "Give everybody a lile bit more equity" 424:3 to our study, YouTube creators had sued YouTube for reported discrimination the platform's in automated decisions to demonetize LGBTQ content. In responding to reports of demonetization biases aecting LGBTQ content, several YouTube creators had also coordinated together and built auditing tools and workows to systematically identify factors triggering demonetization of LGBTQ content. We conducted a walkthrough analysis of these creators' auditing materials and tools, and describe their audit workow and ndings in this paper, though we did not try to replicate the audit and did not verify the results.</p><p>In summary, this paper reports ndings about creators' perspectives, responses, and needs with respect to YouTube's demonetization algorithm, specically contributing the following:</p><p>&#8226; A description of harms stemming from perceived errors in automated decisions to demonetize content featuring or associated with historically disadvantaged communities, and how content creators coped and responded; &#8226; A description of tools and infrastructure creators reported would help them manage demonetization; &#8226; An elaboration on the implications of creators' perspectives, responses and needs for the creative economy and future of work, as well as for the future of crowd auditing.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">RELATED WORKS</head><p>We have organized related work into (1) research on user content moderation regimes, particularly algorithmic system ones;</p><p>(2) research on algorithmic management of work versus content creator compensation; and, (3) research on algorithmic bias and economic opportunity, including policy considerations in addition to audits to detect disparate impacts and harms to marginalized communities.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.1">Content Moderation Regimes</head><p>While content moderation on some platforms is manual, other platforms make use of automated algorithms. The goal of these algorithms is to moderate everyday user content by removing unsafe, repugnant, fraudulent, and violent content from platforms, which in many ways is similar to YouTube's demonetization algorithm that we examine in this paper. These user content moderation algorithms are typically "black box" <ref type="bibr">[60]</ref> proprietary algorithms developed and maintained by technology companies, and the internals of their workings is not public knowledge. In the literature, there are at least three main automated content moderation regimes informing our work, namely mostly-algorithmic system <ref type="bibr">[61]</ref>, human-in-the-loop <ref type="bibr">[5,</ref><ref type="bibr">17]</ref>, and economicincentive-in-the-loop systems <ref type="bibr">[53]</ref>. A mostly-algorithmic system content moderation regime is a system where an algorithm detects, classies, and decides what to do with user content, typically with no human assistance <ref type="bibr">[61]</ref>, but sometimes with a human review built in to manage any conicts. A human-in-the-loop content moderation regime typically has human moderators who review, classify, and make content decisions, especially for hard to judge content requiring human subject matter expertise <ref type="bibr">[5,</ref><ref type="bibr">17]</ref>. Finally, an economic-incentive-in-the-loop uses an algorithm to manage compensation for content, thus subtly inuencing content creators to not publish prohibited content <ref type="bibr">[53]</ref>, with YouTube's algorithms as an example. Our work examines YouTube's system with a dierent lens, focusing on YouTube's demonetization algorithm and treating its mostlyalgorithmic system content moderation as a separate and distinct system. Our work also diers in examining the perceptions and harms of demonetization specically on marginalized demographics, rather than creators in general.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.2">Algorithmic Management of Work and Compensation</head><p>Algorithms manage work and because of this they have the authority to grant or deny compensation to platform workers <ref type="bibr">[14,</ref><ref type="bibr">48]</ref>. For example, algorithms manage the schedule and routes of Uber drivers, as well as their pay rates, such as through the platform's surge pricing algorithm <ref type="bibr">[48]</ref>. Similarly, demonetization algorithms judge the nature of content and use these classications to decide whether to compensate content creators <ref type="bibr">[15,</ref><ref type="bibr">53]</ref>.</p><p>Focusing specically on content creators, some creators earn income for their work and even depend on this income for their livelihood <ref type="bibr">[15]</ref>. These creators are generally compensated through either the platform they are promoting content on or through third-party websites like Ko-or Patreon. One example of platform compensation is YouTube's Partner Program, which shares ad revenue with content creators <ref type="bibr">[15]</ref>. These platforms are "increasingly reliant on automated systems of demand prediction and evaluation" to quantify performance <ref type="bibr">[20]</ref>, and so creators are always needing to "game the system" <ref type="bibr">[20]</ref> to keep their content advertiser-friendly and gain ad revenue <ref type="bibr">[15]</ref>.</p><p>The dependence on algorithms for compensating user-generated content brings with it the potential for many kinds of economic harms. Firstly, creators' earnings are dependent on a constantly changing algorithm. Due to a lack of transparency about the algorithms governing content moderation and promotion, creators must spend extra time to keep up to date with the algorithm rules <ref type="bibr">[20]</ref>. Content creators have also expressed concerns about lost nancial opportunities as they try to reverse content moderation decisions or punishments. For example, previous literature discussed the shared experiences of losing ad revenue on YouTube due to demonetization and the manual eort required to reverse these algorithmic punishments. Since the demonetization and appeals process generally occur during the rst 24-48 hours, which is when the most views are accumulated, the impact is especially severe <ref type="bibr">[15]</ref>. Previous works have concluded both that the instability in the platform features and algorithm, coupled with the opaqueness of the algorithm, lead to precarious nancial security for creators <ref type="bibr">[53]</ref>.</p><p>While all content creators might grapple with the consequences of algorithmic demonetization, YouTube's "tiered governance", where YouTube gives "dierent users dierent sets of rules, dierent material resources and opportunities, and dierent procedural protections when content is demonetized", can result in dierences in treatment between content creators <ref type="bibr">[15]</ref>. Such dierences can bring extra obstacles for creators who belong to and/or generate content relevant to marginalized communities. For example, past work has also shown that marginalized communities (political conservatives, transgender people, and Black people) experience content and account removals more often than others <ref type="bibr">[30]</ref>. Our work builds on these past ndings, and includes descriptions of the problems that creators face (with a special focus on the LGBTQ and Black community), as well as potential solutions or support suggested by creators.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.3">Algorithmic Bias and Equal Economic Opportunity</head><p>Researchers have audited a range of algorithms for bias <ref type="bibr">[1,</ref><ref type="bibr">4,</ref><ref type="bibr">6,</ref><ref type="bibr">13,</ref><ref type="bibr">45,</ref><ref type="bibr">66,</ref><ref type="bibr">69]</ref>, such as search engine advertising <ref type="bibr">[69]</ref>, targeted advertisements on digital platforms <ref type="bibr">[45]</ref>. Some of these audits have also examined how these biases could harm economic opportunities. In this paper, we do not conduct an algorithm audit, but instead present how content creators perceived algorithm bias in YouTube's demonetization algorithm and the economic harms<ref type="foot">foot_3</ref> those creators experienced. We also document creators' responses to demonetization, one of which is a user-driven audit <ref type="bibr">[68]</ref> of what keywords lead to demonetization.</p><p>"Give everybody a lile bit more equity" 424:5</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.3.1">Anti-Discrimination Policy for Content Biases Harming Workers and Economic Opportunities.</head><p>In the literature, biased decisions about economic opportunities (e.g. employment, housing, nancial credit, education) have been typically framed as an outcome resulting from stereotypes, representation issues or prejudice against members of legally protected or marginalized demographics in society <ref type="bibr">[41,</ref><ref type="bibr">45]</ref> -not as biased decisions against the demographic(s) featured in digital art, content, or media. In the creator economy, those who have made content featuring demographics covered by anti-discrimination protections have reported their content was demonetized because of the demographics of the content <ref type="bibr">[3,</ref><ref type="bibr">18]</ref>, not only because of the demographics of the creator. Our research contributes YouTube creator perspectives on the causes, their responses and ideas for resolutions to the harms these biases against demographic content inict on creators. In the discussion section, we recommend future work, especially policy scholarship, that could clarify what, if any, legal protections content makers have when their content is economically damaged (e.g. demonetized) allegedly because the content features a legally protected or marginalized demographic.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">BACKGROUND INFORMATION ON YOUTUBE</head><p>YouTube is the world's largest video-sharing platform, where everyday producers of videos ("content creators") can upload and share videos <ref type="bibr">[8]</ref>. YouTube earns revenues from creators' labor by placing advertisements on their videos <ref type="bibr">[44]</ref>. In 2019, YouTube reportedly made $15.1 billion from video advertising <ref type="bibr">[44]</ref>. YouTube shares a portion of these revenues with qualied YouTube creators. When a creator's video earns income directly from YouTube advertising, it is a so-called monetized video <ref type="bibr">[15]</ref>. In August 2021, YouTube stated there were more than 2 million YouTube creators eligible to monetize their videos <ref type="bibr">[52]</ref>. Fortune Magazine reported YouTube has paid these creators more than $30 billion dollars over the past three years <ref type="bibr">[12]</ref>, though the distribution of this monetization income is reportedly skewed <ref type="bibr">[8]</ref>.</p><p>Monetization describes the mechanisms by which creators earn money from advertisements placed on their content, and from consumer spending on a digital platform where they have published their content <ref type="bibr">[8,</ref><ref type="bibr">15,</ref><ref type="bibr">44]</ref>. For the purposes of this paper, direct monetization refers to when YouTube pays creators some of the ad revenues generated from displaying ads on their video content <ref type="bibr">[8,</ref><ref type="bibr">15]</ref>. In contrast, 3rd party monetization refers to when content creators earn income from parties other than YouTube, such as from viewers (e.g. via Patreon), corporate brands or aliates who pay creators to advertise their products <ref type="bibr">[25]</ref>, and multi-channel networks <ref type="bibr">[8]</ref>.</p><p>Demonetization refers to when YouTube's algorithm removes all or some monetization features from a creator's video, eectively minimizing or withholding the income they would otherwise earn from advertising <ref type="bibr">[15,</ref><ref type="bibr">53]</ref>. When YouTube's algorithm has demonetized content, creators are (1) not paid income for any ads displayed on their content, or (2) their income is reduced because YouTube restricted the number of ads displayed on their content <ref type="bibr">[15,</ref><ref type="bibr">53]</ref>.</p><p>Our work focuses on how creators perceived and responded to what they believed were biases in YouTube's algorithms, which impacted the direct monetization of their content. Many creators have reported YouTube's algorithm has demonetized content about the LGBTQ community, even though the content reportedly complied with YouTube's policies and community standards <ref type="bibr">[3,</ref><ref type="bibr">11,</ref><ref type="bibr">16,</ref><ref type="bibr">18,</ref><ref type="bibr">64]</ref>. YouTube creators have also alleged content produced by Black creators and/or about Black communities has been demonetized due to algorithmic bias <ref type="bibr">[2,</ref><ref type="bibr">58]</ref>. Our work contributes to this small yet growing body of literature, rst, by documenting creator responses to demonetization of content associated with marginalized communities, and second, by enumerating the socioeconomic harms of biased (de)monetization practices.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">METHODOLOGY</head><p>We describe our method for our content analysis of social media and news, as well as survey and interview data, which we collected and analyzed to understand the perceptions, responses, and needs of content creators who believed they had experienced or witnessed error-prone or biased demonetization practices on YouTube. Before collecting Twitter and YouTube data, we made developer accounts and were granted approval from the platforms to use the accounts for academic research or data collection generally. Our interview study and use of public social media data was reviewed and granted approval from our Institutional Review Board.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1">Social Media and News Content Analysis of YouTube LGBTQ Content Demonetization Cases</head><p>Since we were interested to learn more about demonetization cases associated with marginalized communities, we collected data from Twitter, YouTube, and news outlets. We collected data from these sources because YouTube creators have used social media and news channels to communicate with each other and respond to decisions made by YouTube's (de)monetization algorithm <ref type="bibr">[15]</ref>. In addition, we collected and analyzed materials from a creator-led audit of YouTube, which sought to identify factors that triggered the platform's algorithm to demonetize LGBTQ content. In this section we describe our data collection and analysis approach for this research.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1.1">Data Collection:</head><p>First, to retrieve the Twitter data about the LGBTQ YouTube content demonetization case, we made requests to the Twitter API using keywords (Appendix A.1 lists the keywords used for these requests). <ref type="foot">3</ref> We retrieved roughly 50,000 tweets, and then open coded and analyzed a subset of 580 tweets about demonetization to get a broader sense of user and creator responses and reports of the practice on YouTube (a more detailed description of the coding process is given in 4.1.2). These 580 tweets were liked and retweeted over 4,59,414 and 76,947 times respectively, and published to Twitter between September 2016 and November 2020.</p><p>The average number of likes on this set of tweets was 792, with a standard deviation of 8949.98, indicating that there were several tweets with a very high like count and also several with extremely low number of likes on it. The average number of retweets for each tweet was about 132, with a standard deviation of 1777.14, indicating that some tweets got retweeted much more than others. In total, these tweets were quoted 40,330 times, with each tweet being quote-tweeted an average of 69 times. The standard deviation for the number of quote tweets was 283, again indicating a wide variance in our selection of tweets. In total, there were 76,721 replies to the tweets in our data-set, with each tweet having an average of 132 replies , with a standard deviation of 502.65. To our knowledge, September 2016 was approximately when content creators began reporting to each other on social media and via news channels that YouTube's algorithm was demonetizing LGBTQ content <ref type="bibr">[11]</ref>. We selected these tweets randomly but from a chronologically ordered subset of tweets, in which the subset was partitioned from tweets published prior to September 2016. We found tweets posted before September 2016 were not germane to our research questions. Of the 580 tweets analyzed only 176 discussed the LGBTQ content demonetization case specically. The small number of LGBTQ demonetization tweets could reect a limitation of our study, namely the keywords we used to request tweets from Twitter's API did not cause the API to return every tweet relevant to our research (see below for more details about this and additional limitations).</p><p>Second, we collected and analyzed the transcripts of 29 YouTube videos about LGBTQ content demonetization, published between 2017 and 2020 by 25 unique content creators. Of these creators, 10 talked about economic harms (40%), 18 talked about user auditing (72%), 22 talked about making sense of the algorithms behind demonetization (88%), and 23 talked about collective support (92%). Following up on harms reported during interviews with creators, we also subsequently analyzed 4 videos about harms impacting Black creators on YouTube. These videos discussed creators' experiences and impacts of bias, particularly against LGBTQ and Black creators, including disparities in demonetization, visibility <ref type="bibr">[26]</ref> and their discoverability on the platform, as well as other metrics, such as subscriber growth. First, to collect the transcripts, we manually searched for YouTube videos about LGBTQ demonetization (Appendix A.1 lists the keywords used to these requests), and then copied the transcripts from the YouTube platform into a spreadsheet. We used the same method to collect the videos about demonetization of Black creators, using dierent keywords (please see A.1 for a list of these keywords). For the YouTube transcript analysis, we used the same annotation process as done for the Twitter data, e.g. grounded theory, albeit the annotation codes used for the YouTube transcripts were produced separately from the Twitter data (below, a more detailed description of the coding process is given in 4.1.2).</p><p>Third, following a similar but independent procedure, we collected 10 news articles published on various platforms like The Washington Post<ref type="foot">foot_6</ref> , Vox<ref type="foot">foot_7</ref> , CNN <ref type="foot">6</ref> , Rolling Stone<ref type="foot">foot_9</ref> and Business Insider<ref type="foot">foot_10</ref> between 2018 and 2020. The articles were gathered by conducting an Internet search query for "LGBTQ YouTube demonetization" and from related search queries on a proprietary news database, LexisNexis. <ref type="foot">9</ref> Next, we describe our procedure of open coding <ref type="bibr">[35]</ref> and analyzing this news and the previously described social media content.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1.2">alitative Analysis:</head><p>For our content analysis (e.g. Twitter, YouTube, News), we used Grounded Theory, open coding each data set in an iterative fashion, making and then rening the annotations we applied to the social media and news data <ref type="bibr">[35,</ref><ref type="bibr">59]</ref>. More specically, three authors annotated a set of tweets, YouTube transcripts, or news articles. The procedure entailed, rst, deciding on an initial set of codes to use, and then, second, coding each data set independently, marking salient themes to represent the categories of information observed in the data with our labels. In our rst round of coding, we annotated each data-set separately to produce distinct annotation codes for each data set. Once this round of coding was completed, we met as a team and discussed the data patterns and themes observed, and then decided on a rened set of codes to apply to the data. One author then returned to and annotated the data sets with these rened codes. Since one author independently labeled each data set with these codes, we did not measure inter-rater reliability because the statistic is used to measure agreement between multiple coders <ref type="bibr">[59]</ref>. Two authors reviewed and discussed these nal annotations to check if the labels had given common meaning. Once each data set was coded and reviewed, we met to discuss the annotations once more, then wrote memos to categorize the codes and their associated data under salient themes describing the content. In the Appendix of this paper, we included our nal code book for the Twitter, YouTube and news data-sets.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1.3">Limitations:</head><p>One limitation of our methods was using keywords associated with LGBTQ demonetization to discover and retrieve data from platforms (e.g. Twitter, YouTube, LexisNexis). It is possible the keywords were insucient for collecting representative data samples. Another limitation was that we only analyzed a sample of the collected tweets. Our goal was not to generate a statistical representation of LGBTQ demonetization tweets but rather to learn more about the range of perceptions and responses to demonetization by creators associated with marginalized demographics.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2">Interviews with YouTube Content Creators</head><p>We conducted semi-structured interviews with 20 YouTube content creators to learn about (de)monetization, particularly how they made sense of and responded to YouTube's automated content decisions. We sought to learn of the impact demonetization has for YouTube creators, as well as how to improve the ways in which creative labor is rewarded and what tools could help content producers address demonetization bias.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2.1">Recruiting:</head><p>We recruited participants by posting information about our study to Twitter, Discord, Reddit, and a project website hosted on Github. <ref type="foot">10</ref> Our eligibility criteria required study participants to be 18 years or older, a current or former YouTube creator, and someone who has knowledge of content demonetization.</p><p>Between Aug 2, 2021 and Sep 6, 2021, a total of 66 persons completed the screening survey, with 60 reporting they were currently or formerly YouTube content creators. Nearly half (28) had created YouTube content for 3 to 5 years, while 16 had produced YouTube content for more than 5 years, 13 for 1 to 2 years, and 5 for less than 1 year. Of eligible screening survey respondents, all 37 were invited for 1-hour interview session, and 20 completed the recorded session. The interviews took place online via Zoom, and participants were compensated with a $35 USD gift card for their time.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2.2">Data Protection Measures:</head><p>Zoom automatically produces text transcripts when meetings are recorded; we used the transcription feature to produce text data from the interviews. We also redacted personal data and personally identiable information (PII) from interview transcripts to protect participants' privacy. Interview recordings were stored in an encrypted, password and 2-factor authentication protected institutional cloud account.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>4.2.3</head><p>Interview estions and alitative Analysis: Our analysis followed a similar method as in section 4.1.2. After redacting personal data from the interview transcripts, we uploaded them to Taguette, an annotation software program for qualitative coding. <ref type="foot">11</ref> Three authors independently coded an initial subset of the transcripts, adding annotations to describe patterns that emerged, such as beliefs creators held about how (de)monetization works on YouTube. The authors then met to discuss and rene the codes. A nal set of codes were agreed upon, and two authors used these rened codes to annotate the remaining transcripts. With the annotation transcripts, one author wrote memos to characterize data patterns, organizing the codes and associated information together and describing their themes. As with the content analysis, we used these memos to inform our results about our main research questions. In the Appendix of this paper, we included our nal code book for the interview transcript data.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3">Demographic, Socioeconomic and Monetization Survey</head><p>Each interview participant completed a brief survey asking about their demographic and socioeconomic information, in addition to questions related to monetization. There were a total of 21 survey respondents, including one survey respondent who completed the survey but not the interview; we included this person's survey responses.</p><p>"Give everybody a lile bit more equity" 424:9 4.3.1 Demographics, Socioeconomic and Geographic Information: Gender: Our survey respondents were mostly men <ref type="bibr">(18)</ref>, with 2 women and 1 non-binary person. Race: Survey respondents reported their race as: Black <ref type="bibr">(7)</ref>; White <ref type="bibr">(10)</ref>; American Indian or Alaskan Native (2); Asian (1); and White and Asian <ref type="bibr">(1)</ref>.</p><p>Age: Respondents were relatively young: 6 were ages 18 to 24 years; 12 were 25 to 34 years; and 2 were 45 to 54 years old. Sexual Orientation: Survey respondents reported their sexual orientation as: Bisexual (5); heterosexual or straight <ref type="bibr">(15)</ref>; and 1 respondent self-described as "mostly heterosexual. " Geographic location: Our survey respondents reported currently living in Belgium, Malaysia, Pakistan, USA, and UK. A number of our interview respondents also reported living outside of the US, Europe, and Asia. 12 Education and Occupation: Respondents were highly educated: 1 had a high school diploma or equivalent; 6 had some college; 2 had an Associate degree; 9 had a bachelor's degree; 1 had some graduate school; and 2 had Master's or professional degrees. Respondents held a number of occupations: eight (8) described their occupation as "Professional YouTuber", "YouTube content creator", "video producer", or "content creator" of some type. Three (3) held other tech worker occupations or work(ed) in the tech industry. The remainder held occupations in nance, marketing, sales, education, and community services. A few (2) respondents were Social Security Disability Insurance (SSDI) recipients. Two (2) were currently unemployed and either planning to enroll in university <ref type="bibr">(1)</ref> or engage in a job search soon <ref type="bibr">(1)</ref>. One (1) respondent reported their occupation as freelancer. Our survey respondents reported annual household incomes (before taxes) from under the poverty line to the upper-middle class or above for the United States. 1314 4.3.2 Content Production and Monetization History. Of the 21 survey respondents, ve (5) had created YouTube content for more than 5 years; ten (10) for 3 -5 years; ve (5) for 1 -2 years; and, one <ref type="bibr">(1)</ref> for less than a year. A total of 9 respondents reported they were YouTube Partner Program (YPP) members. 15 There was a gap between those saying they had earned money from making YouTube content and those saying they were YPP members, perhaps because 3 respondents said they were not directly compensated by YouTube at this time but were previously. In total, 17 respondents said they had earned money for creating YouTube content. Of these, 12 reported they were paid directly by YouTube; 8 were paid by a brand sponsor; 11 were paid by viewers or subscribers via 3rd parties (e.g. Patreon, CashApp, Buy Me Coee); and 2 preferred not to report. In an open-text response in the survey, one creator also reported they were paid from "streaming revenue from cross-posting the same videos to another steaming services (Nebula)" (Respondent 5). Another reported they earned income from "sales from merchandise" (Respondent 18). Respondent 21 reported they had, "collaborated with a friend online for their music video, and they paid me through PayPal. " Of those voluntarily responding to the question, in total, 17 respondents reported how much income they earned from creating YouTube content in the preceding 12 months from any source, e.g. via YouTube advertising or 3rd parties, and 1 respondent said they preferred not to disclose. In the previous 12 months, of these 17 creators, ve (5) made less than $5,000; four (4) 12 Not all of our survey participants who reported living in the United States currently live in the United States. We have reported the locations participants reported in the survey, even for those who reported living elsewhere during the interviews. While we are not sure the reason for the discrepancy in reporting, due to safety and privacy concerns, especially for those who belong to marginalized communities, we decided to report the survey locations. 13 <ref type="url">https://money.usnews.com/money/personal-nance/family-nance/articles/where-do-i-fall-in-the-american-</ref>economic-class-system 14 We did not collect information about the number of persons living in each YouTube creators' household, and therefore, we cannot oer an exact comparison of their reported annual income to established denitions for 'middle class income' in the United States, such as dened by Pew Research Center: <ref type="url">https://www.pewresearch.org/fact-tank/2020/07/23/are-you-inthe-american-middle-class/</ref> 15 <ref type="url">https://www.youtube.com/creators/how-things-work/video-monetization</ref> between $5,000 -$14,999; two (2) between $15,000 -$29,999; two (2) between $30,000 -$49,999; two (2) between $50,000 -$74,999; and nally, two (2) creators made between $100,000 -$149,999.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3.3">Interviewee Aributes.</head><p>To contextualize the results reported in section 6, in Table <ref type="table">1</ref>, below, demographic and content related attributes of the YouTube creators we interviewed are listed. [*] In Table <ref type="table">1</ref>, the codes D, G, V, CD, A, MI, H, P, M and U indicate the reason why the creator believed their content was demonetized. D = demographics, G = gore, V = violence, CD = copyright dispute, A = adult content, MI = misinformation, P = profanity, M = mortality/death, and U = creator felt the reason was unknown. The symbol "!=$" means the creator was not yet eligible for direct monetization with YouTube and had reported not being demonetized yet for this reason, even if they had made indirect money and/or they had experienced social harms aecting their content (e.g. visibility, discoverability) or indirect earnings.</p><p>[**] In Table <ref type="table">1</ref>, the reported content income was for the prior 12 months, in U.S. dollars, and from any source, meaning direct monetization from YouTube advertising and/or third party sources, such as viewer subscriptions on Patreon.com. [***] Participant 19 reported a LGBTQ Pride video they produced was not demonetized but comments on the video containing LGBTQ keywords were automatically placed in their YouTube account's spam message folder -P19 emphasized they thought this was "weird" since comments on the same video without LGBTQ keywords were not automatically labeled spam.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5">YOUTUBE DEMONETIZATION AUDIT ORGANIZED BY CONTENT CREATORS</head><p>In this section, to provide background information needed to contextualize our ndings for readers, we describe a creator-led audit of YouTube's demonetization system, undertaken by content creators to learn why LGBTQ content was being demonetized on YouTube, even though creators were reporting the demonetized content complied with YouTube's policies. We detail their auditing approach and observations, though we did not replicate the audit or verify its ndings.</p><p>To begin, three YouTube creators designed A/B video tests to assess whether a popular demonetization theory was true, namely if including LGBTQ keywords in video titles triggered demonetization. These creators used a publicly visible Google Spreadsheet to coordinate work, and documented their results in a public Google doc. 16 First, these creators rst extracted 100 video titles from the top search results for [the keywords] gay, lesbian, and LGBTQ. Second, they reviewed these videos to check if they complied with YouTube's content policies and standards. They then replaced the LGBTQ keywords in the titles of these videos, swapping them with alternative words such as "friend" and "happy. " Third, they uploaded the original (e.g. control) set of 100 videos that had LGBTQ keywords in the titles to test if YouTube's algorithm would demonetize these videos; in the control set, 33 of the videos were demonetized. Finally, they uploaded the test set of videos, identical in every respect to the control videos except they did not have LGBTQ keywords in their titles. The creators found the 33 previously demonetized videos, e.g. those demonetized in the control run, were not demonetized when LGBTQ words were replaced in the test set.</p><p>This audit also examined the eect of individual words in the title of a video, and A/B testing resulted in a list of approximately 900 demonetizing words. They found words like "gay" or "LGBT" 16 See the auditors' Demonetization Report at: <ref type="url">https://docs.google.com/document/d/18B-X77K72PUCNIV3tGonzeNKNkegFLWuLxQ_evhF3AY/</ref>editThe goal of the user-driven audit was to identify factors triggering demonetization <ref type="bibr">[11]</ref>.</p><p>were demonetized, but if these keywords were switched to "friend" or "happy, " all else equal, the video would no longer be demonetized. These results oered some evidence supporting creators' main hypothesis about LGBTQ demonetization, namely that LGBTQ keywords like "lesbian" and "gay" led YouTube's algorithm to demonetize videos. The creators wrote a report detailing their approach, as well as the outcomes of their investigation of YouTube's demonetization algorithm. <ref type="foot">17</ref>In this report, they suggested another but untested hypothesis, which was YouTube's algorithm mistook LGBTQ content as unsafe content because of "noise in YouTube's data" from videos shared on the platform about mass shooting events targeting the LGBTQ community, such as the Pulse Night Club. <ref type="foot">18</ref> The creators expressed the belief content about hate crimes against the LGBTQ community would continue to harm monetization of safe or policy-compliant content about the LGBTQ community. In the next section of this paper, we report additional perspectives of creators on reasons why they believed content featuring disadvantaged or marginalized demographics was demonetized.</p><p>Ultimately, the creator-led audit of YouTube's algorithmic system for demonetization decisions suggested the system made errors when classifying if LGBTQ content was safe for advertisers and viewers or eligible for monetization. While the audit was presumably designed to identify the source of demonetization of LGBTQ, among other content, it was not necessarily designed to unearth the views, responses or needs of content creators who had witnessed or experienced demonetization. In the remainder of our paper, we report on YouTube creator beliefs and reactions to demonetization practices, and the tools and infrastructure they said would help content creators navigate YouTube's demonetization machine.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6">FINDINGS</head><p>We have organized our ndings into four parts: (1) creator perspectives about types of demonetization bias occurring on YouTube, (2) perspectives on the kinds of harms the practice has inicted on creators, (3) creator responses to demonetization, and (4) tools and infrastructure creators reported would improve their ability to respond to demonetization and help meet their economic needs. Below, OC refers to "online content", with OCT# specifying the data source was Twitter and OCY# meaning the data was from a YouTube video, and P# indicating an interview with a content creator.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.1">Perspectives on the Types of Demonetization Bias</head><p>We found three common narratives people believed as to why content was demonetized by YouTube's algorithm: "stereotyping", "copyright disputes", and "devaluing viewer audiences. " Since creators more often reported harms to marginalized communities from "stereotyping, " we focused more on this narrative.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.1.1">Stereotyping:</head><p>Creators felt YouTube's algorithm was biased, often misclassifying content associated with marginalized demographics and labeling the content as unsafe or harmful to advertisers and children or families, which then consequently demonetized those videos. We characterized this demonetization bias as "stereotyping" because, one, creators felt the algorithmic systems fed damaging stereotypes about marginalized demographics; two, because associating the content of marginalized demographics with harmful content is a harmful representational bias <ref type="bibr">[74]</ref> (please see section 6.2 for more detailed information on demonetization harms).</p><p>To explain the algorithmic bias against these communities, creators proposed a number of "folk" theories <ref type="bibr">[48,</ref><ref type="bibr">73]</ref> -those theories that users of a system develop to make sense of it. First, for LGBTQ content demonetization cases, we found the most commonly held folk theory was YouTube's algorithm demonetized videos containing LGBTQ keywords. "It may be because there was a video that I may have used a sensitive words like [...] gay." (P7). More specically, creators believed YouTube's algorithm classied videos with LGBTQ keywords as if it were too sexualized or pornographic: "LGBTQ content was getting demonetized just because YouTube was viewing it as like sexually explicit content. " (P2). Notably, from our interviews, we found creators who said they belonged to the LGBTQ community reported they were not demonetized because they did not make LGBTQ content (partly to avoid demonetization harms). Hence, some creators believed YouTube demonetized content if it were associated with the LGBTQ community, but not necessarily LGBTQ creators.</p><p>In addition to LGBTQ content, second, we discovered creators thought Black community content was labeled too controversial for advertisers by YouTube's algorithm and then demonetized. We found some creators linked YouTube's content policies to how they believed YouTube's demonetization algorithm worked, suggesting not only was the technology biased but also the platform's policy. For example, in one video, a creator explained:</p><p>"YouTube [created] this new [monetization] policy because big advertisers that partner with YouTube [...] Big brands [...] [do] not want their ads associated with a video or placed on the video that is controversial, so controversial [to YouTube] could be like Black Lives Matter or anything about [the] LGBTQ community. " (OCY24)</p><p>Our interviews and content analysis showed creators reporting a variety of content associated with Black communities were demonetized by YouTube, from Black Lives Matter videos to videos commenting on Black culture. In contrast to the main creator theory about LGBTQ content demonetization though, we found creators believed Black creators were also demonetized because of their identity, e.g. race, as well as because they believed YouTube's algorithm associated their content with marginalized demographics. One creator said of their experience creating content for YouTube: "As a Black [person] [...] and as a mini content creator I've faced a lot of discrimination"(P8).</p><p>Another common but more general sentiment as to why the YouTube algorithm agged marginalized community content was that it learned from human biases, in particular those of YouTube viewers and advertisers. More specically, creators believed advertisers had a preference for not displaying their ads on content related to marginalized communities, resulting in YouTube's algorithm adopting these preferences and associating marginalized demographic content as "not advertiser-friendly" or "not safe. " Summarily, their theory suggested demonetization bias resulted from YouTube's algorithm learning discriminatory tastes. <ref type="foot">19</ref> In their interview, P17 said: "Maybe some of [the bias] is on viewers side, maybe some of it isn't [and] is [...] entirely on YouTube. But I do feel like there is a dierence there [...] I do feel like if I was covering a romantic comedy [about a heterosexual couple] [...] I feel like maybe I would be a little bit less careful than I would be for covering an LGBT movie. " (P17) 6.1.2 Copyright Disputes. Another common narrative related to copyright disputes. As this perceived algorithm error does not necessarily aect marginalized demographics specically, we keep this subsection short and include it for the sake of completeness.</p><p>YouTube has an algorithm called "Content ID" <ref type="bibr">[38,</ref><ref type="bibr">71,</ref><ref type="bibr">77]</ref> to detect if videos improperly use copyrighted material. However, several creators felt that it often incorrectly agged and demonetized what they considered fair use of copyrighted material in their videos, and these errors facilitated a ecosystem of erroneous claims, with major studios and corporations being paid the income the creators would have otherwise made from their content. Creators also felt that Content ID <ref type="bibr">[77]</ref> only helped protect large corporations, and they had no similar tool to protect their own original content and thus monetization income ("YouTube provides no protections or no content ID for youtubers" [P14]). Further, creators felt that their own content was regularly ripped o: "In terms of just like other people ripping your content and uploading it separately that happens all the time to people with any kind of size and success on the platform" (P14).</p><p>Creators also felt that YouTube's Content ID system <ref type="bibr">[77]</ref> could be abused to retaliate against creators for reporting discrimination, hate, or bias targeting marginalized demographics. For example, one creator said: "[One] movie I reviewed, I did [state] in my <ref type="bibr">[video]</ref> commentaries [I felt the movie] was racist [...] and I think the the content creator really did not like that [...] and I think that was a deliberate reason why they sent me [...] a DMCA [notice]" (P17). We elaborate on this issue of system abuse in the Discussion. The third common narrative was that creators believed content by creators living in economically disadvantaged regions was demonetized unfairly. We used the term "devaluing viewers &amp; geography" to describe this type of demonetization bias, a practice that reduced the monetization income of creators due to algorithmic system setting of dierent prices for advertising sales. Creators believed YouTube charged advertisers a dierent price to display ads to consumers living in poorer regions, resulting in creators from those regions earning less income than creators in richer economies: "One of the common misconceptions about work on YouTube is you're not buying an ad on my channel necessarily, [...] you're buying an ad against a specic type of viewer and so that type of viewer may be monetized dierently and, therefore, indirectly, certain kinds of creators are aected by that. " (P9) Creators thought YouTube established lower prices for targeting viewers living in poorer areas because these consumers were presumed to have less purchasing power. We found creators thought YouTube's algorithm incorporated these estimates of consumer power into its advertising prices, and this led to monetization pay disparities: "In the US, you could actually be getting as high as 50 cents per view you know in other or lower regions, you could actually getting way lesser amount of you know, this actually all this has to do location of people actually interacting with your videos, so that also has a lot and the higher impact on how much you get paid for video. " (P6) Germane to this issue, in the United States geographic location is a proxy for race <ref type="bibr">[28]</ref> and the long history of structural racism has meant Black communities have had less economic power in housing, nancial and consumer markets <ref type="bibr">[28]</ref>. Consequently, using geographic attributes to determine the price of YouTube ads and targeting YouTube audiences could inadvertently discriminate against consumers or creators on the basis of race.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.2">Perspectives on the Types of Demonetization Harms</head><p>The previous section discussed types of bias leading to demonetization. In this section, we look at the harms that creators felt were caused by these biases.</p><p>6.2.1 Economic Harms. First, creators reported demonetization led to economic harms from advertising placement restrictions, decreased content discoverability, conicts over copyright, and "Give everybody a lile bit more equity" 424:15 viewer targeting practices. Creators believed these factors caused them to lose income directly from YouTube advertising placements and pricing, and indirectly from changes to trac to their content. In turn, creators reported these impacts increased the amount of time they spent making content, which they believed reduced the value of the income they earned from making YouTube videos.</p><p>I. Losing Direct Monetization Income from Reduced Advertising and Viewer Trac: Creators reported losing income when YouTube's algorithm agged their content as prohibited, reducing their earnings from demonetization and their content's discoverability on the platform. For example, when asked about what the impact of demonetization was on discoverability, P9 said: "Yeah of course I'm going to make less money. I'm also going to you know go through all of the stages of despair. I'm wondering what did I do wrong and why does nobody love me anymore so that's fun too" (P9). Another participant explained further, "I do remember a lot of creators [..] kind of panicking and being like I want to continue making this content but YouTube is severely under serving it, because I'm not making a lot of money, because [they] don't nd it safe for advertisers, they don't nd it safe for viewers, even though we're not talking about sexual things we're not talking about like a sexually explicit things we're just talking about life and in the queer space" (P2). From our interviews and review of legal documents, the total amount of income creators reported lost or minimized due to demonetization ranged from less than $1,000 (P7) to more than $10,200 <ref type="bibr">[19]</ref>.</p><p>II. Losing Indirect Monetization Income from 3rd Parties: Creators also believed that discoverability of their content on YouTube impacted the income they earned from 3rd parties, such as brand sponsors and aliates. For instance, in one lawsuit against YouTube, one LGBTQ creator reported having their brand sponsor income reduced from "$8,000 for each sponsored video" to $800 <ref type="bibr">[19,</ref><ref type="bibr">40]</ref>. In interviews, creators reported the demonetization impacts to the visibility and discoverability of their content <ref type="bibr">[26]</ref> aected the number of viewers who subscribed and watched their videos, which then aected brand sponsor deals. One creator, P1, also reported, in the YouTube gaming community, creators typically required other creators to have a certain number of follower to collaborate with them, as such reductions to discoverability could impact opportunities for creators to earn income from collaborations (please see 6.2.2 for more detailed reporting these creator perspectives on the social harms of demonetization).</p><p>III. Wasted Time &amp; Reduced Value of Earnings from Extra Work: Demonetization increased the amount of time creators spent producing a video, to make their content suitable for YouTube's algorithm. However, this work reduced the income they earned, since their pay was not linked to the hours they worked but advertising revenues. One creator said about the extra work: "It's very frustrating to upload a draft say okay x this part, and you did [that], and then another section gets claimed, and then you upload it again and you might have xed that part but then the old section apparently wasn't good enough again and that gets claimed again [...] that's kind of [...] what I meant by the lack of communication, at least through the YouTube system, is that it's very, very inconsistent it just feels like sometimes it's stringing you along" (P11). One interviewee reported creators spend time testing the YouTube algorithm to learn what content the algorithm would classify in a way leading to demonetization: "People play around with the algorithm, they try to gure out [...] 'if I use these words in my description, maybe, those are like you know bad words,' so I can't use those because I might get demonetized" (P2). Another creator, after learning about the LGBTQ content audit (discussed in 6.3.1), declared they would need to use the audit's "demonetizing keyword" spreadsheet tool every time they made a video to avoid demonetization. Other creators reported they were demoralized by the extra work they had to do to avoid demonetization, a personal harm (see 6.2.3 for more details on this type of harm). Finally, creators also reported that contesting demonetization errors was prohibitively time-consuming, and this time was also perceived as a tax reducing their earnings (see 6.3.2 for details).</p><p>IV. Impact of Income Reductions: During our interviews, we asked creators about how losing income aected their lives. Creators said while they did not only depend on their YouTube wages to earn a living, they believed it was a hardship for many creators who did. One creator, whose primary source of income was their YouTube content, said the impact of demonetization "was not good [...] because sometimes when I posted a video on YouTube, [there's a] denial of the [money] I'm supposed to have been paid" (P18). Creators reported YouTube has withheld their pay from ads the company placed on their videos after the video was demonetized by YouTube's algorithm.</p><p>Creators believed these monetization damages were unfair and led pay disparities among creators who made otherwise equal quality content, and whose channels were similar in size and reputation. One creator said: "There is denitely not an equal pay for an equal amount of work in the ecosphere of YouTube. I feel like the pay is very much dependent on the success that you achieve on the platform, and that obviously varies from person to person dramatically [...] [and] I would denitely say that [this] denitely is something to do with the algorithm and how much [the algorithm] recommend[s] you and how much viewers respond to your work" (P17). Others felt monetization pay on the platform was fair among channels of equal size and quality: "I think in terms of the amount of pay that they get per views, I think it should be fairly close cut" (P4).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.2.2">Social Harms.</head><p>Creators also perceived several social harms resulting from demonetization on YouTube, including (1) visibility, (2) discoverability, (3) inuence, and (4) unevenness in what content is and isn't demonetized.</p><p>First, creators reported demonetization aected the visibility and views of their content, which in turn caused them to lose income as well as inuence. More specically, demonetization was said to reduce the number of ads placed on creators' videos, which in turn aected how many views videos got. We found creators believed most views occurred in the rst few days. Thus, even if they successfully appealed demonetization of their content, they would still lose money because of the views lost during those days. Creators also believed reduced visibility of their content on YouTube aected monetization income from brand sponsors, e.g. their status as a consumer inuencer.</p><p>Second, creators felt demonetization impacted discoverability of their content, in terms of search results and recommendations, which further impacted views. Creators belonging to marginalized demographics also reported that platform visibility and discoverability aected brand sponsorship oers. As a matter of course, without sucient visibility or discoverability, YouTube creators not yet eligible to earn monetization income would nd it dicult to become eligible. We found creators who were not yet earning income from YouTube advertising but had aspirations to grow their YouTube channel and monetize their content in the future were concerned about whether new and small creators, as well as those who made content about marginalized demographics have a fair chance on the platform.</p><p>Lastly, YouTube creators, especially those producing content about marginalized communities, believed YouTube's algorithm demonetized their content while failing to demonetize hate content targeting protected demographics. One person said: "People have posted sexual explicit videos and horrible videos on YouTube saying hateful things using slurs about people who are LGBTQ they don't demonetize that" (OCT356). Others made comparisons pointing to disparate treatment of marginalized groups: "@YouTube if you're gonna keep demonetizing LGBTQ+ videos can you also demonetize Heterosexual relationship. Either they're all okay or they're all not okay" (OCT357). Some creators contested this narrative though, suggesting instead YouTube has demonetized all types of content or creators who failed to comply with platform rules: "They demonetize almost all creators regardless of race, sexuality, gender, so stop acting like it's just an LGBTQ problem" (OCT394). Along these lines, creators expressed concerns over YouTube's economic power: "YouTube's monopoly of the medium is clearly dangerous to any and all marginalized groups. " (OCT425).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.2.3">Personal Harms:</head><p>In addition to economic and social harms, creators reported demonetization harmed them personally, specically their mental health and well-being, as well as their morale and quality of work life. Creators believed their content's performance metrics were adversely impacted by demonetization and this impacted their mental health and well-being. Creators also said the analytics YouTube provided them about their content impacted their morale, especially when their content generated lower metrics than expected due to demonetization. More specically, several creators said their personal well-being was impacted by demonetization and how the practice reduced how well their content performed on YouTube, e.g. the number of viewer watch hours, subscribers, and ranking their content received. When asked what challenges they'd want people to know creators currently faced on the platform, mental health was one, P16 said: "I feel like mental health is a big one, because I feel like a lot of the time content creation comes at the expense of your mental health [....] you're always constantly looking at the numbers and whether it's going well and, the YouTube studio shows [...] basically how long your videos [has] done compared to other videos" (P16). We additionally asked creators what could help them with harms to their personal well-being stemming from demonetization, and in section 5.3.5 we detailed their perspectives, and refer readers there.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.3">Types of Responses to Demonetization Bias and Harms</head><p>In this section, we present how creators responded to demonetization of content about marginalized demographics.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.3.1">Investigating Harms Against Marginalized</head><p>Communities. One surprising response to demonetization was several creators conducted a user-driven audit to probe why YouTube's algorithm demonetized LGBTQ content on the platform, which we described in section 5. This audit is an example of a growing trend of users of a platform organically coming together to audit the algorithm system(s) underlying a platform <ref type="bibr">[68]</ref>. Along similar lines, second, we found creators posted videos disclosing their earnings from YouTube advertising. In the future, creators could use these videos to evaluate if pay gaps exist between demographics. Here, we mention these pay disclosure videos for completeness, but note future work is needed to determine how to analyze these videos and validate the analysis, if possible. Next, we briey report patterns and statistics observed from our interviews and content analysis, but abbreviated this section because the LGBTQ content audit was described already in section 5.</p><p>From our content analysis, we found Twitter users and YouTube creators, in tweets and videos, discussed creator audits of YouTube's demonetization decisions and/or their or others' testing content features thought to demonetize videos. In total, 14 tweets were about users auditing YouTube, but only 5 of these specically mentioned the LGBTQ content audit by the YouTube creators. The tweets about user auditing were retweeted and liked 102 and 964 times respectively, and retweeted and liked on average 7 and 69 times, with a standard deviation of 19 and 206, suggesting some tweets about user auditing were retweeted and liked much more. In total, 18 or 72% of the YouTube videos we analyzed discussed user auditing, most if not all of these were specically about the LGBTQ content audit, reecting our targeted search for conversations about the creator-led audit of YouTube LGBTQ content demonetization. These conversations, in addition to reports and lawsuits cited in the media, gave the impression some amount of awareness of the creator audit and LGBTQ content demonetization could exist among creators, viewers and fans. However, we found our interview participants, while many expressed awareness of YouTube demonetizing content featuring disadvantaged or legally protected demographics, not many were specically aware or informed of the LGBTQ content audit organized by three YouTube creators.</p><p>In their interview, one creator, P6, suggested YouTube creators tended to have information about issues aecting their community, meaning those who made the same genre of YouTube content, but not issues aecting other YouTube creator communities. To the extent content featuring protected or marginalized demographic groups is its own content genre, this could have design implications for future user audits of demonetization decisions or disparate demonetization experiences. In this paper, we detail creator beliefs about design needs (please see section 6.3.6 and the Discussion), including for tools to provide creators with information about demonetization cases.</p><p>Separately, but related to design implications for creator audits, in reviewing materials from the creator-led LGBTQ content audit (described in section 5), we observed the team of creator-auditors evaluated additional keywords, other than LGBTQ keywords, by placing the words in video titles and testing which would demonetize videos. Using their audit spreadsheet, we counted the number of words they marked as demonetized during the creators' A/B tests, and found demonetized keywords included words related to: gender (51), sexual orientation <ref type="bibr">(17)</ref>, race <ref type="bibr">(16)</ref>, national origin <ref type="bibr">(13)</ref>, ethnicity (4), religion (6), and disability (33) -suggesting protected demographic content was perhaps generally at risk for being demonetized. Surprisingly, their spreadsheet keyword list also indicated they found A/B testing words related to auditing led to demonetization, e.g. "audit", "bias", "youtube algorithm", "evidence", and "demonetized. " We included this observations for completeness, but note we did not replicate or verify the creators' testing method or results, and recommend future work explore and validate methods to test for bias against content featuring or associated with legally protected demographics.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.3.2">Contesting Demonetization</head><p>Harms via the Platform, Institutions, and Online Communities. We found creators, one, engaged in formal institutional procedures such as lawsuits <ref type="bibr">[19,</ref><ref type="bibr">40]</ref>, and, two, used their "community voice" <ref type="bibr">[10,</ref><ref type="bibr">43]</ref> on social media and in the news to draw attention to and contest the harms they believed stemmed from demonetization bias <ref type="bibr">[10]</ref>.</p><p>I. Platform and Institutional Conict Resolution Mechanisms. First, we found creators utilized platform and legal mechanisms to appeal demonetization decisions. For one, they contested YouTube's demonetization errors directly via YouTube's ocial appeals system. <ref type="foot">20</ref> But, as noted earlier in 6.2.1, most interviewees thought the appeals system was ineective because they believed it would cost them valuable time they could spend producing content and would not rectify their situation. This stance toward YouTube's appeal system matches past research nding that creators view the system negatively <ref type="bibr">[15]</ref>.</p><p>Creators belonging to marginalized demographics believed YouTube responded to their reports more slowly. For example, P8 said: "When it concerns the whites [...] the [issue] gets straightened [...] Because I'm Black, for that reason, they don't usually attend to me on time, but though it will be resolved, but it's my time" (P8). As described in section 6.2.2, creators believed delayed YouTube decisions aected their monetization income; if true, and YouTube typically responded to Black creators more slowly, then this could contribute to pay disparities for Black creators. <ref type="foot">21</ref>While YouTube has platform appeals systems for disputing demonetization decisions [76, 78, 79], we found there were impediments to using the system, namely, power, money, and time. For one, creators perceived a power dierence between themselves and corporations, with corporations having more resources to make unjustied claims, having access to YouTube's Content ID <ref type="bibr">[77]</ref>, as well as to other special dispute resolution mechanisms. This meant the algorithm bias in YouTube's copyright system disproportionately impacted less economically powerful creators. Furthermore, several creators said legal costs prevented them from contesting copyright claims issued by YouTube's Content ID algorithm <ref type="bibr">[77]</ref>, causing them to lose their YouTube advertising income:</p><p>"A lot [of] people will not even dispute their claims, they'll just be like 'that's ne you can take my $5 a month I would be making o this video and I won't have to pay for a lawyer', and eectively that's a lot of how fair use exists on the platform of YouTube" (P14)</p><p>In response to these time costs, we found some creators decided not to appeal what they thought were erroneous copyright demonetization decisions: "I tend to not dispute any of these copyright claims, because it can take up to a month of time and sometimes it's like well I don't have time to get this video out to wait a month" (P11).</p><p>When platform mechanisms failed to address the harms of demonetization, some creators tried to enforce their civil rights through legal institutions <ref type="bibr">[19,</ref><ref type="bibr">40]</ref>. For example, in Divino vs. Google, plaintis were a class of LGBTQ and Black content creators who claimed Google applied "YouTube's content policies in ways that [...] disproportionately aect LGBTQ+ creators" <ref type="bibr">[19]</ref>. In Newman vs. Google, plaintis (Mexican, Puerto Rican, and Black YouTube creators) charged that Google and YouTube "prole[d], use[d], and consider[ed] [their] race, personal identity, or viewpoints, in order to interfere with, restrict, or block video uploading, viewing, promotion, advertising, engagement, and/or monetization services because [they] are African American and/or possess personal characteristics or viewpoints that Defendants dislike" <ref type="bibr">[40]</ref>. These lawsuits have faced roadblocks though, with some calling into question how creators could use legal mechanisms or institutions to redress the harms of demonetization bias.</p><p>II. Online Community Conict Resolution Actions On Social Media: Second, we found creators used "community voice" <ref type="bibr">[10,</ref><ref type="bibr">43]</ref> on social media and news platforms to contest what they perceived as "unjustied demonetization" and YouTube's performative gestures toward marginalized communities, and to solicit a response from YouTube about these matters. The concept of "voice" in the literature typically references when workers use communication mechanisms to inuence the terms and conditions of their work <ref type="bibr">[43]</ref>. In contrast to YouTube appeals, creators believed social media -specically Twitter -was an eective way to elicit a response from YouTube about demonetization cases.</p><p>For example, P2 reported they resolved their demonetization case by tweeting at YouTube's account: " I posted a screenshot [...] an email from YouTube saying great news after manually reviewing your video we've determined that it is suitable for all advertisers. So I just I made the tiniest little stink about it on Twitter it didn't even get to too much traction I think [...] it was pretty much a cut and dry case of like 'hey, YouTube! I'm just using clips from this TV show, so if it's violent than [that's] the TV shows [...] that's not me"' (P2). Creators also responded to demonetization bias by calling out YouTube on Twitter, sometimes comparing YouTube's demonetization of content about marginalized demographics to its monetization of hate content to emphasize disparate treatment of marginalized groups on YouTube. One person tweeted: "@YouTube @[redacted] And yet you platform so much hate. I have had a homophobic Nazi video play when my nephew and I had been watching toy car restoration videos. Thank god I was there to turn Youtube o. " (OTC377).</p><p>Furthermore, creators and their allies used community voice to call out the company's "performative allyship", which describes when companies or people do things to present themselves as supporting marginalized demographics without necessarily addressing the actions and stances, including their own, that have typically harmed these communities <ref type="bibr">[18,</ref><ref type="bibr">24,</ref><ref type="bibr">56]</ref>. For example, in response to YouTube changing its Twitter prole picture to a pride ag, one user tweeted at YouTube: "HAPPY PRIDE! THANK YOU @YouTube FOR NEVER SKIPPING A BEAT TO DEMONETIZE ANYTHING I POST WITH 'TRANS' IN THE TITLE. Change your fucking prole picture until you actually support your LGBTQ+ creators" (OCT385). Another user tweeted: "Do not even begin to pretend that you ght for the LGBTQ community. You routinely demonetize LGBTQ videos, while allowing anti-LGBTQ creators who break your TOS to continue operating without punishment" (OCT387).</p><p>While social media was often used by creators to report demonetization harms and garner broader public attention, creators were not always aware of similar harms experienced by creators in other YouTube communities. For example, when P3, who was aware of LBGTQ content demonetization, was asked if they heard of similar biases against additional YouTube communities, they said: "I stay within the BookTube side, so [...] so I couldn't really say for sure that that's what's happening. So I wouldn't want to say yes denitely, [...] but I would assume that similar trends were happening in other communities, like the Makeup Community or elsewhere" (P3). This lack of awareness is a barrier for further collective action, and we elaborate on potential solutions later in the paper.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.3.3">Censorship of Content</head><p>Perceived to Demonetize Videos. In contrast to using collective voice, a dierent way creators responded to YouTube's algorithm was self-censoring what they put in their videos. P17, who made movie review video blogs for YouTube, told us: "If I think about what a specic lm is covering, my decision to the cover of lm might be determined by whether or not I am perceiving the lm is potentially being a risk to my own monetization. And sometimes I will deliberately take eorts to go around that or try to avoid it". P17 explained further: "I've denitely covered movies where I've been cautious about what I talked about [...] [for example] I reviewed a period romance movie that had a lesbian romance at the center of it and [...] I deliberately did not include the words lesbian or LGBT in the tags, because if that was there, I was worried that [would] trigger something. " (P17) Creators self-censoring content calls into question if YouTube's demonetization algorithm unintentionally creates nancial incentives to minimize the depiction of marginalized demographics. This self-censorship could aect how marginalized communities are depicted in videos, including relationships, family, education, employment, and the range of everyday human activities people do regardless of demographic attributes. Moreover, creators excluding marginalized demographics could amplify the YouTube algorithm's stereotyping bias, especially if the algorithm more often encounters negative depictions of a demographic group. For instance, in 6.3.1, the organizers of the LGBTQ content demonetization audit believed YouTube's algorithm would continue to disparately demonetize LGBTQ content because of the types of videos (e.g. violent) posted to the platform after hate crime events against LGBTQ people.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>6.3.4</head><p>Collectively Supporting Marginalized Communities: Content creators participated in two major kinds of collective actions to support each other and to redress economic harms associated with YouTube's demonetization decisions. The rst collective support action is "visibility hacking", to boost the presence of marginalized creators and their content. Creators formed online groups and channels on Facebook, Reddit, Twitter, and Discord to share tips and links to their content, requesting onlookers to click through and watch. When asked if there were any online communities they would recommend to new creators, P1 said: "Number one [would] denitely [be] Discord because this is where I've gotten most, if not all, of my organic views from. Discord is also where I also nd most of my online friendships. From there also, [creators] come to support me as well". We found creators believed sharing links with other creators who then clicked on them aected the views their content gets, which could in turn impact their monetization. However, one creator believed social media algorithms minimized the content they shared if housed on a dierent platform: "I cross post my videos promote them on Twitter and Facebook [...] but I do feel like those kind of social media platforms deliberately try to avoid promoting videos with links in them, sometimes to make sure that users stay on their specic websites and that in turn harms creators that are trying to promote their own work" (P17). The creator elaborated: "Creators have entirely stopped using Facebook fan pages, because the reach of them has been so limited by Facebook. That it's virtually negligible. We could almost get about the same amount of people viewing your posts as probably shouting in a very large room" (P17).</p><p>The second collective support action is uplifting colleagues, providing encouragement and advice to other creators, as well as helping them report harms and nd resources. For example, on YouTube, we found many creators made lists of other creators belonging to marginalized demographics, and issued calls for others to follow their work. One such collective call occurred after Black Lives Matter protesters decried the murder of George Floyd, with creators asking people to subscribe to Black creators. However, this kind of support was not always welcome. One creator commented:</p><p>" I remember talking with a few Black bookish creators and being like [...] I know these people are only here, they only even found out about my channel because of George Floyd's murder. And that bothers me. It bothers me that my content -something that I'm so proud of, [something] that I work so hard for-only gets big spikes in recognition when somebody who looks like me is killed. How do you think that makes us feel as Black creators?" (OCY27) Another creator reported that simply getting more followers backred, because while people subscribed to their channels, they did not then watch the creators' content: "afterwards, a lot of Black creators several months later, were saying, hey, you know, back then everybody said that Black Lives Matter but do they because you just subscribed, and it seems very performative and nobody watches this stu" (P11). This creator also believed that this kind of support ultimately penalized Black creators, because the ratio of their subscriber growth to videos watched was skewed and appeared suspect to YouTube's algorithm.</p><p>Along these lines, we also found YouTube itself published lists of marginalized content producers and issued follow requests. But creators questioned whether YouTube cared about the well-being of marginalized creators or if these eorts were another example of the company's performative allyship, e.g. taking actions to generate public relations attention that could suggest the company is on the forefront of diversity and inclusion when the opposite is sometimes true. For example, one person said: "I'm honestly super disgusted with how @YouTube is proting o of pride month when they constantly suppress, age block, demonetize, and delete channels from LGBTQ+ folks. Don't y the colors for prot when you don't actually support those people on your platform" (OCT359). Another tweeted at YouTube: "@TeamYouTube You disgust us YouTube. How dare you claim to be pro-LGBTQ+ while allowing blatant targeted attacks on an individual yet demonetize LGBTQ+ and sex education creators? If there were any other viable option your users would jump ship in a heartbeat" (OCT428).</p><p>In summary, community support was an important element of responding to demonetization harms, but Black and LGBTQ YouTube creators highlighted the importance of allies learning how marginalized communities preferred to receive support (see sections 5.3.5 and 6 for more discussion on performative allyship versus welcomed support for marginalized creators).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>6.3.5</head><p>Diversifying Income Streams and Exiting the Platform. Some creators responded to demonetization by using 3rd party platforms (e.g. Patreon, PayPal, Venmo) to solicit payments directly from fans and viewers, sometimes in exchange for exclusive access to content. But as one creator said: "Creators are starting to add said multiple revenue streams, more out of desperation than as a smart business strategy. " (OCT273). Creators who had successfully monetized their content via non-YouTube mechanisms reported a degree of freedom from the economic harms of algorithmic demonetization. For example, one creator said: "Obviously the benet of Patreon is it's a much more stable paycheck than something like the algorithm and advertisers which uctuate" (P2).</p><p>Creators reported additional benets of using 3rd party payment platforms. On Discord servers for creators and in interviews, creators said Patreon was more lucrative than YouTube advertising. However, this view was not universal. For example, as a small content creator, P11 reported they only earned $14 per month from their Patreon, not enough to support themselves economically. Another said they found social media algorithms minimized their content less if they linked to their Patreon rather than YouTube account when sharing their content on platforms: "I did notice that when I actually posted the YouTube link when the video was actually public that would get a lot less attention when I post it with the title card up promoting my patreon" (P17). Another creator, P9, said YouTube penalized them when users clicked on the link to their Patreon account, which they included in the description section of their YouTube videos: "If somebody clicks o of my video and they go to Patreon, Youtube's going to see that as a session close and penalize my video for that".</p><p>We found another way creators reacted to YouTube demonetization was to threaten switch or actually switching to other video-sharing platforms. For example, one creator tweeted: "There's always Vimeo. It's not as big as YouTube, but at least there's no weird demonetization" (OCT86). Switching was not without risks, however. We found creators who chose to exit YouTube's platform would possibly lose a substantial number of viewers who may not switch with them.</p><p>"I make a lot more o of the same dollar if it goes through Patreon, I make a lot more o the same dollar if it goes through PayPal, or any of these other systems, I make more. But I have to convince that user now to leave the closed ecosystem of YouTube and go onto a website and create an account and enter the payment information and give me money.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>" (P9)</head><p>There is also the risk that creators might face similar kinds of demonetization problems on a new platform. For instance, one creator questioned what (if any) platform was safe to switch to: "So what's a good video source other than @YouTube that doesn't support fascists and doesn't demonetize or silence LGBTQ and anti-violence voices that we can start using?" (OT456). Another creator reported they had not switched to TikTok because they found reports on the web of LGBTQ creators being demonetized there. But their band had plans to switch to the platform, and in responding to reports of bias, the creator said: "I personally identify as part of the LGBT community [...] Like that's really I guess hurtful for me [...] what if I do want to create content on TikTok? I would also want myself to not be like hidden just because of who I am [...] but now because I'm going to get into TikTok because of my bandmates and I also have to wonder whether or not I would have to hide that facet of myself to make sure my band doesn't start o at a disadvantage" (P1).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>6.3.6</head><p>Perspectives on the Types of Creator Needs. As part of the interviews, we also asked creators about how to improve the ways that creative labor is rewarded, and what kinds of tools could help them address demonetization bias. Creators said that they wanted tools and infrastructure to provide them: (1) reliable information about monetization and demonetization, (2) control over their content and advertising, (3) economic security, and (4) well-being. In this section, we report on what creators said would help meet their needs for transparency, autonomy, and predictability in earning income from content monetization.</p><p>Information: First, we found creators lacked clear information about what and why content was being demonetized, and believed tools providing aggregated information about the number of demonetized videos by content genre (e.g. BookTube essays, gaming, music, etc.) and platform metrics (e.g. #subscribers, #hours watched) would help them monitor trends and avoid harms. At the same time, creators said aggregated disclosures about how many videos of a particular genre were demonetized would not necessarily provide actionable information, such as why the content was demonetized (e.g. LGBTQ keywords). One creator reported wanting "a tool that would help a content creator realized that [they have] maybe used a sensitive word [...] because if we're uploading maybe the content creator might have a time to maybe go back and x [this]" (P7).</p><p>In addition to being notied about demonetizing or demonetized content, creators said they would nd helpful a tool providing them with aggregated information about how much income creators earned from YouTube advertising. Creators recommended the tool list earnings by creators producing the same genre of content who had similar channel metrics. However, creators also expressed privacy concerns about this kind of transparency. As noted earlier in this paper, some creators have already taken steps towards transparency by sharing videos disclosing their YouTube income to evidence their monetization pay. <ref type="foot">22</ref> While measuring creator pay gaps is beyond the scope of this paper, we note future work could measure the extent of monetization disparities using these pay disclosure videos.</p><p>Control: Second, we found creators wanted to know what types of advertisements were displayed on their videos, and control over which ads YouTube placed on their content. One creator said: "if I could choose what the Ad was, I would love that" (P19). Creators believed having this information would help them say what ads were allowed on their work and block unwanted ads from their audience, such as hate ads. During their interview, one creator said: "I think knowing what ads are being run on your videos is important, if nothing else, then for the fact that you can also say, well, actually I don't want those ads on my videos, you can do the exactly what advertisers say when they say I don't want my ads on that kind of video. You can say I don't want those kinds of ads on my videos" (P4). At the same time, we found creators thought controlling ad placements on their videos could impact monetization income from YouTube advertising. For example, another creator stated: "More control would be great, absolutely. Every creator should have a lot more control over [what's] being seen on their content. It does mean that they would probably be seeing less revenue, but I'm perfectly ne with a creator being able to choose" (P9).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Economic Security:</head><p>"These are people's jobs, like this is how people put food on the table. And having that security blanket, because I think that's what is missing a sense of security, a sense of 'Okay, if one video ops it's not the end of the world, ' I have, you know, this safety net. That's kind of what's missing, I think, even for larger channels and smaller channels, of course, but I would really love to see them oer things like health care benets" (P11) Creators described several economic improvements to help them: (1) minimum payments to alleviate income insecurity, (2) benets (e.g. health insurance and taxes support), and (3) a labor union. These changes do not target creators associated with marginalized demographics specically, though in the broader labor market these types of economic infrastructure are known to help marginalized communities <ref type="bibr">[70]</ref>, so we include these ndings for completeness.</p><p>First, creators reported their income from YouTube advertising was variable or unpredictable, leading to uncertainty and hardship. We asked what they thought about YouTube paying creators a minimum payment for their work, nding most believed this would help creators producing content for the platform. For example, one creator said: "If there was like a minimum wage above subscribers that would be very to be helpful" (P13). Another creator who supported minimum payments suggested: " I think that when you reach a certain threshold or like you get a certain number of views, you should just have a base income. That would be a good idea, because then it would be a more reliable source of income" (P16).</p><p>Along similar lines and in response to reports about the harms to mental health from content work, we asked creators for their perspective on YouTube providing them with health insurance. We found creators believed this was a good idea. At the same time, they questioned if YouTube would be willing to provide them with health benets. For example, one creator said: "It's hard for me to imagine YouTube giving giving creators the things that employers give to their employees" (P14). Others oered suggestions for how the platform could determine eligibility. Finally, one creator was less sure if YouTube health insurance was a good idea: "I don't know if actually having YouTube supply creators with health insurance is a good idea I just liked the idea of them doing that" (P2). Beyond this paper, future work could survey content creators about their perspectives and need for platform-provided health insurance, among other benets. <ref type="foot">23</ref>Finally, our interviewees believed a labor union for content creators would help them to address the challenges they experienced, such as with demonetization appeals made to YouTube, and with negotiating better terms and conditions for content work. For example, when asked if creators should unionize, one creator said: "Absolutely. [...] It's like any other job. People do have to band together and ght for their rights [...] it could sound silly to some, but I think that would be [...] phenomenal" (P11). But the creator also noted their concern about retaliation:</p><p>"The only issue is, are [creators] going to get cracked down on for doing so? [...] We should have the right to do that, and we should have the opportunity to say, 'hey, we deserve better' because frankly we do. And I would love to see that happen, and I would love to be a part of that, but even right now as I say that I say but 'is that going to cause consequences for you know my channel?"' (P11). Another creator who believed a labor union could help creators, indicated some doubts, reporting one prior eort by a union was unsuccessful: "a [person] in Germany who tied up with one of the Labor associations to do something. [...] it didn't work out and it never would. You can't go to the German Government to force a YouTube to do anything" (P9). <ref type="foot">24</ref> They also reported two YouTubers, in what could have been a nancial scam, had claimed to provide creators with union benets in exchange for money, indicating eorts to unionize could be met with distrust if issues they associated with unions were not addressed: "they created this creator Union thing that you had to pay, you know $60 a year to be a part of, and they would lobby YouTube and they made all these grand promises and then they did nothing" (P9). Along these lines, future work could survey creators to gain a broader view of their perspectives on and wishes for a creator labor union.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7">DISCUSSION</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.1">The Future of Work in the Content Creator Economy</head><p>There is a growing sense that user-generated content production for digital platforms is work <ref type="bibr">[15,</ref><ref type="bibr">53]</ref>. But what economic benets and collective rights should digital platforms and public policy accord to content creators? On YouTube, creators indicated content work was not compensated fairly, and decisions to demonetize were sometimes rooted in perceived bias against marginalized demographics. Therefore, it is critical that compensation for content work, including demonetization practices, be explored in the context of future of work as a plausibly regulated economic opportunity under existing frameworks for employment and anti-discrimination.</p><p>An example problem creators faced in this context is a lack of clarity about what mechanisms can be used to redress economic harms of demonetization bias. For example, YouTube creators resorted to legal mechanisms and sued YouTube for discrimination and economic losses from what they perceived were demonetization biases against LGBTQ <ref type="bibr">[19]</ref> and Black creators <ref type="bibr">[40]</ref>, but these cases were dismissed by U.S. courts <ref type="bibr">[46]</ref>. Aside from Twitter and mutual aid from others, creators reported few other mechanisms to help them resolve disputes over cases and recoup economic damages where demonetization was viewed as unjustied. In some senses then, content creators are akin to the millions of gig economy workers who lack adequate mechanisms to protect them against economic harms of their work <ref type="bibr">[33,</ref><ref type="bibr">50]</ref>. How can platforms design their algorithms and verify they do not harm marginalized groups, including content creators? For example, some algorithms are probabilistic <ref type="bibr">[9,</ref><ref type="bibr">42]</ref>, in essence they are designed to discriminate statistically by attaching labels to groups. What amount of error in associating marginalized groups with harmful labels like "unsafe" or "not family-friendly" or "not able to purchase as much as rich people" is justiable? While beyond the scope of this work, we recommend lawmakers and regulators require platforms to consider if their algorithms contribute to the range of harms "safety" regimes have historically caused marginalized groups <ref type="bibr">[29,</ref><ref type="bibr">34]</ref>.</p><p>Another source of challenges for future of work comes from advertiser-based business models. YouTube has content policies and community standards guiding content creators about what content the platform deems not "safe" or "advertiser-friendly" <ref type="bibr">[75,</ref><ref type="bibr">80,</ref><ref type="bibr">81]</ref>. But creators believed these norms were inuenced by advertisers who did not want their marketing displayed on certain content, particularly videos deemed inappropriate for children or considered controversial (including social activism). Creators speculated YouTube's demonetization algorithm had learned advertiser biases or denitions for what content was safe for advertisements. This raises the question of whether constructs <ref type="foot">25</ref> , such as "advertiser-friendly" and "safe" can be viably used without impacting marginalized communities. There is a long history of harming marginalized communities by associated them with what is not safe or other harmful stereotypes, e.g. racial proling <ref type="bibr">[7,</ref><ref type="bibr">31]</ref>. In a similar vein, creators believed YouTube's demonetization algorithm removed compensation from videos displaying content about marginalized demographics because YouTube's algorithm associated the content with unsafe and not advertiser-friendly content. We suggest these decisions may be a form of algorithmic system statistical discrimination, where harmful attributes are attached to content associated with an entire marginalized demographic, specically LGBTQ and Black communities, impacting the content creator's compensation. Others have also found disparate content harms in related contexts. For example, Haimson et. al. found content of LGBTQ and Black users was disparately removed from Twitter, Facebook, YouTube and other platforms, sometimes because their content was classied as violating safety norms or was associated with social activism <ref type="bibr">[30]</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.2">Understanding Algorithmic System Power and Folks Theorizing Leading to Harmful Behaviors</head><p>Creators developed many folk theories to explain how YouTube's algorithm worked and why it sometimes made errors. While it is not fully clear how accurate these theories were, they still guided creators' behaviors and interactions, sometimes in ways that may be unproductive or even cause additional harms. For example, creators reported self-censoring and not including content associated with the LGBTQ community, potentially reducing the representation and compensation of creators making LGBTQ content. Other creators, while meaning to combat economic damages of demonetization, shared links to their and other creators' videos in online forums, which could trigger YouTube's "click fraud" detection algorithm. Moreover, creators also said they avoided creating content likely to be viewed by audiences living in poorer regions of the world, in fear the YouTube algorithm would price advertising lower, reducing their monetization income.</p><p>Creators also spent a great deal of time and energy discussing, understanding, avoiding, auditing, and ghting YouTube's algorithm, which imposes a kind of tax on creators that reduces their income, echoing some of the ndings by Duy et al <ref type="bibr">[20]</ref>. Altogether, demonetization errors and YouTube's responses and inadequate explanations to these errors seemed to erode creators' beliefs about the demonetization algorithm's authority and power, leading to what <ref type="bibr">Eslami et al. (2018)</ref> have called "algorithmic disillusionment", or when users "realize [...] algorithms were not as perceptive and powerful as [they] thought" <ref type="bibr">[22]</ref>. Past work has found that a lack of transparency and agency can change how workers respond to algorithmic system management and authority <ref type="bibr">[22,</ref><ref type="bibr">48,</ref><ref type="bibr">51]</ref>. We found YouTubers lacked adequate explanations for why content was demonetized. Without more detailed justications, creators felt their compensation was unpredictable, which further impacted their morale and motivation to produce content. Our paper oers some insight about the tools and infrastructure creators felt would be useful in oering an element of consistency to content work, though we note that much more work is needed to understand how to best serve creators in a way that also balances needs of other stakeholders.</p><p>Lastly, some creators felt that allies trying to help marginalized communities sometimes inadvertently cause more harm through their actions. For example, after George Floyd's death, some Black content creators saw a rapid increase in subscribers due to call to action campaigns. However, some creators believed YouTube's algorithm viewed this growth with suspicion, and penalized their accounts by minimizing the visibility <ref type="bibr">[26]</ref> of their content on the platform or demonetizing them. However, whether this and similar folk theories is actually the case or not is unclear. What is clear, though, is that it is unclear how allies can best help, especially in a manner that is approved of by those marginalized communities. In their work on "HCI research with marginalized communities", Liang et al. (2021) emphasized this point, highlighting Ahmed's contention that "treating [marginalized] groups as 'Other' without awareness of the powers that have done the actual suppressing is ethically irresponsible" <ref type="bibr">[49]</ref>. Along these lines, we recommend future work examine what kinds of collective actions marginalized content creators welcome allies to take, coupled with which actions are eective and helpful in practice.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.3">Designing Tools and Structures to Support Content Creators</head><p>In 6.3.6, we described some tools and economic structures creators suggested could improve what Beleld et al. (2020) called the "monopoly face" of labor, and also the second face, "collective voice/institutional response" <ref type="bibr">[10]</ref>. In this framework, Beleld et al. argued the objective, from the standpoint of a monopoly over labor, "is typically more on osetting informational and power asymmetries between workers and management", whereas, "collective voice" signies "[workers'] ability [...] to voice complaints and see them address through collective bargaining". Along these lines, we found YouTube creators perceived power imbalances in their access to information and explanations about why content was demonetized, as well as in their power to negotiate demonetization disputes with YouTube and improve their economic, social and personal conditions. We suspect marginalized creators could have especially felt these power imbalances due to the disparate impacts often experienced by members of marginalized demographics in society <ref type="bibr">[49]</ref>.</p><p>In relation to power imbalances aecting their content work and income, creators believed tools providing them with information about what content was demonetized and also monetization pay would help them become aware of trends, and especially disparities in demonetization and monetization. For example, a tool, such as a browser extension, could display aggregate information to its users about demonetization patterns on a platform, permitting creators and researchers to track impacts from, among other things, the operationalization of content policies, which we've indicated can harm marginalized content creators. Browser extension tools for creators could also track and inform them of design changes to platforms, such as to the demonetization algorithm and advertiser "Give everybody a lile bit equity" 424:27 interface aordances, such as what keywords advertisers can ban and keyword and audience targeting pricing. We speculate such tools could reduce the amount of extra time creators spending laboring to understand how the demonetization algorithm works. In addition to information, creators wanted more control over decisions about their content, including what ads were placed on their videos, and we recommend future work explore how to design tools for this, noting we found creators have made prototypes, which were decommissioned due to platform changes. <ref type="foot">26</ref>Other online work communities have utilized tools to change power and information imbalances, particularly browser extensions to collect data from workers in the community, and using the data to monitor harmful trends, such as managers/employers posting bad tasks to Amazon Mechanical Turk and not paying crowdworkers for their work <ref type="bibr">[36]</ref>. In other cases, scholars have documented online work communities addressing their need for having disputes and harms redressed by using software tools to organize letter writing campaigns <ref type="bibr">[65]</ref>, including to Amazon's CEO Je Bezos <ref type="bibr">[32]</ref>; organizing boycotts of software applications, such as DoorDash <ref type="bibr">[23]</ref>, InstaCart <ref type="bibr">[39]</ref>, and TikTok <ref type="bibr">[63]</ref>; and more generally calling out harms on Twitter, leveraging public awareness, and demanding technology companies change what technology or factor harmed them.</p><p>Another kind of tool that could help users of platforms more generally is something to help facilitate user-driven audits <ref type="bibr">[68]</ref> of platforms, which we saw one example of in 6.3.1. Here, we speculate on possible lessons and insights for researchers investigating tools to facilitate such audits. In particular, the audit we examined seems to have several major components: a method for evaluating the algorithm (A/B testing of videos which dier only in title keywords), a way to coordinate the audit (the shared Google spreadsheet with tasks), a way to log and share results (the same spreadsheet), a summary of ndings (the shared Google doc written up to share the results with an external audience), and enough labor to execute on all of this (the three creators conducting the audit). While the specic user-driven audit we saw seems to have worked well, it is currently unclear how well it would generalize to other platforms and other situations. For example, the A/B testing worked because the auditors could directly query the algorithm, but would not work if auditors only had access to a xed dataset of inputs and outputs. Similarly, the spreadsheet worked well for the scale of this audit (about 100 videos), but might not work as well for larger scales. Lastly, the three auditors were likely highly motivated and skilled. For situations with larger data, much more labor may be needed, and it is an open question as to how to recruit, motivate, and coordinate enough people to succeed in this task. We feel that these and other related questions are a compelling area for future research.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.4">Limitations</head><p>First, our work was limited by our data collection methods, specically in that we used keywords to retrieve relevant social media and news content from online platforms, which only captured some aspects of creator reports of demonetization biases and harms beyond those reported. Second, our work was limited by the demographics of respondents who participated in our study; for example, most creators reported their gender was male, but we note research has reported this demographic skew likely reects YouTube's demographics more generally <ref type="bibr">[21]</ref>. Third, our work was limited by our positionality as researchers, specically we reported on communities, specically Black communities, for which we do not belong. To the extent we do not belong to a marginalized community, our viewpoint could neglect, misrepresent or harm the voices of those who do. Fourth, our study was limited by our ability to recruit a more diverse sample. Specically, we were able to recruit more white, heterosexual men who were YouTube creators and had made videos featuring LGBTQ content that was demonetized than LGBTQ creators making LGBTQ YouTube content.</p><p>For this reason, the experiences reported by our participants could dier from many LGBTQ creators who make LGBTQ content. To the extent the experiences of white, heterosexual men who created LGBTQ content diered from LGBTQ creators of LGBTQ content, our work could likewise misrepresent and harm the LGBTQ community. Along these lines, we reported design needs creators said they had and that would help them respond to demonetization harms regardless of demographics. Many of these reported needs aligned with those of workers doing precarious labor, which tends to be done by more members of marginalized demographics. When reporting these needs, including for access to more and better information, infrastructure, and economic security, we read and referenced literature suggesting what creators recommended to us were design changes that have helped marginalized demographics working in economically insecure and precarious jobs. For this reason, we felt it sucient to report these preliminary design ideas in hopes to encourage future work to assess what tools and infrastructure could help creators deal with demonetization biases and harms from the viewpoint of creators from marginalized demographics. Fifth, our study was limited to reporting the experiences of the creators we interviewed, most of whom reported their content had been demonetized at some point, and the experiences of the Twitter users and the creators of the YouTube content we analyzed. To the extent our recruiting and data collection methods were more likely to capture the views of those who had been demonetized versus not, our reporting could miss narratives from creators of demographic content that had not been demonetized. Finally, this paper is only about perspectives of content creators, and may not necessarily reect the way that YouTube actually works.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="8">CONCLUSION</head><p>In this paper, we presented the perspectives of YouTube creators on algorithmic demonetization, specically creators associated with marginalized demographics. Using a combination of interviews and content analysis of videos, tweets, and news articles, we investigated creators' perceptions and beliefs about algorithmic demonetization, the harms it caused, how creators responded to demonetization, and tools and infrastructure that creators desired to mitigate harms.</p><p>We framed our work from a lens of economic interaction in social computing, and distinguished between algorithms managing compensation for content work and those managing the removal of content from platforms. In other words, our study focused on automated decisions not to pay income to creators of content associated with marginalized demographics, treating these as decisions about economic opportunity versus moderating content alone. One reason for this distinction is to clarify the roles of algorithms making decisions about content, and specically to draw out where algorithms make decisions about economic opportunity. In doing so, we presented the types economic damages resulting from algorithm biases but for those aecting decisions about when to not compensate YouTube creators for their works associated with LGBTQ and Black communities. Our ndings indicated these algorithm biases worked to attach harmful attributes to content associated with marginalized demographics, and these attributes were connected to YouTube's policies and norms around what content is considered safe and advertiser-friendly.</p><p>This paper is part of a growing trend of research analyzing new kinds of work and the impact of algorithms on work. Our focus was on marginalized demographics, which a great deal of past work has shown faces long-standing socioeconomic inequalities and harms. We hope our paper has helped shed some light on how these inequalities and harms are continuing in new kinds of work and the role that algorithms play.</p><p>Funding. Our study was funded by the National Science Foundation (award number: 2040942), Cisco, and Amazon.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Table B: Twitter Code Book</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Code Description</head><p>Performative Allyship</p><p>This code was used to describe when allies or entities (such as YouTube) took actions to appear supportive of a marginalized community (e.g. LGBTQ demographics) but their actions were more a public relations gesture because, for example only, the allies or entities had previously or would subsequently act in other ways that harmed the community they were signaling their support for publicly.</p><p>User Auditing This code was used to describe when a user or creator was talking about auditing YouTube or another platform to discover what was causing the demonetization of content when the user believed the demonetization decision was erroneous or biased Sensemaking This code was used to describe when a user or creator is theorizing why a phenomenon related to demonetization happened and/or was observed; includes theorizing about the harms and needs of creators/users/others with respect to why they believe the demonetization was happening</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Collective Support</head><p>This code was used to describe when a user/creator was taking actions to support another user/creator with respect to demonetization or related issue</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Collective Repair</head><p>This code was used to describe when a user/creator was taking actions to alleviate or mitigate the harms they believed a user/creator was experiencing because of demonetization errors/bias, e.g. the user clicked on links to YouTube videos to try manually getting YouTube's algorithm to rank the videos higher in search results, etc.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Platform Response</head><p>This code was used to describe when a response by a platform (e.g. YouTube) to user/creator reports or news reports or lawsuits about or related to demonetization Harms This code was used to describe the types of harms users/creators have experienced or observed others experiencing because of demonetization</p><p>LGBTQ Demonetization</p><p>This code was used to demark whether the tweet pertained to the demonetization of LGBTQ content in particular </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking Harms</head><p>This code was used to describe when the YouTube video or creator talked about creator or viewer or stakeholder views about the harms of demonetization.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Censorship Harms</head><p>This code was used to described when a YouTube video or creator talked about censoring their content to remove features they believed would demonetize their video(s). The code was also used to describe when a YouTube video or creator said they believed erroneous or wrongful demonetization was a form of censorship.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Economic Harms</head><p>This code was used to describe when a YouTube video or creator talked about the economic harms of demonetization practices.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Demonetization</head><p>This code was used to describe when a YouTube video or creator discussed their or others' theories about the causes of demonetization.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking Needs</head><p>This code was used to describe when a YouTube video or creator shared their beliefs about what content creators needed to manage demonetization practices and harms.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking Fairness</head><p>This code was used to describe when a YouTube video or creator shared their theory or beliefs about whether demonetization cases or practices were fair. The code was also used to describe when a video or creator shared their perspective on whether content creation work was fair and why.</p><p>User Auditing This code was used to describe when a user or creator was talking about auditing YouTube or another platform to discover what was causing the demonetization of content</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Collective Support</head><p>This code was used to describe when a video or creator was talking about actions they or others had taken to support other creators with respect to demonetization or a related issue User response This code was used to describe when a video or creator was talking about a response to demonetization taken by a user, viewer or creator.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Collective Repair</head><p>This code was used to describe when a YouTube video or creator discussed taking actions to mitigate the harms they believed they or another creator was experiencing because of demonetization, e.g. the creator said they clicked on links to YouTube videos to manually optimize YouTube's algorithm to rank their videos higher in search results.</p><p>"Give everybody a lile bit more equity" 424:35</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Continuation of Table C</head><p>Code Description</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Platform Response</head><p>This code was used to describe when a YouTube video or creator discussed a response by a digital content distribution platform (e.g. YouTube) to reports from a user, creator, news, or lawsuit about demonetization</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Mocking Response</head><p>This code was used to describe when a YouTube video or creator responded or talked about a platform response to reports of demonetization in a sarcastic or teasing tone.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Performative Allyship</head><p>This code was used to describe when a YouTube video or creator talked about allies or entities (such as YouTube) taking actions to appear supportive of a marginalized community (e.g. LGBTQ demographics) but their actions were viewed by the video or creator as a public relations gesture because, for example only, the allies or entities had previously or would subsequently act in other ways that harmed the community they were signaling their support for publicly.</p><p>LGBTQ Demonetization</p><p>This code was used to demark whether the video or annotated part of the video transcript text pertained to the demonetization of LGBTQ content in particular </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking Harms</head><p>This code was used to describe when the text talked about creator or viewer or stakeholder views about the harms of demonetization.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Demonetization</head><p>This code was used to describe when a YouTube video or creator discussed their or others' theories about the causes of demonetization.</p><p>User Auditing This code was used to describe when a user or creator was talking about auditing YouTube or another platform to discover what was causing the demonetization of content User response This code was used to describe when a video or creator was talking about a response to demonetization taken by a user, viewer or creator.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Platform Response</head><p>This code was used to describe when a YouTube video or creator discussed a response by a digital content distribution platform (e.g. YouTube) to reports from a user, creator, news, or lawsuit about demonetization</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Continuation of Table D</head><p>Code Description</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Performative Allyship</head><p>This code was used to describe when a YouTube video or creator talked about allies or entities (such as YouTube) taking actions to appear supportive of a marginalized community (e.g. LGBTQ demographics) but their actions were viewed by the video or creator as a public relations gesture because, for example only, the allies or entities had previously or would subsequently act in other ways that harmed the community they were signaling their support for publicly.</p><p>LGBTQ Demonetization</p><p>This code was used to demark whether the text pertained to the demonetization of LGBTQ content in particular </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Background Information</head><p>Used this code to describe when participant talked about their YouTube channel, e.g. content genre they produce, etc.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Demonetization</head><p>Used this code to describe when participant talked about demonetization, including: whether or not they or others they know had been been demonetized, Awareness of demographically biased demonetization (e.g. LGBTQA content), Reasons for demonetization, Actions taken to respond to demonetization Algorithmic Bias Awareness Used this code to describe when participant talked about their awareness (or lack thereof) algorithmic bias</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Equal Opportunity</head><p>Used this code to describe when participants talk about whether content creators have equal opportunities with respect to monetization and paid opportunities on YouTube, e.g. are creators paid equally for equal work and related discussions.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Platform</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Reports &amp; Response</head><p>Used this code to describe when participants talked about YouTube's response to their communications, including: (i) Appeals of demonetization and (ii) Communications to YouTube via social media</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Sensemaking</head><p>Used this code to describe when participants talked about their theories or beliefs about how YouTube (algorithm) works</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Monetization</head><p>Used this code to describe when participants talked about how they or others are monetized on the platform, and any challenges or needs they have with respect to monetizing their content</p></div><note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="1" xml:id="foot_0"><p>LGBTQ refers to the broad and diverse community of persons who are lesbian, gay, bisexual, transgender, and queer. Lesbian, gay, bisexual, and queer refer to a person's sexual orientation. Transgender is "an umbrella term for people whose gender identity and/or gender expression diers from what is typically associated with the sex they were assigned at birth"<ref type="bibr">[27]</ref>.Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="2022" xml:id="foot_1"><p/></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" xml:id="foot_2"><p>Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="2" xml:id="foot_3"><p>In this paper we refer to the economic damages of demonetization as "economic harms" to reference how social interactions and reputation, health and work environments shape economic well-being for people.Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="3" xml:id="foot_4"><p>https://developer.twitter.com/en/docs Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="424" xml:id="foot_5"><p>Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="4" xml:id="foot_6"><p>https://www.washingtonpost.com/technology/2019/08/14/youtube-discriminates-against-lgbt-content-by-unfairlyculling-it-suit-alleges</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="5" xml:id="foot_7"><p>https://www.vox.com/culture/2019/10/10/20893258/youtube-lgbtq-censorship-demonetization-nerd-city-algorithmreport</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="6" xml:id="foot_8"><p>https://www.cnn.com/2019/06/10/tech/youtube-susan-wojcicki-code-con</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="7" xml:id="foot_9"><p>http://rollingstone.com</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="8" xml:id="foot_10"><p>https://www.insider.com/youtubers-identify-title-words-that-get-videos-demonetized-experiment-2019-10</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="9" xml:id="foot_11"><p>https://www.lexisnexis.com Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="10" xml:id="foot_12"><p>https://aiopportunitylab.github.io/youtube-study</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="11" xml:id="foot_13"><p>https://www.taguette.org Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="17" xml:id="foot_14"><p>Notably, the creators' audit report garnered the attention of news media, and a major labor union, IG Metall, one of the largest unions in Germany. Please refer to: https://fairtube.info</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="18" xml:id="foot_15"><p>https://www.npr.org/2016/06/16/482322488/orlando-shooting-what-happened-update Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="19" xml:id="foot_16"><p>As perhaps a related matter to this learned algorithmic bias, in section 5.2, we discuss additional types of reported harms that could have resulted from discriminatory tastes, such as cases where YouTube has placed anti-LGBTQ ads on LGBTQ content.Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="2022" xml:id="foot_17"><p/></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="20" xml:id="foot_18"><p>https://support.google.com/youtube/answer/9564590?hl=en</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="21" xml:id="foot_19"><p>While beyond the scope of this paper, the literature has discussed consumer discrimination<ref type="bibr">[72]</ref>, where in the United States, Black consumers have received slower or no service due to 'consumer proling'; we referenced a few of these works here<ref type="bibr">[7,</ref><ref type="bibr">31]</ref>, and note future work could examine if a similar pattern of bias takes place on digital platforms.Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="22" xml:id="foot_20"><p>https://www.youtube.com/results?search_query=how+much+youtube+paid+me Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="23" xml:id="foot_21"><p>We note the importance for future work to account for dierences in needs and access to health benets, including for mental health care, among creators living within and outside the United States and Europe.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="24" xml:id="foot_22"><p>Conict of Interest Disclosure: The rst author previously worked for IG Metall, the German labor union referenced by the content creator in the quote. Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="25" xml:id="foot_23"><p>In sociology, a "construct" describes a norm that denes attributes<ref type="bibr">[54]</ref> such as "gender", sometimes doing so in a way to make the denition seem like an immutable trait of humans, though constructs typically are not. For example, gender has been dened in binary terms, man or woman, but now is not.Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="26" xml:id="foot_24"><p>https://drive.google.com/drive/folders/10gLKGH1fqCp39x_UcEb9f98xTufzfnuZ?usp=sharing Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022.</p></note>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" xml:id="foot_25"><p>Proc. ACM Hum.-Comput. Interact., Vol. 6, No. CSCW2, Article 424. Publication date: November 2022."Give everybody a lile bit more equity"</p></note>
		</body>
		</text>
</TEI>
