skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Learning About Algorithm Auditing In Five Steps: Scaffolding How High School Youth Can Systematically and Critically Evaluate Machine Learning Applications.
While there is widespread interest in supporting young people to critically evaluate machine learning-powered systems, there is little research on how we can support them in inquiring about how these systems work and what their limitations and implications may be. Outside of K-12 education, an effective strategy in evaluating black-boxed systems is algorithm auditing—a method for understanding algorithmic systems’ opaque inner workings and external impacts from the outside in. In this paper, we review how expert researchers conduct algorithm audits and how end users engage in auditing practices to propose five steps that, when incorporated into learning activities, can support young people in auditing algorithms. We present a case study of a team of teenagers engaging with each step during an out-of-school workshop in which they audited peer-designed generative AI TikTok filters. We discuss the kind of scaffolds we provided to support youth in algorithm auditing and directions and challenges for integrating algorithm auditing into classroom activities. This paper contributes: (a) a conceptualization of five steps to scaffold algorithm auditing learning activities, and (b) examples of how youth engaged with each step during our pilot study.  more » « less
Award ID(s):
2333469
PAR ID:
10632785
Author(s) / Creator(s):
; ; ; ;
Corporate Creator(s):
Editor(s):
Hoadley, C; Wang, C
Publisher / Repository:
ISLS
Date Published:
Edition / Version:
1
Format(s):
Medium: X
Location:
https://drive.google.com/file/d/1uMcZSnsren2dbO4WE2Uselv5JnN9ZSzE/view
Sponsoring Org:
National Science Foundation
More Like this
  1. This study investigates how high school-aged youth engage in algorithm auditing to identify and understand biases in artificial intelligence and machine learning (AI/ML) tools they encounter daily. With AI/ML technologies being increasingly integrated into young people’s lives, there is an urgent need to equip teenagers with AI literacies that build both technical knowledge and awareness of social impacts. Algorithm audits (also called AI audits) have traditionally been employed by experts to assess potential harmful biases, but recent research suggests that non-expert users can also participate productively in auditing. We conducted a two-week participatory design workshop with 14 teenagers (ages 14–15), where they audited the generative AI model behind TikTok’s Effect House, a tool for creating interactive TikTok filters. We present a case study describing how teenagers approached the audit, from deciding what to audit to analyzing data using diverse strategies and communicating their results. Our findings show that participants were engaged and creative throughout the activities, independently raising and exploring new considerations, such as age-related biases, that are uncommon in professional audits. We drew on our expertise in algorithm auditing to triangulate their findings as a way to examine if the workshop supported participants to reach coherent conclusions in their audit. Although the resulting number of changes in race, gender, and age representation uncovered by the teens were slightly different from ours, we reached similar conclusions. This study highlights the potential for auditing to inspire learning activities to foster AI literacies, empower teenagers to critically examine AI systems, and contribute fresh perspectives to the study of algorithmic harms. 
    more » « less
  2. This study investigates how high school-aged youth engage in algorithm auditing to identify and understand biases in artificial intelligence and machine learning (AI/ML) tools they encounter daily. With AI/ML technologies being increasingly integrated into young people’s lives, there is an urgent need to equip teenagers with AI literacies that build both technical knowledge and awareness of social impacts. Algorithm audits (also called AI audits) have traditionally been employed by experts to assess potential harmful biases, but recent research suggests that non-expert users can also participate productively in auditing. We conducted a two-week participatory design workshop with 14 teenagers (ages 14–15), where they audited the generative AI model behind TikTok’s Effect House, a tool for creating interactive TikTok filters. We present a case study describing how teenagers approached the audit, from deciding what to audit to analyzing data using diverse strategies and communicating their results. Our findings show that participants were engaged and creative throughout the activities, independently raising and exploring new considerations, such as age-related biases, that are uncommon in professional audits. We drew on our expertise in algorithm auditing to triangulate their findings as a way to examine if the workshop supported participants to reach coherent conclusions in their audit. Although the resulting number of changes in race, gender, and age representation uncovered by the teens were slightly different from ours, we reached similar conclusions. This study highlights the potential for auditing to inspire learning activities to foster AI literacies, empower teenagers to critically examine AI systems, and contribute fresh perspectives to the study of algorithmic harms. 
    more » « less
  3. Abstract While there are many different frameworks seeking to identify what benefits young people might derive from participation in informal STEM (Science, Technology, Engineering and Mathematics) learning (ISL), this paper argues that the sector would benefit from an approach that foregrounds equity and social justice outcomes. We propose a new model for reflecting on equitable youth outcomes from ISL that identifies five key areas: (1) Grounded fun; (2) STEM capital; (3) STEM trajectories; (4) STEM identity work; and (5) Agency+ . The model is applied to empirical data (interviews, observations and youth portfolios) collected over one year in four UK-based ISL settings with 33 young people (aged 11–14), largely from communities that are traditionally under-represented in STEM. Analysis considers the extent to which participating youth experienced equitable outcomes, or not, in relation to the five areas. The paper concludes with a discussion of implications for ISL and how the model might support ongoing efforts to reimagine ISL as vehicle for social justice. 
    more » « less
  4. Young learners today are constantly influenced by AI recommendations, from media choices to social connections. The resulting "filter bubble" can limit their exposure to diverse perspectives, which is especially problematic when they are not aware this manipulation is happening or why. To address the need to support youth AI literacy, we developed "BeeTrap", a mobile Augmented Reality (AR) learning game designed to enlighten young learners about the mechanisms and the ethical issue of recommendation systems. Transformative Experience model was integrated into learning activities design, focusing on making AI concepts relevant to students’ daily experiences, facilitating a new understanding of their digital world, and modeling real-life applications. Our pilot study with middle schoolers in a community-based program primarily investigated how transformative structured AI learning activities affected students’ understanding of recommendation systems and their overall conceptual, emotional, and behavioral changes toward AI. 
    more » « less
  5. Seitamaa_Hakkarainen, P; Kangas, K (Ed.)
    Today’s youth have extensive experience interacting with artificial intelligence and machine learning applications on popular social media platforms, putting youth in a unique position to examine, evaluate, and even challenge these applications. Algorithm auditing is a promising candidate for connecting youth’s everyday practices in using AI applications with more formal scientific literacies (i.e., syncretic designs). In this paper, we analyze high school youth participants’ everyday algorithm auditing practices when interacting with generative AI filters on TikTok, revealing thorough and extensive examinations, with youth rapidly testing filters with sophisticated camera variations and facial manipulations to identify filter limitations. In the discussion, we address how these findings can provide a foundation for developing designs that bring together everyday and more formal algorithm auditing. 
    more » « less