skip to main content


Search for: All records

Creators/Authors contains: "Badillo-Urquiola, Karla"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Artificial intelligence (AI) underpins virtually every experience that we have—from search and social media to generative AI and immersive social virtual reality (SVR). For Generation Z, there is no before AI. As adults, we must humble ourselves to the notion that AI is shaping youths’ world in ways that we don’t understand and we need to listen to them about their lived experiences. We invite researchers from academia and industry to participate in a workshop with youth activists to set the agenda for research into how AI-driven emerging technologies affect youth and how to address these challenges. This reflective workshop will amplify youth voices and empower youth and researchers to set an agenda. As part of the workshop, youth activists will participate in a panel and steer the conversation around the agenda for future research. All will participate in group research agenda setting activities to reflect on their experiences with AI technologies and consider ways to tackle these challenges. 
    more » « less
    Free, publicly-accessible full text available October 14, 2024
  2. Adolescent online safety researchers have emphasized the importance of moving beyond restrictive and privacy invasive approaches to online safety, towards resilience-based approaches for empowering teens to deal with online risks independently. Unfortunately, many of the existing online safety interventions are focused on parental mediation and not contextualized to teens' personal experiences online; thus, they do not effectively cater to the unique needs of teens. To better understand how we might design online safety interventions that help teens deal with online risks, as well as when and how to intervene, we must include teens as partners in the design process and equip them with the skills needed to contribute equally to the design process. As such, we conducted User Experience (UX) bootcamps with 21 teens (ages 13-17) to first teach them important UX design skills using industry standard tools, so they could create storyboards for unsafe online interactions commonly experienced by teens and high-fidelity, interactive prototypes for dealing with these situations. Based on their storyboards, teens often encountered information breaches and sexual risks with strangers, as well as cyberbullying from acquaintances or friends. While teens often blocked or reported strangers, they struggled with responding to risks from friends or acquaintances, seeking advice from others on the best action to take. Importantly, teens did not find any of the existing ways for responding to these risks to be effective in keeping them safe. When asked to create their own design-based interventions, teens frequently envisioned nudges that occurred in real-time. Interestingly, teens more often designed for risk prevention (rather than risk coping) by focusing on nudging the risk perpetrator (rather than the victim) to rethink their actions, block harmful actions from occurring, or penalizing perpetrators for inappropriate behavior to prevent it from happening again in the future. Teens also designed personalized sensitivity filters to provide teens the ability to manage content they wanted to see online. Some teens also designed personalized nudges, so that teens could receive intelligent, guided advice from the platform that would help them know how to handle online risks themselves without intervention from their parents. Our findings highlight how teens want to address online risks at the root by putting the onus of risk prevention on those who perpetrate them - rather than on the victim. Our work is the first to leverage co-design with teens to develop novel online safety interventions that advocate for a paradigm shift from youth risk protection to promoting good digital citizenship.

     
    more » « less
  3. Social service providers play a vital role in the developmental outcomes of underprivileged youth as they transition into adulthood. Educators, mental health professionals, juvenile justice officers, and child welfare caseworkers often have first-hand knowledge of the trials uniquely faced by these vulnerable youth and are charged with mitigating harmful risks, such as mental health challenges, child abuse, drug use, and sex trafficking. Yet, less is known about whether or how social service providers assess and mitigate the online risk experiences of youth under their care. Therefore, as part of the National Science Foundation (NSF) I-Corps program, we conducted interviews with 37 social service providers (SSPs) who work with underprivileged youth to determine what (if any) online risks are most concerning to them given their role in youth protection, how they assess or become aware of these online risk experiences, and whether they see value in the possibility of using artificial intelligence (AI) as a potential solution for online risk detection. Overall, online sexual risks (e.g., sexual grooming and abuse) and cyberbullying were the most salient concern across all social service domains, especially when these experiences crossed the boundary between the digital and the physical worlds. Yet, SSPs had to rely heavily on youth self-reports to know whether and when online risks occurred, which required building a trusting relationship with youth; otherwise, SSPs became aware only after a formal investigation had been launched. Therefore, most SSPs found value in the potential for using AI as an early detection system and to monitor youth, but they were concerned that such a solution would not be feasible due to a lack of resources to adequately respond to online incidences, access to the necessary digital trace data (e.g., social media), context, and concerns about violating the trust relationships they built with youth. Thus, such automated risk detection systems should be designed and deployed with caution, as their implementation could cause youth to mistrust adults, thereby limiting the receipt of necessary guidance and support. We add to the bodies of research on adolescent online safety and the benefits and challenges of leveraging algorithmic systems in the public sector. 
    more » « less
  4. Digital technologies shape how individuals, communities, and soci- eties interact; yet they are far from equitable. This paper presents a framework that challenges the “one-view-fits-all” design approach to digital health tools. We explore systemic issues of power to eval- uate the multidimensional indicators of Latino health outcomes and how technology can support well-being. Our proposed frame- work enables designers to gain a better understanding of how marginalized communities use digital technologies to navigate unique challenges. As an innovative and possibly controversial approach to assets-based design, we stress the importance of in- dustry and academia self-reflection on their organization’s role in the marginalization of communities in addition to valuing the lived experiences of marginalized communities. Through this approach, designers may avoid amplifying structural and health inequities in marginalized communities. 
    more » « less
  5. null (Ed.)
    The methods in which we study the online experiences of adolescents should be evidence-based and informed by youth. This is especially true when studying sensitive topics, such as the online risk behaviors of minors. We directly engaged 20 adolescents (ages 12-18) in the co-design of two different research methodologies (i.e., diary studies and analyzing social media trace data) for conducting adolescent online safety research. We also interviewed 13 of their parents to understand their perspectives. Overall, teens wanted to share their personal experiences and benefit society, while parents wanted researchers to tackle a topic that they felt was a prevalent problem for teens. Yet, they both had significant concerns regarding data privacy of the sensitive disclosures made by teens during such studies. Teens' feared getting in trouble. Participants emphasized the importance of developing a trusting relationship with the researcher to overcome these concerns. Participants also saw the potential for using the research study as a tool for risk-reporting and mitigation, where researchers could act as liaisons between the teens and other parties (e.g., counselors, law enforcement, parents) to share pertinent risk details and facilitate resources or even help teens directly by giving them strategies for mitigating online risks they encountered during the study. Our research delves into important ethical considerations for conducting risk-focused research with adolescents and uncovers the critical need for designing risk-based research for youth protection. We provide researchers with heuristic guidelines for conducting ethical research with vulnerable populations (i.e., adolescents) and keeping participants safe while doing so. 
    more » « less
  6. We conducted a thematic content analysis of 4,180 posts by adolescents (ages 12-17) on an online peer support mental health forum to understand what and how adolescents talk about their online sexual interactions. Youth used the platform to seek support (83%), connect with others (15%), and give advice (5%) about sexting, their sexual orientation, sexual abuse, and explicit content. Females often received unwanted nudes from strangers and struggled with how to turn down sexting requests from people they knew. Meanwhile, others who sought support complained that they received unwanted sexual solicitations while doing so—to the point that adolescents gave advice to one another on which users to stay away from. Our research provides insight into the online sexual experiences of adolescents and how they seek support around these issues. We discuss how to design peer-based social media platforms to support the well-being and safety of youth. 
    more » « less
  7. Parental control applications are designed to help parents monitor their teens and protect them from online risks. Generally, parents are considered the primary stakeholders for these apps; therefore, the apps often emphasize increased parental control through restriction and monitoring. By taking a developmental perspective and a Value Sensitive Design approach, we explore the possibility of designing more youth-centric online safety features. We asked 39 undergraduate students in the United States to create design charrettes of parental control apps that would better represent teens as stakeholders. As emerging adults, students discussed the value tensions between teens and parents and designed features to reduce and balance these tensions. While they emphasized safety, the students also designed to improve parent-teen communication, teen autonomy and privacy, and parental support. Our research contributes to the adolescent online safety literature by presenting design ideas from emerging adults that depart from the traditional paradigm of parental control. We also make a pedagogical contribution by leveraging design charrettes as a classroom tool for engaging college students in the design of youth-centered apps. We discuss why features that support parent-teen cooperation, teen privacy, and autonomy may be more developmentally appropriate for adolescents than existing parental control app designs. 
    more » « less
  8. Mobile social media applications ("apps"), such as TikTok (previously Musical.ly), have recently surfaced in news media due to harmful incidents involving young children engaging with strangers through these mobile apps. To better understand children's awareness of online stranger danger and explore their visions for technologies that can help them manage related online risks (e.g., sexual solicitations and cyberbullying), we held two participatory design sessions with 12 children (ages 8-11 years old). We found that children desired varying levels of agency, depending on the severity of the risk. In most cases, they wanted help resolving the issue themselves instead of relying on their parents to do it for them. Children also believed that social media apps should take on more responsibility in promoting online safety for children. We discuss the children's desires for agency, privacy, and automated intelligent assistance and provide novel design recommendations inspired by children. 
    more » « less