skip to main content

Search for: All records

Award ID contains: 1844881

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We licensed a dataset from a mental health peer support platform catering mainly to teens and young adults. We anonymized the name of this platform to protect the individuals on our dataset. On this platform, users can post content and comment on others’ posts. Interactions are semi-anonymous: users share a photo and screen name with others. They have the option to post with their username visible or anonymously. The platform is moderated, but the ratio of moderators to posters is low (0.00007). The original dataset included over 5 million posts and 15 million comments from 2011- 2017. It was scaled to a feasible size for qualitative analysis by running a query to identify posts by a) adolescents aged 13-17 that were seeking support for b) online sexual experiences (not offline) with people they know (not strangers).
  2. The methods in which we study the online experiences of adolescents should be evidence-based and informed by youth. This is especially true when studying sensitive topics, such as the online risk behaviors of minors. We directly engaged 20 adolescents (ages 12-18) in the co-design of two different research methodologies (i.e., diary studies and analyzing social media trace data) for conducting adolescent online safety research. We also interviewed 13 of their parents to understand their perspectives. Overall, teens wanted to share their personal experiences and benefit society, while parents wanted researchers to tackle a topic that they felt was a prevalent problem for teens. Yet, they both had significant concerns regarding data privacy of the sensitive disclosures made by teens during such studies. Teens' feared getting in trouble. Participants emphasized the importance of developing a trusting relationship with the researcher to overcome these concerns. Participants also saw the potential for using the research study as a tool for risk-reporting and mitigation, where researchers could act as liaisons between the teens and other parties (e.g., counselors, law enforcement, parents) to share pertinent risk details and facilitate resources or even help teens directly by giving them strategies for mitigating online risks they encounteredmore »during the study. Our research delves into important ethical considerations for conducting risk-focused research with adolescents and uncovers the critical need for designing risk-based research for youth protection. We provide researchers with heuristic guidelines for conducting ethical research with vulnerable populations (i.e., adolescents) and keeping participants safe while doing so.« less
  3. Parental mediation is a key factor that influences adolescents’ exposure to online risk. Yet, research on this topic has mostly been cross-sectional and correlative, not exploring whether the relationship between parental mediation and adolescent online risk exposure could be bi-directional, where teens’ risk exposure influences parenting practices. To address this gap, we conducted an eight week, repeated measures web-based diary study with 68 adolescents (aged 13–17) and their parents to examine the relationships between three parental mediation strategies (active mediation, monitoring, and restriction) and three adolescent online risk types (explicit content, sexual solicitations, and online harassment) teens reported encountering online.
  4. Traditional parental control applications designed to protect children and teens from online risks do so through parental restrictions and privacy-invasive monitoring. We propose a new approach to adolescent online safety that aims to strike a balance between a teen’s privacy and their online safety through active communication and fostering trust between parents and children. We designed and developed an Android “app” called Circle of Trust and conducted a mixed methods user study of 17 parent-child pairs to understand their perceptions about the app. Using a within-subjects experimental design, we found that parents and children significantly preferred our new app design over existing parental control apps in terms of perceived usefulness, ease of use, and behavioral intent to use. By applying a lens of Value Sensitive Design to our interview data, we uncovered that parents and children who valued privacy, trust, freedom, and balance of power preferred our app over traditional apps. However, those who valued transparency and control preferred the status quo. Overall, we found that our app was better suited for teens than for younger children.
  5. We conducted a thematic content analysis of 4,180 posts by adolescents (ages 12-17) on an online peer support mental health forum to understand what and how adolescents talk about their online sexual interactions. Youth used the platform to seek support (83%), connect with others (15%), and give advice (5%) about sexting, their sexual orientation, sexual abuse, and explicit content. Females often received unwanted nudes from strangers and struggled with how to turn down sexting requests from people they knew. Meanwhile, others who sought support complained that they received unwanted sexual solicitations while doing so—to the point that adolescents gave advice to one another on which users to stay away from. Our research provides insight into the online sexual experiences of adolescents and how they seek support around these issues. We discuss how to design peer-based social media platforms to support the well-being and safety of youth.
  6. Parental control applications are designed to help parents monitor their teens and protect them from online risks. Generally, parents are considered the primary stakeholders for these apps; therefore, the apps often emphasize increased parental control through restriction and monitoring. By taking a developmental perspective and a Value Sensitive Design approach, we explore the possibility of designing more youth-centric online safety features. We asked 39 undergraduate students in the United States to create design charrettes of parental control apps that would better represent teens as stakeholders. As emerging adults, students discussed the value tensions between teens and parents and designed features to reduce and balance these tensions. While they emphasized safety, the students also designed to improve parent-teen communication, teen autonomy and privacy, and parental support. Our research contributes to the adolescent online safety literature by presenting design ideas from emerging adults that depart from the traditional paradigm of parental control. We also make a pedagogical contribution by leveraging design charrettes as a classroom tool for engaging college students in the design of youth-centered apps. We discuss why features that support parent-teen cooperation, teen privacy, and autonomy may be more developmentally appropriate for adolescents than existing parental control app designs.
  7. Mobile social media applications ("apps"), such as TikTok (previously Musical.ly), have recently surfaced in news media due to harmful incidents involving young children engaging with strangers through these mobile apps. To better understand children's awareness of online stranger danger and explore their visions for technologies that can help them manage related online risks (e.g., sexual solicitations and cyberbullying), we held two participatory design sessions with 12 children (ages 8-11 years old). We found that children desired varying levels of agency, depending on the severity of the risk. In most cases, they wanted help resolving the issue themselves instead of relying on their parents to do it for them. Children also believed that social media apps should take on more responsibility in promoting online safety for children. We discuss the children's desires for agency, privacy, and automated intelligent assistance and provide novel design recommendations inspired by children.