Millions of people participate in online peer-to-peer support sessions, yet there has been little prior research on systematic psychology-based evaluations of fine-grained peer-counselor behavior in relation to client satisfaction. This paper seeks to bridge this gap by mapping peer-counselor chat-messages to motivational interviewing (MI) techniques. We annotate 14,797 utterances from 734 chat conversations using 17 MI techniques and introduce four new interviewing codes such as ''chit-chat'' and ''inappropriate'' to account for the unique conversational patterns observed on online platforms. We automate the process of labeling peer-counselor responses to MI techniques by fine-tuning large domain-specific language models and then use these automated measures to investigate the behavior of the peer counselors via correlational studies. Specifically, we study the impact of MI techniques on the conversation ratings to investigate the techniques that predict clients' satisfaction with their counseling sessions. When counselors use techniques such as reflection and affirmation, clients are more satisfied. Examining volunteer counselors' change in usage of techniques suggest that counselors learn to use more introduction and open questions as they gain experience. This work provides a deeper understanding of the use of motivational interviewing techniques on peer-to-peer counselor platforms and sheds light on how to build better training programs for volunteer counselors on online platforms. 
                        more » 
                        « less   
                    
                            
                            Helping the Helper: Supporting Peer Counselors via AI-Empowered Practice and Feedback
                        
                    
    
            Millions of users come to online peer counseling platforms to seek support on diverse topics ranging from relationship stress to anxiety. However, studies show that online peer support groups are not always as effective as expected largely due to users' negative experiences with unhelpful counselors. Peer counselors are key to the success of online peer counseling platforms, but most of them often do not have systematic ways to receive guidelines or supervision. In this work, we introduce CARE: an interactive AI-based tool to empower peer counselors through automatic suggestion generation. During the practical training stage, CARE helps diagnose which specific counseling strategies are most suitable in the given context and provides tailored example responses as suggestions. Counselors can choose to select, modify, or ignore any suggestion before replying to the support seeker. Building upon the Motivational Interviewing framework, CARE utilizes large-scale counseling conversation data together with advanced natural language generation techniques to achieve these functionalities. We demonstrate the efficacy of CARE by performing both quantitative evaluations and qualitative user studies through simulated chats and semi-structured interviews. We also find that CARE especially helps novice counselors respond better in challenging situations. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2247357
- PAR ID:
- 10506667
- Publisher / Repository:
- arXiv preprint arXiv:2305.08982
- Date Published:
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Realistic practice and tailored feedback are key processes for training peer counselors with clinical skills. However, existing mechanisms of providing feedback largely rely on human supervision. Peer counselors often lack mechanisms to receive detailed feedback from experienced mentors, making it difficult for them to support the large number of people with mental health issues who use peer counseling. Our work aims to leverage large language models to provide contextualized and multi-level feedback to empower peer counselors, especially novices, at scale. To achieve this, we co-design with a group of senior psychotherapy supervisors to develop a multi-level feedback taxonomy, and then construct a publicly available dataset with comprehensive feedback annotations of 400 emotional support conversations. We further design a self-improvement method on top of large language models to enhance the automatic generation of feedback. Via qualitative and quantitative evaluation with domain experts, we demonstrate that our method minimizes the risk of potentially harmful and low-quality feedback generation which is desirable in such high-stakes scenarios.more » « less
- 
            Online mental health support communities, in which volunteer counselors provide accessible mental and emotional health support, have grown in recent years. Despite millions of people using these platforms, the clinical effectiveness of these communities on mental health symptoms remains unknown. Although volunteers receive some training on the therapeutic skills proven effective in face-to-face environments, such as active listening and motivational interviewing, it is unclear how the usage of these skills in an online context affects people's mental health. In our work, we collaborate with one of the largest online peer support platforms and use both natural language processing and machine learning techniques to examine how one-on-one support chats on the platform affect clients' depression and anxiety symptoms. We measure how characteristics of support-providers, such as their experience on the platform and use of therapeutic skills (e.g. affirmation, showing empathy), affect support-seekers' mental health changes. Based on a propensity-score matching analysis to approximate a random-assignment experiment, results shows that online peer support chats improve both depression and anxiety symptoms with a statistically significant but relatively small effect size. Additionally, support providers' techniques such as emphasizing the autonomy of the client lead to better mental health outcomes. However, we also found that the use of some behaviors, such as persuading and providing information, are associated with worsening of mental health symptoms. Our work provides key understanding for mental health care in the online setting and designing training systems for online support providers.more » « less
- 
            Ensuring the effectiveness of text-based crisis counseling requires observing ongoing conversations and providing feedback, both labor-intensive tasks. Automatic analysis of conversations—at the full chat and utterance levels—may help support counselors and provide better care. While some session-level training data (e.g., rating of patient risk) is often available from counselors, labeling utterances requires expensive post hoc annotation. But the latter can not only provide insights about conversation dynamics, but can also serve to support quality assurance efforts for counselors. In this paper, we examine if inexpensive—and potentially noisy—session-level annotation can help improve label utterances. To this end, we propose a logic-based indirect supervision approach that exploits declaratively stated structural dependencies between both levels of annotation to improve utterance modeling. We show that adding these rules gives an improvement of 3.5% f-score over a strong multi-task baseline for utterance-level predictions. We demonstrate via ablation studies how indirect supervision via logic rules also improves the consistency and robustness of the system.more » « less
- 
            Online peer-to-peer therapy sessions can be effective in improving people's mental well-being. However, online volunteer counselors may lack the expertise and necessary training to provide high-quality sessions, and these low-quality sessions may negatively impact volunteers' motivations as well as clients' well-being. This paper uses interviews with 20 senior online volunteer counselors to examine how they addressed challenges and acquired skills when volunteering in a large, mental-health support community - 7Cups.com. Although volunteers in this community received some training based on principles of active listening and motivational interviewing, results indicate that the training was insufficient and that volunteer counselors had to independently develop strategies to deal with specific challenges that they encountered in their volunteer work. Their strategies, however, might deviate from standard practice since they generally lacked systematic feedback from mentors or clients and, instead, relied on their personal experiences. Additionally, volunteer counselors reported having difficulty maintaining their professional boundaries with the clients. Even though training and support resources were available, they were underutilized. The results of this study have uncovered new design spaces for HCI practitioners and researchers, including social computing and artificial intelligence approaches that may provide better support to volunteer counselors in online mental health communities.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
