Autism is a neurodevelopmental disability that impacts one’s social communication and interaction. When left unsupported, this can increase the amount of loneliness felt by autistic people. Communication technology, such as AAC, can be helpful in supporting social communication, especially when co-designed with autistic people. We conducted a series of design workshops to co-design a new AAC system specifically supporting social communication. In this paper, we focus on the accessibility issues that were identified when running our workshops and provide recommendations on how to improve the process. We found that it is critical to build support for information processing time into the workshops, include a variety of AAC stakeholders, and create a shared vocabulary between the workshop participants to make design workshops more accessible to autistic adults.
more »
« less
Designing for Common Ground: Visually Representing Conversation Dynamics of Neurodiverse Dyads
During interpersonal interactions, conversational moves can help people establishcommon ground ---a shared frame of reference to support communication. Neurodiverse conversation dyads that include autistic and non-autistic members can experience challenges in creating and maintaining such a shared frame of reference due to differing communication and cognitive styles. We conducted a design study to understand conversational patterns among neurodiverse dyads and then used those patterns to co-design concepts for supporting the creation and maintenance of common ground by those conversation pairs. Our study involved two activities with participants: (1) a paired interview with autistic adults and a trusted conversation partner that used a novel swimlane visual elicitation activity, and (2) a remote design study during which the autistic participants designed a game intended to visualize and support neurodiverse conversation dynamics. We found that communication technology can scaffold neurodiverse dyads in locating common ground by supporting crucial individual and joint decision-making; clarification of language and emotions; and embodied sense-making of identity, relationships, and shared information. This project generated insights related to two distinct aspects of designing assistive Information and Communication Technology (ICT) to support autistic individuals: (1) the ability for visual elicitation activities to help autistic individuals recognize interaction patterns, gain a deeper understanding of other's perspectives, and imagine more desirable alternatives, and (2) the importance of recognizing and supporting multi-dimensional aspects of communication practices (i.e., social, emotional, sensory) in establishing and maintaining shared points of reference for neurodiverse conversation dyads.
more »
« less
- Award ID(s):
- 1845023
- PAR ID:
- 10603353
- Publisher / Repository:
- Association for Computing Machinery (ACM)
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 7
- Issue:
- CSCW2
- ISSN:
- 2573-0142
- Format(s):
- Medium: X Size: p. 1-33
- Size(s):
- p. 1-33
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
For autistic individuals, navigating social and emotional interactions can be complex, often involving disproportionately high cognitive labor in contrast to neurotypical conversation partners. Through a novel approach to speculative co-design, autistic adults explored affective imaginaries — imagined futuristic technology interventions — to probe a provocative question: What if technology could translate emotions like it can translate spoken language? The resulting speculative prototype for an image-enabled emotion translator chat application included: (1) a visual system for representing personalized emotion taxonomies, and (2) a Wizard of Oz implementation of these taxonomies in a low-fidelity chat application. Although wary of technology that purports to understand emotions, autistic participants saw value in being able to deploy visual emotion taxonomies during chats with neurotypical conversation partners. This work shows that affective technology should enable users to: (1) curate encodings of emotions used in system artifacts, (2) enhance interactive emotional understanding, and (3) have agency over how and when to use emotion features.more » « less
-
Computer-based job interview training, including virtual reality (VR) simulations, have gained popularity in recent years to support and aid autistic individuals, who face significant challenges and barriers in finding and maintaining employment. Although popular, these training systems often fail to resemble the complexity and dynamism of the employment interview, as the dialogue management for the virtual conversation agent either relies on choosing from a menu of prespecified answers, or dialogue processing is based on keyword extraction from the transcribed speech of the interviewee, which depends on the interview script. We address this limitation through automated dialogue act classification via transfer learning. This allows for recognizing intent from user speech, independent of the domain of the interview. We also redress the lack of training data for a domain general job interview dialogue act classifier by providing an original dataset with responses to interview questions within a virtual job interview platform from 22 autistic participants. Participants’ responses to a customized interview script were transcribed to text and annotated according to a custom 13-class dialogue act scheme. The best classifier was a fine-tuned bidirectional encoder representations from transformers (BERT) model, with an f1-score of 87%.more » « less
-
ObjectivesThe main aim of this study was to demonstrate how ordered network analysis of video-recorded interactions combined with verbal response mode (VRM) coding (eg, edification, disclosure, reflection and interpretation) can uncover specific communication patterns that contribute to the development of shared understanding between physicians and nurses. The major hypothesis was that dyads that reached shared understanding would exhibit different sequential relationships between VRM codes compared with dyads that did not reach shared understanding. DesignObservational study design with the secondary analysis of video-recorded interactions. SettingThe study was conducted on two oncology units at a large Midwestern academic health care system in the USA. ParticipantsA total of 33 unique physician–nurse dyadic interactions were included in the analysis. Participants were the physicians and nurses involved in these interactions during patient care rounds. Primary and secondary outcome measuresThe primary outcome measure was the development of shared understanding between physicians and nurses, as determined by prior qualitative analysis. Secondary measures included the frequencies, orders and co-occurrences of VRM codes in the interactions. ResultsA Mann-Whitney U test showed that dyads that reached shared understanding (N=6) were statistically significantly different (U=148, p=0.00, r=0.93) from dyads that did not reach shared understanding (N=25) in terms of the sequential relationships between edification and disclosure, edification and advisement, as well as edification and questioning. Dyads that reached shared understanding engaged in more edification followed by disclosure, suggesting the importance of this communication pattern for reaching shared understanding. ConclusionsThis novel methodology demonstrates a robust approach to inform interventions that enhance physician–nurse communication. Further research could explore applying this approach in other healthcare settings and contexts.more » « less
-
In recent years, the popularity of AI-enabled conversational agents or chatbots has risen as an alternative to traditional online surveys to elicit information from people. However, there is a gap in using single-agent chatbots to converse and gather multi-faceted information across a wide variety of topics. Prior works suggest that single-agent chatbots struggle to understand user intentions and interpret human language during a multi-faceted conversation. In this work, we investigated how multi-agent chatbot systems can be utilized to conduct a multi-faceted conversation across multiple domains. To that end, we conducted a Wizard of Oz study to investigate the design of a multi-agent chatbot for gathering public input across multiple high-level domains and their associated topics. Next, we designed, developed, and evaluated CommunityBots - a multi-agent chatbot platform where each chatbot handles a different domain individually. To manage conversation across multiple topics and chatbots, we proposed a novel Conversation and Topic Management (CTM) mechanism that handles topic-switching and chatbot-switching based on user responses and intentions. We conducted a between-subject study comparing CommunityBots to a single-agent chatbot baseline with 96 crowd workers. The results from our evaluation demonstrate that CommunityBots participants were significantly more engaged, provided higher quality responses, and experienced fewer conversation interruptions while conversing with multiple different chatbots in the same session. We also found that the visual cues integrated with the interface helped the participants better understand the functionalities of the CTM mechanism, which enabled them to perceive changes in textual conversation, leading to better user satisfaction. Based on the empirical insights from our study, we discuss future research avenues for multi-agent chatbot design and its application for rich information elicitation.more » « less
An official website of the United States government
