skip to main content


Title: Agent-Based Dynamic Collaboration Support in a Smart Office Space
For the past 15 years, in computer-supported collaborative learning applications, conversational agents have been used to structure group interactions in online chat-based environments. A series of experimental studies has provided an empirical foundation for the design of chat based conversational agents that significantly improve learning over no-support control conditions and static-support control conditions. In this demo, we expand upon this foundation, bringing conversational agents to structure group interaction into physical spaces, with the specific goal of facilitating collaboration and learning in workplace scenarios.  more » « less
Award ID(s):
1917955 1822831
NSF-PAR ID:
10174662
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of the SIGdial 2020 Conference
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. For nearly two decades, conversational agents have been used to structure group interactions in online chat-based environments. More recently, this form of dynamic support for collaborative learning has been extended to physical spaces using a combination of multimodal sensing technologies and instrumentation installed within a physical space. This demo extends the reach of dynamic support for collaboration still further through an application of what has recently been termed on-device machine learning, which enables a portable form of multimodal detection to trigger real-time responses. 
    more » « less
  2. Negotiation is a complex social interaction that encapsulates emotional encounters in human decision-making. Virtual agents that can negotiate with humans are useful in pedagogy and conversational AI. To advance the development of such agents, we explore the prediction of two important subjective goals in a negotiation – outcome satisfaction and partner perception. Specifically, we analyze the extent to which emotion attributes extracted from the negotiation help in the prediction, above and beyond the individual difference variables. We focus on a recent dataset in chat-based negotiations, grounded in a realistic camping scenario. We study three degrees of emotion dimensions – emoticons, lexical, and contextual by leveraging affective lexicons and a state-of-the-art deep learning architecture. Our insights will be helpful in designing adaptive negotiation agents that interact through realistic communication interfaces. 
    more » « less
  3. Conversational systems typically focus on functional tasks such as scheduling appointments or creating todo lists. Instead we design and evaluate SlugBot (SB), one of 8 semifinalists in the 2018 AlexaPrize, whose goal is to support casual open-domain social inter-action. This novel application requires both broad topic coverage and engaging interactive skills. We developed a new technical approach to meet this demanding situation by crowd-sourcing novel content and introducing playful conversational strategies based on storytelling and games. We collected over 10,000 conversations during August 2018 as part of the Alexa Prize competition. We also conducted an in-lab follow-up qualitative evaluation. Over-all users found SB moderately engaging; conversations averaged 3.6 minutes and involved 26 user turns. However, users reacted very differently to different conversation subtypes. Storytelling and games were evaluated positively; these were seen as entertaining with predictable interactive structure. They also led users to impute personality and intelligence to SB. In contrast, search and general Chit-Chat induced coverage problems; here users found it hard to infer what topics SB could understand, with these conversations seen as being too system-driven. Theoretical and design implications suggest a move away from conversational systems that simply provide factual information. Future systems should be designed to have their own opinions with personal stories to share, and SB provides an example of how we might achieve this. 
    more » « less
  4. null (Ed.)
    Virtual conversational assistants designed specifically for software engineers could have a huge impact on the time it takes for software engineers to get help. Research efforts are focusing on virtual assistants that support specific software development tasks such as bug repair and pair programming. In this paper, we study the use of online chat platforms as a resource towards collecting developer opinions that could potentially help in building opinion Q&A systems, as a specialized instance of virtual assistants and chatbots for software engineers. Opinion Q&A has a stronger presence in chats than in other developer communications, thus mining them can provide a valuable resource for developers in quickly getting insight about a specific development topic (e.g., What is the best Java library for parsing JSON?). We address the problem of opinion Q&A extraction by developing automatic identification of opinion-asking questions and extraction of participants’ answers from public online developer chats. We evaluate our automatic approaches on chats spanning six programming communities and two platforms. Our results show that a heuristic approach to opinion-asking questions works well (.87 precision), and a deep learning approach customized to the software domain outperforms heuristics-based, machine-learning-based and deep learning for answer extraction in community question answering. 
    more » « less
  5. Abstract

    The EngageAI Institute focuses on AI‐driven narrative‐centered learning environments that create engaging story‐based problem‐solving experiences to support collaborative learning. The institute's research has three complementary strands. First, the institute creates narrative‐centered learning environments that generate interactive story‐based problem scenarios to elicit rich communication, encourage coordination, and spark collaborative creativity. Second, the institute creates virtual embodied conversational agent technologies with multiple modalities for communication (speech, facial expression, gesture, gaze, and posture) to support student learning. Embodied conversational agents are driven by advances in natural language understanding, natural language generation, and computer vision. Third, the institute is creating an innovative multimodal learning analytics framework that analyzes parallel streams of multimodal data derived from students’ conversations, gaze, facial expressions, gesture, and posture as they interact with each other, with teachers, and with embodied conversational agents. Woven throughout the institute's activities is a strong focus on ethics, with an emphasis on creating AI‐augmented learning that is deeply informed by considerations of fairness, accountability, transparency, trust, and privacy. The institute emphasizes broad participation and diverse perspectives to ensure that advances in AI‐augmented learning address inequities in STEM. The institute brings together a multistate network of universities, diverse K‐12 school systems, science museums, and nonprofit partners. Key to all of these endeavors is an emphasis on diversity, equity, and inclusion.

     
    more » « less