Abstract BackgroundEffectively facilitating teamwork experiences, particularly in the context of large-size courses, is difficult to implement. This study seeks to address the challenges of implementing effective teamwork experiences in large courses. This study integrated teamwork pedagogy to facilitate a semester-long project in the context of a large-size class comprising 118 students organized into 26 teams. The data for this study were collected from two online teamwork sessions when teams collaborated and self-recorded during the in-class time. The video recordings were qualitatively analyzed to identify patterns in team dynamics processes through visualizations. The study aims to provide insights into the different ways team members engaged in team dynamics processes during different phases of the semester. ResultsFindings suggest that members of teams were mostly active and passive during meetings and less constructive and interactive in their engagement. Team members mainly engaged in communication, team orientation, and feedback behaviors. Over time, team members' interactions with one another remained about the same, with feedback behaviors tending to diminish and coordination behaviors staying about the same or slightly increasing over time. ConclusionThe implications of this study extend to both practice and theory. Practically, combining cooperative learning and scrum practices enabled a blend of collaborative and cooperative work, which suggests providing teams with tools and structures to coordinate teamwork processes and promote interaction among team members. From a theoretical perspective, this study contributes to the understanding of temporal aspects of teamwork dynamics by examining how team interactions evolve during working sessions at different points in time. Overall, this research provides valuable insights for educators, practitioners, and researchers aiming to enhance teamwork experiences in large courses, particularly in software development disciplines.
more »
« less
(Mis)align: a simple dynamic framework for modeling interpersonal coordination
Abstract As people coordinate in daily interactions, they engage in different patterns of behavior to achieve successful outcomes. This includes both synchrony—the temporal coordination of the same behaviors at the same time—and complementarity—the coordination of the same or different behaviors that may occur at different relative times. Using computational methods, we develop a simple framework to describe the interpersonal dynamics of behavioral synchrony and complementarity over time, and explore their task-dependence. A key feature of this framework is the inclusion of a task context that mediates interactions, and consists of active, inactive, and inhibitory constraints on communication. Initial simulation results show that these task constraints can be a robust predictor of simulated agents’ behaviors over time. We also show that the framework can reproduce some general patterns observed in human interaction data. We describe preliminary theoretical implications from these results, and relate them to broader proposals of synergistic self-organization in communication.
more »
« less
- Award ID(s):
- 2120932
- PAR ID:
- 10470983
- Publisher / Repository:
- Nature Publishing Group
- Date Published:
- Journal Name:
- Scientific Reports
- Volume:
- 13
- Issue:
- 1
- ISSN:
- 2045-2322
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Observing how infants and mothers coordinate their behaviors can highlight meaningful patterns in early communication and infant development. While dyads often differ in the modalities they use to communicate, especially in the first year of life, it remains unclear how to capture coordination across multiple types of behaviors using existing computational models of interpersonal synchrony. This paper explores Dynamic Mode Decomposition with control (DMDc) as a method of integrating multiple signals from each communicating partner into a model of multimodal behavioral coordination. We used an existing video dataset to track the head pose, arm pose, and vocal fundamental frequency of infants and mothers during the Face-to-Face Still-Face (FFSF) procedure, a validated 3-stage interaction paradigm. For each recorded interaction, we fit both unimodal and multimodal DMDc models to the extracted pose data. The resulting dynamic characteristics of the models were analyzed to evaluate trends in individual behaviors and dyadic processes across infant age and stages of the interactions. Results demonstrate that observed trends in interaction dynamics across stages of the FFSF protocol were stronger and more significant when models incorporated both head and arm pose data, rather than a single behavior modality. Model output showed significant trends across age, identifying changes in infant movement and in the relationship between infant and mother behaviors. Models that included mothers’ audio data demonstrated similar results to those evaluated with pose data, confirming that DMDc can leverage different sets of behavioral signals from each interacting partner. Taken together, our results demonstrate the potential of DMDc toward integrating multiple behavioral signals into the measurement of multimodal interpersonal coordination.more » « less
-
Abstract The extent to which hand dominance may influence how each agent contributes to inter-personal coordination remains unknown. In the present study, right-handed human participants performed object balancing tasks either in dyadic conditions with each agent using one hand (left or right), or in bimanual conditions where each agent performed the task individually with both hands. We found that object load was shared between two hands more asymmetrically in dyadic than single-agent conditions. However, hand dominance did not influence how two hands shared the object load. In contrast, hand dominance was a major factor in modulating hand vertical movement speed. Furthermore, the magnitude of internal force produced by two hands against each other correlated with the synchrony between the two hands’ movement in dyads. This finding supports the important role of internal force in haptic communication. Importantly, both internal force and movement synchrony were affected by hand dominance of the paired participants. Overall, these results demonstrate, for the first time, that pairing of one dominant and one non-dominant hand may promote asymmetrical roles within a dyad during joint physical interactions. This appears to enable the agent using the dominant hand to actively maintain effective haptic communication and task performance.more » « less
-
Abstract Human–exoskeleton interactions have the potential to bring about changes in human behavior for physical rehabilitation or skill augmentation. Despite significant advances in the design and control of these robots, their application to human training remains limited. The key obstacles to the design of such training paradigms are the prediction of human–exoskeleton interaction effects and the selection of interaction control to affect human behavior. In this article, we present a method to elucidate behavioral changes in the human–exoskeleton system and identify expert behaviors correlated with a task goal. Specifically, we observe the joint coordinations of the robot, also referred to as kinematic coordination behaviors, that emerge from human–exoskeleton interaction during learning. We demonstrate the use of kinematic coordination behaviors with two task domains through a set of three human-subject studies. We find that participants (1) learn novel tasks within the exoskeleton environment, (2) demonstrate similarity of coordination during successful movements within participants, (3) learn to leverage these coordination behaviors to maximize success within participants, and (4) tend to converge to similar coordinations for a given task strategy across participants. At a high level, we identify task-specific joint coordinations that are used by different experts for a given task goal. These coordinations can be quantified by observing experts and the similarity to these coordinations can act as a measure of learning over the course of training for novices. The observed expert coordinations may further be used in the design of adaptive robot interactions aimed at teaching a participant the expert behaviors.more » « less
-
With few exceptions, most research in automated assessment of depression has considered only the patient’s behavior to the exclusion of the therapist’s behavior. We investigated the interpersonal coordination (synchrony) of head movement during patient-therapist clinical interviews. Participants with major depressive disorder were recorded in clinical interviews (Hamilton Rating Scale for Depression, HRSD) at 7-week intervals over a period of 21 weeks. For each session, patient and therapist 3D head movement was tracked from 2D videos. Head angles in the horizontal (pitch) and vertical (yaw) axes were used to measure head movement. Interpersonal coordination of head movement between patients and therapists was measured using windowed cross-correlation. Patterns of coordination in head movement were investigated using the peak picking algorithm. Changes in head movement coordination over the course of treatment were measured using a hierarchical linear model (HLM). The results indicated a strong effect for patient-therapist head movement synchrony. Within-dyad variability in head movement coordination was higher than between-dyad variability, meaning that differences over time in a dyad were higher as compared to the differences between dyads. Head movement synchrony did not change over the course of treatment. To the best of our knowledge, this study is the first attempt to analyze the mutual influence of patient-therapist head movement in relation to depression severity.more » « less
An official website of the United States government
