skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: LIFT: Integrating Stakeholder Voices into Algorithmic Team Formation
Team formation tools assume instructors should configure the criteria for creating teams, precluding students from participating in a process affecting their learning experience. We propose LIFT, a novel learner-centered workflow where students propose, vote for, and weigh the criteria used as inputs to the team formation algorithm. We conducted an experiment (N=289) comparing LIFT to the usual instructor-led process, and interviewed participants to evaluate their perceptions of LIFT and its outcomes. Learners proposed novel criteria not included in existing algorithmic tools, such as organizational style. They avoided criteria like gender and GPA that instructors frequently select, and preferred those promoting efficient collaboration. LIFT led to team outcomes comparable to those achieved by the instructor-led approach, and teams valued having control of the team formation process. We provide instructors and designers with a workflow and evidence supporting giving learners control of the algorithmic process used for grouping them into teams.  more » « less
Award ID(s):
2016908
PAR ID:
10460884
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
Proceedings of the CHI Conference on Human Factors in Computing Systems
Page Range / eLocation ID:
1 to 13
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The configuration that an instructor enters into an algorithmic team formation tool determines how students are grouped into teams, impacting their learning experiences. One way to decide the configuration is to solicit input from the students. Prior work has investigated the criteria students prefer for team formation, but has not studied how students prioritize the criteria or to what degree students agree with each other. This paper describes a workflow for gathering student preferences for how to weight the criteria entered into a team formation tool, and presents the results of a study in which the workflow was implemented in four semesters of the same project-based design course. In the most recent semester, the workflow was supplemented with an online peer discussion to learn about students' rationale for their selections. Our results show that students want to be grouped with other students who share the same course commitment and compatible schedules the most. Students prioritize demographic attributes next, and then task skills such as programming needed for the project work. We found these outcomes to be consistent in each instance of the course. Instructors can use our results to guide team formation in their own project-based design courses and replicate our workflow to gather student preferences for team formation in any course. 
    more » « less
  2. Jeff Nichols (Ed.)
    Instructors using algorithmic team formation tools must decide which criteria (e.g., skills, demographics, etc.) to use to group students into teams based on their teamwork goals, and have many possible sources from which to draw these configurations (e.g., the literature, other faculty, their students, etc.). However, tools offer considerable flexibility and selecting ineffective configurations can lead to teams that do not collaborate successfully. Due to such tools’ relative novelty, there is currently little knowledge of how instructors choose which of these sources to utilize, how they relate different criteria to their goals for the planned teamwork, or how they determine if their configuration or the generated teams are successful. To close this gap, we conducted a survey (N=77) and interview (N=21) study of instructors using CATME Team-Maker and other criteria-based processes to investigate instructors’ goals and decisions when using team formation tools. The results showed that instructors prioritized students learning to work with diverse teammates and performed “sanity checks” on their formation approach’s output to ensure that the generated teams would support this goal, especially focusing on criteria like gender and race. However, they sometimes struggled to relate their educational goals to specific settings in the tool. In general, they also did not solicit any input from students when configuring the tool, despite acknowledging that this information might be useful. By opening the “black box” of the algorithm to students, more learner-centered approaches to forming teams could therefore be a promising way to provide more support to instructors configuring algorithmic tools while at the same time supporting student agency and learning about teamwork. 
    more » « less
  3. Peer evaluations are critical for assessing teams, but are susceptible to bias and other factors that undermine their reliability. At the same time, collaborative tools that teams commonly use to perform their work are increasingly capable of logging activity that can signal useful information about individual contributions and teamwork. To investigate current and potential uses for activity traces in peer evaluation tools, we interviewed (N=11) and surveyed (N=242) students and interviewed (N=10) instructors at a single university. We found that nearly all of the students surveyed considered specific contributions to the team outcomes when evaluating their teammates, but also reported relying on memory and subjective experiences to make the assessment. Instructors desired objective sources of data to address challenges with administering and interpreting peer evaluations, and have already begun incorporating activity traces from collaborative tools into their evaluations of teams. However, both students and instructors expressed concern about using activity traces due to the diverse ecosystem of tools and platforms used by teams and the limited view into the context of the contributions. Based on our findings, we contribute recommendations and a speculative design for a data-centric peer evaluation tool. 
    more » « less
  4. In engineering education, laboratory learning that is well aligned with core content knowledge is instrumental as it plays a significant role in students’ knowledge construction, application, and distribution. Learning in laboratories is interactive in nature, and therefore students who learn engineering through online platforms can face many challenges with labs, which were frequently documented during the recent pandemic. To address those reported challenges, innovative online lab learning modules were developed and learning strategies were implemented in five courses in electrical engineering, Circuits I, Electronics I, Electronics II, Signals and Systems, and Microcomputers I, through which students gain solid foundation before students take on senior design projects. Lab modules with open-ended design learning experience through using a lab-in-a-box approach were developed to allow students to solve lab problems with multiple approaches that allow problem solving independently and collaboratively. Because this innovative lab design allows problem solving at various cognitive levels, it is better suited for concept exploration and collaborative lab learning environments as opposed to the traditional lab works with a “cookbook” approach that tend to lead students to follow certain procedures for expected solutions with the absence of problem exploration stage. In addition to the open-ended lab modules, course instructors formed online lab groups through which students shared the entire problem-solving process from ideas formation to solutions through trial and error. To investigate the effectiveness of the open-ended online lab learning experiences, students in all courses were randomly divided into experimental and control groups. Students in the control group learned in labs through learning materials that are aligned with core concepts by following a completed given procedures students in the experimental group learned through inquiry-based labs learning materials that required them to work in teams by integrating core concepts together to find solutions with multiple approaches. To maximize the online lab learning effect and to replicate the way industry, commerce and research practice, instructor structured cooperative learning strategies were applied along with pre-lab simulations and videos. The research results showed that generally students in the experimental group outperformed their counterparts in labs especially with more advanced concept understanding and applications, but showed mixed results for the overall class performance based on their course learning outcomes such as quizzes, lab reports, and tests. Further, survey results showed that 72% of students reported open-ended lab learning helped them learn better. According to interviews, the initial stage of working with team members was somewhat challenging from difficulties in finding time to work together for discussion and problem solving. Yet, through many communication tools, such as course LMS and mobile apps they were able to collaborate in lab problems, which also led them to build learning communities that went beyond the courses. 
    more » « less
  5. Teaching engineering students how to work in teams is necessary, important, and hard to do well. Minoritized students experience forms of marginalization from their teammates routinely, which affects their access to safe learning environments. Team evaluation tools like CATME can help instructors see where teaming problems are, but are often normed in ways that obscure the subtle if pervasive harassment of minoritized teammates. Instructors, particularly of large courses, need better ways to identify teams that are marginalizing minoritized team members. This paper introduces theory on microaggressions, selective incivility theory, and coded language to interpret data collected from a complex study site during the COVID-19 pandemic. The team collected data from classroom observations (moved virtual during COVID), interviews with instructors, interviews with students, interpretations of historical data collected through an online team evaluation tool called CATME, and a diary study where students documented their reflections on their marginalization by teammates. While data collection and analysis did not, of course, go as the research team had planned, it yielded insights into how frequently minoritized teammates experience marginalization, instructors’ sense of their responsibility and skill for addressing such, marginalization, and students’ sense of defeat in hoping for more equitable and supportive learning environments. The paper describes our data collection processes, analysis, and some choice insights drawn from this multi-year study at a large, research-extensive white university. 
    more » « less