skip to main content


Title: A qualitative analysis of concept maps through the Research Experiences for Undergraduates (REU) programs.
Learning physics in any context, including undergraduate research experiences (UREs), requires learning its concepts and the relational structure between those new concepts with what students already know. We use concept maps, a knowledge elicitation method, for assessing mentees' and mentors' knowledge structures during Research Experience for Undergraduates programs. The study looked at maps from seven mentor-mentee pairs to understand how mentors and mentees use specific knowledge and strategies during the development of their concept maps. A qualitative analysis of the maps showed mentors and mentees differed in their ways of organizing and displaying their knowledge in terms of structure, scale, language, and use of conceptual and procedural knowledge. For instance, mentees used more procedural knowledge. It is perhaps due to their perception of finishing their REU projects and the fact that they may have only limited and superficial knowledge of specific topics. However, mentors' maps were smaller but more significant in using more comprehensive conceptual knowledge and connecting their maps to the broader scientific context.  more » « less
Award ID(s):
1846321
NSF-PAR ID:
10409050
Author(s) / Creator(s):
;
Editor(s):
Frank, Brian W.; Jones, Dyan L.; Ryan, Qing X.
Date Published:
Journal Name:
Physics Education Research Conference 2022
Page Range / eLocation ID:
525 to 530
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Concept maps have emerged as a valid and reliable method for assessing deep conceptual understanding in engineering education within disciplines as well as interdisciplinary knowledge integration across disciplines. Most work on concept maps, however, focuses on undergraduates. In this paper, we use concept maps to examine changes in graduate students’ conceptual understanding and knowledge integration resulting from an interdisciplinary graduate program. Our study context is pair of foundational, team-taught courses in an interdisciplinary Disaster Resilience and Risk Management (DRRM) graduate program. The courses include a 3-hour research course and a 1-hour seminar that aim to build student understanding within and across Urban Affairs and Planning, Civil and Environmental Engineering, Geosciences, and Business Information Technology. The courses introduce core principles of DRRM and relevant research methods in these disciplines, and drive students to understand the intersections of these disciplines in the context of planning for and responding to natural and human-made disasters. To understand graduate student growth from disciplinary-based to interdisciplinary scholars, we pose the research questions: 1) In what ways do graduate students’ understandings of DRRM change as a result of their introduction to an interdisciplinary graduate research program? and 2) To what extent and in what ways do concept maps serve as a tool to capture interdisciplinary learning in this context? Data includes pre/post concept maps centered on disaster resilience and risk management, a one-page explanation of the post-concept map, and ethnographic field notes gathered from class and faculty meetings. Pre-concept maps were collected on the first day of class; post-concept maps will be collected as part of the final course assignment. We assess the students’ concept maps for depth of conceptual understanding within disciplines and interdisciplinary competency across disciplines, using the field notes to provide explanatory context. The results presented in this paper support the inclusion of an explanation component to concept maps, and also suggest that concept maps alone may not be the best measure of student understanding of concepts within and across disciplines in this specific context. If similar programs wish to use concept maps as an assessment method, we suggest the inclusion of an explanation component and suggest providing explicit instructions that specify the intended audience. We also suggest using a holistic scoring method, as it is more likely to capture nuances in the concept maps than traditional scoring methods, which focus solely on counting factors like hierarchies and number of cross-links. 
    more » « less
  2. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  3. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  4. null (Ed.)
    Engineering graduates need a deep understanding of key concepts in addition to technical skills to be successful in the workforce. However, traditional methods of instruction (e.g., lecture) do not foster deep conceptual understanding and make it challenging for students to learn the technical skills, (e.g., professional modeling software), that they need to know. This study builds on prior work to assess engineering students’ conceptual and procedural knowledge. The results provide an insight into how the use of authentic online learning modules influence engineering students’ conceptual knowledge and procedural skills. We designed online active learning modules to support and deepen undergraduate students’ understanding of key concepts in hydrology and water resources engineering (e.g., watershed delineation, rainfall-runoff processes, design storms), as well as their technical skills (e.g., obtaining and interpreting relevant information for a watershed, proficiency using HEC-HMS and HEC-RAS modeling tools). These modules integrated instructional content, real data, and modeling resources to support students’ solving of complex, authentic problems. The purpose of our study was to examine changes in students’ self-reported understanding of concepts and skills after completing these modules. The participants in this study were 32 undergraduate students at a southern U.S. university in a civil engineering senior design course who were assigned four of these active learning modules over the course of one semester to be completed outside of class time. Participants completed the Student Assessment of Learning Gains (SALG) survey immediately before starting the first module (time 1) and after completing the last module (time 2). The SALG is a modifiable survey meant to be specific to the learning tasks that are the focus of instruction. We created versions of the SALG for each module, which asked students to self-report their understanding of concepts and ability to implement skills that are the focus of each module. We calculated learning gains by examining differences in students’ self-reported understanding of concepts and skills from time 1 to time 2. Responses were analyzed using eight paired samples t-tests (two for each module used, concepts and skills). The analyses suggested that students reported gains in both conceptual knowledge and procedural skills. The data also indicated that the students’ self-reported gain in skills was greater than their gain in concepts. This study provides support for enhancing student learning in undergraduate hydrology and water resources engineering courses by connecting conceptual knowledge and procedural skills to complex, real-world problems. 
    more » « less
  5. null (Ed.)
    Engineering graduates need a deep understanding of key concepts in addition to technical skills to be successful in the workforce. However, traditional methods of instruction (e.g., lecture) do not foster deep conceptual understanding and make it challenging for students to learn the technical skills, (e.g., professional modeling software), that they need to know. This study builds on prior work to assess engineering students’ conceptual and procedural knowledge. The results provide an insight into how the use of authentic online learning modules influence engineering students’ conceptual knowledge and procedural skills. We designed online active learning modules to support and deepen undergraduate students’ understanding of key concepts in hydrology and water resources engineering (e.g., watershed delineation, rainfall-runoff processes, design storms), as well as their technical skills (e.g., obtaining and interpreting relevant information for a watershed, proficiency using HEC-HMS and HEC-RAS modeling tools). These modules integrated instructional content, real data, and modeling resources to support students’ solving of complex, authentic problems. The purpose of our study was to examine changes in students’ self-reported understanding of concepts and skills after completing these modules. The participants in this study were 32 undergraduate students at a southern U.S. university in a civil engineering senior design course who were assigned four of these active learning modules over the course of one semester to be completed outside of class time. Participants completed the Student Assessment of Learning Gains (SALG) survey immediately before starting the first module (time 1) and after completing the last module (time 2). The SALG is a modifiable survey meant to be specific to the learning tasks that are the focus of instruction. We created versions of the SALG for each module, which asked students to self-report their understanding of concepts and ability to implement skills that are the focus of each module. We calculated learning gains by examining differences in students’ self-reported understanding of concepts and skills from time 1 to time 2. Responses were analyzed using eight paired samples t-tests (two for each module used, concepts and skills). The analyses suggested that students reported gains in both conceptual knowledge and procedural skills. The data also indicated that the students’ self-reported gain in skills was greater than their gain in concepts. This study provides support for enhancing student learning in undergraduate hydrology and water resources engineering courses by connecting conceptual knowledge and procedural skills to complex, real-world problems. 
    more » « less