Self-report assessments are used frequently in higher education to assess a variety of constructs, including attitudes, opinions, knowledge, and competence. Systems thinking is an example of one competence often measured using self-report assessments where individuals answer several questions about their perceptions of their own skills, habits, or daily decisions. In this study, we define systems thinking as the ability to see the world as a complex interconnected system where different parts can influence each other, and the interrelationships determine system outcomes. An alternative, less-common, assessment approach is to measure skills directly by providing a scenario about an unstructured problem and evaluating respondents’ judgment or analysis of the scenario (scenario-based assessment). This study explored the relationships between engineering students’ performance on self-report assessments and scenario-based assessments of systems thinking, finding that there were no significant relationships between the two assessment techniques. These results suggest that there may be limitations to using self-report assessments as a method to assess systems thinking and other competencies in educational research and evaluation, which could be addressed by incorporating alternative formats for assessing competence. Future work should explore these findings further and support the development of alternative assessment approaches.
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract -
Abstract Background Engineers operate in an increasingly global environment, making it important that engineering students develop global engineering competency to prepare them for success in the workplace. To understand this learning, we need assessment approaches that go beyond traditional self‐report surveys. A previous study (Jesiek et al.,
Journal of Engineering Education 2020; 109(3):1–21) began this process by developing a situational judgment test (SJT) to assess global engineering competency based in the Chinese context and administering it to practicing engineers.Purpose We built on this previous study by administering the SJT to engineering students to explore what prior experiences related to their SJT performance and how their SJT performance compared with practicing engineers' performance on the SJT.
Method Engineering students completed a survey including the SJT and related self‐report survey instruments. We collected data from three groups of students: those who had studied abroad in China; those who had studied abroad elsewhere; and those who had not studied abroad.
Results We found that students' SJT performance did not relate to their scores on the self‐report instruments, but did relate to their participation in study abroad programs. The students also performed better on the SJT when compared to the practicing engineers.
Conclusions Our results highlight the need to use multiple forms of assessment for global engineering competence. Although building evidence for the validity of the Global Engineering Competency China SJT is an ongoing process, this data collection technique may provide new insights on global engineering competency compared to traditionally used assessments.
-
Abstract There is an increasing emphasis on assessing student learning outcomes from study abroad experiences, but this assessment often focuses on a limited range of outcomes and assessment methods. We argue for shifting to assessing student learning
processes in study abroad and present the critical incident technique as one approach to achieve this goal. We demonstrate this approach in interviews with 79 students across a range of global engineering programs, through which we identified 173 incidents which were analyzed to identify common themes. This analysis revealed that students described a wide range of experiences and outcomes from their time abroad. Students’ experiences were messy and complex, making them challenging to understand through typical assessment approaches. Our findings emphasize the importance of using a range of assessment approaches and suggest that exploring students’ learning processes in addition to learning outcomes could provide new insights to inform the design of study abroad programs. -
International research programs for students offer an important opportunity to support students in developing skills in both research and intercultural competence. During the COVID-19 pandemic, many of these programs made the shift to operating virtually, with likely impacts on program outcomes. The purpose of this study was to identify the approaches that program leaders used in adapting international research programs to the virtual environment and explore how these innovations could inform the design of these programs going forward. We conducted eight focus groups with over 40 U.S.-based faculty who had experience running these programs to understand the benefits, challenges, and future potential of incorporating virtual elements into international research programs for students. This paper reports the results of these focus groups and provides suggestions for future program design based on best practices and innovations identified through the development of virtual programs.more » « less