Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available October 22, 2026
-
Abstract Self-report assessments are used frequently in higher education to assess a variety of constructs, including attitudes, opinions, knowledge, and competence. Systems thinking is an example of one competence often measured using self-report assessments where individuals answer several questions about their perceptions of their own skills, habits, or daily decisions. In this study, we define systems thinking as the ability to see the world as a complex interconnected system where different parts can influence each other, and the interrelationships determine system outcomes. An alternative, less-common, assessment approach is to measure skills directly by providing a scenario about an unstructured problem and evaluating respondents’ judgment or analysis of the scenario (scenario-based assessment). This study explored the relationships between engineering students’ performance on self-report assessments and scenario-based assessments of systems thinking, finding that there were no significant relationships between the two assessment techniques. These results suggest that there may be limitations to using self-report assessments as a method to assess systems thinking and other competencies in educational research and evaluation, which could be addressed by incorporating alternative formats for assessing competence. Future work should explore these findings further and support the development of alternative assessment approaches.more » « less
-
Abstract There is an increasing emphasis on assessing student learning outcomes from study abroad experiences, but this assessment often focuses on a limited range of outcomes and assessment methods. We argue for shifting to assessing student learningprocessesin study abroad and present the critical incident technique as one approach to achieve this goal. We demonstrate this approach in interviews with 79 students across a range of global engineering programs, through which we identified 173 incidents which were analyzed to identify common themes. This analysis revealed that students described a wide range of experiences and outcomes from their time abroad. Students’ experiences were messy and complex, making them challenging to understand through typical assessment approaches. Our findings emphasize the importance of using a range of assessment approaches and suggest that exploring students’ learning processes in addition to learning outcomes could provide new insights to inform the design of study abroad programs.more » « less
An official website of the United States government
