Research based assessments have a productive and storied history in PER. While useful for conducting research on student learning, their utility is limited for instructors interested in improving their own courses. We have developed a new assessment design process that leverages three-dimensional learning, evidence-centered design, and self-regulated learning to deliver actionable feedback to instructors about supporting their students' learning. We are using this approach to design the Thermal and Statistical Physics Assessment (TaSPA), which also allows instructors to choose learning goals that align with their teaching. Perhaps more importantly, this system will be completely automated when it is completed, making the assessment scalable with minimal burden on instructors and researchers. This work represents an advancement in how we assess physics learning at a large scale and how the PER community can better support physics instructors and students.
more »
« less
gPortfolios: A pragmatic approach to online asynchronous assignments
Purpose: We gathered examples from our extended collaboration to move educators move online while avoiding synchronous meetings. “gPortfolios” are public (to the class) pages where students write responses to carefully constructed engagement routines. Students then discuss their work with instructors and peers in threaded comments. gPortfolios usually include engagement reflections, formative self-assessments, and automated quizzes. These assessments support and document learning while avoiding instructor “burnout” from grading. gPortfolios can be implemented using Google Docs and Forms or any learning management system. Methodology. We report practical insights gained from design-based implementation research. This research explored the late Randi Engle’s principles for productive disciplinary engagement and expansive framing. Engle used current theories of learning to foster student discussions that were both authentic to the academic discipline at hand and productive for learning. This research also used new approaches to assessment to support Engle’s principles. This resulted in a comprehensive approach to online instruction and assessment that is effective and efficient for both students and teachers. Findings. Our approach “frames” (i.e., contextualizes) online engagement using each learners’ own experiences, perspectives, and goals. Writing this revealed how this was different in different courses. Secondary biology students framed each assignment independently. Secondary English and history students framed assignments as elements of a personalized capstone presentation; the history students further used a self-selected “historical theme.” Graduate students framed each assignment in an educational assessment course using a real or imagined curricular aim and context. Originality. Engle’s ideas have yet to be widely taken up in online education.
more »
« less
- Award ID(s):
- 1915498
- PAR ID:
- 10299958
- Date Published:
- Journal Name:
- Information and learning science
- Volume:
- 121
- Issue:
- 5/6
- ISSN:
- 2398-5348
- Page Range / eLocation ID:
- 273-283
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Harms, Kyle; Cunha, Jácome; Oney, Steve; Kelleher, Caitlin (Ed.)Analytics about how students navigate online learning tools throughout the duration of an assignment is scarce. Knowledge about how students use online tools before a course’s end could positively impact students’ learning outcomes. We introduce PEDI (Piazza Explorer Dashboard for Intervention), a tool which analyzes and presents visualizations of forum activity on Piazza, a question and answer forum, to instructors. We outline the design principles and data-informed recommendations used to design PEDI. Our prior research revealed two critical periods in students’ forum engagement over the duration of an assignment. Early engagement in the first half of an assignment duration positively correlates with class average performance. Whereas, extremely high engagement toward the deadline predicted lower class average performance. PEDI uses these findings to detect and flag troubling engagement levels and informs instructors through clear visualizations to promote data-informed interventions. By providing insights to instructors, PEDI may improve class performance and pave the way for a new generation of online tools.more » « less
-
Preservice teachers (PSTs) need to be able to use ambitious teaching practices to help support their students’ productive engagement in scientific practices such as analyzing and interpreting data or using evidence-based reasoning to support their claims. Approximations of practice are one way in which teacher educators can support their PSTs to develop their skills in enacting ambitious teaching practices. In this study, we report on the use of a suite of three online, simulated approximations of practice where secondary PSTs practiced facilitating discussions focused on engaging students in argumentation. Using information from both PSTs’ and teacher educators’ perspectives, we examined their main takeaways from each simulation experience, how learning from one simulation was used to prepare for the next simulation, PSTs’ perception of the simulations’ authenticity, and their views about whether they would recommend using this online suite of simulations in future teacher preparation courses. Findings suggested that teacher educators and PSTs alike noted a variety of main takeaways, including understanding the importance of planning and asking good questions. Furthermore, they recommended the suite for use in future teacher education courses. Implications of the work for productively integrating online simulations into teacher education settings are discussed.more » « less
-
Abstract Self-report assessments are used frequently in higher education to assess a variety of constructs, including attitudes, opinions, knowledge, and competence. Systems thinking is an example of one competence often measured using self-report assessments where individuals answer several questions about their perceptions of their own skills, habits, or daily decisions. In this study, we define systems thinking as the ability to see the world as a complex interconnected system where different parts can influence each other, and the interrelationships determine system outcomes. An alternative, less-common, assessment approach is to measure skills directly by providing a scenario about an unstructured problem and evaluating respondents’ judgment or analysis of the scenario (scenario-based assessment). This study explored the relationships between engineering students’ performance on self-report assessments and scenario-based assessments of systems thinking, finding that there were no significant relationships between the two assessment techniques. These results suggest that there may be limitations to using self-report assessments as a method to assess systems thinking and other competencies in educational research and evaluation, which could be addressed by incorporating alternative formats for assessing competence. Future work should explore these findings further and support the development of alternative assessment approaches.more » « less
-
In the United States, the onset of COVID-19 triggered a nationwide lockdown, which forced many universities to move their primary assessments from invigilated in-person exams to unproctored online exams. This abrupt change occurred midway through the Spring 2020 semester, providing an unprecedented opportunity to investigate whether online exams can provide meaningful assessments of learning relative to in-person exams on a per-student basis. Here, we present data from nearly 2,000 students across 18 courses at a large Midwestern University. Using a meta-analytic approach in which we treated each course as a separate study, we showed that online exams produced scores that highly resembled those from in-person exams at an individual level despite the online exams being unproctored—as demonstrated by a robust correlation between online and in-person exam scores. Moreover, our data showed that cheating was either not widespread or ineffective at boosting scores, and the strong assessment value of online exams was observed regardless of the type of questions asked on the exam, the course level, academic discipline, or class size. We conclude that online exams, even when unproctored, are a viable assessment tool.more » « less
An official website of the United States government

