skip to main content


Search for: All records

Creators/Authors contains: "Panter, Abigail"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Undergraduate science, technology, engineering, and mathematics (STEM) students’ motivations have a strong influence on whether and how they will persist through challenging coursework and into STEM careers. Proper conceptualization and measurement of motivation constructs, such as students’ expectancies and per- ceptions of value and cost (i.e., expectancy value theory [EVT]) and their goals (i.e., achievement goal theory [AGT]), are necessary to understand and enhance STEM persistence and success. Research findings suggest the importance of exploring multiple measurement models for motivation constructs, including traditional con- firmatory factor analysis, exploratory structural equation models (ESEM), and bifactor models, but more research is needed to determine whether the same model fits best across time and context. As such, we mea- sured undergraduate biology students’ EVT and AGT motivations and investigated which measurement model best fit the data, and whether measurement invariance held, across three semesters. Having determined the best- fitting measurement model and type of invariance, we used scores from the best performing model to predict biology achievement. Measurement results indicated a bifactor-ESEM model had the best data-model fit for EVT and an ESEM model had the best data-model fit for AGT, with evidence of measurement invariance across semesters. Motivation factors, in particular attainment value and subjective task value, predicted small yet statistically significant amounts of variance in biology course outcomes each semester. Our findings provide support for using modern measurement models to capture students’ STEM motivations and potentially refine conceptualizations of them. Such future research will enhance educators’ ability to benevolently monitor and support students’ motivation, and enhance STEM performance and career success. 
    more » « less
  2. Well-designed instructional videos are powerful tools for helping students learn and prompting students to use generative strategies while learning from videos further bolsters their effectiveness. However, little is known about how individual differences in motivational factors, such as achievement goals, relate to how students learn within multimedia environments that include instructional videos and generative strategies. Therefore, in this study, we explored how achievement goals predicted undergraduate students’ behaviors when learning with instructional videos that required students to answer practice questions between videos, as well as how those activities predicted subsequent unit exam performance one week later. Additionally, we tested the best measurement models for modeling achievement goals between traditional confirmatory factor analysis and bifactor confirmatory factor analysis. The bifactor model fit our data best and was used for all subsequent analyses. Results indicated that stronger mastery goal endorsement predicted performance on the practice questions in the multimedia learning environment, which in turn positively predicted unit exam performance. In addition, students’ time spent watching videos positively predicted practice question performance. Taken together, this research emphasizes the availing role of adaptive motivations, like mastery goals, in learning from instructional videos that prompt the use of generative learning strategies. 
    more » « less
  3. Abstract

    Even highly motivated undergraduates drift off their STEM career pathways. In large introductory STEM classes, instructors struggle to identify and support these students. To address these issues, we developed co‐redesign methods in partnership with disciplinary experts to create high‐structure STEM courses that better support students and produce informative digital event data. To those data, we applied theory‐ and context‐relevant labels to reflect active and self‐regulated learning processes involving LMS‐hosted course materials, formative assessments, and help‐seeking tools. We illustrate the predictive benefits of this process across two cycles of model creation and reapplication. In cycle 1, we used theory‐relevant features from 3 weeks of data to inform a prediction model that accurately identified struggling students and sustained its accuracy when reapplied in future semesters. In cycle 2, we refit a model with temporally contextualized features that achieved superior accuracy using data from just two class meetings. This modelling approach can produce durable learning analytics solutions that afford scaled and sustained prediction and intervention opportunities that involve explainable artificial intelligence products. Those same products that inform prediction can also guide intervention approaches and inform future instructional design and delivery.Practitioner notesWhat is already known about this topic

    Learning analytics includes an evolving collection of methods for tracing and understanding student learning through their engagements with learning technologies.

    Prediction models based on demographic data can perpetuate systemic biases.

    Prediction models based on behavioural event data can produce accurate predictions of academic success, and validation efforts can enrich those data to reflect students' self‐regulated learning processes within learning tasks.

    What this paper adds

    Learning analytics can be successfully applied to predict performance in an authentic postsecondary STEM context, and the use of context and theory as guides for feature engineering can ensure sustained predictive accuracy upon reapplication.

    The consistent types of learning resources and cyclical nature of their provisioning from lesson to lesson are hallmarks of high‐structure active learning designs that are known to benefit learners. These designs also provide opportunities for observing and modelling contextually grounded, theory‐aligned and temporally positioned learning events that informed prediction models that accurately classified students upon initial and later reapplications in subsequent semesters.

    Co‐design relationships where researchers and instructors work together toward pedagogical implementation and course instrumentation are essential to developing unique insights for feature engineering and producing explainable artificial intelligence approaches to predictive modelling.

    Implications for practice and/or policy

    High‐structure course designs can scaffold student engagement with course materials to make learning more effective and products of feature engineering more explainable.

    Learning analytics initiatives can avoid perpetuation of systemic biases when methods prioritize theory‐informed behavioural data that reflect learning processes, sensitivity to instructional context and development of explainable predictors of success rather than relying on students' demographic characteristics as predictors.

    Prioritizing behaviours as predictors improves explainability in ways that can inform the redesign of courses and design of learning supports, which further informs the refinement of learning theories and their applications.

     
    more » « less