skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Interactive Demo of the Modeling and Evidence Mapping Environment (MEME) for Supporting both Elementary and Graduate Students.
The Modeling and Evidence Mapping Environment (MEME) was designed to support elementary students in using evidence to create a model of an ecosystem. While drawing inspiration from prior modeling environments, MEME is unique in combining the following: 1) MEME incorporates explicit systems scaffolds based on the Phenomena, Mechanism, Component (PMC) framework; 2) MEME supports collaborative, qualitative model building; 3) MEME directly incorporates evidence within the model and modeling environment, and 4) students and teachers can provide and reply to comments directly on the model itself. We will give participants an opportunity to use MEME and share models produced both by 5th grade students learning about ecosystems, and graduate students exploring cultural historical activity theory (CHAT).  more » « less
Award ID(s):
1761019
PAR ID:
10249151
Author(s) / Creator(s):
Date Published:
Journal Name:
Annual Meeting of the International Society of the Learning Sciences (ISLS).
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Students in science education struggle with creating and iteratively revising models based on evidence. We report on an implementation of a “gallery walk” activity where 5th grade students used the Model and Evidence Mapping Environment (MEME) software tool to develop and then critique each other’s models of an algal bloom. MEME was designed to support students in creating visual models organized around the components and mechanisms of the target phenomena, linking evidence to those models, and then providing and responding to comments on the specific features of the model. Findings illustrate how this was a productive environment for students to make their ideas about modeling criteria visible, and how their ideas cut across normative dimensions of modeling expertise. 
    more » « less
  2. null (Ed.)
    This study examines how 5th grade students represent the mechanisms of a complex aquatic ecosystem in the Modeling and Evidence Mapping Environment (MEME), a software tool designed to support students in iteratively modeling the elements within a complex system, and their relationships to each other. We explore the various ways students represented mechanisms of an aquatic ecosystem through their models and present our findings on the patterns that emerged and the unexpected ways that mechanisms were utilized within student models. 
    more » « less
  3. A. Weinberger; W. Chen; D.Hernández-Leo; D., B. Chen (Ed.)
    Scientific argumentation and modeling are both core practices in learning and doing science. However, they are challenging for students. Although there is considerable literature about scientific argumentation or modeling practice in K-12 science, there are limited studies on how engaging students in modeling and scientific argumentation might be mutually supportive. This study aims to explore how 5th graders can be supported by our designed mediators as they engage in argumentation and modeling, in particular, model revision. We implemented a virtual afterschool science club to examine how our modeling tool – MEME (Model and Evidence Mapping Environment), provided evidence, peer comments, and other mediators influenced students in learning about aquatic ecosystems through developing a model. While both groups that we examined constructed strong arguments and developed good models, we show how the mediators played different roles in helping them be successful. 
    more » « less
  4. Abstract Interpreting and creating computational systems models is an important goal of science education. One aspect of computational systems modeling that is supported by modeling, systems thinking, and computational thinking literature is “testing, evaluating, and debugging models.” Through testing and debugging, students can identify aspects of their models that either do not match external data or conflict with their conceptual understandings of a phenomenon. This disconnect encourages students to make model revisions, which in turn deepens their conceptual understanding of a phenomenon. Given that many students find testing and debugging challenging, we set out to investigate the various testing and debugging behaviors and behavioral patterns that students use when building and revising computational system models in a supportive learning environment. We designed and implemented a 6-week unit where students constructed and revised a computational systems model of evaporative cooling using SageModeler software. Our results suggest that despite being in a common classroom, the three groups of students in this study all utilized different testing and debugging behavioral patterns. Group 1 focused on using external peer feedback to identify flaws in their model, group 2 used verbal and written discourse to critique their model’s structure and suggest structural changes, and group 3 relied on systemic analysis of model output to drive model revisions. These results suggest that multiple aspects of the learning environment are necessary to enable students to take these different approaches to testing and debugging. 
    more » « less
  5. Abstract BackgroundThis study posits that scaffolded team-based computational modeling and simulation projects can support model-based learning that can result in evidence of representational competence and regulatory skills. The study involved 116 students from a second-year thermodynamics undergraduate course organized into 24 teams, who worked on three two-week-long team-based computational modeling and simulation projects and reflected upon their experience. ResultsResults characterized different levels of engagement with computational model-based learning in the form of problem formulation and model planning, implementation and use of the computational model, evaluation, and interpretation of the outputs of the model, as well as reflection on the process. Results report on students’ levels of representational competence as related to the computational model, meaning-making of the underlying code of the computational model, graphical representations generated by the model, and explanations and interpretations of the output representations. Results also described regulatory skills as challenges and strategies related to programming skills, challenges and strategies related to meaning-making skills for understanding and connecting the science to the code and the results, and challenges and strategies related to process management mainly focused on project management skills. ConclusionCharacterizing dimensions of computational model-based reasoning provides insights that showcase students’ learning, benefits, and challenges when engaging in team-based computational modeling and simulation projects. This study also contributes to evidence-based scaffolding strategies that can support undergraduate students' engagement in the context of computational modeling and simulation. 
    more » « less