skip to main content


Title: Assessment in Chemistry Education
Abstract

Knowing what students know and can do is challenging business, as we cannot directly examine what any given person is thinking. Instead, we must construct opportunities for students to make their thinking visible and interpret the evidence elicited from these opportunities to infer progress toward desired outcomes. Here we describe the approach of “assessments as evidentiary arguments” and examine several types of assessments that are used to evaluate the learning of college chemistry students. Throughout this discussion, we will pay especial attention to the assumptions (or lack thereof) that particular assessment strategies make about how students learn. We have limited discussion to what students know and can do with regards to chemistry and chemical phenomena. That is, we are not concerned here with assessment of affective constructs (e. g. motivation, identity).

 
more » « less
NSF-PAR ID:
10090690
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Israel Journal of Chemistry
Volume:
59
Issue:
6-7
ISSN:
0021-2148
Page Range / eLocation ID:
p. 598-607
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    While systems engineers rely on systems thinking skills in their work, given the increasing complexity of modern engineering problems, engineers across disciplines need to be able to engage in systems thinking, including what we term comprehensive systems thinking. Due to the inherent complexity of systems thinking, and more specifically comprehensive systems thinking, it is not easy to know how well students (and practitioners) are learning and leveraging systems thinking approaches. Thus, engineering managers and educators can benefit from systems thinking assessments. A variety of systems thinking assessments exist that are relevant to engineers, including some focused on the demonstration of systems thinking knowledge or skills and others measuring attitudes, interests, or values related to systems thinking. Starting with a collection of systems thinking assessments from a systematic literature review conducted by our team, we analyzed in-depth those behavior-based assessments that included the creation of a visual representation and were open-ended, i.e., it did not presuppose or provide answers. The findings from this in-depth analysis of systems thinking behavior-based assessments identified 1) six visualization types that were leveraged, 2) dimensions of systems thinking that were assessed and 3) tensions between the affordances of different assessments. In addition, we consider the ways assessments can be used. For example, using assessments to provide feedback to students or using assessments to determine which students are meeting defined learning goals. We draw on our findings to highlight opportunities for future comprehensive systems thinking behavior-based assessment development. 
    more » « less
  2. Pamucar, Dragan (Ed.)
    Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments. 
    more » « less
  3. Computational thinking has become the calling card for re-introducing coding into schools. While much attention has focused on how students engage in designing systems, applications, and other computational artifacts as a measure of success for computational thinking, far fewer efforts have focused on what goes into remediating problems in designing systems and interactions because learners invariably make mistakes that need fixing-or debugging. In this panel, we examine the often overlooked practice of debugging that presents significant learning challenges (and opportunities) to students in completing assignments and instructional challenges to teachers in helping students to succeed in their classrooms. The panel participants will review what we know and don't know about debugging, discuss ways to conceptualize and study debugging, and present instructional approaches for helping teachers and students to engage productively in debugging situations. 
    more » « less
  4. This contribution reports how the investigators are bridging across chemistry, philosophy, and other disciplines to study the landscape of ethics and responsible conduct (ERC) of research at the University of Central Florida (UCF) and to develop ongoing initiatives that cultivate a campus-wide culture of ERC in science. A multi-modal approach is employed to assess the ethics landscape at UCF, which is one of the most populated, rapidly emerging, minority-serving metropolitan universities in the United States. Stakeholders are consulted to develop new initiatives. In one example, the team created case-study driven workshops that help students discover through discussion how decision making and the sense of what is right can be affected by culture, discipline, past experience, and the availability or lack of information. Participants discuss topics closely related to chemistry -- including CRISPR, climate science, putative links between autism and vaccination, recalls related to vehicle emissions systems, and other examples from science, technology, and industry -- that help them understand how ERC impacts society at all levels and why it must be central to their professional practice. Philosophical arguments, like the Trolley Problem and normative theory, are used to focus students' thinking on the key value judgements that define the moral landscape and lead to ethical or unethical outcomes. The investigators are exploring means for bridging across hierarchies that are inherent in higher education -- and which create natural but often unhelpful divisions between students, faculty, staff, administrators, and alumni -- so that all stakeholders develop and contribute to a shared sense of ERC. The investigators examine how chemistry students engage with interdisciplinary colleagues and how faculty in chemistry and closely related disciplines are engaging with the initiatives. Advances in the assessment of ERC and the development of vehicles for promoting a culture of ERC are described. 
    more » « less
  5. Abstract

    This study explores the role of unconventional forms of classroom assessments in expanding minoritized students' opportunities to learn (OTL) in high school physics classrooms. In this research + practice partnership project, high school physics teachers and researchers co‐designed a unit about momentum to expand minoritized students' meaningful OTL. Specifically, the unit was designed to (a) expand what it means to learn and be good at science using unconventional forms of assessment, (b) facilitate students to leverage everyday experiences, concerns, and home languages to do science, and (c) support teachers to facilitate meaningful dialogical interactions. The analysis focused on examining minoritized students' OTLs mediated by intentionally designed, curriculum‐embedded, unconventional forms of assessments. The participants were a total of 76 students in 11th or 12th grade. Data were gathered in the form of student assessment tasks, a science identity survey, and interviews. Data analysis entailed: (a) statistical analysis of student performance measured by conventional and unconventional assessments and (b) qualitative analysis of two Latinx students' experiences with the co‐designed curriculum and assessments. The findings suggest that the use of unconventional forms of curriculum‐embedded assessment can increase minoritized students' OTLifthe assessment facilitates minoritized students to personally and deeply relate themselves to academic tasks.

     
    more » « less