This second edition of Charles Camp and John Clement's book contains a set of 24 innovative lessons and laboratories in mechanics for high school physics classrooms that was developed by a team of teachers and science education researchers. Research has shown that certain student preconceptions conflict with current physical theories and seem to resist change when using traditional instructional techniques. This book provides a set of lessons that are aimed specifically at these particularly troublesome areas: Normal Forces, Friction, Newton's Third Law, Relative Motion, Gravity, Inertia, and Tension. The lessons can be used to supplement any course that includes mechanics. Each unit contains detailed step by step lesson plans, homework and test problems, as well as background information on common student misconceptions, an overall integrated teaching strategy, and key aspects of the targeted core concepts. This edition has a number of substantial changes based on teacher input. A number of the lessons are adaptable for college level courses as well. Evaluations using pre-and post-tests have shown large gain differerfces over control groups.
more »
« less
PRECONCEPTIONS IN MECHANICS: LESSONS DEALING WITH STUDENTS' CONCEPTUAL DIFFICULTIES
The nine units in this high school physics curriculum guide focus on areas where students have exhibited qualitative preconceptions -- ideas that they bring to class with them prior to instruction in physics -- that conflict with the physicist's conceptions. It has also shown that some of these conflicting preconceptions are quite persistent and seem to resist change in the face of normal instructional techniques. The motivating idea for this book is to provide a set of lessons that are aimed specifically at these particularly troublesome areas and that use special techniques for dealing with them. Other preconceptions contain important, useful intuitions that lessons can build on to foster sensemaking. Ideas in the lessons can be used to supplement any course that includes mechanics.
more »
« less
- Award ID(s):
- 0723709
- PAR ID:
- 10585743
- Publisher / Repository:
- U. of Massachusett, Amherst
- Date Published:
- ISSN:
- 0000-000
- ISBN:
- 000-00-00000-00-0
- Subject(s) / Keyword(s):
- Preconceptions Physics Teaching Analogies
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
There is a growing consensus that solutions to complex science and engineering problems require novel methodologies that are able to integrate traditional physics-based modeling approaches with state-of-the-art machine learning (ML) techniques. This paper provides a structured overview of such techniques. Application-centric objective areas for which these approaches have been applied are summarized, and then classes of methodologies used to construct physics-guided ML models and hybrid physics-ML frameworks are described. We then provide a taxonomy of these existing techniques, which uncovers knowledge gaps and potential crossovers of methods between disciplines that can serve as ideas for future research.more » « less
-
Doglioni, C.; Kim, D.; Stewart, G.A.; Silvestris, L.; Jackson, P.; Kamleh, W. (Ed.)An important part of the Large Hadron Collider (LHC) legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of “likelihood-free” inference and present powerful new analysis techniques that combine physics insights, statistical methods, and the power of machine learning. We have developed MadMiner, a new Python package that makes it straightforward to apply these techniques. In example LHC problems we show that the new approach lets us put stronger constraints on theory parameters than established methods, demonstrating its potential to improve the new physics reach of the LHC legacy measurements. While we present techniques optimized for particle physics, the likelihood-free inference formulation is much more general, and these ideas are part of a broader movement that is changing scientific inference in fields as diverse as cosmology, genetics, and epidemiology.more » « less
-
null (Ed.)When designing learning environments and curricula for diverse populations, it is beneficial to connect with learners’ cultural knowledge, and the related interests, they bring to the learning context. To aid in the design and development of a computing curriculum and identify these areas of personal and cultural connection, we conducted a series of participatory design sessions. The goal of these sessions was to col- lect ideas around ways to make the instructional materials reflect the interests and voices of the learners. In this paper, we examine how the use of participatory design techniques can advance our understanding of the domains influencing today’s youth. Specifically, we examine the ideas generated by youth during these sessions as a means to understand what influences them and their ideas of cultural relevancy. In this work, we identify the resources children draw on across design activities and organize them to extend the Spheres of Influence framework (L. Archer et al., 2014). We identify seven spheres to attend to when designing for learning: Home and Family, School and Work, Hobbies and Leisure, Media, Interests, Peers, and Identity.more » « less
-
A pervasive approach in scientific computing is to express the solution to a given problem as the limit of a sequence of vectors or other mathematical objects. In many situations these sequences are generated by slowly converging iterative procedures, and this led practitioners to seek faster alternatives to reach the limit. ‘Acceleration techniques’ comprise a broad array of methods specifically designed with this goal in mind. They started as a means of improving the convergence of general scalar sequences by various forms of ‘extrapolation to the limit’, i.e. by extrapolating the most recent iterates to the limit via linear combinations. Extrapolation methods of this type, the best-known of which is Aitken’s delta-squared process, require only the sequence of vectors as input. However, limiting methods to use only the iterates is too restrictive. Accelerating sequences generated by fixed-point iterations by utilizing both the iterates and the fixed-point mapping itself has proved highly successful across various areas of physics. A notable example of these fixed-point accelerators (FP-accelerators) is a method developed by Donald Anderson in 1965 and now widely known as Anderson acceleration (AA). Furthermore, quasi-Newton and inexact Newton methods can also be placed in this category since they can be invoked to find limits of fixed-point iteration sequences by employing exactly the same ingredients as those of the FP-accelerators. This paper presents an overview of these methods – with an emphasis on those, such as AA, that are geared toward accelerating fixed-point iterations. We will navigate through existing variants of accelerators, their implementations and their applications, to unravel the close connections between them. These connections were often not recognized by the originators of certain methods, who sometimes stumbled on slight variations of already established ideas. Furthermore, even though new accelerators were invented in different corners of science, the underlying principles behind them are strikingly similar or identical. The plan of this article will approximately follow the historical trajectory of extrapolation and acceleration methods, beginning with a brief description of extrapolation ideas, followed by the special case of linear systems, the application to self-consistent field (SCF) iterations, and a detailed view of Anderson acceleration. The last part of the paper is concerned with more recent developments, including theoretical aspects, and a few thoughts on accelerating machine learning algorithms.more » « less
An official website of the United States government

