skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Refining a Taxonomy for Categorizing the Quality of Engineering Student Questions
The ability to identify one’s own confusion and to ask a question that resolves it is an essential metacognitive skill that supports self-regulation (Winne, 2005). Yet, while students receive substantial training in how to answer questions, little classroom time is spent training students how to ask good questions. Past research has shown that students are able to pose more high-quality questions after being instructed in a taxonomy for classifying the quality of their questions (Marbach‐Ad & Sokolove, 2000). As pilot data collection in preparation for a larger study funded through NSF-DUE, we provided engineering statics students training in writing high-quality questions to address their own confusions. The training emphasized the value of question-asking in learning and how to categorize questions using a simple taxonomy based on prior work (Harper et al., 2003). The taxonomy specifies five question levels: 1) an unspecific question, 2) a definition question, 3) a question about how to do something, 4) a why question, and 5) a question that extends knowledge to a new circumstance. At the end of each class period during a semester-long statics course, students were prompted to write and categorize a question that they believed would help them clarify their current point of greatest confusion. Through regular practice writing and categorizing such questions, we hoped to improve students' abilities to ask questions that require higher-level thinking. We collected data from 35 students in courses at two institutions. Over the course of the semester, students had the opportunity to write and categorize twenty of their own questions. After the semester, the faculty member categorized student questions using the taxonomy to assess the appropriateness of the taxonomy and whether students used it accurately. Analysis of the pilot data indicates three issues to be addressed: 1) Student compliance in writing and categorizing their questions varied. 2) Some students had difficulty correctly coding their questions using the taxonomy. 3) Some student questions could not be clearly characterized using the taxonomy, even for faculty raters. We will address each of these issues with appropriate refinements in our next round of data collection: 1) Students may have been overwhelmed with the request to write a question after each class period. In the future, we will require students to write and categorize at least one question per week, with more frequent questions encouraged. 2) To improve student use of the taxonomy in future data collection, students will receive more practice with the taxonomy when it is introduced and more feedback on their categorization of questions during the semester. 3) We are reformulating our taxonomy to accommodate questions that may straddle more than one category, such as a question about how to extend a mathematical operation to a new situation (which could be categorized as either a level 3 or 5). We are hopeful that these changes will improve accuracy and compliance, enabling us to use the intervention as a means to promote metacognitive regulation and measure changes as a result, which is the intent of the larger scope of the project.  more » « less
Award ID(s):
1945961
PAR ID:
10422073
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2021 ASEE Annual Conference and Exposition
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. John E. Mitchell (Ed.)
    Contribution: This article describes the implementation, assessment, and evaluation of conceptual-based writing exercises in an introductory course on electric circuit analysis. Background: Students' struggles in gateway courses such as circuit analysis are often traced to inadequate metacognitive skills on the part of the student as well their misconceptions regarding fundamental phenomena related to the course. Writing is known to be a powerful tool for insight into a student's thought process and to foster metacognitive activity. Research Questions: What effect does the use of short writing exercises have on students' understanding of fundamental concepts related to the behavior of electric circuits operating at dc? What effect does the use of the conceptually based writing exercises have on students' ability to justify their responses when answering conceptual questions related to basic electric circuit concepts? Methodology: In the first semester of the study, a single writing exercise was given and in the second semester, a total of five such exercises were administered. In each semester, students were separated into ``at-risk'' and ``not at-risk'' groups based on their responses to the first writing exercise. A 2 x 2 x (2) mixed analysis of variance (ANOVA) was conducted, with at-risk/not at-risk and semester/semester between-subjects factors and pre-test/post-test on a multiple-choice conceptual-based exam a within-subjects factor. Findings: Results suggest that only the at-risk group may have benefited in terms of deepened conceptual understanding and the ability to justify their responses from the use of multiple conceptual-based writing exercises. 
    more » « less
  2. To enhance conceptual understanding of mathematical models for inventory management, we developed poetry-writing assignments for a required, upper-level undergraduate course in an industrial and systems engineering program. Specifically, two poetry-writing assignments were incorporated into an inventory and supply chain system design and control course. The first assignment, due one week before the first term exam, asked students to write a poem about a concept, model or topic related to deterministic inventory modeling. The second and assignment, due one week before the second term exam, asked the students to write a poem about a concept, model or topic related to stochastic inventory modeling. The students were also asked to respond to several open-ended questions on their approach to writing the poems and their assessment of the impact of these poetry writing on improving their conceptual understanding of the underlying mathematical models. Data was collected in Spring 2022 semester. The student written-poetry will be analyzed for correctness and to identify misunderstandings or gaps in understanding. In this paper, we will present our findings from the content analysis of student-written poetry and our preliminary findings on the effectiveness of poetry-writing assignments to enhance conceptual understanding of mathematical models for inventory management. 
    more » « less
  3. null (Ed.)
    In mechanics, the standard 3-credit, 45-hour course is sufficient to deliver standard lectures with prepared examples and questions. Moreover, it is not only feasible, but preferable, to employ any of a variety of active learning and teaching techniques. Nevertheless, even when active learning is strategically used, students and instructors alike experience pressure to accomplish their respective learning and teaching goals under the constraints of the academic calendar, raising questions as to whether the allocated time is sufficient to enable authentic learning. One way to assess learning progress is to examine the learning cycles through which students attempt, re-think, and re-attempt their work. This article provides data to benchmark the time required to learn key Statics concepts based on results of instruction of approximately 50 students in a Statics class at a public research university during the Fall 2020 semester. Two parallel techniques are employed to foster and understand student learning cycles. • Through a Mastery Based Learning model, 15 weekly pass/fail “Mastery Tests” are given. Students who do not pass may re-test with a different but similar test on the same topic each week until the semester’s conclusion. The tests are highly structured in that they are well posed and highly focused. For example, some tests focus only on drawing Free Body Diagrams, with no equations or calculations. Other tests focus on writing equilibrium equations from a given Free Body Diagram. Passing the first six tests is required to earn the grade of D; passing the next three for C; the next three for B; and the final three for A. Evaluations include coding of student responses to infer student reasoning. Learning cycles occur as students repeat the same topics, and their progress is assessed by passing rates and by comparing evolving responses to the same test topics. • Concept Questions that elicit qualitative responses and written explanations are deployed at least weekly. The learning cycle here consists of students answering a question, seeing the overall class results (but without the correct answer), having a chance to explore the question with other students and the instructor, and finally an opportunity to re-answer the same question, perhaps a few minutes or up to a couple days later. Sometimes, that same question is given a third time to encourage further effort or progress. To date, results from both cycles appear to agree on one important conclusion: the rate of demonstrated learning is quite low. For example, each Mastery Test has a passing rate of 20%-30%, including for students with several repeats. With the Concept Questions, typically no more than half of the students who answered incorrectly change to the correct answer by the time of the final poll. The final article will provide quantitative and qualitative results from each type of cycle, including tracking coded responses on Mastery Tests, written responses on Concept Questions, and cross-comparisons thereof. Additional results will be presented from student surveys. Since the Mastery Tests and Concept Questions follow typical Statics topics, this work has potential to lead to a standardized set of benchmarks and standards for measuring student learning – and its rate – in Statics. 
    more » « less
  4. Question-asking is a crucial learning and teaching approach. It reveals different levels of students' understanding, application, and potential misconceptions. Previous studies have categorized question types into higher and lower orders, finding positive and significant associations between higher-order questions and students' critical thinking ability and their learning outcomes in different learning contexts. However, the diversity of higher-order questions, especially in collaborative learning environments. has left open the question of how they may be different from other types of dialogue that emerge from students' conversations, To address these questions, our study utilized natural language processing techniques to build a model and investigate the characteristics of students' higher-order questions. We interpreted these questions using Bloom's taxonomy, and our results reveal three types of higher-order questions during collaborative problem-solving. Students often use Why, How and What If' questions to I) understand the reason and thought process behind their partners' actions: 2) explore and analyze the project by pinpointing the problem: and 3) propose and evaluate ideas or alternative solutions. In addition. we found dialogue labeled 'Social'. 'Question - other', 'Directed at Agent', and 'Confusion/Help Seeking' shows similar underlying patterns to higher-order questions, Our findings provide insight into the different scenarios driving students' higher-order questions and inform the design of adaptive systems to deliver personalized feedback based on students' questions. 
    more » « less
  5. Addressing common student questions in introductory STEM courses early in the term is one way that instructors can ensure that their students have all been presented with information about how to succeed in their courses. However, categorizing student questions and identifying evidence-based resources to address student questions takes time, and instructors may not be able to easily collect and respond to student questions at the beginning of every course. To help faculty effectively anticipate and respond to student questions, we 1) administered surveys in multiple STEM courses to identify common student questions, 2) conducted a qualitative analysis to determine categories of student questions (e.g., what are best practices for studying, how can in- and out-of- course time be effectively used), and 3) collaboratively identified advice on how course instructors can answer these questions. Here, we share tips, evidence-based strategies, and resources from faculty that instructors can use to develop their own responses for students. We hope that educators can use these common student questions as a starting point to proactively address questions throughout the course and that the compiled resources will allow instructors to easily find materials that can be considered for their own courses. 
    more » « less