Despite the documented need to train and
educate more cybersecurity professionals, we have little
rigorous evidence to inform educators on effective ways to
engage, educate, or retain cybersecurity students. To begin
addressing this gap in our knowledge, we are conducting a
series of think-aloud interviews with cybersecurity
students to study how students reason about core
cybersecurity concepts. We have recruited these students
from three diverse institutions: University of Maryland,
Baltimore County, Prince George’s Community College,
and Bowie State University. During these interviews,
students grapple with security scenarios designed to probe
student understanding of cybersecurity, especially
adversarial thinking. We are analyzing student statements
using a structured qualitative method, novice-led paired
thematic analysis, to document student misconceptions
and problematic reasonings. We intend to use these
findings to develop Cybersecurity Assessment Tools that
can help us assess the effectiveness of pedagogies. These
findings can also inform the development of curricula,
learning exercises, and other educational materials and
policies.
more »
« less
Student misconceptions about cybersecurity concepts: Analysis of student think-a-loud interviews in Journal of Cybersecurity Education, Research & Practice
We conducted an observational study to document student misconceptions about cybersecurity using
thematic analysis of 25 think-aloud interviews. By understanding patterns in student misconceptions, we
provide a basis for developing rigorous evidence-based recommendations for improving teaching and
assessment methods in cybersecurity and inform future research. This study is the first to explore student
cognition and reasoning about cybersecurity. We interviewed students from three diverse institutions. During
these interviews, students grappled with security scenarios designed to probe their understanding of
cybersecurity, especially adversarial thinking. We analyzed student statements using a structured qualitative
method, novice-led paired thematic analysis, to document patterns in student misconceptions and
problematic reasoning that transcend institutions, scenarios, or demographics. Themes generated from this
analysis describe a taxonomy of misconceptions but not their causes or remedies. Four themes emerged:
overgeneralizations, conflated concepts, biases, and incorrect assumptions. Together, these themes reveal that
students generally failed to grasp the complexity and subtlety of possible vulnerabilities, threats, risks, and
mitigations, suggesting a need for instructional methods that engage students in reasoning about complex
scenarios with an adversarial mindset. These findings can guide teachers’ attention during instruction and
inform the development of cybersecurity assessment tools that enable cross-institutional assessments that
measure the effectiveness of pedagogies.
more »
« less
- Award ID(s):
- 1819521
- NSF-PAR ID:
- 10110287
- Date Published:
- Journal Name:
- Journal of cyber security
- Volume:
- 1
- Issue:
- 5
- ISSN:
- 2579-0064
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null ; null ; null ; null (Ed.)We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.more » « less
-
We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.more » « less
-
Artificial intelligence (AI) and cybersecurity are in-demand skills, but little is known about what factors influence computer science (CS) undergraduate students' decisions on whether to specialize in AI or cybersecurity and how these factors may differ between populations. In this study, we interviewed undergraduate CS majors about their perceptions of AI and cybersecurity. Qualitative analyses of these interviews show that students have narrow beliefs about what kind of work AI and cybersecurity entail, the kinds of people who work in these fields, and the potential societal impact AI and cybersecurity may have. Specifically, students tended to believe that all work in AI requires math and training models, while cybersecurity consists of low-level programming; that innately smart people work in both fields; that working in AI comes with ethical concerns; and that cybersecurity skills are important in contemporary society. Some of these perceptions reinforce existing stereotypes about computing and may disproportionately affect the participation of students from groups historically underrepresented in computing. Our key contribution is identifying beliefs that students expressed about AI and cybersecurity that may affect their interest in pursuing the two fields and may, therefore, inform efforts to expand students' views of AI and cybersecurity. Expanding student perceptions of AI and cybersecurity may help correct misconceptions and challenge narrow definitions, which in turn can encourage participation in these fields from all students.more » « less
-
Abstract Students possess informal, intuitive ways of reasoning about the world, including biological phenomena. Although useful in some cases, intuitive reasoning can also lead to the development of scientifically inaccurate ideas that conflict with central concepts taught in formal biology education settings, including evolution. Using antibiotic resistance as an example of evolution, we developed a set of reading interventions and an assessment tool to examine the extent to which differences in instructional language affect undergraduate student misconceptions and intuitive reasoning. We find that readings that confront intuitive misconceptions can be more effective in reducing those misconceptions than factual explanations of antibiotic resistance that fail to confront misconceptions. Overall, our findings build upon investigations of intuitive reasoning in biology, examine possible instructional interventions, and raise questions about effective implementation of reading interventions in addressing persistent misconceptions about biology.more » « less