Title: Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity (invited paper)
We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better.
The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background.
Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding.
Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM. more »« less
Sherman, Alan T.; Herman, Geoffrey L.; Oliva, Linda; Peterson, Peter A.; Golaszewski, Enis; Poulsen, Seth; Scheponik, Travis; Gorti, Akshita(
, National Cyber Summit (NCS) Research Track 2020)
null; null; null; null
(Ed.)
We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better.
The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background.
Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding.
Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.
Sherman, Alan T.; Herman, Geoffrey L.; Oliva, Linda; Peterson, Peter A.; Golaszewski, Enis; Poulsen, Seth; Scheponik, Travis; Gorti, Akshita(
, National Cyber Summit (NCS) Research Track 2020)
null
(Ed.)
We reflect on our ongoing journey in the
educational Cybersecurity Assessment Tools (CATS) Project
to create two concept inventories for cybersecurity.
We identify key steps in this journey and important questions we faced.
We explain the decisions we made and discuss the consequences of those decisions,
highlighting what worked well and what might have gone better.
The CATS Project is creating and validating two
concept inventories---conceptual tests of understanding---that can be used
to measure the effectiveness of various approaches to teaching and learning cybersecurity.
The Cybersecurity Concept Inventory (CCI) is for students
who have recently completed any first course in cybersecurity;
the Cybersecurity Curriculum Assessment (CCA) is for students
who have recently completed an undergraduate major or track in cybersecurity.
Each assessment tool comprises 25 multiple-choice questions (MCQs)
of various difficulties
that target the same five core concepts, but the CCA assumes greater
technical background.
Sherman, A. T.; Oliva, L.; Golaszewski, E.; Phatak, D.; Scheponik, T.; Herman, G.; Choi, D. S.; Offenberger, S.; Peterson, P. A.; Dykstra, J.; et al(
, IEEE security & privacy)
For two days in February 2018, 17 cybersecurity ed-
ucators and professionals from government and in-
dustry met in a “hackathon” to refine existing draft
multiple-choice test items, and to create new ones, for
a Cybersecurity Concept Inventory (CCI) and Cyber-
security Curriculum Assessment (CCA) being devel-
oped as part of the Cybersecurity Assessment Tools
(CATS) Project. We report on the results of the
CATS Hackathon, discussing the methods we used
to develop test items, highlighting the evolution of
a sample test item through this process, and offer-
ing suggestions to others who may wish to organize
similar hackathons.
Each test item embodies a scenario, question stem,
and five answer choices. During the Hackathon, par-
ticipants organized into teams to (1) Generate new
scenarios and question stems, (2) Extend CCI items
into CCA items, and generate new answer choices for
new scenarios and stems, and (3) Review and refine
draft CCA test items.
The CATS Project provides rigorous evidence-
based instruments for assessing and evaluating educa-
tional practices; these instruments can help identify
pedagogies and content that are effective in teach-
ing cybersecurity. The CCI measures how well stu-
dents understand basic concepts in cybersecurity—
especially adversarial thinking—after a first course
in the field. The CCA measures how well students
understand core concepts after completing a full cy-
bersecurity curriculum.
We report on the status of our Cybersecurity Assess-
ment Tools (CATS) project that is creating and val-
idating a concept inventory for cybersecurity, which
assesses the quality of instruction of any first course
in cybersecurity. In fall 2014, we carried out a Del-
phi process that identified core concepts of cyber-
security. In spring 2016, we interviewed twenty-six
students to uncover their understandings and mis-
conceptions about these concepts. In fall 2016, we
generated our first assessment tool–a draft Cyberse-
curity Concept Inventory (CCI), comprising approx-
imately thirty multiple-choice questions. Each ques-
tion targets a concept; incorrect answers are based on
observed misconceptions from the interviews. This
year we are validating the draft CCI using cognitive
interviews, expert reviews, and psychometric testing.
In this paper, we highlight our progress to date in
developing the CCI.
The CATS project provides infrastructure for a rig-
orous evidence-based improvement of cybersecurity
education. The CCI permits comparisons of different
instructional methods by assessing how well students
learned the core concepts of the field (especially ad-
versarial thinking), where instructional methods re-
fer to how material is taught (e.g., lab-based, case-
studies, collaborative, competitions, gaming). Specif-
ically, the CCI is a tool that will enable researchers to
scientifically quantify and measure the effect of their
approaches to, and interventions in, cybersecurity ed-
ucation.
Frady, K.; Brown, C.; High, K.; Hughes, C.; O’Hara, R.; & Huang, S.(
, Annual American Society for Engineering Education Conference)
There is little research or understanding of curricular differences between two- and four-year programs, career development of engineering technology (ET) students, and professional preparation for ET early career professionals [1]. Yet, ET credentials (including certificates, two-, and four-year degrees) represent over half of all engineering credentials awarded in the U.S [2]. ET professionals are important hands-on members of engineering teams who have specialized knowledge of components and engineering systems. This research study focuses on how career orientations affect engineering formation of ET students educated at two-year colleges. The theoretical framework guiding this study is Social Cognitive Career Theory (SCCT). SCCT is a theory which situates attitudes, interests, and experiences and links self-efficacy beliefs, outcome expectations, and personal goals to educational and career decisions and outcomes [3]. Student knowledge of attitudes toward and motivation to pursue STEM and engineering education can impact academic performance and indicate future career interest and participation in the STEM workforce [4]. This knowledge may be measured through career orientations or career anchors. A career anchor is a combination of self-concept characteristics which includes talents, skills, abilities, motives, needs, attitudes, and values. Career anchors can develop over time and aid in shaping personal and career identity [6]. The purpose of this quantitative research study is to identify dimensions of career orientations and anchors at various educational stages to map to ET career pathways. The research question this study aims to answer is: For students educated in two-year college ET programs, how do the different dimensions of career orientations, at various phases of professional preparation, impact experiences and development of professional profiles and pathways? The participants (n=308) in this study represent three different groups: (1) students in engineering technology related programs from a medium rural-serving technical college (n=136), (2) students in engineering technology related programs from a large urban-serving technical college (n=52), and (3) engineering students at a medium Research 1 university who have transferred from a two-year college (n=120). All participants completed Schein’s Career Anchor Inventory [5]. This instrument contains 40 six-point Likert-scale items with eight subscales which correlate to the eight different career anchors. Additional demographic questions were also included. The data analysis includes graphical displays for data visualization and exploration, descriptive statistics for summarizing trends in the sample data, and then inferential statistics for determining statistical significance. This analysis examines career anchor results across groups by institution, major, demographics, types of educational experiences, types of work experiences, and career influences. This cross-group analysis aids in the development of profiles of values, talents, abilities, and motives to support customized career development tailored specifically for ET students. These findings contribute research to a gap in ET and two-year college engineering education research. Practical implications include use of findings to create career pathways mapped to career anchors, integration of career development tools into two-year college curricula and programs, greater support for career counselors, and creation of alternate and more diverse pathways into engineering.
Words: 489
References [1] National Academy of Engineering. (2016). Engineering technology education in the United States. Washington, DC: The National Academies Press. [2] The Integrated Postsecondary Education Data System, (IPEDS). (2014). Data on engineering technology degrees. [3] Lent, R.W., & Brown, S.B. (1996). Social cognitive approach to career development: An overivew. Career Development Quarterly, 44, 310-321. [4] Unfried, A., Faber, M., Stanhope, D.S., Wiebe, E. (2015). The development and validation of a measure of student attitudes toward science, technology, engineeirng, and math (S-STEM). Journal of Psychoeducational Assessment, 33(7), 622-639. [5] Schein, E. (1996). Career anchors revisited: Implications for career development in the 21st century. Academy of Management Executive, 10(4), 80-88. [6] Schein, E.H., & Van Maanen, J. (2013). Career Anchors, 4th ed. San Francisco: Wiley.
Sherman, Alan T., Herman, Geoffrey L., Oliva, Linda, Peterson, Peter A., Golaszewski, Enis, Poulsen, Seth, Scheponik, Travis, and Gorti, Akshita. Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity (invited paper). Retrieved from https://par.nsf.gov/biblio/10161696. Proceedings of the 2020 National Cyber Summit .
Sherman, Alan T., Herman, Geoffrey L., Oliva, Linda, Peterson, Peter A., Golaszewski, Enis, Poulsen, Seth, Scheponik, Travis, & Gorti, Akshita. Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity (invited paper). Proceedings of the 2020 National Cyber Summit, (). Retrieved from https://par.nsf.gov/biblio/10161696.
Sherman, Alan T., Herman, Geoffrey L., Oliva, Linda, Peterson, Peter A., Golaszewski, Enis, Poulsen, Seth, Scheponik, Travis, and Gorti, Akshita.
"Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity (invited paper)". Proceedings of the 2020 National Cyber Summit (). Country unknown/Code not available. https://par.nsf.gov/biblio/10161696.
@article{osti_10161696,
place = {Country unknown/Code not available},
title = {Experiences and Lessons Learned Creating and Validating Concept Inventories for Cybersecurity (invited paper)},
url = {https://par.nsf.gov/biblio/10161696},
abstractNote = {We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.},
journal = {Proceedings of the 2020 National Cyber Summit},
author = {Sherman, Alan T. and Herman, Geoffrey L. and Oliva, Linda and Peterson, Peter A. and Golaszewski, Enis and Poulsen, Seth and Scheponik, Travis and Gorti, Akshita},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.