skip to main content


Search for: All records

Creators/Authors contains: "Phatak, Dhananjay"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We report on the status of our Cybersecurity Assess- ment Tools (CATS) project that is creating and val- idating a concept inventory for cybersecurity, which assesses the quality of instruction of any first course in cybersecurity. In fall 2014, we carried out a Del- phi process that identified core concepts of cyber- security. In spring 2016, we interviewed twenty-six students to uncover their understandings and mis- conceptions about these concepts. In fall 2016, we generated our first assessment tool–a draft Cyberse- curity Concept Inventory (CCI), comprising approx- imately thirty multiple-choice questions. Each ques- tion targets a concept; incorrect answers are based on observed misconceptions from the interviews. This year we are validating the draft CCI using cognitive interviews, expert reviews, and psychometric testing. In this paper, we highlight our progress to date in developing the CCI. The CATS project provides infrastructure for a rig- orous evidence-based improvement of cybersecurity education. The CCI permits comparisons of different instructional methods by assessing how well students learned the core concepts of the field (especially ad- versarial thinking), where instructional methods re- fer to how material is taught (e.g., lab-based, case- studies, collaborative, competitions, gaming). Specif- ically, the CCI is a tool that will enable researchers to scientifically quantify and measure the effect of their approaches to, and interventions in, cybersecurity ed- ucation. 
    more » « less
  2. Despite the documented need to train and educate more cybersecurity professionals, we have little rigorous evidence to inform educators on effective ways to engage, educate, or retain cybersecurity students. To begin addressing this gap in our knowledge, we are conducting a series of think-aloud interviews with cybersecurity students to study how students reason about core cybersecurity concepts. We have recruited these students from three diverse institutions: University of Maryland, Baltimore County, Prince George’s Community College, and Bowie State University. During these interviews, students grapple with security scenarios designed to probe student understanding of cybersecurity, especially adversarial thinking. We are analyzing student statements using a structured qualitative method, novice-led paired thematic analysis, to document student misconceptions and problematic reasonings. We intend to use these findings to develop Cybersecurity Assessment Tools that can help us assess the effectiveness of pedagogies. These findings can also inform the development of curricula, learning exercises, and other educational materials and policies. 
    more » « less