skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Armstrong, Miriam"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. PurposeThis study aimed to investigate how honest participants perceived an attacker to be during shoulder surfing scenarios that varied in terms of which Principle of Persuasion in Social Engineering (PPSE) was used, whether perceived honesty changed as scenarios progressed, and whether any changes were greater in some scenarios than others. Design/methodology/approachParticipants read one of six shoulder surfing scenarios. Five depicted an attacker using one of the PPSEs. The other depicted an attacker using as few PPSEs as possible, which served as a control condition. Participants then rated perceived attacker honesty. FindingsThe results revealed honesty ratings in each condition were equal during the beginning of the conversation, participants in each condition perceived the attacker to be honest during the beginning of the conversation, perceived attacker honesty declined when the attacker requested the target perform an action that would afford shoulder surfing, perceived attacker honesty declined more when the Distraction and Social Proof PPSEs were used, participants perceived the attacker to be dishonest when making such requests using the Distraction and Social Proof PPSEs and perceived attacker honesty did not change when the attacker used the target’s computer. Originality/valueTo the best of the authors’ knowledge, this experiment is the first to investigate how persuasion tactics affect perceptions of attackers during shoulder surfing attacks. These results have important implications for shoulder surfing prevention training programs and penetration tests. 
    more » « less
  2. Auditory icons are naturally occurring sounds that systems play to convey information. Systems must convey complex messages. To do so, systems can play: 1) a single sound that represents the entire message, or 2) a single sound that represents the first part of the message, followed by another sound that represents the next part of that message, etc. The latter are known as concatenated auditory icons. To evaluate those approaches, participants interpreted single and concatenated auditory icons designed to convey their message well and poorly. Single auditory icons designed to convey their message well were correctly interpreted more often than those designed to convey their message poorly; that was not true for concatenated auditory icons. Concatenated auditory icons should not be comprised of a series of sounds that each represents its piece of a message well. The whole of a concatenated auditory icon is not the sum of its parts. 
    more » « less
  3. Purpose Nonexperts do not always follow the advice in cybersecurity warning messages. To increase compliance, it is recommended that warning messages use nontechnical language, describe how the cyberattack will affect the user personally and do so in a way that aligns with how the user thinks about cyberattacks. Implementing those recommendations requires an understanding of how nonexperts think about cyberattack consequences. Unfortunately, research has yet to reveal nonexperts’ thinking about cyberattack consequences. Toward that end, the purpose of this study was to examine how nonexperts think about cyberattack consequences. Design/methodology/approach Nonexperts sorted cyberattack consequences based on perceived similarity and labeled each group based on the reason those grouped consequences were perceived to be similar. Participants’ labels were analyzed to understand the general themes and the specific features that are present in nonexperts’ thinking. Findings The results suggested participants mainly thought about cyberattack consequences in terms of what the attacker is doing and what will be affected. Further, the results suggested participants thought about certain aspects of the consequences in concrete terms and other aspects of the consequences in general terms. Originality/value This research illuminates how nonexperts think about cyberattack consequences. This paper also reveals what aspects of nonexperts’ thinking are more or less concrete and identifies specific terminology that can be used to describe aspects that fall into each case. Such information allows one to align warning messages to nonexperts’ thinking in more nuanced ways than would otherwise be possible. 
    more » « less
  4. Phishing emails have certain characteristics, including wording related to urgency and unrealistic promises (i.e., “too good to be true”), that attempt to lure victims. To test whether these characteristics affected users’ suspiciousness of emails, users participated in a phishing judgment task in which we manipulated 1) email type (legitimate, phishing), 2) consequence amount (small, medium, large), 3) consequence type (gain, loss), and 4) urgency (present, absent). We predicted users would be most suspicious of phishing emails that were urgent and offered large gains. Results supporting the hypotheses indicate that users were more suspicious of phishing emails with a gain consequence type or large consequence amount. However, urgency was not a significant predictor of suspiciousness for phishing emails, but was for legitimate emails. These results have important cybersecurity-related implications for penetration testing and user training. 
    more » « less
  5. null (Ed.)
    Objective To understand how aspects of vishing calls (phishing phone calls) influence perceived visher honesty. Background Little is understood about how targeted individuals behave during vishing attacks. According to truth-default theory, people assume others are being honest until something triggers their suspicion. We investigated whether that was true during vishing attacks. Methods Twenty-four participants read written descriptions of eight real-world vishing calls. Half included highly sensitive requests; the remainder included seemingly innocuous requests. Participants rated visher honesty at multiple points during conversations. Results Participants initially perceived vishers to be honest. Honesty ratings decreased before requests occurred. Honesty ratings decreased further in response to highly sensitive requests, but not seemingly innocuous requests. Honesty ratings recovered somewhat, but only after highly sensitive requests. Conclusions The present results revealed five important insights: (1) people begin vishing conversations in the truth-default state, (2) certain aspects of vishing conversations serve as triggers, (3) other aspects of vishing conversations do not serve as triggers, (4) in certain situations, people’s perceptions of visher honesty improve, and, more generally, (5) truth-default theory may be a useful tool for understanding how targeted individuals behave during vishing attacks. Application Those developing systems that help users deal with suspected vishing attacks or penetration testing plans should consider (1) targeted individuals’ truth-bias, (2) the influence of visher demeanor on the likelihood of deception detection, (3) the influence of fabricated situations surrounding vishing requests on the likelihood of deception detection, and (4) targeted individuals’ lack of concern about seemingly innocuous requests. 
    more » « less
  6. null (Ed.)
    Purpose This study aims to examine how social engineers use persuasion principles during vishing attacks. Design/methodology/approach In total, 86 examples of real-world vishing attacks were found in articles and videos. Each example was coded to determine which persuasion principles were present in that attack and how they were implemented, i.e. what specific elements of the attack contributed to the presence of each persuasion principle. Findings Authority (A), social proof (S) and distraction (D) were the most widely used persuasion principles in vishing attacks, followed by liking, similarity and deception (L). These four persuasion principles occurred in a majority of vishing attacks, while commitment, reciprocation and consistency (C) did not. Further, certain sets of persuasion principles (i.e. authority, distraction, liking, similarity, and deception and social proof; , authority, commitment, reciprocation, and consistency, distraction, liking, similarity and deception, and social proof; and authority, distraction and social proof) were used more than others. It was noteworthy that despite their similarities, those sets of persuasion principles were implemented in different ways, and certain specific ways of implementing certain persuasion principles (e.g. vishers claiming to have authority over the victim) were quite rare. Originality/value To the best of authors’ knowledge, this study is the first to investigate how social engineers use persuasion principles during vishing attacks. As such, it provides important insight into how social engineers implement vishing attacks and lays a critical foundation for future research investigating the psychological aspects of vishing attacks. The present results have important implications for vishing countermeasures and education. 
    more » « less
  7. null (Ed.)
    More specialized cybersecurity education programs are needed to address workforce needs, but it is unclear which knowledge, skills, and abilities (KSAs) fulfil industry needs. We interviewed 48 professionals within four cyber defense specialty areas: (1) Cyber Network Defense Analysis, (2) Cyber Network Defense Infrastructure Support, (3) Incident Response, and (4) Vulnerability Assessment and Management. The professionals rated a number of specialized KSAs along two dimensions: how important the KSA was to their job and how difficult the KSA was to learn. Overall, communication and other non-technical skills were rated as being very important for all cyber defense jobs. Findings indicated that, for some specialty areas, technical knowledge and skills vary considerably between jobs and so the ability to teach oneself is more valuable than proficiency in any one KSA. Findings may be used to inform the development of general cybersecurity curricula, as well as curricula that focus on Cyber Network Defense Analysis, Cyber Network Defense Infrastructure Support, or Vulnerability Assessment and Management. 
    more » « less
  8. null (Ed.)