skip to main content


Search for: All records

Award ID contains: 1755625

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
  2. null (Ed.)
    Makerspaces have complex access control requirements and are increasingly protected through digital access control mechanisms (e.g., keycards, transponders). However, it remains unclear how space administrators craft access control policies, how existing technical infrastructures support and fall short of access needs, and how these access control policies impact end-users in a makerspace. We bridge this gap through a mixed-methods, multi-stakeholder study. Specifically, we conducted 16 semi-structured interviews with makerspace administrators across the U.S. along with a survey of 48 makerspace end-users. We found four factors influenced administrators' construction of access control policies: balancing safety versus access; logistics; prior experience; and, the politics of funding. Moreover, administrators often made situational exceptions to their policies: e.g., during demand spikes, to maintain a good relationship with their staff, and if they trusted the user(s) requesting an exception. Conversely, users expressed frustration with the static nature of access control policies, wishing for negotiability and for social nuance to be factored into access decisions. The upshot is that existing mechanisms for access control in makerspaces are often inappropriately static and socially unaware. 
    more » « less
  3. Digital resources are often collectively owned and shared by small social groups (e.g., friends sharing Netflix accounts, roommates sharing game consoles, families sharing WhatsApp groups). Yet, little is known about (i) how these groups jointly navigate cybersecurity and privacy (S&P) decisions for shared resources, (ii) how shared experiences influence individual S&P attitudes and behaviors, and (iii) how well existing S&P controls map onto group needs. We conducted group interviews and a supplemental diary study with nine social groups (n=34) of varying relationship types. We identified why, how and what resources groups shared, their jointly construed threat models, and how these factors influenced group strategies for securing shared resources. We also identified missed opportunities for cooperation and stewardship among group members that could have led to improved S&P behaviors, and found that existing S&P controls often fail to meet the needs of these small social groups. 
    more » « less
  4. What triggers end-user security and privacy (S&P) behaviors? How do those triggers vary across individuals? When and how do people share their S&P behavior changes? Prior work, in usable security and persuasive design, suggests that answering these questions is critical if we are to design systems that encourage pro-S&P behaviors. Accordingly, we asked 852 online survey respondents about their most recent S&P behaviors (n = 1947), what led up to those behaviors, and if they shared those behaviors. We found that social “triggers”, where people interacted with or observed others, were most common, followed by proactive triggers, where people acted absent of an external stimulus, and lastly by forced triggers, where people were forced to act. People from different age groups, nationalities, and levels of security behavioral intention (SBI) all varied in which triggers were dominant. Most importantly, people with low-to-medium SBI most commonly reported social triggers. Furthermore, participants were four times more likely to share their behavior changes with others when they, themselves, reported a social trigger. 
    more » « less