skip to main content

Search for: All records

Creators/Authors contains: "Das, Sauvik"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. User adoption of security and privacy (S&P) best practices remains low, despite sustained efforts by researchers and practitioners. Social influence is a proven method for guiding user S&P behavior, though most work has focused on studying peer influence, which is only possible with a known social graph. In a study of 104 Facebook users, we instead demonstrate that crowdsourced S&P suggestions are significantly influential. We also tested how reflective writing affected participants’ S&P decisions, with and without suggestions. With reflective writing, participants were less likely to accept suggestions — both social and Facebook default suggestions. Of particular note, when reflective writing participants were shown the Facebook default suggestion, they not only rejected it but also (unknowingly) configured their settings in accordance with expert recommendations. Our work suggests that both non-personal social influence and reflective writing can positively influence users’ S&P decisions, but have negative interactions.
  2. Laptop webcams can be covertly activated by malware and law enforcement agencies. Consequently, 59% percent of Americans manually cover their webcams to avoid being surveilled. However, manual covers are prone to human error---through a survey with 200 users, we found that 61.5% occasionally forget to re-attach their cover after using their webcam. To address this problem, we developed Smart Webcam Cover (SWC): a thin film that covers the webcam (PDLC-overlay) by default until a user manually uncovers the webcam, and automatically covers the webcam when not in use. Through a two-phased design iteration process, we evaluated SWC with 20 webcam cover users through a remote study with a video prototype of SWC, compared to manual operation, and discussed factors that influence users' trust in the effectiveness of SWC and their perceptions of its utility.
  3. Bluetooth requires device pairing to ensure security in data transmission, encumbering a number of ad-hoc, transactional interactions that require both ease-of-use and "good enough" security: e.g., sharing contact information or secure links to people nearby. We introduce Bit Whisperer, an ad-hoc short-range wireless communication system that enables "walk up and share'" data transmissions with "good enough" security. Bit Whisperer transmits data to proximate devices co-located on a solid surface through high frequency, inaudible acoustic signals. The physical surface has two benefits: it limits communication range since sound travels more robustly on a flat solid surface than air; and, it makes the domain of communication visible, helping users identify exactly with whom they are sharing data without prior pairing. Through a series of technical evaluations, we demonstrate that Bit Whisperer is robust for common use-cases and secure against likely threats. We also implement three example applications to demonstrate the utility of Whisperer: 1-to-1 local contact sharing, 1-to-N private link sharing to open a secure group chat, and 1-to-N local device authentication.
  4. Improving end-users’ awareness of cybersecurity warnings (e.g., phishing and malware alerts) remains a longstanding problem in usable security. Prior work suggests two key weaknesses with existing warnings: they are primarily communicated via saturated communication channels (e.g., visual, auditory, and vibrotactile); and, they are communicated rationally, not viscerally. We hypothesized that wrist-based affective haptics should address both of these weaknesses in a form-factor that is practically deployable: i.e., as a replaceable wristband compatible with modern smartwatches like the Apple Watch. To that end, we designed and implemented Spidey Sense, a wristband that produces customizable squeezing sensations to alert users to urgent cybersecurity warnings. To evaluate Spidey Sense, we applied a three-phased ‘Gen-Rank-Verify’ study methodology with 48 participants. We found evidence that, relative to vibrotactile alerts, Spidey Sense was considered more appropriate for the task of alerting people to cybersecurity warnings.
  5. Makerspaces have complex access control requirements and are increasingly protected through digital access control mechanisms (e.g., keycards, transponders). However, it remains unclear how space administrators craft access control policies, how existing technical infrastructures support and fall short of access needs, and how these access control policies impact end-users in a makerspace. We bridge this gap through a mixed-methods, multi-stakeholder study. Specifically, we conducted 16 semi-structured interviews with makerspace administrators across the U.S. along with a survey of 48 makerspace end-users. We found four factors influenced administrators' construction of access control policies: balancing safety versus access; logistics; prior experience; and, the politics of funding. Moreover, administrators often made situational exceptions to their policies: e.g., during demand spikes, to maintain a good relationship with their staff, and if they trusted the user(s) requesting an exception. Conversely, users expressed frustration with the static nature of access control policies, wishing for negotiability and for social nuance to be factored into access decisions. The upshot is that existing mechanisms for access control in makerspaces are often inappropriately static and socially unaware.
  6. Digital resources are often collectively owned and shared by small social groups (e.g., friends sharing Netflix accounts, roommates sharing game consoles, families sharing WhatsApp groups). Yet, little is known about (i) how these groups jointly navigate cybersecurity and privacy (S&P) decisions for shared resources, (ii) how shared experiences influence individual S&P attitudes and behaviors, and (iii) how well existing S&P controls map onto group needs. We conducted group interviews and a supplemental diary study with nine social groups (n=34) of varying relationship types. We identified why, how and what resources groups shared, their jointly construed threat models, and how these factors influenced group strategies for securing shared resources. We also identified missed opportunities for cooperation and stewardship among group members that could have led to improved S&P behaviors, and found that existing S&P controls often fail to meet the needs of these small social groups.
  7. What triggers end-user security and privacy (S&P) behaviors? How do those triggers vary across individuals? When and how do people share their S&P behavior changes? Prior work, in usable security and persuasive design, suggests that answering these questions is critical if we are to design systems that encourage pro-S&P behaviors. Accordingly, we asked 852 online survey respondents about their most recent S&P behaviors (n = 1947), what led up to those behaviors, and if they shared those behaviors. We found that social “triggers”, where people interacted with or observed others, were most common, followed by proactive triggers, where people acted absent of an external stimulus, and lastly by forced triggers, where people were forced to act. People from different age groups, nationalities, and levels of security behavioral intention (SBI) all varied in which triggers were dominant. Most importantly, people with low-to-medium SBI most commonly reported social triggers. Furthermore, participants were four times more likely to share their behavior changes with others when they, themselves, reported a social trigger.