skip to main content

Title: Tracing One’s Touches: Continuous mobile user authentication based on touch dynamics
Despite that tremendous progress has been made in mobile user authentication (MUA) in recent years, continuous mobile user authentication (CMUA), in which authentication is performed continuously after initial login, remains under studied. In addition, although one-handed interaction with a mobile device becomes increasingly common, one-handed CMUA has never been investigated in the literature. There is a lack of investigation of the CMUA performance between one-handed and two-handed interactions. To fill the literature gap, we developed a new CMUA method based on touch dynamics of thumb scrolling on the touchscreen of a mobile device. We developed a mobile app of the proposed CMUA method and evaluated its effectiveness with data collected from a user study. The findings have implications for the design of effective CMUA using touch dynamics and for improvement of accessibility and usability of MUA mechanisms.
Authors:
; ;
Award ID(s):
1917537
Publication Date:
NSF-PAR ID:
10095425
Journal Name:
Twenty-fifth Americas Conference on Information Systems
Sponsoring Org:
National Science Foundation
More Like this
  1. Mobile devices typically rely on entry-point and other one-time authentication mechanisms such as a password, PIN, fingerprint, iris, or face. But these authentication types are prone to a wide attack vector and worse 1 INTRODUCTION Currently smartphones are predominantly protected a patterned password is prone to smudge attacks, and fingerprint scanning is prone to spoof attacks. Other forms of attacks include video capture and shoulder surfing. Given the increasingly important roles smartphones play in e-commerce and other operations where security is crucial, there lies a strong need of continuous authentication mechanisms to complement and enhance one-time authentication such that evenmore »if the authentication at the point of login gets compromised, the device is still unobtrusively protected by additional security measures in a continuous fashion. The research community has investigated several continuous authentication mechanisms based on unique human behavioral traits, including typing, swiping, and gait. To this end, we focus on investigating physiological traits. While interacting with hand-held devices, individuals strive to achieve stability and precision. This is because a certain degree of stability is required in order to manipulate and interact successfully with smartphones, while precision is needed for tasks such as touching or tapping a small target on the touch screen (Sitov´a et al., 2015). As a result, to achieve stability and precision, individuals tend to develop their own postural preferences, such as holding a phone with one or both hands, supporting hands on the sides of upper torso and interacting, keeping the phone on the table and typing with the preferred finger, setting the phone on knees while sitting crosslegged and typing, supporting both elbows on chair handles and typing. On the other hand, physiological traits, such as hand-size, grip strength, muscles, age, 424 Ray, A., Hou, D., Schuckers, S. and Barbir, A. Continuous Authentication based on Hand Micro-movement during Smartphone Form Filling by Seated Human Subjects. DOI: 10.5220/0010225804240431 In Proceedings of the 7th International Conference on Information Systems Security and Privacy (ICISSP 2021), pages 424-431 ISBN: 978-989-758-491-6 Copyrightc 2021 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved still, once compromised, fail to protect the user’s account and data. In contrast, continuous authentication, based on traits of human behavior, can offer additional security measures in the device to authenticate against unauthorized users, even after the entry-point and one-time authentication has been compromised. To this end, we have collected a new data-set of multiple behavioral biometric modalities (49 users) when a user fills out an account recovery form in sitting using an Android app. These include motion events (acceleration and angular velocity), touch and swipe events, keystrokes, and pattern tracing. In this paper, we focus on authentication based on motion events by evaluating a set of score level fusion techniques to authenticate users based on the acceleration and angular velocity data. The best EERs of 2.4% and 6.9% for intra- and inter-session respectively, are achieved by fusing acceleration and angular velocity using Nandakumar et al.’s likelihood ratio (LR) based score fusion.« less
  2. As it becomes commonplace to use mobile devices to store personal and sensitive data, mobile user authentication (MUA) methods have witnessed significant advancement to improve data and device security. On the other hand, traditional MUA methods such as password (or passcode) are still being widely deployed. Despite the growing body of knowledge on technical strengths and security vulnerabilities of various MUA methods, the perception of mobile users may be different, which can play a decisive role in MUA adoption. Additionally, user preferences for MUA methods may be subject to the influence of their demographic factors and device types. Furthermore, themore »pervasive use of mobile devices has generated many situations that create new usability and security needs of MUA methods such as support of one-handed and/or sight-free interaction. This study investigates user perception and situational needs of MUA methods using a survey questionnaire. The research findings can guide the design and selection of MUA methods.« less
  3. Mobile user authentication (MUA) has become a gatekeeper for securing a wealth of personal and sensitive information residing on mobile devices. Keystrokes and touch gestures are two types of touch behaviors. It is not uncommon for a mobile user to make multiple MUA attempts. Nevertheless, there is a lack of an empirical comparison of different types of touch dynamics based MUA methods across different attempts. In view of the richness of touch dynamics, a large number of features have been extracted from it to build MUA models. However, there is little understanding of what features are important for the performancemore »of such MUA models. Further, the training sample size of template generation is critical for real-world application of MUA models, but there is a lack of such information about touch gesture based methods. This study is aimed to address the above research limitations by conducting experiments using two MUA prototypes. Their empirical results can not only serve as a guide for the design of touch dynamics based MUA methods but also offer suggestions for improving the performance of MUA models.« less
  4. Wearable computing devices have become increasingly popular and while these devices promise to improve our lives, they come with new challenges. One such device is the Google Glass from which data can be stolen easily as the touch gestures can be intercepted from a head-mounted device. This paper focuses on analyzing and combining two behavioral metrics, namely, head movement (captured through glass) and torso movement (captured through smartphone) to build a continuous authentication system that can be used on Google Glass alone or by pairing it with a smartphone. We performed a correlation analysis among the features on these twomore »metrics and found that very little correlation exists between the features extracted from head and torso movements in most scenarios (set of activities). This led us to combine the two metrics to perform authentication. We built an authentication system using these metrics and compared the performance among different scenarios. We got EER less than 6% when authenticating a user using only the head movements in one scenario whereas the EER is less than 5% when authenticating a user using both head and torso movements in general.« less
  5. The privacy of users and information are becoming increasingly important with the growth and pervasive use of mobile devices such as wearables, mobile phones, drones, and Internet of Things (IoT) devices. Today many of these mobile devices are equipped with cameras which enable users to take pictures and record videos anytime they need to do so. In many such cases, bystanders’ privacy is not a concern, and as a result, audio and video of bystanders are often captured without their consent. We present results from a user study in which 21 participants were asked to use a wearable system calledmore »FacePET developed to enhance bystanders’ facial privacy by providing a way for bystanders to protect their own privacy rather than relying on external systems for protection. While past works in the literature focused on privacy perceptions of bystanders when photographed in public/shared spaces, there has not been research with a focus on user perceptions of bystander-based wearable devices to enhance privacy. Thus, in this work, we focus on user perceptions of the FacePET device and/or similar wearables to enhance bystanders’ facial privacy. In our study, we found that 16 participants would use FacePET or similar devices to enhance their facial privacy, and 17 participants agreed that if smart glasses had features to conceal users’ identities, it would allow them to become more popular.« less