- Award ID(s):
- 2047189
- PAR ID:
- 10390283
- Date Published:
- Journal Name:
- ACM Transactions on Computer-Human Interaction
- Volume:
- 29
- Issue:
- 3
- ISSN:
- 1073-0516
- Page Range / eLocation ID:
- 1 to 27
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Face touch is an unconscious human habit. Frequent touching of sensitive/mucosal facial zones (eyes, nose, and mouth) increases health risks by passing pathogens into the body and spreading diseases. Furthermore, accurate monitoring of face touch is critical for behavioral intervention. Existing monitoring systems only capture objects approaching the face, rather than detecting actual touches. As such, these systems are prone to false positives upon hand or object movement in proximity to one's face (e.g., picking up a phone). We present FaceSense, an ear-worn system capable of identifying actual touches and differentiating them between sensitive/mucosal areas from other facial areas. Following a multimodal approach, FaceSense integrates low-resolution thermal images and physiological signals. Thermal sensors sense the thermal infrared signal emitted by an approaching hand, while physiological sensors monitor impedance changes caused by skin deformation during a touch. Processed thermal and physiological signals are fed into a deep learning model (TouchNet) to detect touches and identify the facial zone of the touch. We fabricated prototypes using off-the-shelf hardware and conducted experiments with 14 participants while they perform various daily activities (e.g., drinking, talking). Results show a macro-F1-score of 83.4% for touch detection with leave-one-user-out cross-validation and a macro-F1-score of 90.1% for touch zone identification with a personalized model.more » « less
-
Mobile user authentication (MUA) has become a gatekeeper for securing a wealth of personal and sensitive information residing on mobile devices. Keystrokes and touch gestures are two types of touch behaviors. It is not uncommon for a mobile user to make multiple MUA attempts. Nevertheless, there is a lack of an empirical comparison of different types of touch dynamics based MUA methods across different attempts. In view of the richness of touch dynamics, a large number of features have been extracted from it to build MUA models. However, there is little understanding of what features are important for the performance of such MUA models. Further, the training sample size of template generation is critical for real-world application of MUA models, but there is a lack of such information about touch gesture based methods. This study is aimed to address the above research limitations by conducting experiments using two MUA prototypes. Their empirical results can not only serve as a guide for the design of touch dynamics based MUA methods but also offer suggestions for improving the performance of MUA models.more » « less
-
null (Ed.)Communication during touch provides a seamless and natural way of interaction between humans and ambient intelligence. Current techniques that couple wireless transmission with touch detection suffer from the problem of selectivity and security, i.e., they cannot ensure communication only through direct touch and not through close proximity. We present BodyWire-HCI , which utilizes the human body as a wire-like communication channel, to enable human–computer interaction, that for the first time, demonstrates selective and physically secure communication strictly during touch. The signal leakage out of the body is minimized by utilizing a novel, low frequency Electro-QuasiStatic Human Body Communication (EQS-HBC) technique that enables interaction strictly when there is a conductive communication path between the transmitter and receiver through the human body. Design techniques such as capacitive termination and voltage mode operation are used to minimize the human body channel loss to operate at low frequencies and enable EQS-HBC. The demonstrations highlight the impact of BodyWire-HCI in enabling new human–machine interaction modalities for variety of application scenarios such as secure authentication (e.g., opening a door and pairing a smart device) and information exchange (e.g., payment, image, medical data, and personal profile transfer) through touch (https://www.youtube.com/watch?v=Uwrig2XQIH8).more » « less
-
null (Ed.)With the growing popularity of smartphones, continuous and implicit authentication of such devices via behavioral biometrics such as touch dynamics becomes an attractive option. Specially, when the physical biometrics are challenging to utilize, and their frequent and continuous usage annoys the user. This paper presents a touchstroke authentication model based on several classification algorithms and compare their performances in authenticating legitimate smartphone users. The evaluation results suggest that it is possible to achieve comparable authentication accuracies with an average accuracy of 91% considering the best performing model. This research is supervised by Dr. Debzani Deb (debd@wssu.edu), Department of Computer Science at Winston-Salem State University, NC.more » « less
-
We show a new type of side-channel leakage in which the built-in magnetometer sensor in Apple's mobile devices captures touch events of users. When a conductive material such as the human body touches the mobile device screen, the electric current passes through the screen capacitors generating an electromagnetic field around the touch point. This electromagnetic field leads to a sharp fluctuation in the magnetometer signals when a touch occurs, both when the mobile device is stationary and held in hand naturally. These signals can be accessed by mobile applications running in the background without requiring any permissions. We develop iSTELAN, a three-stage attack, which exploits this side-channel to infer users' application and touch data. iSTELAN translates the magnetometer signals to a binary sequence to reveal users' touch events, exploits touch event patterns to fingerprint the type of application a user is using, and models touch events to identify users' touch event types performed on different applications. We demonstrate the iSTELAN attack on 22 users while using 7 popular app types and show that it achieves an average accuracy of 90% for disclosing touch events, 74% for classifying application type used, and 73% for detecting touch event types.