skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Effects of Sensor Setup Time and Comfort on User Experience in Physiological Computing *
Physiological sensors are commonly applied for user state monitoring and consequent machine behavior adaptation in applications such as rehabilitation and intelligent cars. While more accurate user state monitoring is known to lead to better user experience, increased accuracy often requires more sensors or more complex sensors. The increased setup time and discomfort involved in the use of such sensors may itself worsen user experience. To examine this effect, we conducted a study where 72 participants interacted with a computer-based multitasking scenario whose difficulty was periodically adapted - ostensibly based on data from either a remote eye tracker or a lab-grade “wet” electroencephalography sensor. Deception was used to ensure consistent difficulty adaptation accuracies, and user experience was measured with the Intrinsic Motivation Inventory, NASA Task Load Index, and an ad-hoc scale. We found few user experience differences between the eye tracker and electroencephalography sensor - while one interaction effect was noted, it was small, and there were no other differences. This result is at first surprising and seems to indicate that comfort and setup time are not major factors for laboratory-based user experience evaluations of such technologies. However, the result is likely due to a suboptimal study protocol where each participant interacted with only one sensor. In future work, we will use an alternate protocol to further explore the effects of user comfort and setup time on user experience.  more » « less
Award ID(s):
2151464
PAR ID:
10579588
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
978-1-6654-1020-5
Page Range / eLocation ID:
3776 to 3780
Format(s):
Medium: X
Location:
Kuching, Malaysia
Sponsoring Org:
National Science Foundation
More Like this
  1. In affective computing, classification algorithms are used to recognize users’ psychological states and adapt tasks to optimize user experience. However, classification is never perfect, and the relationship between adaptation accuracy and user experience remains understudied. It is also unclear whether the adaptation magnitude (‘size’ of action taken to influence user states) influences effects of adaptation accuracy. To evaluate impacts of adaptation accuracy (appropriate vs. inappropriate actions) and magnitude on user experience, we conducted a ‘Wizard of Oz’ study where 112 participants interacted with the Multi-Attribute Task Battery over three 11-minute intervals. An adaptation accuracy (50 % to 80 %) was preassigned for the first 11-minute interval, and accuracy increased by 10 % in each subsequent interval. Task difficulty changed every minute, and participant preferences for difficulty changes were assessed at the same time. Adaptation accuracy was artificially induced by fixing the percentage of times the difficulty changes matched participant preferences. Participants were also randomized to two magnitude conditions, with difficulty modified by 1 (low) or 3 (high) levels each minute. User experience metrics were assessed after each interval. Analysis with latent growth models offered support for linear increases in user experience across increasing levels of adaptation accuracy. For each 10 % gain in accuracy, results indicate a 1.3 (95 % CI [.35, 2.20]) point increase in NASA Task Load Index scores (range 6–60), a 0.40 (95 % CI [.18, 0.57]) increase in effort/importance (range 2–14), and 0.48 (95 % CI [.24, 0.72]) increase in perceived competence (range 2–14). Furthermore, the effect of accuracy on Task Load Index scores was modulated by adaptation magnitude. No effects were observed for interest/enjoyment or pressure/tension. By providing quantitative estimates of effects of adaptation accuracy on user experience, the study provides guidelines for researchers and developers of affect-aware technologies. Furthermore, our methods could be adapted for use in other affective computing scenarios. 
    more » « less
  2. Blascheck, Tanja; Bradshaw, Jessica; Vrzakova, Hana (Ed.)
    Virtual Reality (VR) technology has advanced to include eye-tracking, allowing novel research, such as investigating how our visual system coordinates eye movements with changes in perceptual depth. The purpose of this study was to examine whether eye tracking could track perceptual depth changes during a visual discrimination task. We derived two depth-dependent variables from eye tracker data: eye vergence angle (EVA) and interpupillary distance (IPD). As hypothesized, our results revealed that shifting gaze from near-to-far depth significantly decreased EVA and increased IPD, while the opposite pattern was observed while shifting from far-to-near. Importantly, the amount of change in these variables tracked closely with relative changes in perceptual depth, and supported the hypothesis that eye tracker data may be used to infer real-time changes in perceptual depth in VR. Our method could be used as a new tool to adaptively render information based on depth and improve the VR user experience. 
    more » « less
  3. null (Ed.)
    Perspiration level monitoring enables numerous applications such as physical condition estimation, personal comfort monitoring, health/exercise monitoring, and inference of environmental conditions of the user. Prior works on perspiration (sweat) sensing require users to manually hold a device or attach adhesive sensors directly onto their skin, limiting user mobility and comfort. In this paper, we present a low-cost and novel wearable sensor system that is able to accurately estimate an individual's sweat level based on measuring moisture. The sensor is designed in a threadlike form factor, allowing it to be sewn into the seams of clothing, rather than having to act as a standalone sensor that the user must attach to their body. The system is comprised of multiple cotton-covered conductive threads that are braided into one sensor. When a person sweats, the resistance between the braided conductive threads changes as moisture becomes trapped in the cotton covering of the threads. The braided three-dimensional structure allows for robust estimation of perspiration level in the presence of external forces that may cause sensor distortion, such as motion. We characterize the relationship between the volume of sweat and measured resistance between the braided threads. Finally, we weave our sensors into the fabric of a shirt and conduct on-body experiments to study users' sweating level through various activities. 
    more » « less
  4. Understanding how individuals focus and perform visual searches during collaborative tasks can help improve user engagement. Eye tracking measures provide informative cues for such understanding. This article presents A-DisETrac, an advanced analytic dashboard for distributed eye tracking. It uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard. Using two pilot studies, the system was evaluated in terms of user experience and utility, and compared with existing work. Moreover, the system was used to study how advanced gaze measures such as ambient-focal coefficient K and real-time index of pupillary activity relate to collaborative behavior. It was observed that the time a group takes to complete a puzzle is related to the ambient visual scanning behavior quantified and groups that spent more time had more scanning behavior. User experience questionnaire results suggest that their dashboard provides a comparatively good user experience. 
    more » « less
  5. We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library, and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user's gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research. 
    more » « less