skip to main content

Search for: All records

Award ID contains: 1831969

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. BACKGROUND: Today, various emerging assistive applications (apps) running on smartphones have been introduced such as Seeing AI, TapTapSee, and BeMyEyes apps. The assistive apps are designed to assist people with visual impairment in navigating unfamiliar environments, reading text, identifying objects and persons. Yet, little is known about how those with visual impairment perceive the assistive apps. OBJECTIVE: This study aims to advance knowledge of user experience with those assistive apps. METHODS: To address the knowledge gap, this study conducted phone interviews with a convenience sample of 30 individuals with visual impairment. RESULTS: The results indicated that those with visual impairment showed a range of preferences, needs, and concerns about user interfaces and interactions with the assistive apps. DISCUSSIONS: Given their needs and concerns, this study offered a set of facilitators to promote user adoption of the assistive apps, which should be valuable guidance to user interface/interaction designers in the field.
    Free, publicly-accessible full text available August 30, 2023
  2. Today, various sensor technologies have been introduced to help people keep track of their daily living activities. For example, a wide range of sensors were integrated in applications to develop a smart home, a mobile emergency response system and a fall detection system. Sensor technologies were also employed in clinical settings for monitoring an early sign or onset of Alzheimer’s diseases, dementia, abnormal sleep disorder, and heart rate problems. However, there has been a lack of attention paid to comprehensive reviews, valuable especially for young, early-career scholars who just developed research interests in this area. This paper reviewed the existing sensor technologies by considering various contexts such as sensor features, data of interests, locations of sensors, and the number of sensors. For instance, sensor technologies provided various features that enabled people to monitor biomechanics of human movement (e.g., walking speed), use of household goods (e.g., switch on/off of home appliances), sounds (e.g., sounds in a particular room), and surrounding environments (e.g., temperature and humidity). Sensor technologies were widely used to examine various data, such as biomarkers for health, dietary habits, leisure activities, and hygiene status. Sensors were installed in various locations to cover wide-open area (e.g., ceilings, wall, and hallway),more »specific area (e.g., a bedroom and a dining room), and specific objects (e.g., mattresses and windows). Different sets of sensors were employed to keep track of activities of daily living, which ranged from a single sensor to multiple sensors to cover throughout the home. This comprehensive reviews for sensor technology implementations are anticipated to help many researchers and professionals to design, develop, and use sensor technology applications adequately in the target user’s contexts by promoting safety, usability, and accessibility.

    « less
  3. People can visualize their spontaneous and voluntary emotions via facial expressions, which play a critical role in social interactions. However, less is known about mechanisms of spontaneous emotion expressions, especially in adults with visual impairment and blindness. Nineteen adults with visual impairment and blindness participated in interviews where the spontaneous facial expressions were observed and analyzed via the Facial Action Coding System (FACS). We found a set of Action Units, primarily engaged in expressing the spontaneous emotions, which were likely to be affected by participants’ different characteristics. The results of this study could serve as evidence to suggest that adults with visual impairment and blindness show individual differences in spontaneous facial expressions of emotions.
  4. Despite significant vision loss, humans can still recognize various emotional stimuli via a sense of hearing and express diverse emotional responses, which can be sorted into two dimensions, arousal and valence. Yet, many research studies have been focusing on sighted people, leading to lack of knowledge about emotion perception mechanisms of people with visual impairment. This study aims at advancing knowledge of the degree to which people with visual impairment perceive various emotions – high/low arousal and positive/negative emotions. A total of 30 individuals with visual impairment participated in interviews where they listened to stories of people who became visually impaired, encountered and overcame various challenges, and they were instructed to share their emotions. Participants perceived different kinds and intensities of emotions, depending on their demographic variables such as living alone, loneliness, onset of visual impairment, visual acuity, race/ethnicity, and employment status. The advanced knowledge of emotion perceptions in people with visual impairment is anticipated to contribute toward better designing social supports that can adequately accommodate those with visual impairment.
  5. In response to the novel coronavirus (COVID-19) pandemic, public health interventions such as social distancing and stay-at-home orders have widely been implemented, which is anticipated to contribute to reducing the spread of COVID-19. On the contrary, there is a concern that the public health interventions may increase the level of loneliness. Loneliness and social isolation are public health risks, closely associated with serious medical conditions. As COVID-19 is new to us today, little is known about emotional well-being among people with visual impairment during the COVID-19 pandemic. To address the knowledge gap, this study conducted phone interviews with a convenience sample of 31 people with visual impairment. The interview incorporated the University of California, Los Angeles (UCLA) Loneliness Scale (version 3) and the trait meta-mood scale (TMMS) to measure loneliness and emotional intelligence skills, respectively. This study found that people with visual impairment were vulnerable to the feeling of loneliness during the COVID-19 pandemic and showed individual differences in emotional intelligence skills by different degrees of loneliness. Researchers and health professionals should consider offering adequate coping strategies to those with visual impairment amid the COVID-19 pandemic.