skip to main content

Search for: All records

Creators/Authors contains: "Atashzar, S. Farokh"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available April 1, 2024
  2. Free, publicly-accessible full text available October 23, 2023
  3. Free, publicly-accessible full text available November 1, 2023
  4. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward researchmore »towards more realistic physicality in future VR/AR.« less
    Free, publicly-accessible full text available December 1, 2023
  5. Free, publicly-accessible full text available August 31, 2023
  6. Free, publicly-accessible full text available October 1, 2023
  7. Free, publicly-accessible full text available October 1, 2023
  8. Free, publicly-accessible full text available July 1, 2023
  9. Abstract

    Sensory information is critical for motor coordination. However, understanding sensorimotor integration is complicated, especially in individuals with impairment due to injury to the central nervous system. This research presents a novel functional biomarker, based on a nonlinear network graph of muscle connectivity, called InfoMuNet, to quantify the role of sensory information on motor performance. Thirty-two individuals with post-stroke hemiparesis performed a grasp-and-lift task, while their muscle activity from 8 muscles in each arm was measured using surface electromyography. Subjects performed the task with their affected hand before and after sensory exposure to the task performed with the less-affected hand. For the first time, this work shows that InfoMuNet robustly quantifies changes in functional muscle connectivity in the affected hand after exposure to sensory information from the less-affected side. > 90% of the subjects conformed with the improvement resulting from this sensory exposure. InfoMuNet also shows high sensitivity to tactile, kinesthetic, and visual input alterations at the subject level, highlighting its potential use in precision rehabilitation interventions.