This content will become publicly available on January 1, 2026
Hands-Free VR [Hands-Free VR]
- Award ID(s):
- 2309564
- PAR ID:
- 10582568
- Publisher / Repository:
- SCITEPRESS - Science and Technology Publications
- Date Published:
- ISSN:
- 978-989-758-728-3
- ISBN:
- 978-989-758-728-3
- Page Range / eLocation ID:
- 533 to 542
- Format(s):
- Medium: X
- Location:
- Porto, Portugal
- Sponsoring Org:
- National Science Foundation
More Like this
-
A novel wheelchair called PURE ( Personalized Unique Rolling Experience) that uses hands hands-free (HF) torso leanlean-to -steer control has been developed for manual wheelchair users (mWCUs). PURE addresses limitations of current wheelchairs, such as the in ability to use both hands for life experiences instead of propulsion. PURE uses a ball ball-based robot drivetrain to offer a compactcompact, selfself- balancing , omnidirectional mobile device. A custom sensor system convertconverts rider torso motions into direction and speed commands to control PURE, which is especially useful if a rider has minimal torso range of motion. We explored whether PURE’s HF control performed as well as a traditional joystick (JS) human human- robot interface and mWCUsmWCUs, performed as well as able able-bodied users (ABUs). 10 mWCUs and 10 ABUs were trained and tested to drive PURE through courses replicating indoor settingssettings. Each participant adjusted ride sensitivity settings for both HF and JS control . Repeated Repeated-measures MANOVA tests suggested that the number of collisions collisions, completion time time, NASA TLX scores except physical demand , and index of performance performances were similar for HF and JS control and between mWCUs and ABUs for all sections. Th is suggestsuggests that PURE is effective for controlling this new omnidirectional wheelchair by only using torso motion thus leaving both hands to be used for other tasks during propulsion propulsion.more » « less
-
Text correction on mobile devices usually requires precise and repetitive manual control. In this paper, we present EyeSayCorrect, an eye gaze and voice based hands-free text correction method for mobile devices. To correct text with EyeSayCorrect, the user first utilizes the gaze location on the screen to select a word, then speaks the new phrase. EyeSayCorrect would then infer the user’s correction intention based on the inputs and the text context. We used a Bayesian approach for determining the selected word given an eye-gaze trajectory. Given each sampling point in an eye-gaze trajectory, the posterior probability of selecting a word is calculated and accumulated. The target word would be selected when its accumulated interest is larger than a threshold. The misspelt words have higher priors. Our user studies showed that using priors for misspelt words reduced the task completion time up to 23.79% and the text selection time up to 40.35%, and EyeSayCorrect is a feasible hands-free text correction method on mobile devices.more » « less
-
A hands-free (HF) lean-to-steer control concept that uses torso motions is demonstrated by navigating a virtual robotic mobility device based on a ball-based robotic (ballbot) wheelchair. A custom sensor system (i.e., Torso-dynamics Estimation System (TES)) was utilized to measure and convert the dynamics of the rider’s torso motions into commands to provide HF control of the robot. A simulation study was conducted to explore the efficacy of the HF controller compared to a traditional joystick (JS) controller, and whether there were differences in performance by manual wheelchair users (mWCUs), who may have reduced torso function, compared to able-bodied users (ABUs). Twenty test subjects (10 mWCUs + 10 ABUs) used the subject-specific adjusted TES while wearing a virtual reality headset and were asked to navigate a virtual human rider on the ballbot through obstacle courses replicating seven indoor environment zones. Repeated measures MANOVA tests assessed performance metrics representing efficiency (i.e., number of collisions), effectiveness (i.e., completion time), comfort (i.e., NASA TLX scores), and robustness (i.e., index of performance). As expected, more challenging zones took longer to complete and resulted in more collisions. An interaction effect was observed such that ABUs had significantly more collisions using JS vs. HF control, while mWCUs had little difference with either interface. All subjects reported greater physical demand was needed for HF control than JS control; although, no users visibly showed or expressed fatigue or exhaustion when using HF control. In general, HF control performed as well as JS control, and mWCUs performed similarly to ABUs.more » « less