Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing
We investigate silent speech as a hands-free selection method in eye-gaze pointing. We first propose a stripped-down image-based model that can recognize a small number of silent commands almost as fast as state-of-the-art speech recognition models. We then compare it with other hands-free selection methods (dwell, speech) in a Fitts' law study. Results revealed that speech and silent speech are comparable in throughput and selection time, but the latter is significantly more accurate than the other methods. A follow-up study revealed that target selection around the center of a display is significantly faster and more accurate, while around the top corners and the bottom are slower and error prone. We then present a method for selecting menu items with eye-gaze and silent speech. A study revealed that it significantly reduces task completion time and error rate.
Mon 21 NovDisplayed time zone: Auckland, Wellington change
15:30 - 16:30 | Session 3: GazePapers at Rutherford House Lecture Theatre 2 Chair(s): Aluna Everitt University of Oxford | ||
15:30 20mTalk | Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing Papers DOI Media Attached | ||
15:50 20mTalk | HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single Smartphone Papers Takahiro Nagai Tohoku University, Kazuyuki Fujita Tohoku University, Kazuki Takashima Tohoku University, Yoshifumi Kitamura Tohoku University DOI Media Attached | ||
16:10 20mTalk | Effects of Display Layout on Spatial Memory for Immersive Environments Papers Jiazhou Liu Monash University, Arnaud Prouzeau Inria & LaBRI (University of Bordeaux, CNRS, Bordeaux-INP), Barrett Ens Monash University, Tim Dwyer Monash University DOI Media Attached |