Mid-air haptics allow bare-hand tactile stimulation; however, it has a constrained workspace, making it unsuitable for room-scale haptics. We present a novel approach to rendering mid-air haptic sensations in a large rendering volume by turning a static array into a dynamic array following the user's hand. We used a 6DOF robot to drive a haptic ultrasound array over a large 3D space. Our system enables rendering room-scale mid-air experiences while preserving bare-hand interaction, thus, providing tangibility for virtual environments. To evaluate our approach, we performed three evaluations. First, we performed a technical system evaluation, showcasing the feasibility of such a system. Next, we conducted three psychophysical experiments, showing that the motion does not affect the user's perception with high likelihood. Lastly, we explored seven use cases that showcase our system's potential using a user study. We discuss challenges and opportunities in how large-scale mid-air haptics can contribute toward room-scale haptic feedback. Thus, with our system, we contribute to general haptic mid-air feedback on a large scale.
Futian Zhang University of Waterloo, Keiko Katsuragawa National Research Council; University of Waterloo, Edward Lank University of Waterloo; Inria; University of Lille
Can Liu City University of Hong Kong, Chenyue Dai City University of Hong Kong; Massachusetts Institude of Technology, Qingzhou, Ma City University of Hong Kong, University of Michigan, Ann Arbor, Brinda Mehra City University of Hong Kong; University of Michigan, Alvaro Cassinelli City University of Hong Kong