TY - JOUR
T1 - AI-Enabled Soft Sensing Array for Simultaneous Detection of Muscle Deformation and Mechanomyography for Metaverse Somatosensory Interaction
AU - Suo, Jiao
AU - Liu, Yifan
AU - Wang, Jianfei
AU - Chen, Meng
AU - Wang, Keer
AU - Yang, Xiaomeng
AU - Yao, Kuanming
AU - Roy, Vellaisamy A.L.
AU - Yu, Xinge
AU - Daoud, Walid A.
AU - Liu, Na
AU - Wang, Jianping
AU - Wang, Zuobin
AU - Li, Wen Jung
N1 - Publisher Copyright:
© 2023 The Authors. Advanced Science published by Wiley-VCH GmbH.
PY - 2024/4/24
Y1 - 2024/4/24
N2 - Motion recognition (MR)-based somatosensory interaction technology, which interprets user movements as input instructions, presents a natural approach for promoting human-computer interaction, a critical element for advancing metaverse applications. Herein, this work introduces a non-intrusive muscle-sensing wearable device, that in conjunction with machine learning, enables motion-control-based somatosensory interaction with metaverse avatars. To facilitate MR, the proposed device simultaneously detects muscle mechanical activities, including dynamic muscle shape changes and vibrational mechanomyogram signals, utilizing a flexible 16-channel pressure sensor array (weighing ≈0.38 g). Leveraging the rich information from multiple channels, a recognition accuracy of ≈96.06% is achieved by classifying ten lower-limb motions executed by ten human subjects. In addition, this work demonstrates the practical application of muscle-sensing-based somatosensory interaction, using the proposed wearable device, for enabling the real-time control of avatars in a virtual space. This study provides an alternative approach to traditional rigid inertial measurement units and electromyography-based methods for achieving accurate human motion capture, which can further broaden the applications of motion-interactive wearable devices for the coming metaverse age.
AB - Motion recognition (MR)-based somatosensory interaction technology, which interprets user movements as input instructions, presents a natural approach for promoting human-computer interaction, a critical element for advancing metaverse applications. Herein, this work introduces a non-intrusive muscle-sensing wearable device, that in conjunction with machine learning, enables motion-control-based somatosensory interaction with metaverse avatars. To facilitate MR, the proposed device simultaneously detects muscle mechanical activities, including dynamic muscle shape changes and vibrational mechanomyogram signals, utilizing a flexible 16-channel pressure sensor array (weighing ≈0.38 g). Leveraging the rich information from multiple channels, a recognition accuracy of ≈96.06% is achieved by classifying ten lower-limb motions executed by ten human subjects. In addition, this work demonstrates the practical application of muscle-sensing-based somatosensory interaction, using the proposed wearable device, for enabling the real-time control of avatars in a virtual space. This study provides an alternative approach to traditional rigid inertial measurement units and electromyography-based methods for achieving accurate human motion capture, which can further broaden the applications of motion-interactive wearable devices for the coming metaverse age.
KW - human motion recognition
KW - mechanomyography
KW - natural human–machine interaction
KW - non-intrusive muscle activities sensing
KW - wearable devices
UR - http://www.scopus.com/inward/record.url?scp=85185264513&partnerID=8YFLogxK
U2 - 10.1002/advs.202305025
DO - 10.1002/advs.202305025
M3 - Article
C2 - 38376001
AN - SCOPUS:85185264513
VL - 11
JO - Advanced Science
JF - Advanced Science
IS - 16
M1 - 2305025
ER -