Control interface for hands-free navigation of personal mobility vehicles
The user is controlling the device while holding an object in hand.
I developed a torso control interface which enables hands-free control with user’s natural upper-body movement for personal mobility devices. I built a support bar which was installed in front of the user’s waist level, the bar consists of both solid segment and soft segment. An array of force sensors was attached on the inner surface of the bar for detecting the pressure distribution of the human body. One IMU sensor was attached on the bar for detecting the bending angle of the user. An algorithm was developed for estimating the driving intention through pressure distribution and bending angle. A calibration algorithm was developed for different users. Different experiments have been conducted to verify the effectiveness of proposed torso control system. The basic methodology and validation of the proposed system was presented in (Chen et al., 2019; Chen et al., 2020), a recent version was presented in (Chen et al., 2022), a complete description of the methodolody and validation of the improved torso control system was presented in ICRA 2024 and accepted by IEEE/ASME Transactions on Mechatronics (Chen et al., 2024; Chen et al., 2024).
Keywords: Hands-free control, intention estimation, human-robot interface. Skills:: Python, ROS, 3D modeling (Solidworks), dynamics, Kalman filter, Eagle, user study
Reference
Torso Control System with A Sensory Safety Bar for a Standing Mobility Device
Yang Chen, Diego Paez-Granados, and Kenji Suzuki
2019 International Symposium on Micro-NanoMechatronics and Human Science (MHS) 2019