In this dissertation, we propose a universal motion-based control framework that supports general functionalities on 2D and 3D user interfaces with a single integrated design.We develop a hybrid framework of optical and inertial sensing technologies to track 6-DOF (degrees of freedom) motion of a handheld device, which includes the explicit 6-DOF (position and orientation in the global coordinates) and the implicit 6-DOF (acceleration and angular speed in the device-wise coordinates).Motion recognition is another key function of the universal motion-based control and contains two parts: motion gesture recognition and air-handwriting recognition.The interaction technique of each task is carefully designed to follow a consistent mental model and ensure the usability.The universal motion-based control achieves seamless integration of 2D and 3D interactions, motion gestures, and air-handwriting.Motion recognition by itself is a challenging problem. For motion gesture recognition, we propose a normalization procedure to effectively address the large in-class motion variations among users.The main contribution is the investigation of the relative effectiveness of various feature dimensions (of tracking signals) for motion gesture recognition in both user-dependent and user-independent cases.For air-handwriting recognition, we first develop a strategy to model air-handwriting with basic elements of characters and ligatures.Then, we build word-based and letter-based decoding word networks for air-handwriting recognition.Moreover, we investigate the detection and recognition of air-fingerwriting as an extension to air-handwriting. To complete the evaluation of air-handwriting, we conduct usability study to support that air-handwriting is suitable for text input on a motion-based user interface.
【 预 览 】
附件列表
Files
Size
Format
View
Universal motion-based control and motion recognition