| BioMedical Engineering OnLine | |
| EMG-based facial gesture recognition through versatile elliptic basis function neural network | |
| Mahyar Hamedi2  Sh-Hussain Salleh3  Mehdi Astaraki1  Alias Mohd Noor3  | |
| [1] Department of Biomedical Engineering, Science and Research Branch, Islamic Azad University Tehran, Tehran, Iran | |
| [2] Faculty of Bioscience and Medical Engineering, Universiti Teknologi Malaysia, Skudai, Johor 81310, Malaysia | |
| [3] Centre for Biomedical Engineering, Transportation Research Alliance, Universiti Teknologi Malaysia, Skudai, Johor 81310, Malaysia | |
| 关键词: Human machine interface; Versatile elliptic basis function neural network; Feature extraction; Facial gesture recognition; Electromyogram; Facial neural activity; | |
| Others : 797433 DOI : 10.1186/1475-925X-12-73 |
|
| received in 2013-04-25, accepted in 2013-07-09, 发布年份 2013 | |
PDF
|
|
【 摘 要 】
Background
Recently, the recognition of different facial gestures using facial neuromuscular activities has been proposed for human machine interfacing applications. Facial electromyograms (EMGs) analysis is a complicated field in biomedical signal processing where accuracy and low computational cost are significant concerns. In this paper, a very fast versatile elliptic basis function neural network (VEBFNN) was proposed to classify different facial gestures. The effectiveness of different facial EMG time-domain features was also explored to introduce the most discriminating.
Methods
In this study, EMGs of ten facial gestures were recorded from ten subjects using three pairs of surface electrodes in a bi-polar configuration. The signals were filtered and segmented into distinct portions prior to feature extraction. Ten different time-domain features, namely, Integrated EMG, Mean Absolute Value, Mean Absolute Value Slope, Maximum Peak Value, Root Mean Square, Simple Square Integral, Variance, Mean Value, Wave Length, and Sign Slope Changes were extracted from the EMGs. The statistical relationships between these features were investigated by Mutual Information measure. Then, the feature combinations including two to ten single features were formed based on the feature rankings appointed by Minimum-Redundancy-Maximum-Relevance (MRMR) and Recognition Accuracy (RA) criteria. In the last step, VEBFNN was employed to classify the facial gestures. The effectiveness of single features as well as the feature sets on the system performance was examined by considering the two major metrics, recognition accuracy and training time. Finally, the proposed classifier was assessed and compared with conventional methods support vector machines and multilayer perceptron neural network.
Results
The average classification results showed that the best performance for recognizing facial gestures among all single/multi-features was achieved by Maximum Peak Value with 87.1% accuracy. Moreover, the results proved a very fast procedure since the training time during classification via VEBFNN was 0.105 seconds. It was also indicated that MRMR was not a proper criterion to be used for making more effective feature sets in comparison with RA.
Conclusions
This work was accomplished by introducing the most discriminating facial EMG time-domain feature for the recognition of different facial gestures; and suggesting VEBFNN as a promising method in EMG-based facial gesture classification to be used for designing interfaces in human machine interaction systems.
【 授权许可】
2013 Hamedi et al.; licensee BioMed Central Ltd.
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 20140706055211570.pdf | 1032KB | ||
| Figure 10. | 54KB | Image | |
| Figure 9. | 59KB | Image | |
| Figure 8. | 20KB | Image | |
| Figure 7. | 41KB | Image | |
| Figure 6. | 54KB | Image | |
| Figure 5. | 59KB | Image | |
| Figure 4. | 18KB | Image | |
| Figure 3. | 66KB | Image | |
| Figure 2. | 63KB | Image | |
| Figure 1. | 32KB | Image |
【 图 表 】
Figure 1.
Figure 2.
Figure 3.
Figure 4.
Figure 5.
Figure 6.
Figure 7.
Figure 8.
Figure 9.
Figure 10.
【 参考文献 】
- [1]World Health OrganizationWebsite [http://www.who.int/mediacentre/news/releases/2011/disabilities_20110609/en/]
- [2]Kawamura K, Iskarous M: Trends in service robots for the disabled and the elderly. Intelligent Robots and Systems. Advanced Robotic Systems and the Real World’, IROS’94. In Proceedings of the IEEE/RSJ/GI International Conference 1994, 3:1647-1654. [IEEE]
- [3]EMG- IRWw, Gesture- V-b, EMG- IRWw, Gesture- V-b: Intelligent Robotic Wheelchair with EMG-, Gesture-, and Voice-based Interfaces. In Proceedings of the 2003 lEEE/RSJ International Conference on intelligent Robots and Systems. 3rd edition. Las Vegas, NV: IEEE; 2003:2453-3458.
- [4]Mak AF, Zhang M, Boone DA: State-of-the-art research in lower-limb prosthetic biomechanics-socket interface. J Rehabil Res Dev 2001, 38(2):161-173.
- [5]Arjunan SP, Kumar DK: Recognition of facial movements and hand gestures using surface Electromyogram (sEMG) for HCI based applications. In Proceedings of 9th Biennial Conference of the Australian Pattern Recognition Society on Digital Image Computing Techniques and Applications. Glenelg, Australia: IEEE; 2008:1-6.
- [6]Firoozabadi SMP, Asghari Oskoei MR, Hu H: A Human-Computer Interface based on Forehead Multi-Channel Bio-signals to Control a Virtual Wheelchair. 272: Proceedings of 14th ICBME; 2008:272-277.
- [7]Mohammad Rezazadeh I, Wang X, Firoozabadi SMP, Hashemi Golpayegani MR: Using affective human machine interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Construct J 2010, 20:289-298.
- [8]Gibert G, Pruzinec M, Schultz T, Stevens K: Enhancement of Human Computer Interaction with facial Electromyographic sensors, Proceeding of the 21st Annual Conference of the Australian Computer Human Interaction Special Interest Group on Design Open 247 OzCHI. Melborn, Australia: ACM press; 2009:1-4.
- [9]Wei L, Hu H: EMG and visual based HMI for hands-free control of an intelligent wheelchair. In 8th World Congress on Intelligent Control and Automation. Jinan: IEEE; 2010:1027-1032.
- [10]Tamura H, Manabe T, Tanno K, Fuse Y: The Electric Wheelchair Control System Using Surface Electromyogram of Facial Muscles. World Automation Congress (WAC): 19–23 Sept. 2010. Kobe: IEEE; 2010:1-6.
- [11]Reaz MBI, Hussain MS, Mohd-Yasin F: Techniques of EMG signal analysis: detection, processing, classification and applications. Biol Proced Online 2006, 8(1):11-35.
- [12]Yücel K, Mehmet K: EMG Signal Classification Using Wavelet Transform and fuzzy Clustering Algorithms. Istanbul, Turkey: Ayazaga; 2001.
- [13]Huang CN, Chen CH, Chung HY: The review of applications and measurements in facial electromyography. J Med Biol Eng 2004, 25:15-20.
- [14]Fukuda O, Tsuji T, Kaneko M, Otsuka A: A human-assisting manipulator teleoperated by EMG signals and arm motions. IEEE Trans Robot Autom 2003, 19:210-222.
- [15]Liejun W, Xizhong Q, Taiyi Z: Facial expression recognition using improved support vector machine by modifying kernels. Inform Technol J 2009, 8:595-599.
- [16]Hamedi M, Sheikh HS, Tan TS, Kamarul A: SEMG based Facial Expression Recognition in Bipolar configuration. J Comp Sci 2011, 7(9):1407-1415.
- [17]Sawarkar KG: Analysis and Inference of EMG Using FFT. Mumbai, India: Proceeding of SPIT-IEEE Colloquium and International Conference; 2007:1.
- [18]Subasi A, Kiymik MK: Muscle fatigue detection in EMG using time-frequency methods, ICA and neural networks. J Med Syst 2010, 34(4):775-785.
- [19]Tkach D, Huang H, Kuiken TA: Research study of stability of time-domain features for electromyographic pattern recognition. J Neuroeng Rehabil 2010, 7:21. BioMed Central Full Text
- [20]Ang LBP, Belen EF, Bernardo RA, Boongaling ER, Briones GH, Coronel JB: Facial expression recognition through pattern analysis of facial muscle movements utilizing electromyogram sensors. In Proceedings of the TENCON 2004 IEEE Region 10 Conference: 21–24 Nov. 2004. 3rd edition. Chiang Mai, Thailand: IEEE; 2004:600-603.
- [21]Van den Broek EL, Lis’y V, Janssen JH, Westerink JHDM, Schut MH, Tuinenbreijer K: Affective Man–machine Interface: Unveiling human emotions through biosignals. Biomedical Engineering Systems and Technologies: Communications in Computer and Information Science. Berlin, Germany: Springer Verlag; 2010:21-47. 52(Part 1)
- [22]Hamedi M, Rezazadeh IM, Firoozabadi SMP: Facial gesture recognition using two-channel biosensors configuration and fuzzy classifier: a pilot study. In Proceeding of the International Conference on Electrical, Control and Computer Engineering: 21–22 June 2011. Pahang, Malaysia: IEEE; 2011:338-343.
- [23]Hamedi M, Salleh SH, Tan TS, Ismail K, Ali J, Dee-Uam C, Pavaganun C, Yupapin PP: Human facial neural activities and gesture recognition for machine interfacing applications. Int J Nanomedicine 2011, 6:3461-3472.
- [24]Mohammad Rezazadeh I, Firoozabadi SM, Hu H, Hashemi Golpayegani SMR: A novel human-machine interface based on recognition of multi-channel facial bioelectric signals. Australas Phys Eng Sci Med 2011, 34(4):497-513.
- [25]Mohammad Rezazadeh I, Firoozabadi M, Hu H, Hashemi Golpayegani MR: Determining the surface electrodes locations to capture facial bioelectric signals. Iran J Med Phys 2010, 7:65-79.
- [26]Englehart K, Hudgins B: A robust, real-time control scheme for multifunction myoelectric control. IEEE Trans Biomed Eng 2003, 50(7):848-854.
- [27]Battiti R: Using mutual information for selecting features in supervised neural net learning. Neural Networks, IEEE Trans on 1994, 5(4):537-550.
- [28]Cover TM, Thomas JA: Entropy, relative entropy and mutual information. In Elements of Inform Theory. New York: John Wiley & Sons; 1991:12-49.
- [29]Rechy-Ramirez EJ, Hu H: Stages for Developing Control Systems using EMG and EEG signals: A survey. Technical Report: CES-513 in School of Computer Science and Electronic Engineering. United Kingdom: University of Essex; 2011.
- [30]Peng HC, Long F, Ding C: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 2005, 27(8):1226-1238.
- [31]Asghari Oskoei M, Hu H: Myoelectric control systems—A survey. Biomed Signal Proces 2007, 2(4):275-294.
- [32]Saichon J, Chidchanok L, Suphakant P: A very fast neural learning for classification using only new incoming datum. IEEE Trans Neural Netw 2010, 21(3):381-392.
- [33]Andris F: Biomechanics of the Upper Limbs Mechanics, Modelling and Musculoskeletal Injuries. 1st edition. London, England: CRC Press; 2004.
PDF