Processes | 卷:10 |
Face Verification Based on Deep Learning for Person Tracking in Hazardous Goods Factories | |
Yu Lu1  Xi Huang2  Hua Zheng2  Xiongjun Zeng2  Qingxiang Wu2  Xixian Huang2  | |
[1] Concord University College, Fujian Normal University, Fuzhou 350117, China; | |
[2] Key Laboratory of OptoElectronic Science and Technology for Medicine of Ministry of Education, College of Photonic and Electronic Engineering, Fujian Normal University, Fuzhou 350108, China; | |
关键词: hazardous goods factory; face verification; person tracking; deep learning; | |
DOI : 10.3390/pr10020380 | |
来源: DOAJ |
【 摘 要 】
Person tracking in hazardous goods factories can provide a significant improvement in security and safety. This article proposes a face verification model which can be used to record travel paths for staff or related persons in the factory. As face images are captured from the dynamic crowd at entrance–exit gates of workshops, face verification is challenged by polymorphic faces, poor illumination and changing of a person’s pose. To adapt to this situation, a new face verification model is proposed, which is composed of two advanced deep learning neural network models. Firstly, MTCNN (Multi-Task Cascaded Convolutional Neural Network) is used to construct a face detector. Based on the SphereFace-20 network model, we have reconstructed a convolutional network architecture with the embedded Batch Normalization elements and the optimized network parameters. The new model, which is called the MDCNN, is used to extract efficient face features. A set of specific processing algorithms is used in the model to process polymorphic face images. The multi-view faces and various types of face images are used to train the models. The experimental results have demonstrated that the proposed model outperforms most existing methods on benchmark datasets such as the Labeled Faces in the Wild (LFW) and YouTube Face (YTF) datasets without multi-view (accuracy is 99.38% and 94.30%, respectively) and the CNBC/FERET datasets with multi-view (accuracy is 94.69%).
【 授权许可】
Unknown