期刊论文详细信息
Fire Ecology
Federated recognition mechanism based on enhanced temporal-spatial learning using mobile edge sensors for firefighters
Original Research
Do-Hyeun Kim1  Harun Jamil2  Khan Murad Ali3 
[1] Department of Computer Engineering, Jeju National University, Jeju-si, Republic of Korea;Department of Electronic Engineering, Jeju National University, Jeju-si, Republic of Korea;Department of Electronic Engineering, Jeju National University, Jeju-si, Republic of Korea;Department of Computer Engineering, Jeju National University, Jeju-si, Republic of Korea;
关键词: Weighted attention mechanism;    BILSTM;    CNN;    Federated recognition;    Human activity recognition;    Smartphone sensors;    Sequential data;   
DOI  :  10.1186/s42408-023-00203-5
 received in 2023-02-25, accepted in 2023-06-16,  发布年份 2023
来源: Springer
PDF
【 摘 要 】

BackgroundInterest in Human Action Recognition (HAR), which encompasses both household and industrial settings, is growing. HAR describes a computer system’s capacity to accurately recognize and evaluate human activities and behaviors, akin to what humans call perception. Real-time federated activity identification architecture is suggested in this work to monitor smartphone user behavior. The main aim is to decrease accidents happening in an indoor environment and assure the security of older individuals in an indoor setting. The idea lends itself to a multitude of uses, including monitoring the elderly, entertainment, and spying.ResultsIn this paper, we present a new smartphone sensor-based human motion awareness federated recognition scheme using a temporal-spatial weighted BILSTM-CNN framework. We verify new federated recognition based on temporal-spatial data better than existing machine learning schemes in terms of activity recognition accuracy. Several methods and strategies in the literature have been used to attain higher HAR accuracy. In particular, six categories of typical everyday human activities are highlighted, including walking, jumping, standing, moving from one level to another, and picking up items.ConclusionSmartphone-based sensors are utilized to detect the motion activities carried out by elderly people based on the raw inertial measurement unit (IMU) data. Then, weighted bidirectional long short-term memory (BILSTM) networks are for learning about temporal motion features; they are swiftly followed by single-dimensional convolutional neural networks (CNN), which are built for reasoning about spatial structure features. Additionally, the awareness mechanism highlights the data segments to choose discriminative contextual data. Finally, a sizeable dataset of HDL activity datasets is gathered for model validation and training. The results confirm that the proposed ML framework performs 18.7% better in terms of accuracy, 27.9% for the case of precision, and 0.24.1% when evaluating based on the F1-score for client 1.Similarly, for client 2 and client 3, the performance betterment in terms of accuracy is 18.4% and 10.1%, respectively.

【 授权许可】

CC BY   
© Association for Fire Ecology 2023. corrected publication 2023

【 预 览 】
附件列表
Files Size Format View
RO202309155859489ZK.pdf 3981KB PDF download
Fig. 1 363KB Image download
Fig. 2 281KB Image download
MediaObjects/12951_2023_2012_MOESM8_ESM.jpg 5545KB Other download
Fig. 8 2696KB Image download
MediaObjects/41021_2023_275_MOESM1_ESM.pdf 408KB PDF download
40517_2023_267_Article_IEq3.gif 1KB Image download
12938_2023_1137_Article_IEq32.gif 1KB Image download
Fig. 4 1725KB Image download
Fig. 1 490KB Image download
40798_2023_622_Article_IEq4.gif 1KB Image download
Fig. 4 2387KB Image download
Fig. 1 112KB Image download
Fig. 1 227KB Image download
Fig. 11 181KB Image download
【 图 表 】

Fig. 11

Fig. 1

Fig. 1

Fig. 4

40798_2023_622_Article_IEq4.gif

Fig. 1

Fig. 4

12938_2023_1137_Article_IEq32.gif

40517_2023_267_Article_IEq3.gif

Fig. 8

Fig. 2

Fig. 1

【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  • [23]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28]
  • [29]
  • [30]
  • [31]
  • [32]
  • [33]
  • [34]
  • [35]
  • [36]
  • [37]
  • [38]
  • [39]
  • [40]
  • [41]
  • [42]
  • [43]
  • [44]
  • [45]
  • [46]
  • [47]
  • [48]
  • [49]
  • [50]
  • [51]
  • [52]
  • [53]
  • [54]
  • [55]
  • [56]
  • [57]
  • [58]
  • [59]
  • [60]
  • [61]
  • [62]
  • [63]
  • [64]
  • [65]
  • [66]
  • [67]
  • [68]
  • [69]
  • [70]
  • [71]
  • [72]
  文献评价指标  
  下载次数:12次 浏览次数:3次