期刊论文详细信息
Brain Informatics
Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning
Research
Andrew Burton1  Sean Haddick1  Bradley Standen1  Muhammad Arifur Rahman1  Matthew Harris1  Nicholas Shopland1  James Lewis1  David J. Brown1  Mufti Mahmud2  Preethi Premkumar3  Simona Nastase4  David Downes5  Carolyn Thomas5  Yangang Xing6  Nadja Heym7  Zakia Batool Turabee7  Alexander Sumich7 
[1] Department of Computer Science, Nottingham Trent University, Clifton Lane, NG11 8NS, Nottingham, UK;Department of Computer Science, Nottingham Trent University, Clifton Lane, NG11 8NS, Nottingham, UK;Medical Technologies Innovation Facility, Nottingham Trent University, Clifton Lane, NG11 8NS, Nottingham, UK;Computing and Informatics Research Centre, Nottingham Trent University, Clifton Lane, NG11 8NS, Nottingham, UK;Division of Psychology, London South Bank University, SE1 0AA, London, UK;Independent Clinical Psychologist, London, UK;Nottingham School of Art & Design, Nottingham Trent University, Shakespeare St, NG1 4FQ, Nottingham, UK;School of ADBE, Nottingham Trent University, Shakespeare St, NG1 4FQ, Nottingham, UK;School of Social Sciences, Nottingham Trent University, Shakespeare St, NG1 4FQ, Nottingham, UK;
关键词: Biofeedback;    Arousal;    EEG;    HRV;    Glossophobia;    Stress;    VRET;   
DOI  :  10.1186/s40708-023-00193-9
 received in 2022-10-11, accepted in 2023-05-15,  发布年份 2023
来源: Springer
PDF
【 摘 要 】

Virtual reality exposure therapy (VRET) is a novel intervention technique that allows individuals to experience anxiety-evoking stimuli in a safe environment, recognise specific triggers and gradually increase their exposure to perceived threats. Public-speaking anxiety (PSA) is a prevalent form of social anxiety, characterised by stressful arousal and anxiety generated when presenting to an audience. In self-guided VRET, participants can gradually increase their tolerance to exposure and reduce anxiety-induced arousal and PSA over time. However, creating such a VR environment and determining physiological indices of anxiety-induced arousal or distress is an open challenge. Environment modelling, character creation and animation, psychological state determination and the use of machine learning (ML) models for anxiety or stress detection are equally important, and multi-disciplinary expertise is required. In this work, we have explored a series of ML models with publicly available data sets (using electroencephalogram and heart rate variability) to predict arousal states. If we can detect anxiety-induced arousal, we can trigger calming activities to allow individuals to cope with and overcome distress. Here, we discuss the means of effective selection of ML models and parameters in arousal detection. We propose a pipeline to overcome the model selection problem with different parameter settings in the context of virtual reality exposure therapy. This pipeline can be extended to other domains of interest where arousal detection is crucial. Finally, we have implemented a biofeedback framework for VRET where we successfully provided feedback as a form of heart rate and brain laterality index from our acquired multimodal data for psychological intervention to overcome anxiety.

【 授权许可】

CC BY   
© The Author(s) 2023

【 预 览 】
附件列表
Files Size Format View
RO202309076832283ZK.pdf 3798KB PDF download
41116_2023_37_Article_IEq161.gif 1KB Image download
Fig. 6 282KB Image download
41116_2023_37_Article_IEq171.gif 1KB Image download
41116_2023_37_Article_IEq182.gif 1KB Image download
41116_2023_37_Article_IEq189.gif 1KB Image download
Fig. 2 246KB Image download
41116_2023_37_Article_IEq233.gif 1KB Image download
MediaObjects/41408_2023_876_MOESM1_ESM.docx 4471KB Other download
Fig. 12 700KB Image download
MediaObjects/12888_2023_4789_MOESM1_ESM.docx 116KB Other download
40517_2023_259_Article_IEq7.gif 1KB Image download
Fig. 2 129KB Image download
40517_2023_259_Article_IEq15.gif 1KB Image download
42004_2023_919_Article_IEq171.gif 1KB Image download
Fig. 5 332KB Image download
Fig. 5 1164KB Image download
Fig. 2 592KB Image download
42004_2023_919_Article_IEq175.gif 1KB Image download
42004_2023_919_Article_IEq176.gif 1KB Image download
MediaObjects/12888_2023_4850_MOESM6_ESM.docx 25KB Other download
Fig. 2 171KB Image download
40517_2023_259_Article_IEq27.gif 1KB Image download
Fig. 1 571KB Image download
40517_2023_259_Article_IEq30.gif 1KB Image download
40517_2023_259_Article_IEq31.gif 1KB Image download
MediaObjects/41408_2023_867_MOESM1_ESM.docx 286KB Other download
40517_2023_259_Article_IEq32.gif 1KB Image download
Fig. 2 297KB Image download
MediaObjects/12888_2023_4850_MOESM7_ESM.docx 15KB Other download
【 图 表 】

Fig. 2

40517_2023_259_Article_IEq32.gif

40517_2023_259_Article_IEq31.gif

40517_2023_259_Article_IEq30.gif

Fig. 1

40517_2023_259_Article_IEq27.gif

Fig. 2

42004_2023_919_Article_IEq176.gif

42004_2023_919_Article_IEq175.gif

Fig. 2

Fig. 5

Fig. 5

42004_2023_919_Article_IEq171.gif

40517_2023_259_Article_IEq15.gif

Fig. 2

40517_2023_259_Article_IEq7.gif

Fig. 12

41116_2023_37_Article_IEq233.gif

Fig. 2

41116_2023_37_Article_IEq189.gif

41116_2023_37_Article_IEq182.gif

41116_2023_37_Article_IEq171.gif

Fig. 6

41116_2023_37_Article_IEq161.gif

【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  • [23]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28]
  • [29]
  • [30]
  • [31]
  • [32]
  • [33]
  • [34]
  • [35]
  • [36]
  • [37]
  • [38]
  • [39]
  • [40]
  • [41]
  • [42]
  • [43]
  • [44]
  • [45]
  • [46]
  • [47]
  • [48]
  • [49]
  • [50]
  • [51]
  • [52]
  • [53]
  • [54]
  • [55]
  • [56]
  • [57]
  • [58]
  • [59]
  • [60]
  • [61]
  • [62]
  • [63]
  • [64]
  • [65]
  • [66]
  • [67]
  • [68]
  • [69]
  • [70]
  • [71]
  • [72]
  • [73]
  • [74]
  • [75]
  • [76]
  • [77]
  • [78]
  • [79]
  • [80]
  • [81]
  • [82]
  • [83]
  • [84]
  文献评价指标  
  下载次数:1次 浏览次数:0次