期刊论文详细信息
卷:138
Dimensionality reduction and ensemble of LSTMs for antimicrobial resistance prediction
Article
关键词: INTENSIVE-CARE-UNIT;    TIME-SERIES;    FEATURE-SELECTION;    CLASS SEPARABILITY;    CONCEPT DRIFT;    MODELS;   
DOI  :  10.1016/j.artmed.2023.102508
来源: SCIE
【 摘 要 】

Bacterial resistance to antibiotics has been rapidly increasing, resulting in low antibiotic effectiveness even treating common infections. The presence of resistant pathogens in environments such as a hospital Intensive Care Unit (ICU) exacerbates the critical admission-acquired infections. This work focuses on the prediction of antibiotic resistance in Pseudomonas aeruginosa nosocomial infections at the ICU, using Long Short-Term Memory (LSTM) artificial neural networks as the predictive method. The analyzed data were extracted from the Electronic Health Records (EHR) of patients admitted to the University Hospital of Fuenlabrada from 2004 to 2019 and were modeled as Multivariate Time Series. A data-driven dimensionality reduction method is built by adapting three feature importance techniques from the literature to the considered data and proposing an algorithm for selecting the most appropriate number of features. This is done using LSTM sequential capabilities so that the temporal aspect of features is taken into account. Furthermore, an ensemble of LSTMs is used to reduce the variance in performance. Our results indicate that the patient's admission information, the antibiotics administered during the ICU stay, and the previous antimicrobial resistance are the most important risk factors. Compared to other conventional dimensionality reduction schemes, our approach is able to improve performance while reducing the number of features for most of the experiments. In essence, the proposed framework achieve, in a computationally cost-efficient manner, promising results for supporting decisions in this clinical task, characterized by high dimensionality, data scarcity, and concept drift.

【 授权许可】

Free   

  文献评价指标  
  下载次数:0次 浏览次数:2次