期刊论文详细信息
Journal of Translational Medicine
DeepMPF: deep learning framework for predicting drug–target interactions based on multi-modal representation with meta-path semantic analysis
Research
Yan-Fang Ma1  Quan Zou2  Hai-Ru You3  Zhu-Hong You3  Chang-Qing Yu4  Xin-Fei Wang4  Jie Pan4  Yong-Jian Guan4  Zhong-Hao Ren4 
[1] Department of Galactophore, The Third People’s Hospital of Gansu Province, 730020, Lanzhou, China;Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, 610054, Chengdu, China;School of Computer Science, Northwestern Polytechnical University, 710129, Xi’an, China;School of Information Engineering, Xijing University, 710100, Xi’an, China;
关键词: Drug–protein interactions;    Multi-modal;    Meta-path;    Sequence analysis;    Joint learning;    Natural language processing;   
DOI  :  10.1186/s12967-023-03876-3
 received in 2022-08-16, accepted in 2023-01-05,  发布年份 2023
来源: Springer
PDF
【 摘 要 】

BackgroundDrug-target interaction (DTI) prediction has become a crucial prerequisite in drug design and drug discovery. However, the traditional biological experiment is time-consuming and expensive, as there are abundant complex interactions present in the large size of genomic and chemical spaces. For alleviating this phenomenon, plenty of computational methods are conducted to effectively complement biological experiments and narrow the search spaces into a preferred candidate domain. Whereas, most of the previous approaches cannot fully consider association behavior semantic information based on several schemas to represent complex the structure of heterogeneous biological networks. Additionally, the prediction of DTI based on single modalities cannot satisfy the demand for prediction accuracy.MethodsWe propose a multi-modal representation framework of ‘DeepMPF’ based on meta-path semantic analysis, which effectively utilizes heterogeneous information to predict DTI. Specifically, we first construct protein–drug-disease heterogeneous networks composed of three entities. Then the feature information is obtained under three views, containing sequence modality, heterogeneous structure modality and similarity modality. We proposed six representative schemas of meta-path to preserve the high-order nonlinear structure and catch hidden structural information of the heterogeneous network. Finally, DeepMPF generates highly representative comprehensive feature descriptors and calculates the probability of interaction through joint learning.ResultsTo evaluate the predictive performance of DeepMPF, comparison experiments are conducted on four gold datasets. Our method can obtain competitive performance in all datasets. We also explore the influence of the different feature embedding dimensions, learning strategies and classification methods. Meaningfully, the drug repositioning experiments on COVID-19 and HIV demonstrate DeepMPF can be applied to solve problems in reality and help drug discovery. The further analysis of molecular docking experiments enhances the credibility of the drug candidates predicted by DeepMPF.ConclusionsAll the results demonstrate the effectively predictive capability of DeepMPF for drug-target interactions. It can be utilized as a useful tool to prescreen the most potential drug candidates for the protein. The web server of the DeepMPF predictor is freely available at http://120.77.11.78/DeepMPF/, which can help relevant researchers to further study.

【 授权许可】

CC BY   
© The Author(s) 2023

【 预 览 】
附件列表
Files Size Format View
RO202305117665988ZK.pdf 5138KB PDF download
41116_2022_35_Article_IEq88.gif 1KB Image download
MediaObjects/12888_2023_4540_MOESM1_ESM.docx 18KB Other download
Fig. 5 84KB Image download
41116_2022_35_Article_IEq163.gif 1KB Image download
41116_2022_35_Article_IEq213.gif 1KB Image download
41116_2022_35_Article_IEq238.gif 1KB Image download
41116_2022_35_Article_IEq251.gif 1KB Image download
41116_2022_35_Article_IEq259.gif 1KB Image download
Fig. 1 64KB Image download
41116_2022_35_Article_IEq348.gif 1KB Image download
41116_2022_35_Article_IEq408.gif 1KB Image download
41116_2022_35_Article_IEq410.gif 1KB Image download
41116_2022_35_Article_IEq412.gif 1KB Image download
41116_2022_35_Article_IEq414.gif 1KB Image download
41116_2022_35_Article_IEq416.gif 1KB Image download
Fig. 6 102KB Image download
41116_2022_35_Article_IEq420.gif 1KB Image download
41116_2022_35_Article_IEq421.gif 1KB Image download
12888_2022_4443_Article_IEq5.gif 1KB Image download
12888_2022_4443_Article_IEq7.gif 1KB Image download
41116_2022_35_Article_IEq425.gif 1KB Image download
Fig. 7 106KB Image download
12888_2022_4443_Article_IEq8.gif 1KB Image download
12888_2022_4443_Article_IEq9.gif 1KB Image download
12888_2022_4443_Article_IEq10.gif 1KB Image download
12888_2022_4443_Article_IEq11.gif 1KB Image download
12888_2022_4443_Article_IEq13.gif 1KB Image download
41116_2022_35_Article_IEq434.gif 1KB Image download
41116_2022_35_Article_IEq436.gif 1KB Image download
Fig. 1 459KB Image download
【 图 表 】

Fig. 1

41116_2022_35_Article_IEq436.gif

41116_2022_35_Article_IEq434.gif

12888_2022_4443_Article_IEq13.gif

12888_2022_4443_Article_IEq11.gif

12888_2022_4443_Article_IEq10.gif

12888_2022_4443_Article_IEq9.gif

12888_2022_4443_Article_IEq8.gif

Fig. 7

41116_2022_35_Article_IEq425.gif

12888_2022_4443_Article_IEq7.gif

12888_2022_4443_Article_IEq5.gif

41116_2022_35_Article_IEq421.gif

41116_2022_35_Article_IEq420.gif

Fig. 6

41116_2022_35_Article_IEq416.gif

41116_2022_35_Article_IEq414.gif

41116_2022_35_Article_IEq412.gif

41116_2022_35_Article_IEq410.gif

41116_2022_35_Article_IEq408.gif

41116_2022_35_Article_IEq348.gif

Fig. 1

41116_2022_35_Article_IEq259.gif

41116_2022_35_Article_IEq251.gif

41116_2022_35_Article_IEq238.gif

41116_2022_35_Article_IEq213.gif

41116_2022_35_Article_IEq163.gif

Fig. 5

41116_2022_35_Article_IEq88.gif

【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  • [23]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28]
  • [29]
  • [30]
  • [31]
  • [32]
  • [33]
  • [34]
  • [35]
  • [36]
  • [37]
  • [38]
  • [39]
  • [40]
  • [41]
  • [42]
  • [43]
  • [44]
  • [45]
  • [46]
  • [47]
  • [48]
  • [49]
  • [50]
  • [51]
  • [52]
  • [53]
  • [54]
  • [55]
  • [56]
  • [57]
  • [58]
  • [59]
  • [60]
  • [61]
  • [62]
  • [63]
  • [64]
  • [65]
  • [66]
  • [67]
  • [68]
  • [69]
  • [70]
  • [71]
  • [72]
  • [73]
  • [74]
  • [75]
  • [76]
  • [77]
  • [78]
  • [79]
  • [80]
  • [81]
  • [82]
  • [83]
  • [84]
  • [85]
  • [86]
  文献评价指标  
  下载次数:7次 浏览次数:3次