BMC Bioinformatics | |
Spectral consensus strategy for accurate reconstruction of large biological networks | |
Research | |
Edi Prifti1  Séverine Affeldt1  Jean-Daniel Zucker2  Nataliya Sokolovska3  | |
[1] Integromics, Institute of Cardiometabolism and Nutrition, ICAN, Assistance Publique Hôpitaux de Paris, Pitié-Salpêtrière Hospital, 75013, Paris, France;Integromics, Institute of Cardiometabolism and Nutrition, ICAN, Assistance Publique Hôpitaux de Paris, Pitié-Salpêtrière Hospital, 75013, Paris, France;Sorbonne Universités, UPMC University Paris 6, UMR S U1166 NutriOmics Team, 75013, Paris, France;IRD, UMI 209, UMMISCO, IRD France Nord, F-93143, Bondy, France;Integromics, Institute of Cardiometabolism and Nutrition, ICAN, Assistance Publique Hôpitaux de Paris, Pitié-Salpêtrière Hospital, 75013, Paris, France;Sorbonne Universités, UPMC University Paris 6, UMR S U1166 NutriOmics Team, 75013, Paris, France;UMR S U1166 Nutriomics Team, INSERM, 75013, Paris, France; | |
关键词: Network reconstruction; Community-based method; Spectral theory; High-dimensional data; Microbiota; | |
DOI : 10.1186/s12859-016-1308-y | |
来源: Springer | |
【 摘 要 】
BackgroundThe last decades witnessed an explosion of large-scale biological datasets whose analyses require the continuous development of innovative algorithms. Many of these high-dimensional datasets are related to large biological networks with few or no experimentally proven interactions. A striking example lies in the recent gut bacterial studies that provided researchers with a plethora of information sources. Despite a deeper knowledge of microbiome composition, inferring bacterial interactions remains a critical step that encounters significant issues, due in particular to high-dimensional settings, unknown gut bacterial taxa and unavoidable noise in sparse datasets. Such data type make any a priori choice of a learning method particularly difficult and urge the need for the development of new scalable approaches.ResultsWe propose a consensus method based on spectral decomposition, named Spectral Consensus Strategy, to reconstruct large networks from high-dimensional datasets. This novel unsupervised approach can be applied to a broad range of biological networks and the associated spectral framework provides scalability to diverse reconstruction methods. The results obtained on benchmark datasets demonstrate the interest of our approach for high-dimensional cases. As a suitable example, we considered the human gut microbiome co-presence network. For this application, our method successfully retrieves biologically relevant relationships and gives new insights into the topology of this complex ecosystem.ConclusionsThe Spectral Consensus Strategy improves prediction precision and allows scalability of various reconstruction methods to large networks. The integration of multiple reconstruction algorithms turns our approach into a robust learning method. All together, this strategy increases the confidence of predicted interactions from high-dimensional datasets without demanding computations.
【 授权许可】
CC BY
© The Author(s) 2016
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202311091575950ZK.pdf | 6861KB | download | |
12864_2017_4020_Article_IEq47.gif | 1KB | Image | download |
12903_2017_424_Article_IEq1.gif | 1KB | Image | download |
12864_2016_2443_Article_IEq22.gif | 1KB | Image | download |
12864_2017_4020_Article_IEq50.gif | 1KB | Image | download |
12914_2017_112_Article_IEq1.gif | 1KB | Image | download |
12914_2017_112_Article_IEq2.gif | 1KB | Image | download |
12914_2017_112_Article_IEq3.gif | 1KB | Image | download |
12864_2015_2296_Article_IEq20.gif | 1KB | Image | download |
12888_2016_1075_Article_IEq1.gif | 1KB | Image | download |
12864_2017_3527_Article_IEq6.gif | 1KB | Image | download |
12864_2017_3527_Article_IEq7.gif | 1KB | Image | download |
12864_2017_3527_Article_IEq8.gif | 1KB | Image | download |
12864_2017_3527_Article_IEq10.gif | 1KB | Image | download |
12864_2017_4130_Article_IEq27.gif | 1KB | Image | download |
12864_2015_2055_Article_IEq100.gif | 1KB | Image | download |
12864_2017_4179_Article_IEq8.gif | 1KB | Image | download |
12864_2016_2880_Article_IEq5.gif | 1KB | Image | download |
12864_2017_4179_Article_IEq10.gif | 1KB | Image | download |
12864_2015_2055_Article_IEq5.gif | 1KB | Image | download |
12864_2017_4179_Article_IEq11.gif | 2KB | Image | download |
12864_2017_4179_Article_IEq12.gif | 1KB | Image | download |
12906_2015_Article_775_TeX2GIF_IEq3.gif | 1KB | Image | download |
12906_2015_Article_682_TeX2GIF_IEq2.gif | 1KB | Image | download |
12888_2016_877_Article_IEq1.gif | 1KB | Image | download |
12888_2016_877_Article_IEq2.gif | 1KB | Image | download |
12864_2016_2789_Article_IEq16.gif | 1KB | Image | download |
12864_2015_2192_Article_IEq10.gif | 1KB | Image | download |
【 图 表 】
12864_2015_2192_Article_IEq10.gif
12864_2016_2789_Article_IEq16.gif
12888_2016_877_Article_IEq2.gif
12888_2016_877_Article_IEq1.gif
12906_2015_Article_682_TeX2GIF_IEq2.gif
12906_2015_Article_775_TeX2GIF_IEq3.gif
12864_2017_4179_Article_IEq12.gif
12864_2017_4179_Article_IEq11.gif
12864_2015_2055_Article_IEq5.gif
12864_2017_4179_Article_IEq10.gif
12864_2016_2880_Article_IEq5.gif
12864_2017_4179_Article_IEq8.gif
12864_2015_2055_Article_IEq100.gif
12864_2017_4130_Article_IEq27.gif
12864_2017_3527_Article_IEq10.gif
12864_2017_3527_Article_IEq8.gif
12864_2017_3527_Article_IEq7.gif
12864_2017_3527_Article_IEq6.gif
12888_2016_1075_Article_IEq1.gif
12864_2015_2296_Article_IEq20.gif
12914_2017_112_Article_IEq3.gif
12914_2017_112_Article_IEq2.gif
12914_2017_112_Article_IEq1.gif
12864_2017_4020_Article_IEq50.gif
12864_2016_2443_Article_IEq22.gif
12903_2017_424_Article_IEq1.gif
12864_2017_4020_Article_IEq47.gif
【 参考文献 】
- [1]
- [2]
- [3]
- [4]
- [5]
- [6]
- [7]
- [8]
- [9]
- [10]
- [11]
- [12]
- [13]
- [14]
- [15]
- [16]
- [17]
- [18]
- [19]
- [20]
- [21]
- [22]
- [23]
- [24]
- [25]
- [26]
- [27]
- [28]
- [29]
- [30]
- [31]
- [32]
- [33]
- [34]
- [35]
- [36]
- [37]
- [38]
- [39]
- [40]
- [41]
- [42]
- [43]
- [44]
- [45]
- [46]
- [47]
- [48]
- [49]
- [50]
- [51]
- [52]
- [53]
- [54]
- [55]
- [56]
- [57]
- [58]
- [59]
- [60]
- [61]
- [62]
- [63]
- [64]
- [65]
- [66]