BMC Bioinformatics | |
Automatic learning of pre-miRNAs from different species | |
Research Article | |
Alexander Schliep1  Ivani de O. N. Lopes2  André P. de L. F. de Carvalho3  | |
[1] Department of Computer Science, Rutgers University, 110 Frelinghuysen Road, 08854, Piscataway, NJ, USA;Empresa Brasileira de Pesquisa Agropecuária, Embrapa Soja, Caixa Postal 231, 86001-970, Londrina-PR, CEP, Brasil;Instituto de Ciências Matemáticas e de Computação, Avenida Trabalhador são-carlense, 400 - Centro, São Carlos SP, Brasil; | |
关键词: Random Forest; Predictive Accuracy; Minimum Free Energy; Test Species; Negative Sequence; | |
DOI : 10.1186/s12859-016-1036-3 | |
received in 2015-07-30, accepted in 2016-04-12, 发布年份 2016 | |
来源: Springer | |
【 摘 要 】
BackgroundDiscovery of microRNAs (miRNAs) relies on predictive models for characteristic features from miRNA precursors (pre-miRNAs). The short length of miRNA genes and the lack of pronounced sequence features complicate this task. To accommodate the peculiarities of plant and animal miRNAs systems, tools for both systems have evolved differently. However, these tools are biased towards the species for which they were primarily developed and, consequently, their predictive performance on data sets from other species of the same kingdom might be lower. While these biases are intrinsic to the species, their characterization can lead to computational approaches capable of diminishing their negative effect on the accuracy of pre-miRNAs predictive models. We investigate in this study how 45 predictive models induced for data sets from 45 species, distributed in eight subphyla/classes, perform when applied to a species different from the species used in its induction.ResultsOur computational experiments show that the separability of pre-miRNAs and pseudo pre-miRNAs instances is species-dependent and no feature set performs well for all species, even within the same subphylum/class. Mitigating this species dependency, we show that an ensemble of classifiers reduced the classification errors for all 45 species. As the ensemble members were obtained using meaningful, and yet computationally viable feature sets, the ensembles also have a lower computational cost than individual classifiers that rely on energy stability parameters, which are of prohibitive computational cost in large scale applications.ConclusionIn this study, the combination of multiple pre-miRNAs feature sets and multiple learning biases enhanced the predictive accuracy of pre-miRNAs classifiers of 45 species. This is certainly a promising approach to be incorporated in miRNA discovery tools towards more accurate and less species-dependent tools.The material to reproduce the results from this paper can be downloaded from http://dx.doi.org/10.5281/zenodo.49754.
【 授权许可】
CC BY
© Lopes et al. 2016
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202311105555580ZK.pdf | 2262KB | download | |
1555KB | Image | download | |
Fig. 2 | 189KB | Image | download |
12951_2015_155_Article_IEq40.gif | 1KB | Image | download |
Fig. 2 | 2732KB | Image | download |
12951_2015_155_Article_IEq42.gif | 1KB | Image | download |
Fig. 8 | 109KB | Image | download |
Fig. 1 | 1002KB | Image | download |
Fig. 3 | 311KB | Image | download |
Fig. 2 | 1185KB | Image | download |
Fig. 2 | 1260KB | Image | download |
MediaObjects/42004_2023_1019_MOESM2_ESM.pdf | 10064KB | download | |
Fig. 3 | 2073KB | Image | download |
MediaObjects/12974_2023_2924_MOESM3_ESM.tiff | 13318KB | Other | download |
Fig. 1 | 269KB | Image | download |
Fig. 6 | 274KB | Image | download |
Fig. 1 | 420KB | Image | download |
Fig. 2 | 204KB | Image | download |
Fig. 3 | 707KB | Image | download |
Fig. 1 | 174KB | Image | download |
【 图 表 】
Fig. 1
Fig. 3
Fig. 2
Fig. 1
Fig. 6
Fig. 1
Fig. 3
Fig. 2
Fig. 2
Fig. 3
Fig. 1
Fig. 8
12951_2015_155_Article_IEq42.gif
Fig. 2
12951_2015_155_Article_IEq40.gif
Fig. 2
【 参考文献 】
- [1]
- [2]
- [3]
- [4]
- [5]
- [6]
- [7]
- [8]
- [9]
- [10]
- [11]
- [12]
- [13]
- [14]
- [15]
- [16]
- [17]
- [18]
- [19]
- [20]
- [21]
- [22]
- [23]
- [24]
- [25]
- [26]
- [27]
- [28]
- [29]
- [30]
- [31]
- [32]
- [33]
- [34]
- [35]
- [36]