期刊论文详细信息
eLife
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
Francesco Randi1  Xinwei Yu1  Anuj K Sharma1  Andrew M Leifer2  Scott W Linderman3  Matthew S Creamer4 
[1]Department of Physics, Princeton University, Princeton, United States
[2]Department of Physics, Princeton University, Princeton, United States
[3]Princeton Neuroscience Institute, Princeton University, Princeton, United States
[4]Department of Statistics, Stanford University, Stanford, United States
[5]Wu Tsai Neurosciences Institute, Stanford University, Stanford, United States
[6]Princeton Neuroscience Institute, Princeton University, Princeton, United States
关键词: computer vision;    deep learning;    artificial neural network;    tracking;    registration;    transformer;    C. elegans;   
DOI  :  10.7554/eLife.66410
来源: eLife Sciences Publications, Ltd
PDF
【 摘 要 】
We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out real animals. The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals. Performance is evaluated against hand-annotated datasets, including NeuroPAL (Yemini et al., 2021). Using only position information, the method achieves 79.1% accuracy at tracking neurons within an individual and 64.1% accuracy at identifying neurons across individuals. Accuracy at identifying neurons across individuals is even higher (78.2%) when the model is applied to a dataset published by another group (Chaudhary et al., 2021). Accuracy reaches 74.7% on our dataset when using color information from NeuroPAL. Unlike previous methods, fDNC does not require straightening or transforming the animal into a canonical coordinate system. The method is fast and predicts correspondence in 10 ms making it suitable for future real-time applications.
【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202110266663430ZK.pdf 2265KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:18次