Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training
We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out re...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
eLife Sciences Publications Ltd
2021-07-01
|
Series: | eLife |
Subjects: | |
Online Access: | https://elifesciences.org/articles/66410 |
id |
doaj-2c551f23d16046408cff743aaf6c5dc6 |
---|---|
record_format |
Article |
spelling |
doaj-2c551f23d16046408cff743aaf6c5dc62021-08-16T15:35:54ZengeLife Sciences Publications LtdeLife2050-084X2021-07-011010.7554/eLife.66410Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic trainingXinwei Yu0https://orcid.org/0000-0002-8699-3546Matthew S Creamer1https://orcid.org/0000-0002-9458-0629Francesco Randi2https://orcid.org/0000-0002-6200-7254Anuj K Sharma3https://orcid.org/0000-0001-5061-9731Scott W Linderman4https://orcid.org/0000-0002-3878-9073Andrew M Leifer5https://orcid.org/0000-0002-5362-5093Department of Physics, Princeton University, Princeton, United StatesPrinceton Neuroscience Institute, Princeton University, Princeton, United StatesDepartment of Physics, Princeton University, Princeton, United StatesDepartment of Physics, Princeton University, Princeton, United StatesDepartment of Statistics, Stanford University, Stanford, United States; Wu Tsai Neurosciences Institute, Stanford University, Stanford, United StatesDepartment of Physics, Princeton University, Princeton, United States; Princeton Neuroscience Institute, Princeton University, Princeton, United StatesWe present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out real animals. The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals. Performance is evaluated against hand-annotated datasets, including NeuroPAL (Yemini et al., 2021). Using only position information, the method achieves 79.1% accuracy at tracking neurons within an individual and 64.1% accuracy at identifying neurons across individuals. Accuracy at identifying neurons across individuals is even higher (78.2%) when the model is applied to a dataset published by another group (Chaudhary et al., 2021). Accuracy reaches 74.7% on our dataset when using color information from NeuroPAL. Unlike previous methods, fDNC does not require straightening or transforming the animal into a canonical coordinate system. The method is fast and predicts correspondence in 10 ms making it suitable for future real-time applications.https://elifesciences.org/articles/66410computer visiondeep learningartificial neural networktrackingregistrationtransformer |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Xinwei Yu Matthew S Creamer Francesco Randi Anuj K Sharma Scott W Linderman Andrew M Leifer |
spellingShingle |
Xinwei Yu Matthew S Creamer Francesco Randi Anuj K Sharma Scott W Linderman Andrew M Leifer Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training eLife computer vision deep learning artificial neural network tracking registration transformer |
author_facet |
Xinwei Yu Matthew S Creamer Francesco Randi Anuj K Sharma Scott W Linderman Andrew M Leifer |
author_sort |
Xinwei Yu |
title |
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training |
title_short |
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training |
title_full |
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training |
title_fullStr |
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training |
title_full_unstemmed |
Fast deep neural correspondence for tracking and identifying neurons in C. elegans using semi-synthetic training |
title_sort |
fast deep neural correspondence for tracking and identifying neurons in c. elegans using semi-synthetic training |
publisher |
eLife Sciences Publications Ltd |
series |
eLife |
issn |
2050-084X |
publishDate |
2021-07-01 |
description |
We present an automated method to track and identify neurons in C. elegans, called ‘fast Deep Neural Correspondence’ or fDNC, based on the transformer network architecture. The model is trained once on empirically derived semi-synthetic data and then predicts neural correspondence across held-out real animals. The same pre-trained model both tracks neurons across time and identifies corresponding neurons across individuals. Performance is evaluated against hand-annotated datasets, including NeuroPAL (Yemini et al., 2021). Using only position information, the method achieves 79.1% accuracy at tracking neurons within an individual and 64.1% accuracy at identifying neurons across individuals. Accuracy at identifying neurons across individuals is even higher (78.2%) when the model is applied to a dataset published by another group (Chaudhary et al., 2021). Accuracy reaches 74.7% on our dataset when using color information from NeuroPAL. Unlike previous methods, fDNC does not require straightening or transforming the animal into a canonical coordinate system. The method is fast and predicts correspondence in 10 ms making it suitable for future real-time applications. |
topic |
computer vision deep learning artificial neural network tracking registration transformer |
url |
https://elifesciences.org/articles/66410 |
work_keys_str_mv |
AT xinweiyu fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining AT matthewscreamer fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining AT francescorandi fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining AT anujksharma fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining AT scottwlinderman fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining AT andrewmleifer fastdeepneuralcorrespondencefortrackingandidentifyingneuronsincelegansusingsemisynthetictraining |
_version_ |
1721205658781483008 |