TweetyNet: A neural network that enables high-throughput, automated annotation of birdsong

Yarden Cohen, David Nicholson, Alexa Sanchioni, Emily Mallaber, Viktoriya Skidanova, Timothy J. Gardner

Research output: Contribution to journalArticle

Abstract

Songbirds have long been studied as a model system of sensory-motor learning. Many analyses of birdsong require time-consuming manual annotation of the individual elements of song, known as syllables or notes. Here we describe the first automated algorithm for birdsong annotation that is applicable to complex song such as canary song. We developed a neural network architecture, “TweetyNet”, that is trained with a small amount of hand-labeled data using supervised learning methods. We first show TweetyNet achieves significantly lower error on Bengalese finch song than a similar method, using less training data, and maintains low error rates across days. Applied to canary song, TweetyNet achieves fully automated annotation of canary song, accurately capturing the complex statistical structure previously discovered in a manually annotated dataset. We conclude that TweetyNet will make it possible to ask a wide range of new questions focused on complex songs where manual annotation was impractical.
Original languageEnglish
Number of pages26
JournalBioRxiv
Publication statusSubmitted - 13 Oct 2020
Externally publishedYes

Fingerprint

Dive into the research topics of 'TweetyNet: A neural network that enables high-throughput, automated annotation of birdsong'. Together they form a unique fingerprint.

Cite this