Neurally driven synthesis of learned, complex vocalizations

TitleNeurally driven synthesis of learned, complex vocalizations
Publication TypeJournal Article
Year of Publication2021
AuthorsArneodo EM, Chen S, Brown DE, Gilja V, Gentner TQ
JournalCurrent Biology
ISSN0960-9822
Keywordsbioprosthetics, birdsong, brain machine interfaces, Electrophysiology, neural networks, nonlinear dynamics, Speech
Abstract

Summary Brain machine interfaces (BMIs) hold promise to restore impaired motor function and serve as powerful tools to study learned motor skill. While limb-based motor prosthetic systems have leveraged nonhuman primates as an important animal model,1, 2, 3, 4 speech prostheses lack a similar animal model and are more limited in terms of neural interface technology, brain coverage, and behavioral study design.5, 6, 7 Songbirds are an attractive model for learned complex vocal behavior. Birdsong shares a number of unique similarities with human speech,8, 9, 10 and its study has yielded general insight into multiple mechanisms and circuits behind learning, execution, and maintenance of vocal motor skill.11, 12, 13, 14, 15, 16, 17, 18 In addition, the biomechanics of song production bear similarity to those of humans and some nonhuman primates.19, 20, 21, 22, 23 Here, we demonstrate a vocal synthesizer for birdsong, realized by mapping neural population activity recorded from electrode arrays implanted in the premotor nucleus HVC onto low-dimensional compressed representations of song, using simple computational methods that are implementable in real time. Using a generative biomechanical model of the vocal organ (syrinx) as the low-dimensional target for these mappings allows for the synthesis of vocalizations that match the bird’s own song. These results provide proof of concept that high-dimensional, complex natural behaviors can be directly synthesized from ongoing neural activity. This may inspire similar approaches to prosthetics in other species by exploiting knowledge of the peripheral systems and the temporal structure of their output.

URLhttps://www.sciencedirect.com/science/article/pii/S0960982221007338
DOI10.1016/j.cub.2021.05.035
Category: 
IRG Funded