Adam Charles – Data science in neuroscience: from sensors to theory
Add to Calendar
When:
November 3, 2020 @ 12:00 pm – 1:00 pm
2020-11-03T12:00:00-05:00
2020-11-03T13:00:00-05:00
Abstract: The human brain has ~100×10^9 neurons. Unlike the liver, we typically believe that this number is important for cognition and learning, as evidenced by the significant variability of collocated neuron’s activity. In furthering our understanding of neural systems, we have sought 1) technologies that constantly eclipse the capabilities of the past 2) Meaningfully simplified models that can distil high-dimensional data into human readable results. Data science algorithms and theory are becoming a centerpiece in both these domains, driving new computational and adaptive data acquisition and demonstrating fundamental capabilities of mathematical models of computation. In this talk I will discuss two projects that highlight the potential for data science to have high impact in important, evolving problems in neuroscience.
First I will discuss “ML at the sensor”, i.e., adaptive sampling approaches that maximize neuron yield using state-of-the art probes in non-human primates. This work points to the important role algorithms will have in bypassing sensing and bandwidth constraints given the unique and challenging conditions of real-time recording and experimentation of high-dimensional signals. Next I will discuss a fundamental model for interpreting cognitive data both in neuroscience and psychology: recurrent neural networks. Often such models are used based on mechanistic arguments (i.e., the brain is recurrent), however the fundamental properties of these mathematical objects should play a more vital role in when and how they are deployed to elicit understanding. One such property is the information retention capability, or the “short-term memory” (STM) of recurrent systems: a system cannot be considered a good model for a task that fundamentally needs more information than the system can hold! To these ends I explore the particular case of echo-state network and rigorously analyze how properties of the data can impact the STM of random RNNs.
Bio: Adam Charles is an Assistant Professor in the Department of Biomedical Engineering at Johns Hopkins University. Adam completed his Masters and Bachelors at the Cooper Union in NYC, followed by a PhD in Electrical and Computer Engineering under the guidance of Chris Rozell at Georgia Tech. With a background in engineering, Adam continued to a Post-doctoral position with Jonathan Pillow at the Princeton Neuroscience Institute. Adam’s interests are at the intersection of statistical signal processing, computational and theoretical neuroscience, and data science, with a focus on computational imaging and the development of next-generation algorithms for extracting meaning from complex neural data.