Neuroscientists Shed Light on How the Brain Links Objects with Meaning and Value

Jacqueline Mitchell (617) 667-7306,

OCTOBER 11, 2018

BIDMC Research Briefs showcase groundbreaking scientific advances that are transforming medical care.

As visual information streams in through the eyes, multiple groups of neurons process different kinds of information. In a new paper published in the journal Neuron, a team of researchers at Beth Israel Deaconess Medical Center (BIDMC) used a sophisticated microscope to watch the activity of hundreds of neurons in the lateral visual association cortex of mice as they viewed images associated either with food or with non-food reward outcomes. The researchers observed that one subset identifies shapes, colors and objects – a colorful roadside sign, for example – while another group assigns meaning to those visual stimuli so that colorful roadside sign becomes a logo signaling the availability of food.

What’s more, the team – led by corresponding author, Mark Andermann, PhD – observed that the object-identifying neurons responded equally to a given stimuli regardless of whether the mice were hungry or sated. In contrast, the neurons tracking an object’s predicted outcome – is it a traffic sign or a restaurant logo that signals the potential availability of food? – showed highly flexible responses that depended on the animals’ hunger state.

“The findings were quite surprising, because we expected that both groups of neurons would be more sensitive to food-predicting cues and hunger state,” said Andermann, a member of the Division of Endocrinology, Diabetes and Metabolism at BIDMC and an Associate Professor of Medicine at Harvard Medical School.  “But that was true only in one subset of neurons. Understanding how this group of neurons becomes attuned to initially arbitrary shapes during learning is at the core of understanding how the brain’s representations of objects in the world become imbued with meaning and value.”

Based on these findings, Andermann’s team developed a hypothetical neural wiring diagram in which different parts of the brain send specific inputs either to identity-tracking neurons or to outcome-tracking neurons.

“Our observations suggest there is a division of labor between neighboring neurons that react to the presentation of the same image,” Andermann added. “One neuron may react to the sudden appearance of an unexpected shape, while another may react to the cue that something valuable is now available. We hope that this work can help us understand how the brain links the identity of a given object with the meaning that we learn to associate with that object.”

Investigators included co-lead authors Rohan N. Ramesh and Christian R. Burgess, as well as Arthur U. Sugden and Michael Gyetvan, of BIDMC’s Division of Endocrinology, Diabetes and Metabolism.

Support was provided by a Davis Family Foundation Postdoctoral Fellowship (CRB), NIH F31 105678 (RNR), NIH T32 5T32DK007516 (AUS), an NIH New Innovator Award DP2 DK105570 and R01 DK109930, a McKnight Scholar Award, a Pew Scholar Award, a Smith Family Foundation Award, and grants from the Klarman Family Foundation, the American Federation for Aging Research, and the Boston Nutrition and Obesity Research Center (MLA).

About Beth Israel Deaconess Medical Center

Beth Israel Deaconess Medical Center is a leading academic medical center, where extraordinary care is supported by high-quality education and research. BIDMC is a teaching affiliate of Harvard Medical School, and consistently ranks as a national leader among independent hospitals in National Institutes of Health funding. BIDMC is the official hospital of the Boston Red Sox.

Beth Israel Deaconess Medical Center is a part of Beth Israel Lahey Health, a health care system that brings together academic medical centers and teaching hospitals, community and specialty hospitals, more than 4,700 physicians and 39,000 employees in a shared mission to expand access to great care and advance the science and practice of medicine through groundbreaking research and education.