The Brain Adjusts Coding Precision To Save Energy

Post by Leanna Kalinowski

The takeaway

During times in which access to food is limited, the brain must conserve energy by reducing information processing. Neurons in the visual cortex conserve energy following food restriction by decreasing the coding precision of visual information.

What's the science?

The brain uses considerable energy when processing information from the world around us. In general, it consumes 20% of the body’s total caloric intake while only constituting around 2% of the body’s total mass. Given the high energy cost of neural processing, coupled with limited energy resources, the brain is thought to have evolved an energy-efficient strategy that maximizes the amount of information transmission per unit of energy used (i.e., ATP, the main source of energy for cells). During times of food scarcity, it is believed that neuronal networks in the brain conserve energy by reducing information processing. While there is evidence that suggests this is likely the case in invertebrates, it is unclear how the mammalian brain regulates information processing and energy use when access to food is limited. This week in Neuron, Padamsey and colleagues used the mouse primary visual cortex as a model system to examine how food restriction affects information processing and energy consumption in neuronal networks.

How did they do it?

Adult male mice were separated into one of two groups: a control group, where they were given unrestricted access to food, and a food-restricted group, where they were given limited access to food that led to a 15% reduction in body weight over 2-3 weeks. First, the researchers examined ATP use in the primary visual cortex by taking whole-cell voltage-clamp recordings of neurons in this brain region while exposing mice to videos of the outdoors (i.e., a natural setting). These recordings measured excitatory currents, (indicative of neural activity in the visual cortex), which pose the greatest ATP burden on the cortex. Next, the researchers measured visual cortex activity using two-photon calcium imaging during a coding precision task. In this environment, mice were shown videos of the outdoors or their home cage, and the researchers applied a maximum likelihood decoder to record neuronal activity to determine how well distinct visual stimuli were encoded in the brain of food-restricted mice. The authors also tested visual discrimination behaviourally using a two-alternative, forced-choice task. Finally, the researchers examined the role of the hormone leptin in information processing and energy use. They did this by (1) measuring serum leptin levels in control vs food-restricted mice and (2) delivering synthetic leptin for 10 days and re-assessing the coding precision task.

What did they find?

First, the researchers found that mouse visual cortex neurons save ATP use by decreasing excitatory postsynaptic currents through a reduction in AMPA receptor conductance. The ATP use associated with these currents was reduced by 29% in food-restricted mice compared to controls, while the rate of neuronal spiking remained similar between the two groups owing to compensatory changes in neuronal input resistance and resting potential. Next, they found that ATP savings were accompanied by a reduction in decoding accuracy and visual discrimination ability in the food-restricted mice, decoding of similar scenes from the same environment was impaired in the food-restricted mice. Taken together, these results demonstrate that when food is scarce, neurons reduce ATP utilization on synaptic currents at the expense of coding precision. Finally, they found that this reduction in coding precision was associated with reduced levels of the hormone leptin, and was reversed following administration of synthetic leptin.

What's the impact?

This study is the first to show that, in times of food scarcity, coding precision in the mammalian visual cortex is reduced in order to save energy. This suggests that the brain is able to dynamically adapt coding precision and energy use in a context-dependent manner, which is overall beneficial to survival. Further work is needed to understand the full impact of food restriction on total brain energy use and information processing, particularly in other cell types and brain regions.

The Role of the Immune System in Neurodegeneration

Post by Andrew Vo

The takeaway

The immune system plays a potential role in neurodegenerative diseases, such as Lewy body dementia. The mechanism by which an inflammatory response is trafficked to the brains of these patients might be a potential therapeutic target.

What's the science?

Lewy body dementia (LBD) is distinguished by the abnormal accumulation of α-synuclein protein in the brain, leading to changes in the memory and behavior of these patients. Animal studies have suggested a role of the immune system in LBD, although the mechanism by which T cells (specialized cells in our bodies that identify and attack substances with foreign “antigens” or markers) migrate and function in the brain remains unknown. This week in Science, Gate et al. examine biological samples collected from living and postmortem patients to investigate the relationship between immunity-related T cells and LBD pathology. 

How did they do it?

The authors first compared cognitive function and cerebrospinal fluid (CSF) levels of neurofilament light chain (a protein marker of neurodegeneration) in LBD patients and healthy controls. To directly test if the immune system interacted with LDB pathology, they then examined postmortem brains of LBD patients for T cell localization with α-synuclein accumulation. Next, they used sequencing analyses to measure T cell binding molecules, namely CXCR4 and CXCL12, in the CSF and meninges (protective membranes inside the skull and surrounding the brain) of LBD patients. Finally, they measured T cell immune activation in CSF samples from LBD patients through stimulation with a pool of peptide proteins derived from α-synuclein.

What did they find?

DLB patients were found to have reduced cognitive function and increased neurofilament light chain levels in their CSF. Examining postmortem brains, T cells were observed to be localized next to α-synuclein deposits and were mostly concentrated near the substantia nigra (a brain region containing dopamine neurons that degenerate in Parkinson’s disease) in LBD patients. This demonstrated a link between the immune system and brain pathology in LBD.

Sequencing of T cells in the CSF revealed greater expression of CXCR4 and CXCL12 markers in LBD patients compared to controls. A similar pattern was found in the meninges of LBD patients and specifically localized to the brain’s vasculature (or blood supply). Increased CXCL12 levels measured in CSF were related to lower cognitive function and higher neurofilament light chain levels. Together, these results show that increased CXCR4-CXCL12 signaling is associated with neurodegeneration in LBD.

Before stimulation with a peptide pool containing proteins derived from α-synuclein, T cells in the CSF showed greater baseline activation in LBD patients versus controls. This immune activation was further enhanced following peptide stimulation. Sequencing of these stimulated cells showed increased expression of interleukin 17A, which is related to inflammatory responses mediated by TH17 cells. Examining postmortem brains, they found T cells localized near TH17 cells in the substantia nigra of LBD patients. These findings suggest the involvement of TH17 cells and immunoreactivity in LBD neurodegeneration. 

What's the impact?

In summary, this study demonstrated a link between the immune system, specifically CXCR4-CXCL12 signaling that recruits T cells to the brain, and neurodegeneration in LBD. The authors highlight a pathological mechanism in human patients that was previously only established in animal models of neurodegeneration. This signaling mechanism may be a potential therapeutic target for the treatment of LBD.

Access the original scientific publication here.

Morning Larks and Night Owls: The Impact of Chronotype

Post by Anastasia Sares

The takeaway

Chronotype, or day/night preference, is a genetically influenced trait that affects how active and alert we are at various points in the day. In the past few years, studies have linked chronotype to both mental and physical health, and people can suffer adverse effects when their daily schedule doesn’t align with their chronotype—a phenomenon called “social jet lag.”

How do we measure chronotype?

The simplest way to measure a person’s chronotype is by asking about their sleeping habits. It’s important to do this for workdays and for free days, as someone might change their habits between the two, and may try to make up for missed sleep on their free days by oversleeping. There are more objective ways to measure chronotype, but they take longer. Actimetry measures the amount of movement a person makes throughout the day by some kind of wearable device. Another popular measure is dim light melatonin onset, or DLMO, which measures how fast people produce the sleep-related hormone melatonin in response to dim light. All of these measures agree with each other pretty well, even an ultra-short version recently developed for use in clinics or as a part of other studies that don’t have much extra time.

Chronotype also varies hugely over the lifespan, with adolescents having very late chronotypes and older adults having earlier chronotypes. People in urban environments have more varied later chronotypes than those in rural environments, and even people’s location within a time zone (eastern vs. western edge) can affect their chronotype due to small differences in sunlight hours.

How is chronotype related to health outcomes?

Late chronotypes (night owls) are prone to more adverse health outcomes like hypertension and depression. However, it is not clear that being a night owl causes these effects. The less our daily schedule is synchronized with our natural sleeping rhythm, the more stress we experience, and this stress is what can lead to health problems. This is called “social jet lag,” and it is more likely to affect night-owls because societal structures tend to follow earlier schedules (the writer of this article, being a moderately late chronotype, still remembers getting up at 6:15 am during high school with much chagrin). Social jet lag is at its most extreme in shift-workers, like hospital staff who have work during the night.

Chronotype isn’t just a matter of psychology; every cell in the body has a circadian rhythm and is affected by these day-night cycles. Under social jet lag, the body’s cellular clocks adjust at different rates and cannot keep up with the switch between workdays and free days. It is these unsynchronized cellular clocks that may be responsible for the health risks of social jet lag. Worldwide, workers’ sleep habits when working from home during the COVID-19 pandemic shifted later, indicating that society as a whole had been under social jet lag.

How can we lessen the impact of social jet lag?

Chronotype can be manipulated to some degree. Being outside during the day can help to naturally regulate your sleep cycle. On the other hand, exposure to light before bed can disturb natural sleep-wake cycles, and so it is helpful to limit your evening screen time if you want to shift your schedule earlier (put away your phone!). A regular sleep schedule will also lower your risk for adverse health effects, especially if your work hours are very early or very late compared to your natural rhythm. However, don’t sacrifice your free-day sleep in order to keep a normal wake-up time. Rather, try to go to sleep at the same time each night.

On a societal level, we can take chronotype into account in school start times and in assigning shifts to workers, something that is already starting to be done. Some researchers are also calling for governments to abolish daylight savings time, which can cause a host of sleep-related problems.

What's the impact?

In our industrialized society, many people live predominantly indoors and are more detached from natural day-night cycles, making their chronotypes later and more varied. It is important to provide daily structures that accommodate differences in chronotype—this will have a significant impact on human health and well-being, as well as increasing work productivity and quality.

References +

  1. Roenneberg, T., Pilz, L. K., Zerbini, G., & Winnebeck, E. C. (2019). Chronotype and social jetlag: A (self-) critical review. Biology, 8(3), 1–19. https://doi.org/10.3390/biology8030054
  2. Shahid, A., Wilkinson, K., Marcu, S., & Shapiro, C. M. (2011). Munich Chronotype Questionnaire (MCTQ). STOP, THAT and One Hundred Other Sleep Scales, 245–247. https://doi.org/10.1007/978-1-4419-9893-4_58
  3. Ghotbi, N., Pilz, L. K., Winnebeck, E. C., Vetter, C., Zerbini, G., Lenssen, D., … Roenneberg, T. (2020). The µMCTQ: An Ultra-Short Version of the Munich ChronoType Questionnaire. Journal of Biological Rhythms, 35(1), 98–110. https://doi.org/10.1177/0748730419886986
  4. Kalmbach, D. A., Schneider, L. D., Cheung, J., Bertrand, S. J., Kariharan, T., Pack, A. I., & Gehrman, P. R. (2017). Genetic Basis of Chronotype in Humans: Insights From Three Landmark GWAS. Sleep, 40(2). https://doi.org/10.1093/sleep/zsw048
  5. Wittmann, M., Dinich, J., Merrow, M., & Roenneberg, T. (2006). Social jetlag: Misalignment of biological and social time. Chronobiology International, 23(1–2), 497–509. https://doi.org/10.1080/07420520500545979
  6. Hulsegge, G., Loef, B., van Kerkhof, L. W., Roenneberg, T., van der Beek, A. J., & Proper, K. I. (2019). Shift work, sleep disturbances and social jetlag in healthcare workers. Journal of Sleep Research, 28(4). https://doi.org/10.1111/jsr.12802
  7. Korman, M., Tkachev, V., Reis, C., Komada, Y., Kitamura, S., Gubin, D., … Roenneberg, T. (2020). COVID-19-mandated social restrictions unveil the impact of social time pressure on sleep and body clock. Scientific Reports, 10(1), 1–10. https://doi.org/10.1038/s41598-020-79299-7
  8. Roenneberg, T., Wirz-Justice, A., Skene, D. J., Ancoli-Israel, S., Wright, K. P., Dijk, D. J., … Klerman, E. B. (2019). Why Should We Abolish Daylight Saving Time? Journal of Biological Rhythms, 34(3), 227–230. https://doi.org/10.1177/0748730419854197