Decoding of Natural Sounds in Congenitally Blind Individuals

Post by Stephanie Williams

What's the science?

Previous work has shown that patterns of brain activity measured with functional magnetic resonance imaging (fMRI) data can be used to classify sounds. Typically, these studies are performed with complex sounds (traffic, nature sounds) as the stimuli, and the classifiers (machine learning models) are built to predict groups of sounds. For example, fMRI could be used to predict whether an individual was listening to traffic noise or to a group of people speaking. One region that can be used for this decoding of auditory information is the early “visual” cortex (V1, V2, V3), which suggests that early visual cortex processes non-visual auditory information. Earlier work on auditory decoding in the early visual cortex was performed in sighted individuals only, leaving open the question of whether the same auditory information could be decoded from the visual cortex of blind individuals. This week in Current Biology, Vetter and colleagues show that sound decoding can be performed in both sighted and blind individuals with similar accuracy.

How did they do it?                             

The authors collected fMRI data from 8 congenitally blind individuals while they listened to three different natural scene sounds. The authors compared these data to previously published data (N=10) from sighted individuals, which was collected with similar stimuli and MRI acquisition parameters. The sounds consisted of 1) a bird singing and a stream 2) people talking without any clear semantic information and 3) traffic noise with cars and motorbikes. Participants listened to four rounds (‘runs’) of 18 randomized repetitions of the three sounds. The authors focused their analysis primarily on three visual areas called V1, V2, and V3, and further subdivided these into three eccentricities: foveal, peripheral and far peripheral regions. They also conducted some whole-brain analyses, searching on a voxel-by-voxel basis across the brain, rather than using predefined regions, for voxels that could be used to predict which sounds the subjects were listening to. The authors used multivariate pattern analysis (MVPA) to predict which of the three sounds participants were listening to based on the activity patterns derived from the fMRI data. They trained their classifier on three of the four runs and tested on the left-out fourth run for each subject. They compared their decoding accuracy results from the early visual cortex to the auditory cortex (which acted as a positive control) and motor cortex (negative control). The authors then analyzed how the sounds were represented in the eccentricity pattern across the early visual cortex. 

What did they find?

The authors successfully decoded natural sounds from the early visual cortex of congenitally blind individuals, showing that visual imagery and experience is not a prerequisite for the representation of auditory information in the early visual cortex and that there’s a similar cortical organization for auditory feedback in visual cortex between sighted and congenitally blind individuals. The authors saw both higher decoding accuracy for the early visual cortex and lower decoding accuracy for the auditory cortex in the blind group compared in the sighted group. This result indicates that visual deprivation may cause sound representation to be more distributed across the auditory and visual cortex in congenitally blind individuals. When the authors analyzed how eccentricity affected decoding results, they found that they had higher decoding accuracy for peripheral regions of visual cortex compared to foveal regions. This finding is supported by previous research showing that the peripheral visual cortex is connected to many non-visual brain regions. Interestingly, the authors point out that none of the three sounds induced a statistically significant response in the overall brain activity while listening to sounds compared to at rest in any of the 3 early visual areas. This suggests that their decoding accuracy is driven by small activity differences across voxels in each region of interest

stephanie.png

What's the impact?

The authors extend previous work on auditory decoding in the early visual cortex to include blind individuals, showing that there may be a similar organization of auditory information in the early visual cortex of both sighted and blind individuals. This study provides further evidence that the early visual cortex is involved in functions other than the feedforward processing of visual information in both sighted and blind individuals.  

Vetter_quote_Jun23.jpg

Petter et al. Decoding Natural Sounds in Early “Visual” Cortex of Congenitally Blind Individuals. (2020). Access the original scientific publication here.

Abnormal Circadian Rhythm Can Predict Parkinson’s Disease

Post by D. Chloe Chung

What's the science?

Parkinson’s disease is a debilitating neurodegenerative disease that is characterized by the loss of dopaminergic neurons in the brain in an area called the substantia nigra. In addition to severe motor symptoms, Parkinson’s patients often experience a disrupted sleep-wake cycle, sometimes early on in the disease course. However, no study has actually measured behavioral markers of circadian rhythm to find out whether disruption of the internal biological clock can precede the development of Parkinson’s disease. This week in JAMA Neurology, Leng and colleagues reported that abnormal circadian rhythm in healthy older adults can be regarded as an early sign of developing Parkinson’s disease in the future.

How did they do it?

The authors enrolled almost 3,000 healthy older males (average age of 76.3 years old) for the initial evaluation of circadian rhythm and followed up with them for the following 11 years. The participants were mostly Caucasians and lived in a community setting. At the beginning of the study, participants wore a wristband-like device that can track any movement during sleep. For a minimum of three separate 24-hour periods, the monitoring device recorded various circadian rhythm parameters of wake and rest. Sleep efficiency was determined based on the percentage of time the participants were asleep after “lights off”. Other important factors such as sleep apnea or periodic limb movement have been also taken into account. During the 11-year follow-up, participants were subject to in-person visits or questionnaires five times and reported whether they have been diagnosed with Parkinson’s disease, as well as their medication history. 

What did they find?

While none of the participants had Parkinson’s disease at the beginning of this longitudinal study, 78 out of 2930 study subjects were later diagnosed with Parkinson’s disease over the course of 11 years. After adjusting for variables such as demographics, education level, medication or substance usage, comorbidities, and baseline cognition, the authors found a strong association between the decrease in three out of four circadian rhythm parameters and the development of Parkinson’s disease later in life. Strikingly, participants who showed the most irregular circadian rhythm were three times more likely to develop Parkinson’s disease compared to those with the regular circadian rhythm. These findings indicate that decreased circadian rhythmicity can act as an important early symptom of Parkinson’s disease

PD_image_Jun23.png

What’s the impact?

This study is the first one to analyze a large cohort for a long time and reveal that circadian rhythm abnormalities in healthy adults are associated with their chance of developing Parkinson’s disease as they get older. Therefore, the authors suggest that the detection of abnormal circadian rhythm in healthy adults may help early prediction and diagnosis of Parkinson’s disease, ultimately allowing for early disease intervention. It will be interesting to further investigate whether circadian rhythm might directly contribute to the onset of Parkinson’s disease.


Leng et al. Association of circadian abnormalities in older adults with an increased risk of developing Parkinson disease (2020). Access the original scientific publication here.

The Role of the Anterior Cingulate Cortex in Effort-Based Decision Making

Post by Lincoln Tracy

What's the science?

Decision making often requires comparing possible options, such as the amount of effort required to be exerted to achieve the desired outcome. The anterior cingulate cortex (ACC) is a part of our brains that plays a role in weighing up the costs of different options prior to making a decision. However, it is not clear how the ACC mediates these cost-benefit evaluations. This week in The Journal of Neuroscience, Hart and colleagues used chemogenetics and calcium imaging to determine the mechanism underlying the role of the ACC in effort-based decision making.   

How did they do it?

Male rats were trained to exert effort to receive a reward (sucrose pellets) in a lever-pressing task that progressively became more difficult. In the initial stages, one lever press earned a single sucrose pellet, but in later stages, the required number of lever presses for each subsequent pellet increased. After completing the training, the rats were tested under two different conditions. The first was a ‘no-choice’ condition, where rats could only receive sucrose pellets after pressing a lever. The second condition involved choosing between two food options—pressing a lever to receive a sucrose pellet or taking standard rat chow from a small bowl. Prior to testing, the authors injected either inhibitory or excitatory Designer Receptors Exclusively Activated by Designer Drugs (DREADDs) into the ACC via a surgically-inserted cannula to alter its activity. They examined the number of times the rats pressed the levers in each condition to determine their food preferences. The authors also implanted calcium imaging recording equipment into the rats’ brains to compare the activity of the ACC cells during the choice and ‘no choice’ tasks.

What did they find?

Rats made more lever presses during the ‘no choice’ condition compared to the choice condition, where standard chow was freely available from a bowl. Rats injected with DREADDs that either excited or inhibited ACC activity made fewer lever presses during the choice condition, but not the ‘no choice’ condition. Lever pressing patterns did not differ during the ‘no choice’ condition. Therefore, the authors concluded that ACC interference disrupted high-effort lever-pressing behavior when there is a choice of two food options available. Calcium imaging revealed that pre-lever pressing cell activity was greater during the ‘no choice’ condition, compared to the choice condition. That is, ACC neurons were less responsive prior to the onset of lever pressing when the rats had a choice of taking chow from the bowl or pressing the lever for a sucrose pellet. The neurons were also less responsive when rats collected the reward when they had a choice of lever pressing vs. chow.

ACC_img_Jun16.jpg

What's the impact?

This was the first study to examine the role of the ACC in a cell-specific and temporally restricted manner. Taken together, the findings suggest that the ACC regulates effort-based decision making by providing a stable population code to discriminate between the usefulness of the available options. Further study into the contributions of the ACC in effort-based choices may enhance our understanding of the motivational mechanisms in disorders such as depression and addiction.

Hart et al. Chemogenetic modulation and single-photon calcium imaging in anterior cingulate cortex reveal a mechanism for effort-based decisions. Journal of Neuroscience (2020). Access the original scientific publication here.