A Critical Period for Development of the Prefrontal Cortex

Post by Megan McCullough

The takeaway

Thalamic activity during a critical period in development plays a role in shaping the prefrontal cortex. Inhibiting thalamic input to the prefrontal cortex during this critical period creates long-lasting anatomical and behavioral effects that persist into adulthood.

What's the science?

Critical periods are windows of time after birth where there is increased plasticity in the brain. Experiences during this period are essential for proper development and disruptions to learning during this time can have long-lasting consequences. Although previous research identifies a critical period for signals from the thalamus leading to the development of the sensory cortex, it is unknown whether a critical period exists for thalamic activity shaping the development of the prefrontal cortex. The prefrontal cortex is a crucial brain region for study, as this region is functionally responsible for processes like memory and attention, and prefrontal cortex dysfunction is linked to neurodevelopmental disorders. This week in Nature Neuroscience, Benoit and colleagues aimed to examine the role of thalamic activity in the maturation of the prefrontal cortex by inhibiting the thalamus in adolescent mice.

How did they do it?

First, the authors inhibited the mediodorsal and midline thalamus in a group of adolescent mice by injecting the mice with a virus containing hM4Dgi, a designer inhibitory receptor. To test the long-term cognitive effects of inhibiting thalamic activity in adolescence, these mice completed working memory and attention tasks  – tasks that rely on the prefrontal cortex – in adulthood. The authors then assessed changes in prefrontal cortex circuit function due to thalamic inhibition via slice physiology, measuring excitatory and inhibitory activity in a layer that receives projections from the thalamus. Next, a fluorescent protein (GFP), was injected into the adult mice to identify any anatomical changes in neuronal tracts from the thalamus.

These tests were administered to the adult mice who had experienced the thalamic disruption in adolescence as well as to adult mice who had experienced the thalamic disruption in adulthood. The authors studied both age groups to identify the existence of a critical period for prefrontal cortical maturation due to signals from the thalamus.

What did they find?

The authors found that inhibiting the thalamus during adolescence leads to deficits in circuit function in the prefrontal cortex as well as cognitive deficits that persist into adulthood. These deficits were not found when the thalamus was inhibited in adult mice. These results suggest that there is a critical period in development for the maturation of the prefrontal cortex through excitatory input from the thalamus; disrupting these signals during this time period has long-lasting anatomical, functional, and cognitive effects. Specifically, in the adult mice that underwent thalamic inhibition during their adolescence, there was reduced excitatory drive to pyramidal cells in the prefrontal cortex, a reduction in the density of thalamic-prefrontal projections, and reduced performance on memory and attention tasks. Importantly, the authors also found that the behavioral deficits seen in adulthood could be rescued through excitation of the thalamus in the adult mouse.

What's the impact?

This study is the first to show that thalamic input to the prefrontal cortex during a critical period in development is essential for proper prefrontal maturation. Inhibiting activity in the thalamus led to deficits that persisted into adulthood. This has therapeutic relevance as neurodevelopmental disorders such as schizophrenia are linked to disruptions in proper cortical development thought to occur in adolescence.   

What is Neurofeedback and How is it Used?

Post by Ewina Pun

Sensory feedback and neuroplasticity

Have you ever tried to walk in a straight line with your eyes closed, or eat a meal in complete darkness? Movement control becomes significantly harder when sensory feedback is limited. When we’re learning, we rely heavily on various forms of feedback, such as tactile, visual, and audio feedback. For some individuals, sensory feedback is lost due to injuries and neural deficits. Fortunately, our brain has the ability to quickly adapt to new circumstances - also known as neuroplasticity.

Some researchers study neuroplasticity with neurofeedback: a form of biofeedback that provides a representation of the recorded neural activity as a visual, auditory, or other signal back to the individual in real-time to facilitate self-regulation. In other words, one can modulate their brain activity by integrating information about one’s own brain activity to elicit a different behavior or pathology. Neuroplasticity not only enables cognitive and perceptual learning but also forms the basis of clinical neurorehabilitation. And researchers have begun to investigate the use of neurofeedback for treatments of brain and behavioral disorders.

Neurofeedback to study brain networks

Research suggests that there is a specific network of brain regions involved in self-regulation. Changes in brain activity during neurofeedback can be seen via changes in electroencephalography (EEG) amplitudes and blood oxygen level-dependent signals in functional magnetic resonance imaging (fMRI) in brain areas such as the anterior insular cortex, anterior cingulate cortex, dorsal lateral prefrontal cortex, inferior parietal lobule, basal ganglia, and thalamus. This indicates that neurofeedback involves networks involved in reward processing, cognitive control, and learning and memory.

Neurofeedback for clinical application

Neurofeedback has been extensively studied to treat attention deficit hyperactivity disorder (ADHD). For instance, neurofeedback therapy can provide information about the patient’s brain state to allow the patient to consciously match to and maintain the desired brain state through reinforcement. EEG-based Neurofeedback helps reduce the elevated low-frequency (theta/delta) synchronization observed in children with ADHD and improved ADHD symptoms.

Studies have also found several benefits of neurofeedback for stroke recovery. One study showed that severely impaired, chronic stroke patients learned to upregulate ipsilesional sensorimotor rhythm (SMR) through controlling a brain-computer-interface (BCI) with a hand orthosis, and the neurofeedback training also improved their upper limb functions. In addition, virtual reality in combination with EEG-BCI can offer a more immersive representation of neurofeedback for stroke rehabilitation and increase the perceived embodiment (sense of control as their own). However, more robust evidence is needed to support the efficacy of neurofeedback therapies for ADHD and stroke rehabilitation.

Neurofeedback for BCI motor control

The use of neurofeedback is not exclusive to self-regulate one’s neural state. It is also widely used in BCI for closed-loop motor control. In motor BCI, individuals can directly manipulate an external device with their neural signals being translated into action commands. For example, after training, a BCI maps neural activity patterns to control commands in real-time and provides visual feedback of the current position of the external device being controlled. Such real-time feedback allows users to reevaluate, refine and correct their control. With trial-and-error, people with paralysis can decide which set of mental motor imagery (e.g., imagining controlling a joystick or a computer mouse) was most effective for control. With minimal practice using an intracortical BCI, people with paralysis were able to coordinate movements of a seven degrees-of-freedom robotic arm, control a computer cursor, or functionally stimulate muscles for movement restoration.

Bidirectional closed-loop neurofeedback

In addition to providing just visual feedback, researchers try to directly restore tactile and cutaneous sensations in arm and hand function, which is important during grasping or manipulation of objects. A bidirectional BCI for motor control refers to a system that (1) decodes neural signals in the motor cortex into commands to control a device and (2) provides somatosensory feedback by delivering electric stimulation patterns to the primary somatosensory cortex (S1) or the spinal cord. Microstimulation of the cortical surface of M1 and S1 using high-density electrocorticography (ECoG) has provided tactile sensations such as “buzzing”, “tingling”, “brushing”, “light tapping,” or a “feeling of movement” to participants with paralysis. An adaptive deep brain stimulation framework targeted at the thalamus is capable of concurrent biomimicry stimulation and sensing for better closed-loop therapies for psychiatric disorders, epilepsy, or chronic pain. More research is needed to quantify perceptual qualities and improve naturalistic sensations to provide functional benefits for BCI control.

What’s next?

Neurofeedback is a novel and valuable way to study brain function and neuroplasticity. Further, neurofeedback has exciting potential as a therapeutic tool. Although researchers have begun to understand some of the mechanisms underlying neurofeedback, future research will likely further clarify the psychological and neural underpinnings of self-regulation, which will help to design more-effective neurofeedback technologies for treating a variety of diseases and conditions.

References +

Sitaram et al. Closed-loop brain training: the science of neurofeedback. Nature. (2016)

Saha et al. Progress in Brain Computer Interface: Challenges and Opportunities. Front. Syst. Neurosci. (2021)

Ramos-Murguialday, et al. Brain–machine interface in chronic stroke rehabilitation: a controlled study. Ann. Neurol. (2013).

Zotev et al. Self-regulation of human brain activity using simultaneous real-time fMRI and EEG neurofeedback. Neuroimage (2014).

Collinger et al. High-performance neuroprosthetic control by an individual with tetraplegia Lancet (2013).

Hochberg et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature (2006).

Hughes et al. Bidirectional brain-computer interfaces. Handbook of Clinical Neurology (2020).

Ansó et al, Concurrent stimulation and sensing in bi-directional brain interfaces: a multi-site translational experience. Journal of Neural Engineering (2022).

Predicting and Tracking Hallucinations

Post by Leanna Kalinowski

The takeaway

Hallucinations, a common symptom in disorders like schizophrenia, have traditionally been difficult to study given that they cannot be directly observed. Scientists have successfully applied a computational framework to screen for and track hallucinations.

What's the science?

Hallucinations, which are perceptions that occur in the absence of a stimulus, are a hallmark sign of several psychosis-spectrum disorders, such as schizophrenia. Traditionally, hallucinations have been difficult to study given that scientists cannot directly observe them. However, the rise of the field of computational psychiatry field now allows scientists to use mathematical frameworks to better understand the neurological underpinnings of psychosis-spectrum disorders.

One such framework, predictive processing theory, shows promise as a tool for better understanding hallucinations. In this framework, “perception” is described as the process of determining the cause of one’s sensations by considering (1) one’s internal expectations of their surroundings based on prior knowledge (called “priors”) and (2) the available sensory evidence that is weighted by the participant’s certainty in the source of the information. Evidence suggests that hallucinations arise when the priors are over-weighted compared to incoming sensory evidence, but this exact relationship is unclear. This week in Biological Psychiatry, Kafadar, Fisher, and colleagues used mathematical modeling to determine the relationship between these over-weighted priors and susceptibility to hallucinations.

How did they do it?

First, 458 participants were screened for the presence of auditory hallucinations and separated into two groups: hallucinators and non-hallucinators. Then, they completed the Auditory Conditioned Hallucinations task, where participants are first trained to associate a visual pattern with an auditory tone. Once the association is learned, the researchers then recorded the conditioned hallucination rate, which is the proportion of times that the participants reported hearing the tone when the visual pattern was displayed, without the tone. Finally, a subset of the hallucinators group was invited back to the lab 6-12 months later to determine whether performance on this task is related to changes in symptom severity.

What did they find?

The researchers found that conditioned hallucination rates were a predictor of the frequency of self-reported hallucinations. These rates were sensitive to hallucination state and the over-weighting of priors compared to incoming sensory evidence. They also found that conditioned hallucination rates and prior weighting are higher in the hallucinator group. Changes in these rates were further associated with changes in the frequency of reported hallucinations at the follow-up test, suggesting that this approach may inform future clinical screening tools.

What's the impact?

Taken together, these results indicate that conditioned hallucination rates and over-weighting of priors can be used as markers of hallucination status. This can be useful when tracking the development, trajectory, and treatment response of psychosis-spectrum disorders.