Modeling the Dose-Dependent Effects of Ketamine

Post by Lani Cupo

The takeaway

Ketamine produces sedation and disassociation at low doses and anesthesia at high doses accompanied by specific patterns of brain activity characteristic of each state. Disinhibition of neural circuits leading to a global increase in excitation may underlie both low and high-dose states.

What's the science?

The dose-dependent effects of ketamine are well known, with low doses producing psychoactive effects and high doses producing anesthesia. Likewise, it is known that ketamine administration produces patterns of brain activity consistent with gamma oscillations (associated with cognitive function) at low doses, but these are interrupted by slow-delta oscillations (associated with deep sleep) at higher doses. Nevertheless, it’s still an open question how cellular processes relate to the emergence of these patterns of brain activity. This week in PNAS, Adam and colleagues present a biophysical model to simulate cellular changes and observe the effect on brain oscillatory behavior, finding that interactions between inhibitory and excitatory neurotransmitters play a role in the distinctive patterns of brain oscillations observed following ketamine exposure.

How did they do it?

First, the authors acquired electroencephalogram (EEG) data from a human volunteer and a nonhuman primate who were administered ketamine at doses sufficiently high to induce anesthesia. Then, they created a biophysical model (a simulation of biological processes) representing interactions between excitatory pyramidal neurons and inhibitory interneurons. The model focused on the activity of NMDA receptors (a major receptor of interest for ketamine), allowing NMDA receptors to change state (“open” ones can become “closed”) based on other activity in the system. Specifically, in biology, ketamine is known to block the excitatory NMDA receptors, which is interesting given the fact that low levels of ketamine create an excitatory state. It is thought that this is because ketamine blocks inhibitory neurons from firing, leading to an overall excitatory state. The authors tested this hypothesis in their biophysical model. Then, they examined what changes in the activity of neurons could explain gamma oscillations seen following ketamine exposure. Next, they examine why slow-wave delta oscillations emerge when ketamine is “increased” in the model. 

What did they find?

First, the authors found characteristic patterns of EEG activity: at low levels of ketamine, gamma oscillations, representing cortical activity, were evident, whereas at higher levels of ketamine exposure, gamma oscillations were interrupted by delta waves, characteristic of sleep.

Then, using the biophysical model, the authors found evidence for the cellular mechanisms contributing to the gamma oscillations. They found evidence that ketamine blocked NMDA receptors on inhibitory interneurons, contributing to an overall excitatory state. Specifically, some neurons have a subthreshold excitatory state, meaning at baseline they are close to firing, but not quite over the threshold that makes them fire. Blocking these neurons’ NMDA receptors with ketamine can shut them down. When these neurons release inhibitory neurotransmitters, shutting them down leads to a downstream increase in excitatory neurotransmitter release, or a global increase in excitation referred to as disinhibition, because the excitatory neurons are no longer inhibited.

With their biophysical model, the authors next observed that this global excitation gave rise to gamma patterns of brain activity. This behavior is dependent on inhibitory GABA-ergic neurons, some of which are not blocked by ketamine, which can contribute to individual neurons firing at a gamma timescale. These individual neurons are synchronized across the brain, giving rise to global gamma wave activity.

In their model, the authors also find that higher doses of ketamine can induce “down-states” associated with slow-wave delta oscillations. Neurons with background excitatory states shut down under increased ketamine administration while other neurons have a reduced timescale of firing, contributing to the slower delta waves.

What's the impact?

The authors demonstrate that ketamine can produce characteristic brain waves in a biophysical model by blocking NMDA receptors. Their findings increase our understanding of the cellular mechanisms contributing to global brain activity.

 Access the original scientific publication here.

Identifying an fMRI Biomarker for Cognitive Decline in Alzheimer’s Disease

Post by Kelly Kadlec

The takeaway

Two fMRI-based metrics previously used to evaluate cognitive decline with age may also be useful for assessing both the risk and severity of Alzheimer’s disease. These scores can help distinguish between rates of memory decline in healthy individuals and those with varying levels of risk for developing Alzheimer’s.

What's the science?

A formal diagnosis of Alzheimer’s disease (AD) is often preceded by progressive stages of cognitive decline. At each of these stages, patients are at varying risk for advancing to AD, but assessing the risk of an individual is difficult due to a high degree of heterogeneity in neurocognitive aging. Previously, functional magnetic resonance imaging (fMRI) contrast maps for novelty and memory tasks have yielded two corresponding single-value scores that have been proposed as biomarkers of neurocognitive aging. This week in Brain, Soch and colleagues compare these fMRI-based scores in healthy individuals, individuals with AD, and individuals in different risk categories for developing AD, to assess their ability to distinguish between clinical and healthy rates of cognitive decline.

How did they do it?

This study comprised five groups of individuals: healthy controls with no family history of AD, healthy individuals with a first-degree relative with AD, patients with AD, and patients in one of two symptom-based risk states for AD: mild cognitive impairment (MCI) or subjective cognitive decline (SCD), where MCI is considered the more severe.

The authors collected fMRI data from the participants during image-based novelty and memory tasks. They used the resulting contrast maps to calculate two scores: Functional Activity Deviation during Encoding (FADE) and Similarity of Activations during Memory Encoding (SAME). Additionally, psychometric and genetic testing was done for each participant, and a subset had amyloid positivity testing. The authors hypothesized that increasing FADE scores and decreasing SAME scores would be associated with worse AD severity and higher risk for AD.

What did they find?

The authors found that memory and novelty-based FADE and SAME scores could be used to distinguish between the different participant groups and also correlated with known risk factors and cognitive assessments.

The authors reported that increasing risk for AD corresponded to larger deviations in FADE and SAME scores (i.e. more atypical fMRI results). Only memory-based FADE and SAME scores differentiated between the two more severe clinical groups (AD and MCI) and all other participant groups, and only novelty-based scores distinguished between AD and MCI patients.

The authors confirmed that FADE and SAME scores for memory and novelty tasks corresponded to other currently used psychometric tests of cognitive decline and AD severity. The authors also found that within the AD-related participants, higher FADE and lower SAME scores corresponded to the presence of an AD genotype. In addition, the authors found that especially novelty-based scores were sensitive to amyloid positivity.

To demonstrate the potential clinical value of these fMRI biomarkers, the authors used FADE and SAME scores to predict diagnostic groups for the participants and classified each group with above-chance accuracy. They also used these scores to predict the presence of an AD genotype in participants with AD relatives.

What's the impact?

Proper assessment of AD risk and severity is challenging, and this study proposes two promising neural biomarkers. These fMRI-based scores distinguished between differing stages of the disease and predicted other proposed risk factors for AD. This knowledge is critical for choosing the correct treatment routes and improving diagnosis accuracy earlier in the development of AD. Further, a longitudinal study is needed to determine how predictive they are of future outcomes (i.e. MCI progressing to AD). 

Access the original scientific publication here.

How Brain Connectivity Contributes to Different Types of Goal Pursuit

Post by Lila Metko

The takeaway

There is lower connectivity between movement-related brain regions in individuals with a higher propensity to use the ‘prevention system’ to pursue goals that prevent a negative outcome rather than the ‘promotion system’ to pursue goals that have a positive outcome.

What's the science?

According to regulatory focus theory, there are two major cognitive-motivational systems involved in accomplishing goals: the promotion system, involved in achieving hopes and dreams, and the prevention system, involved in fulfilling duties and obligations. The promotion system is oriented towards making good things happen, while the prevention system is focused on preventing negative things from happening. Individual differences in these systems of self-regulation have been linked to psychopathology. There are previously established brain regions that have been shown to be involved in each of these systems and they share some overlap. This week in PNAS Nexus, Kim and colleagues used functional magnetic resonance imaging (fMRI) data to create a predictive network model capable of predicting differences in propensity towards regulatory focus systems based on connectivity between neural structures. 

How did they do it?

The authors studied 1,307 university students enrolled in the Duke Neurogenetics Study. They gave the participants the Adolescent Regulatory Focus Questionnaire (RFQ), a questionnaire that measured their inclination towards a promotion system or prevention system way of attaining goals. fMRI scans were taken of the participants in both resting state (patients awake with eyes open but with no task) and during emotional face matching, card guessing, working memory, and face naming tasks. The authors then estimated general functional connectivity for each participant by combining resting state and task fMRI data and regressing out task-related events, to obtain more reliable data than if they used resting fMRI alone. A predictive model of regulatory focus orientation was created using correlations from participants’ functional connectivity to their scores in the RFQ. A group of participants was left out of the creation of the model to test its ability to predict regulatory focus orientation. To test this, they compared actual scores to predicted scores in the group that was left out. 

What did they find?

The model generated by the authors was predictive of prevention but not promotion scores. It’s possible the lack of prediction for promotion scores is because promotion-orientated processes do not require as complex cognitive processes as prevention and thus may not be detectable in these measures of functional connectivity. It was found that lower functional connectivity between association cortices (brain regions involved in understanding sensory information and planning a behavioral response) was correlated with increased prevention. It had been previously established that these association regions were important for prevention behaviors, but interestingly the authors found a novel region associated with prevention; the primary motor cortex. More than half of the measures of brain connectivity that had a negative correlation with prevention scores involved the primary motor cortex.  

What's the impact?

This study is the first to show that the primary motor cortex, a region involved in initiating voluntary movements, contributes to prevention system function. It is also the first study to create a predictive model for these types of preventative behaviors in a large sample. Importantly, regulatory focus orientations are predictive of vulnerability to psychopathology and improper function of these regulatory systems may also be predictive of generalized anxiety disorder and depression. Thus, the study of brain regions involved in regulatory focus is of high clinical significance.  

Access the original scientific publication here.