Brain Training: Do Computerized Games Improve Cognitive Ability?

Post by Flora Moujaes

What's the science?

In recent years, a billion-dollar industry has emerged.  It posits that you can enhance your cognitive ability, and even your IQ, simply by completing computerized games. But does science really support this claim? A limited number of studies have shown that training on a cognitive task can improve your performance on other tasks that recruit similar cognitive mechanisms; however many studies have failed to replicate these results. Even when training involves different tasks that engage multiple cognitive systems, research has shown that participants just get better at completing the specific tasks, rather than improving their general cognitive ability. This week in the Journal of Experimental Psychology, Stojanoski and colleagues conducted a large-scale online study, investigating whether brain-training tasks improve cognitive ability.

How did they do it?

The authors recruited 11,000 individuals online from a total of 145 countries. Over 1000 participants were active users of commercially available brain training programs including Luminosity, Peak, Elevate, Brian HQ and Neural Nation. The brain-training participants had used such programs for an average of 8.5 months. The authors assessed the participants’ general cognitive function using the Cambridge Brian Sciences online assessment battery, which measures cognitive skills such as working memory, verbal ability, reasoning, decision-making, and inhibitory control.

What did they find?

To see if brain training produces generalizable improvements in high-level cognition, the authors compared whether on average the 1009 participants with an active history of active brain training performed better than a demographically matched group who had no such history. The authors found that there was no difference in performance between active brain trainers and non-brain trainers, even when they compared non-brain trainers to those brain trainers who had been training for longest. They also examined whether the amount of time spent using brain training programs increased cognitive performance, but found that there was no relationship between self-reported length of time participants devoted to brain training and cognitive performance.

flora.png

What's the impact?

The findings of this study do not support computerized games as a way to improve cognitive performance. Given that a billion-dollar industry with over 70 million active users has been built on this premise, more research is needed into whether such brain-training programs are really worth people’s time and money. In particular, studies that employ a within-subject design or follow participants over time are needed in order to draw stronger conclusions.

A word of caution: The methodology in this study did not take into account individual improvements within-subjects (comparing each participant’s cognitive ability before and after brain training), and participants were not randomly assigned to receive brain training.

Stojanoski et al. Brain training habits are not associated with generalized benefits to cognition: An online study of over 1000 “brain trainers”. Journal of Experimental Psychology: General (2020). Access the original scientific publication here.

More Than a Watch: How Wearable Tech is Helping Advance Neuroscience

Post by Lincoln Tracy

What's the science?

Wearable technology is a fast-moving and financially lucrative industry. The adoption of this technology is primarily being driven by smartwatches. These devices can measure a range of physiological signals including heart rate, blood oxygen levels, and sweat gland activation – all while looking sleek and stylish on your wrist. Smartwatches allow scientists to collect a wide range of long-term, real-time physiological data from large numbers of people at the same time. This week in Neuron, Johnson and Picard provide a broad overview of wearable technology and highlight its potential clinical applications.

What have we learned?

One type of smartwatch data is electrodermal activity, or EDA. EDA refers to changes in the electrical properties of our skin. When our sympathetic nervous system is activated, we begin to sweat. These increases in sympathetic activity can be measured by passing a small electric current across two electrodes on the surface of the skin and calculating the electrical conductance, giving us a measure of EDA. Measuring EDA allows scientists to examine purely sympathetic activity, as EDA does not have any known parasympathetic drivers.

However, EDA is more than a simple measure of arousal. One wider-ranging application of EDA is the detection and characterization of seizures, particularly generalized tonic-clonic seizures. The association between EDA and seizure activity has been identified from concurrent electroencephalography (EEG) and EDA recordings from patients having a seizure. The recordings show unusual electrical activity across all EEG channels during the seizure. But after the seizure stops, there is a flattening of EEG activity and a surge in EDA. Several studies in both children and adults have shown that measures of EDA correlate with the duration of the post-seizure suppression of EEG activity.

Why does it matter?

Correlations such as these allow for major advances in real-world diagnostic tests and interventions. For example, EEG suppression has been observed for all EEG-monitored cases of sudden unexpected death in epilepsy (SUDEP), the second leading cause of years of potential lives lost for neurological conditions. The exact mechanisms of SUDEP are unknown, but the risk of SUDEP is greatest with frequent seizures and being alone at the time of seizure. However, by applying machine learning to EDA and accelerometry data, life-threatening seizures can be detected in real-time with almost 100% sensitivity. Detecting the seizures before they end allows caregivers and loved ones to attend to the person seizing and provide aid, potentially preventing death.

What are the next steps?

Despite EDA being studied since the 19th century, the advent and popularity of wearable technology allow scientists to explore the role of this (and other) physiological signals in everyday life. The simplicity and practicality of smartwatches allow their use in vulnerable or underserved populations where typical neuroimaging technologies cannot be used. But it is important to note that wearable technology is not without its limitations. Many devices have their own proprietary algorithms, meaning that the underlying raw data is kept under strict lock and key. There are also issues surrounding the privacy, transparency, and ethical use of the data collected by these devices. These limitations aside, the addition of wearable technology to research kits and combining their use with existing neuroimaging techniques offer the possibility of a pipeline for the effective translation of basic science to new therapies, drug delivery, and personalized medicine.

Johnson_Oct20.jpg

Johnson and Picard. Advancing Neuroscience through Wearable Devices. Neuron (2020). Access the original scientific publication here.

Sawtooth Wave Oscillations During REM Sleep

Post by Cody Walters

What’s the science?

Sawtooth waves (STWs) are jagged oscillations seen in EEG recordings prior to bursts of rapid eye movement (REM) during sleep. Although REM sleep has been the focus of much research, little is known about the cortical origin and functional significance of STWs. This week in The Journal of Neuroscience, Frauscher et al. performed the first ever intracranial study of STWs in human subjects.

How did they do it?

The authors analyzed data from 26 epileptic human patients that underwent preoperative stereoelectroencephalography (SEEG) recordings to localize epileptogenic brain tissue. In addition to SEEG recordings, the authors had access to scalp EEG and polysomnography data.

What did they find?

The authors identified that STWs occur in a variety of brain structures including the parietal, lateral, and medial areas; the anterior insula; the lateral and orbital frontal cortices; and mesiotemporal structures. They then quantified the spectral power (by decomposing the SEEG signal into a sum of sine waves with varying frequencies and amplitudes) in a variety of brain regions during STWs. There was an increase in both low (2-4 Hz) and high (20-240 Hz) frequency power (i.e., the amplitude of the sine wave in those frequency ranges significantly contributed to the overall SEEG signal) during STWs in most brain regions.

Ripples are oscillations in the local field potential that occur in the high frequency range (i.e. 80-240 Hz ) and are known to play a role in memory consolidation and replay. The authors found that there was a significant increase in the ripple rate in a variety of brain structures (e.g., the angular gyrus, posterior cingulate gyrus, temporal pole, superior temporal gyrus and mesiotemporal structures) during STWs. Furthermore, the authors found that STWs were temporally and spatially highly variable both within and between brain regions. These data suggest that STW may synchronize the re-activation of activity to aid in complex memory consolidation.

cody (2).png

What’s the impact?

This study is the first intracranial investigation of the electrophysiological correlates of sawtooth waves in human subjects. The authors found that (1) STWs occured in various brain regions, (2) STWs co-occurred with high frequency ripple events, and (3) STWs were both temporally and spatially variable. These data suggest that STWs may play an important role in coordinating the brain activity to aid memory consolidation during REM sleep. Further, this study sheds light on a largely unexplored aspect of REM sleep.

Frauscher et al. (2020). REM sleep sawtooth waves are associated with widespread cortical activations. The Journal of Neuroscience. Access the publication here.