How the Brain Creates False Memories Based on Misinformation

Post by Kulpreet Cheema

The takeaway

Hippocampal activity and connectivity with prefrontal and parietal cortices in the brain are responsible for creating misinformation-induced false memories.

What's the science?

The misinformation effect is when a memory of an event changes after exposure to misleading information. False memories can occur as a result of the misinformation effect. The classic three-stage misinformation paradigm involves someone witnessing an event, exposure to the misinformation, and finally performing a memory test about the original event. The hippocampus is a brain structure involved in these three stages of memory, however, how hippocampal representations change across these stages is not well known. This week in Nature Communications, Shao and colleagues investigated the role of the hippocampal-cortical network involved in creating misinformation-induced false memories.

How did they do it?

The authors performed behavioral and functional magnetic resonance imaging (fMRI) studies to investigate the misinformation effect. In study one, 122 participants were randomly assigned to the misinformation, neutral or consistent group. The misinformation group had the highest number of false memories, confirming that the misinformation paradigm did lead to increased false memories. In study two, another 57 participants completed a memory test in an fMRI scanner. The study had three stages: during the original-event stage, participants were shown photos of eight events and later heard narratives about the photos (during the post-event phase). Nineteen hours after the post-event stage, the memory of the original event was tested in a memory test. In the misinformation condition, the descriptions of the critical elements of the original photos given in the narratives were inaccurate.  

The activity of hippocampal and prefrontal brain regions during the three stages was analyzed to investigate their involvement in true and false memories. This was followed by a whole-brain ‘searchlight’ analysis to see whether other brain regions were involved in false memory and the misinformation effect.

What did they find?

The hippocampal activity pattern during the original-event and post-event stages were similar, suggesting the original information was reactivated in the hippocampus during the misinformation (i.e., post-event) stage. Prefrontal brain activity was more positively correlated with hippocampus reactivation of post-event information during false memory than original-event information representation. This means the prefrontal cortex works with the hippocampus to monitor memory traces and resolves conflict when false memories occur. In addition to the hippocampus, the participant-specific representations stored in the lateral parietal cortex predicted true memory. On the other hand, misinformation during the false memory was supported by the hippocampus and medial-parietal cortex activity. This suggests that lateral and medial parietal cortices were distinctly connected to the hippocampus to carry original-event and post-event information to create true and false memories, respectively.

What's the impact?

This study shows that dynamic changes in the activity and connectivity of the hippocampus, prefrontal and parietal cortices create the misinformation effect. The representations of original information in the hippocampus become weak when a memory is retrieved, and the misinformed memory representations compete with these original representations. The results also support the multiple-trace memory theory and confirm human memory's fragile and reconstructive nature. 

Access the original scientific publication here.

How Brain-to-Brain Synchrony Between Students and Teachers Relates to Learning Outcomes

Post by Elisa Guma

The takeaway

Synchrony of brain activity amongst students and between students and teachers predicts test performance following lecture-based learning. Furthermore, brain-to-brain synchrony is elevated during lecture segments associated with correctly answered questions.  

What's the science?

Social interactions between students and teachers have a profound impact on students’ learning and engagement. Students feel a greater sense of belonging and tend to have better outcomes in synchronous learning (i.e., where students and teacher interact in real-time) versus asynchronous learning (where students view prerecorded lectures). Interestingly, little is known about the brain mechanisms that support this type of learning. Synchronous brain activity across individuals referred to as brain-to-brain synchrony, may play a role. This week in the Association for Psychological Science, Davidesco and colleagues recorded brain activity from students and teachers in a classroom setting to determine whether brain-to-brain synchrony was associated with learning outcomes. 

How did they do it?

The authors recruited 31 healthy young adult males and females and two professional high school science teachers (one male and one female) to participate in the study. Students were broken up into 9 groups of 4 and attended four 7-minute teacher-led science lectures covering topics such as bipedalism, insulin, habits and niches, and lipids. To assess the degree of learning, students had to take an assessment comprising 10 multiple-choice questions at 3 different timepoints: (1) pretest: 1-week prior to the lectures, (2) immediate posttest: immediately following each 7-minute lecture, and (3) delayed posttest: one week following the lectures.

Electroencephalography (EEG) recordings were acquired from both students and teachers during each lecture and testing session to measure brain activity in real-time, with high temporal specificity. The data was preprocessed and filtered into three frequency bands: theta 3-7Hz), alpha (8-12Hz), and beta (13-20Hz). This was then averaged into three predefined brain regions of interest based on where the recording electrode was positioned on the head of each participant; this included the posterior, central, and frontal regions of the brain.

To quantify learning outcomes, the authors categorized a question as “learned” if it was answered incorrectly in the pretest, but correctly in either of the posttests, and “not learned” if the student’s answer was unchanged from pre- to posttest. The authors compared brain activity patterns 1) across pairs of students or 2) between students and the teacher to determine whether there was any brain-to-brain synchrony. Next, they evaluated whether the periods of brain-to-brain synchrony during lectures were associated with learning outcomes (pretest-to-posttest change). Finally, they evaluated whether brain-to-brain synchrony was higher during lecture segments that the students successfully learned compared to those they did not learn.

What did they find?

First, the authors found that test performance significantly improved from the pretest to the immediate posttest, as well as to the delayed posttest, but to a lesser extent. Next, they found that there was evidence for brain-to-brain synchrony based on recordings from the central electrodes and on alpha band frequency activity. Interestingly, this synchronous activity predicted both pretest-to-immediate-posttest learning as well as pretest-to-delayed-posttest learning. However, there was no effect when comparing activity in the two posttest sessions to each other. Additionally, alpha-band synchronous activity was higher during lecture segments corresponding to learned versus not learned questions.

Next, the authors found that there was a temporal lag for brain-to-brain synchrony between students and teachers wherein the teacher’s brain activity patterns preceded the brain activity patterns of the students by 300ms. This is likely explained by the fact that the teacher served as the speaker and the students as the listeners. Furthermore, the student-to-teacher brain-to-brain synchrony significantly predicted pretest-to-delayed-posttest learning but not pretest-to-immediate-posttest learning.

What's the impact?

This study extends our understanding of how synchronous brain activity between students and between students and their teachers may be related to learning. Alpha rhythm activity in the central part of the brain is particularly relevant for this type of synchronicity. In future studies, it may be interesting to acquire data from other physiological signals such as heart rate, body motion, or eye movements to see how these might be related to EEG-measured brain activity to support learning. 

Access the original scientific publication here.

Encoding Numerical Information is as Easy as 1-2-3 for Infants

Post by Lincoln Tracy

The takeaway

Infants as young as three months old have ‘number sense’, meaning they can automatically encode the number of tones they hear or the number of objects they see.

What's the science?

The ability to discriminate numbers – independently of physical quantities such as size or density – is an important human behavior that is also observed in mammals, birds, and fish. However, it is unclear whether humans are born with an innate ‘number sense’, or whether this is a learned response. This week in Current Biology, Gennari and colleagues tested the existence of a genuine ‘number sense’ by examining the neural activity of three-month olds, measured by electroencephalography (EEG), in response to different stimuli containing numerical and non-numerical information. 

How did they do it?

The authors played a variety of auditory sequences that differed in length, rate, instrument, and pitch to 26 drowsy or sleeping three-month old infants while a high-density EEG system recorded their neural responses. The tones composing the sequences could be “short” (a 40ms tone with a 20ms gap between tones), “medium” (120ms with a 60ms gap), or “long” (360ms with a 180ms gap). For the analysis of the EEG recordings, the authors used multivariate pattern analysis to individuate any potential purely numerical neural code that was separate from the neural activity patterns reflecting other characteristics of the auditory stimuli such as tone rate and duration. A key contrast in their analysis concerned the auditory sequences composed of 4 “long” tones vs 12 “medium” tones or 4 "medium" tones vs 12 "short" tones. These pairs of sequences lasted the same length overall but obviously contained a different number of notes.

What did they find?

The authors found unique neural responses to different number conditions, demonstrating that three-month old infant brains can estimate the number of tones in an auditory sequence separately from other magnitudes. Further, infants were able to encode numbers even during sleep. This implies that number is a fundamental and critical dimension for representing the auditory environment around us.  

What's the impact?

These findings confirm that our brain treats number as a basic dimension of the environment from a very young age. As other researchers believe that the ability to process approximate numbers is the starting point for a deeper understanding of mathematics, these findings may have practical implications in educational and rehabilitative interventions.