The Behaviour and Neurobiology Underlying Leadership Decisions

What's the science?

Leadership is critical within our society. Leaders such as teachers, soldiers, politicians and parents, to name a few, are continuously responsible for making decisions that will affect others. One aspect of leadership is the acceptance of responsibility, as leaders are responsible for others in the choices they make. Despite the fact that leadership is central to human societies, we still don’t understand the neurobiology of leadership and why some choose to lead and others chose to follow. This week in Science, Edelson and colleagues use a decision-making task to examine leadership choices and the brain regions involved.

How did they do it?

They developed a behavioural task to examine leadership preferences. The participants performed 2 tasks: a baseline task and a delegation task. In all tasks, the participant’s payoff was dependent on their choices, and therefore assessed preferences related to risk, loss and ambiguity. In the baseline task, they were required to choose whether they would accept a gamble with varying probabilities of loss or gain over several trials. In some trials the exact probability of gains and losses were shown to the participant (to assess risk), while in some trials these probabilities were not shown (i.e. ambiguous and closer to real life decisions). In the delegation task, participants were required to decide on the same gambles, however, they could choose to lead and make a decision on behalf of their group, or defer and follow the choice of the group members. In this task there were two trials, ‘self’ and ‘group’ trials, where the choice of the leader would directly affect the payoff of him or herself or affect the payoff of the group members. The group members as a whole had more information about the probabilities of an outcome, which mimics a real-life scenario where deciding as a group may be advantageous. The authors analyzed baseline choice data to determine whether preferences for risk, loss or ambiguity were associated with leadership scores (obtained using established scales, as well as real life data) and whether there is a shift in these preferences when making choices that impact others. They then used computational modelling and fMRI analysis to understand the preferences for leadership and the brain activity underlying these choices.

What did they find?

In the ‘group’ trials, participants deferred to the decision of the group more often compared to the ‘self’ trials (17.3% increase in deferral rate), demonstrating an overall preference for avoiding responsibility. Responsibility aversion (or avoidance of responsibility) was correlated with leadership scores, where those with lower responsibility aversion had higher leadership scores. Responsibility aversion was not related to individual preferences/ values related to risk, loss and ambiguity, nor was it related to regret, blame, and guilt in social situations (assessed in a separate experiment). They then used a computational model taking into account the participants preferences to try and model these decisions. From participants preferences they were able to derive subjective values (i.e. difference in value between accepting or rejecting a gamble for each participant) of choices to lead or defer. They found that deferral choices were greater when there was a low subjective value difference (i.e. less discriminability between the value of the two choices). In other words, when the participant was more certain of their choice (high discriminability), they were more likely to take on responsibility. They derived ‘deferral thresholds’ whereby a critical level of certainty decides whether they will defer or not in a gamble. Using computational modelling they found that when taking responsibility for others there was a shift in the deferral threshold, indicating a higher demand for certainty of a choice. These changes in deferral threshold were correlated with leadership scores. The greater the shift in the deferral threshold, the more likely an individual was to defer (i.e. not lead). Since individual values about risk, loss and ambiguity were not related to leadership scores, these results suggest that the key to determining whether one will lead or not lies in the shift in the amount of certainty needed (about their decision) now that they are responsible for others.

                                      Brain, Servier Medical Art, image by BrainPost, CC BY-SA 3.0

                                      Brain, Servier Medical Art, image by BrainPost, CC BY-SA 3.0

Brain activation was higher in the middle temporal gyrus during group trials compared to self-trials, and temporo-parietal junction activity was correlated with the decision to defer (i.e. not lead). Activity in several brain regions including the medial prefrontal cortex (involved in self-reflection) was associated with the subjective value difference while activity in the anterior insula was associated with the probability of choosing to lead. The authors fit a dynamic causal model to these four brain regions and found a relationship between individual differences in activity in this brain network, shift in deferral threshold and leadership scores. Temporal gyrus activity was also associated with a reduced or inhibited influence of the medial prefrontal cortex on anterior insula activity. This inhibitory influence of medial prefrontal cortex on the anterior insula was reduced in leaders, suggesting a potential neural mechanism underlying the shift in deferral threshold and responsibility aversion.

What's the impact?

This is the first study to examine the behaviour and neurobiology underlying leadership preferences. We now know that the majority of individuals show responsibility aversion and that this is a critical driver in the decision to lead. Those who were less likely to lead show a greater difference in their demand for certainty when deciding to take responsibility for others. Further, this study provides important insight into the brain regions involved in leadership choices. These findings have important implications for understanding leadership in society and can help to inform leadership decisions and consequences.

Edelson et al., Computational and neurobiological foundations of leadership decisions. Science (2018). Access the original scientific publication here.

Stress Impairs Episodic Memory Retrieval

What's the science?

Stress can impair episodic memory, meaning memory for specific events. In the brain, the hippocampus and frontoparietal network (a network consisting of the frontal and parietal lobes of the brain) are responsible for this type of memory. When remembering, medial temporal lobe structures are responsible for a sense of familiarity ('I've seen that before'), and the hippocampus is responsible for recalling specific details of events ('I saw that yesterday at the park'). Frontoparietal networks are recruited when recalling the memory is difficult - they guide the recollection by directing attention - such as toward cues to help someone recall. Stress can impair the hippocampus, which can then contribute to poor memory retrieval. But how does stress impair these memory circuits in the brain exactly? This week in Cerebral Cortex, Gagnon and colleagues used functional magnetic resonance imaging (fMRI) to understand memory circuits in the human brain during acute stress.

How did they do it?

47 young healthy male (18-35 years) participants completed the experiment (44 completed fMRI scans). Half of the participants were randomly assigned to a stress group, and half to a control group. On the first day, all participants completed a study session in which they memorized the association between random words and random scenes (192 word-scene pairs). On the next day, participants were assigned to either the stress group or the control group, and shown words they had seen before as part of the pairs, in addition to new words, while completing fMRI scanning. Participants were asked to report whether the word was old (seen before) or new, and if it was recalled as old, how sure they were about whether the scene paired with the word was indoor or outdoor. At the beginning of each run (6 runs totalling 252 trials), participants in the stress group were told the run would be a shock run (an orange border appeared on the screen during the run) or a safety run (a teal border appeared on the screen during the run). During the shock runs, participants in the stress group received an electric shock (previously calibrated for each individual to be moderately painful) via electrodes on their ankle (1-2 shocks at some point during the run). Participants in the control group were informed that the run would be an 'orange run' or a 'teal run', however no shocks were received. The authors hypothesized that stress hormones (released during stress) would act on the brain throughout the entire experiment for the stress group, and that the stress group would experience the additional difficulty of being distracted by the stressful prospect of a shock during the shock runs only. Both groups also rated whether they felt positive (happy, safe) or negative (anxious, stressed) during each run of the experiment. [The study also included cortisol testing and weak and strong levels of memory coding - see paper for details].

What did they find?

Stressed participants reported feeling more negative than control participants during shock runs, and more negative than during their own safe runs. Associative memory (recalling scene as indoor versus outdoor) performance was found to be reduced by stress (in the stress group). However, performance when simply discriminating old words from new words was only marginally affected by stress. Surprisingly, stress due to shock runs (versus safe runs) in the stressed group did not impact associative memory (recalling scene indoor vs. outdoor) performance. These results suggest that recall is influenced generally by exposure to stress (stress group vs. control group), but not by the immediate presence of a distracting threat (shock runs vs. safe runs). The authors examined neural activation across trials to study hippocampal activity. A larger increase in the hippocampal BOLD response (fMRI measure of brain activity) was seen for trials on which recall was correct. The relationship between hippocampal activity and associative memory was also found to be disrupted during stress. In the stress group (but not the control group), reduced activation in the hippocampus was found during correct recollection of old words versus correct rejection of new words. During trials in which participants reported they were certain about whether the scene was indoor or outdoor, there was a relationship between frontoparietal engagement and reaction time in the control group, however, this relationship was disrupted in the stressed group. This result indicates that the control group but not the stressed group is able to flexibly recruit these networks.

hippo1800.png

What's the impact?

This study examines the neural mechanisms underlying changes in memory recollection during acute stress using fMRI. Stress can impair episodic memory recollection, and likely does so by disrupting the relationship between hippocampal activity and recollection. The results have important implications for how stress can impact memory, especially in clinical populations in which neural mechanisms may be altered - for example, in post-traumatic stress disorder.

Gagnon et al., Stress Impairs Episodic Retrieval by Disrupting Hippocampal and Cortical Mechanisms of Remembering. Cerebral Cortex (2018). Access the original scientific publication here.

The Presence of a Decoy Increases Cooperation in Decision-Making

What's the science?

Behavioral economics is a field of research examining how human cognitive biases often interfere with fully rational or 'optimal' behavior. One effect known as the 'decoy effect' shows that when given two options in a decision-making scenario, a third less desirable option can change the desirability of the other options (even when it wouldn’t be expected to). Evolutionary game theory is a field of research which studies how humans make decisions to optimize their payoff or reward in relation to others. One concept called ‘selection’ describes how we use rational thinking to eventually eliminate suboptimal behavior over time. We don’t know how the path to optimal behavior might change when attempting to make decisions involving others in the presence of a decoy effect. This week in Nature Communications Wang and colleagues examine how behavior is affected by the presence of a decoy during a decision-making task called the Prisoner’s Dilemma.

How did they do it?

Behavior was examined in 388 volunteers during the Prisoner’s Dilemma; a well-established decision-making task in which a participant is required to either cooperate (a safe option with less reward) or defect (betray the opponent for a greater payoff). The game centers around two players, one a human participant and the other a computer opponent who the participant is trying to play against to maximize their reward. The computer opponent is learning the behavior of the participant and therefore the game is a balancing act of cooperation and defection. Typically, participants have a tendency to defect more (the payoff is disproportionately greater) even though it is not in their best interest to do so, as cooperation of both parties would result in an optimal payoff. The authors introduced a third irrelevant option, a decoy called ‘reward’, where the participant could reward their opponent. They monitored the participants’ tendency to cooperate and defect with and without the presence of the ‘reward’ decoy option.

What did they find?

When the ‘reward’ decoy option (i.e. the option to reward the opponent) was present, the participants showed an increase in cooperative behavior (median frequency of 60.5%) compared to the control condition (median frequency of 31.4%), despite that fact that they did not choose the decoy option often. These results suggest that the mere presence of the option to reward an opponent results in greater cooperation. They found that the presence of the decoy resulted in both an increase in cooperation before the reward (i.e. decoy) option had even been chosen (i.e. first round) and increased the choice to cooperate following a cooperative choice or a reward (i.e. decoy) choice. Further, this effect was stabilized over time. These results suggest conditional cooperation by participants (also known as tit-for-tat) depending on their opponent’s most recent actions. They also found that this increased cooperativeness did not lead to a greater payoff overall, nor did choosing the reward option lead to a greater payoff overall. However, the probability of success was increased for those individuals who were more cooperative.

decoy1800.png

What's the impact?

This is the first study to demonstrate that the presence of a decoy (despite being an inferior option) promotes cooperative behavior in a game theory decision-making task. These findings suggest that decoys may ignite prosocial behavior across a range of social activities which could result in better outcomes for cooperative individuals. This study has important implications for social decision-making in society.

Wang et al., Exploiting a cognitive bias promotes cooperation in social dilemma experiments. Nature Communications (2018). Access the original scientific publication here.