We experience potentially emotive stimuli all the time. Some of us suffer intense outrage when we mistakenly tune into Fox News. Others have the same experience when they stumble upon CNN. We all have developed strategies to cope with those events, a skill known as emotion regulation.
Although emotions are often portrayed as “irresistible forces”—there is even a band by that name— that control our lives, in actual fact people are very good at dealing with their emotions and regulating them. Most of us do not break into tears when we have slight indigestion and we continue to function well even if we find the latest atrocities committed somewhere in the world to be appalling and distressing.
There has been a large body of research into emotion regulation, and emphasis has recently shifted towards the important role of context in self-regulation. The type of strategy we engage to regulate our emotions often differs between contexts. For example, if we encounter a rude or offensive person we may seek to minimize our anger by reasoning that the person is having a bad-hair day (a strategy known as reappraisal). In another situation, for example when we feel embarrassed in a public situation, we may wish to avoid that feeling altogether (a strategy known as suppression).
Perhaps unsurprisingly, failure to choose a contextually-appropriate emotion-regulation strategy has been associated with psychopathology. Expressing sadness in bereavement will elicit sympathy and support, but breaking into tears when someone tells a bad joke would usually be considered inappropriate and may disrupt a person’s social bonds. There is evidence that depressed individuals are particularly prone to constricted emotional behaviors in both positive and negative emotional contexts, suggesting that their emotion regulation is not contextually appropriate.
A recent article in the Psychonomic Society’s journal Cognitive, Affective, & Behavioral Neuroscience sought to shed further light on individual differences in emotion regulation, with a particular emphasis on its neurocognitive (i.e., physiological) markers and its relation to other components of flexible self-regulation. Researchers Sarah Myruski, George Bonanno, Olga Gulyayeva, Laura Egan, and Tracy Dennis-Tiwary presented their participants with an emotional go/no-go task while also recording participants’ EEG.
The defining attribute of a go/no-go task is that participants are told to press a button only when certain conditions hold. For example, they may have to press a button when a green triangle appears, but withhold a response for any other triangle or other shape. In an emotional go/no-go task, the two classes of stimuli correspond to faces that display different emotions. Myruski and colleagues used 6 different conditions, and within each, one face type was the “go” signal and the other face type the “no go” stimulus.
The figure below illustrates one of the conditions: in this case the happy faces signal “go” and fearful faces are the “no-go” stimulus.
All possible pairings of faces that were happy (H), neutral (N), or fearful (F) were used across conditions, although for purposes of analysis only three over-arching contexts were relevant: HN (which is formed by combining happy-go/neutral-no-go and neutral-go/happy-no-go), FH (analogous combination of conditions), and FN. Participants’ EEG was recorded during performance of the go/no-go task.
In addition, participants responded to questionnaires that probed their emotional well-being (via the Beck Depression Inventory and the Beck Anxiety Inventory). Participants also rated their self-perceptions of their own emotion regulation and flexibility.
The results are best discussed by presenting the behavioural data first, before turning to the neurocognitive measures. The first question of interest to Myruski and colleagues was whether emotional faces (happy or fearful) would facilitate or disrupt performance on the go/no-go task relative to neutral faces. The second, related question was whether this disruption would differ depending on the broader “affective context”; that is, whether the emotional faces were happy or fearful.
The data were analysed using d’ scores, formed by considering hit rate (pressing a key in response to a “go” stimulus) and false alarm rate (pressing a key in response to a “no-go” stimulus) together.
Overall, it was found that d’ was greater when emotional (as opposed to neutral) faces were involved. This is shown in the figure below: The bars are cross-hatched in the two shades that correspond to the conditions in the legend, indicating that those two conditions are combined.
When the effect of emotion was disentangled further, by considering the trials involving fearfulness and happiness separately, a strong effect of valence emerged. This is shown in the next figure:
Taken together, the data suggest that emotion enhanced context sensitivity overall, as revealed by the overall benefit on d’ for the emotional faces when they alternated with neutral faces. When this effect is further broken down by valence, performance was (considerably) enhanced for happy faces and disrupted for fearful faces.
Turning to the physiological measures, Myruski and colleagues focused on two components of event-related brain potential (ERP) markers. As we noted on this blog earlier, ERPs are a long-standing tool in experimental cognitive psychology and they offer the distinct advantage of providing a measure of cognitive activity without requiring a behavioral response. One component of interest here is the N170 component, which arises some 150-180 ms after stimulus presentation and is thought to represent early attentional selection and discrimination, particularly for faces. A later component, known as N2, kicks in approximately 200 to 300 ms after stimulus onset and is focused on frontal regions of the brain. Greater N2 amplitudes are known to indicate successful response inhibition for no-go trials.
The ERP results largely mirrored the behavioural results. N170 amplitudes were larger for both fearful and happy faces (compared to neutral), and fearful faces elicited the largest magnitude. Myruski and colleagues interpreted this result as showing that attentional selection was greater for fearful than neutral and happy faces. By contrast, N2 magnitudes were dampened by emotional stimuli, with fearful no-go faces eliciting the smallest N2 response compared to all other stimuli. That is, fearful no-go faces elicited the lowest response inhibition—contributing to low performance (as measured by d’) on those faces.
In addition, Myruski and colleagues found that performance correlated with people’s self-reported well-being: The greater a participant’s d’ score overall, the lower that person’s depressive symptoms. Moreover, the two neurocognitive measures also predicted well-being: The greater the N170 and N2 amplitudes, the lower the depressive and anxiety symptoms on the self-report inventory. Myruski and colleagues interpret these correlations as showing that:
“… the ability to harness and overcome the respective influence of pleasant and unpleasant emotional contexts may be important predictors of emotional wellbeing that should be distinguished from context-independent discrete strategy use.”
In addition, the data show that emotional context was a determining factor in performance. Happy faces facilitated performance (and increased hit rates more than they decreased false alarms). Conversely, fearful faces reduced inhibition and thus also performance.
The results of Myruski and colleagues set the stage for a further examination of emotion regulation in the emotional go/no-go task in clinically anxious or clinically depressed participants. Given the relative simplicity of the task, it may offer a promising avenue to teach patients various emotion regulation strategies and to provide feedback about their performance.
Psychonomics article focus on in this post:
Myruski, S., Bonanno, G. A., Gulyayeva, O., Egan, L. J., & Dennis-Tiwary, T. A. (2017). Neurocognitive assessment of emotional context sensitivity. Cognitive, Affective, & Behavioral Neuroscience, 17, 1058-1071. DOI: 10.3758/s13415-017-0533-9.