Elsevier

NeuroImage

Volume 109, 1 April 2015, Pages 273-282
NeuroImage

Affective responses to emotional words are boosted in communicative situations

https://doi.org/10.1016/j.neuroimage.2015.01.031Get rights and content

Highlights

  • We investigated emotional words in a social-communicative context.

  • Emotion effects in ERPs are strongly enhanced in a communicative situation.

  • Negativity bias in isolated words turns into positivity bias in communicated words.

  • Social-communicative contexts are highly relevant in emotion processing.

Abstract

Emotional verbal messages are typically encountered in meaningful contexts, for instance, during face-to-face communication in social situations. Yet, they are often investigated by confronting single participants with isolated words on a computer screen, thus potentially lacking ecological validity. In the present study we recorded event-related brain potentials (ERPs) during emotional word processing in communicative situations provided by videos of a speaker, assuming that emotion effects should be augmented by the presence of a speaker addressing the listener. Indeed, compared to non-communicative situations or isolated word processing, emotion effects were more pronounced, started earlier and lasted longer in communicative situations. Furthermore, while the brain responded most strongly to negative words when presented in isolation, a positivity bias with more pronounced emotion effects for positive words was observed in communicative situations. These findings demonstrate that communicative situations – in which verbal emotions are typically encountered – strongly enhance emotion effects, underlining the importance of social and meaningful contexts in processing emotional and verbal messages.

Introduction

Emotional verbal messages, even in the form of single words, are typically encountered in wider meaningful contexts and rarely seen or heard in isolation. For instance, during reading, emotional words are often embedded in longer sentence or text passages. Crucially, in social communicative situations they are experienced in the presence of a speaker directly addressing the listener. Such meaningful contexts may have a strong influence on how we experience and evaluate single words and their emotional and semantic contents (e.g. Hagoort and van Berkum, 2007, Nieuwland and Van Berkum, 2006). Yet, and in contrast to natural language processing in real life situations, communicative aspects have not yet been taken into account in studies on emotional word processing. In the typical lab situation, single participants are placed in front of a computer screen, reading visually presented words of varying emotional contents. Arguably, such situations lack ecological validity, and the potential of verbal stimuli to induce emotion effects may be largely underestimated (e.g. Recio et al., 2011, Trautmann et al., 2009).

Indeed, emotional words (e.g. Kissler et al., 2007, Schacht and Sommer, 2009a, Schacht and Sommer, 2009b) tend to induce weaker affective responses than other emotionally arousing stimuli such as facial expressions (Schupp et al., 2004b), body postures (Aviezer et al., 2012), gestures (Flaisch et al., 2011), visual scenes or objects (e.g. Lang et al., 1993, Schupp et al., 2003). Two recent studies have suggested higher visual complexity of pictorial stimuli to contribute to those differences (Schlochtermeier et al., 2013, Tempel et al., 2013). However, using simple line-drawings and only positive and neutral stimuli that were furthermore matched for arousal, these studies seem not fully conclusive. Another possible explanation following from the discussion above is that isolated words lack personal relevance when they are not embedded in personally meaningful contexts such as communicative situations that frame the emotional meaning of the words. First evidence for the contribution of context relevance in emotional word processing derives from studies providing self-relevant verbal contexts using sentences or personal pronouns preceding the emotional target words (Fields and Kuperberg, 2012, Herbert et al., 2011a, Herbert et al., 2011b). Taking additionally the symbolic nature of verbal stimuli into account – in contrast to the more direct emotional appeal of facial expressions or arousing scenes – the personal relevance of emotional words such as “love” or “idiot” may be considerably enhanced when experienced during face-to-face communication, resulting in stronger and more immediate affective responses.

Within current two-dimensional theories of emotion processing that focus on valence and arousal (Bradley and Lang, 2000, Lang, 1995, Lang et al., 1993, Lang et al., 1998) communicative situations can be assumed to increase the subjective valence of the words and/or the arousal they induce, and may thus intensify the emotional experience induced by those words. Alternatively, in appraisal theories of emotion (Ellsworth and Scherer, 2003, Scherer and Peper, 2001) communicative situations can be assumed to directly affect appraisal of the pleasantness and self-relevance of emotional words, including their potential consequences for the listener (Grandjean and Scherer, 2008, Grandjean et al., 2008). In particular, the self-relevance of emotional words may differ largely between face-to-face communication and encounters of context-free single words.

To date, there is to the best of our knowledge no direct empirical evidence on the processing of emotional words in social-communicative contexts. Thus, this study was designed to investigate the consequences of the presence of a speaker on emotional word processing with electrophysiological measures of brain activity. Our goal was to provide insights into how the brain responds to emotional words in more realistic communicative, and thus personally more relevant and ecologically more valid, situations. We contrasted affective responses to emotional words experienced during communicative situations with the processing of the identical words in non-communicative situations. To this end we presented videos of a speaker with direct eye gaze, conveying a neutral facial expression, uttering emotional and emotionally neutral words. The speaker's gaze turned towards the perceiver during verbal and non-verbal communication signals attention to the perceiver which can be seen as a basic ingredient of face-to-face communication (e.g. Kampe et al., 2003, Vertegaal et al., 2001), enhances attention for the seen face (e.g. Bockler et al., 2014) and may facilitate speech perception, especially when there is only speech but no accompanying gesture information present (Holler et al., 2014). Thus, as a control that also included the presentation of a face, we introduced a non-communicative condition in which videos of the same speaker's face with closed eyes and mouth were presented, signaling that the words were not uttered by the person seen in the video. Seeing a person's face while hearing other persons talk is a rather common situation in real life.

Please note that our focus here is on the social communicative effects of a speaker directly addressing the listener, rather than investigating mechanisms of audiovisual integration. Ample evidence has demonstrated that the congruency of multimodal stimuli may facilitate the perception and identification of emotional (e.g. Paulmann and Pell, 2011, Paulmann et al., 2009; see Klasen et al., 2012 for a review) and non-emotional speech signals (e.g. Schwartz et al., 2004, van Wassenhove et al., 2005) and is mandatorily processed (e.g. de Gelder and Vroomen, 2000) already during early perceptual processing stages (e.g. de Gelder et al., 1999, Gerdes et al., 2013, Pourtois et al., 2000, Pourtois et al., 2002, Stekelenburg and Vroomen, 2007) probably involving specialized structures (e.g. de Gelder and Van den Stock, 2011), while incongruent audiovisual input can even lead to perceptual illusions (cf. McGurk-effect; e.g. McGurk and MacDonald, 1976). Our aim here was not to add evidence on this issue, but instead to concentrate on comparing the effects of emotional and non-emotional words separately in communicative and non-communicative situations, rather than directly contrasting word processing in the presence vs. absence of converging information from the visual modality or the congruency between visual and auditory information. Of course, the processing and integration of combined auditory and visual information is an integral component of face-to-face communication that should affect our responses to verbal messages. However, other determinants like enhanced attention towards the speaker (e.g. Bockler et al., 2014) and enhanced personal/social relevance induced by being directly addressed should have strong – and yet to be determined – effects. We believe that here, the social-communicative aspects play a crucial role. For instance, in contrast to the social relevance manipulated here, there is no a priori reason to assume that audio-visual integration affects the processing of emotional and neutral words differentially. Thus, while audio-visual integration plays an undisputed role in face-to-face communication, the social and communicative aspects can be expected to specifically influence the processing of communicated emotional and personally relevant messages.

At last, because in contrast to the well-investigated effects of emotional word reading little is known about the electrophysiological correlates of these effects in the auditory modality, we additionally conducted a pre-experiment in which the identical words were presented in isolation in the visual and auditory modality.

We focused on a temporal and functional characterization of affective responses to socially communicated emotional words, exploiting the high temporal resolution of event-related brain potentials (ERPs). Two ERP components have been repeatedly reported to reflect emotional responses to different types of visual stimuli such as faces, scenes, objects or words. The first component is the early posterior negativity (EPN), a relative negative deflection over posterior brain regions, occurring around 200 to 300 ms (e.g. Flaisch et al., 2011, Herbert et al., 2008, Kissler et al., 2006, Recio et al., 2011, Schacht and Sommer, 2009a, Schacht and Sommer, 2009b, Schupp et al., 2004a). The EPN has been reported primarily for visual stimuli and is taken to reflect early reflexive attention to and enhanced visual perception of affective stimuli (e.g. Junghöfer et al., 2001, Kissler et al., 2007, Schupp et al., 2003; for reviews see Citron, 2012, Kissler et al., 2006). The EPN does not seem to be strongly modulated by the (semantic) depth of word processing or the nature of the task (Kissler et al., 2006, Schacht and Sommer, 2009b). Furthermore, this component has been demonstrated to vary independent of the self-reference of emotional stimuli. Specifically, emotional visual words induced comparable EPN modulations when preceded by personal pronouns (“my”) or definite articles without self-reference (“the”; e.g. Herbert et al., 2011b).

At later stages, the late positive potential (LPP), peaking at about 400 to 700 ms over centro-parietal regions, has been associated with the more elaborate processing and appraisal of the intrinsic relevance of emotional stimuli (e.g. Bayer et al., 2010, Cacioppo et al., 1993, Cuthbert et al., 2000, Schacht and Sommer, 2009a). This component is directly affected by task demands and the relevance of emotion for the task (e.g. Fischler and Bradley, 2006, Rellecke et al., 2011, Schacht and Sommer, 2009b). Furthermore, LPP amplitude has been shown to be enhanced for self-referential emotional stimuli (e.g. “my success” vs. “the success”; Herbert et al., 2011a, Herbert et al., 2011b).

If social communicative contexts, as hypothesized above, increase the personal relevance of emotional words and therefore the subjective valence and arousal levels, this intensified experience should be reflected in augmented effects of emotion at early points in time (reflecting fast reflexive processing of emotional stimuli) and in later more sustained evaluative processes (effects in the LPP component). While the LPP effects can be expected to be present irrespective of the presentation modality, the predictions for the expected early reflexive effects cannot directly be related to a specific component because thus far, early emotion effects were mostly found at posterior sites for visual materials (EPN). However, analogously to the visual effects, there might be fast reflexive responses to auditorily presented emotional words (possibly at fronto-central regions, see below), which should be enhanced in communicative situations.

Section snippets

Pre-experiment

In the pre-experiment the processing of context-free visual and auditory emotional words was investigated. The purpose was to evaluate the stimulus materials and to test whether the typical emotion effects of isolated visual words can be observed with the present materials. Furthermore, we aimed at comparing the effects for visual words to the processing of the same words in the auditory modality. As mentioned above, there is very little evidence on the electrophysiological correlates of

Participants

Thirty native speakers of German (all women, right-handed, mean age: 25 years, range 18–37) received payment or course credit for participation. All participants gave informed consent to participate in the study and reported normal or corrected-to-normal vision and normal hearing. Data of two additional participants were excluded due to EEG artifacts. None of the participants took part in the rating of the materials or in the pre-experiment. The experiment was approved by the local ethics

Discussion

In the present study we investigated the impact of social-communicative contexts on affective brain responses to emotional words, assuming that communicative situations enhance the personal relevance, and therefore the susceptibility, for verbal emotional messages. The pre-experiment established the baseline effects of the emotional words used in the present study. In a standard situation in which isolated words are visually presented to single participants we observed an often reported

Conclusions

To summarize, we have demonstrated that affective brain responses to emotional words are enhanced if the words are encountered in communicative situations: the effects are amplified, begin earlier and last longer than in different non-communicative control conditions. This is in line with recent evidence suggesting that verbal contexts and assumptions about the sender of verbal messages (e.g. human sender vs. computer) affect the processing of neutral and emotional words (e.g. Graß et al., 2014

Acknowledgments

We thank Julia Baum, Marie Borowikow, Philipp Glage, Martin Maier and Johannes Rost for their help during data collection and Guido Kiecker for technical assistance.

Funding

This research was supported by a grant (AB 277/5) from the German Research Council to Rasha Abdel Rahman.

Conflict of interest

All the authors declare no conflict of interest.

References (81)

  • P. Kanske et al.

    Concreteness in emotional words: ERP evidence from a hemifield study

    Brain Res

    (2007)
  • J. Kissler et al.

    Emotional and semantic networks in visual word processing: insights from ERP studies

    Understanding Emotions

    (2006)
  • J. Kissler et al.

    Emotion and attention in visual word processing: an ERP study

    Biol Psychol

    (2009)
  • S.A. Kotz et al.

    When emotional prosody and semantics dance cheek to cheek: ERP evidence

    Brain Res

    (2007)
  • P.J. Lang et al.

    Emotion, motivation, and anxiety: brain mechanisms and psychophysiology

    Biological psychiatry

    (1998)
  • M. Palazova et al.

    Are effects of emotion in single words non-lexical? Evidence from event-related brain potentials

    Neuropsychologia

    (2011)
  • S. Paulmann et al.

    An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical-sentence context

    Brain Lang

    (2008)
  • M.M. Plichta et al.

    Auditory cortex activation is modulated by emotion: a functional near-infrared spectroscopy (fNIRS) study

    Neuroimage

    (2011)
  • G. Pourtois et al.

    Facial expressions modulate the time course of long latency auditory brain potentials

    Cognitive Brain Research

    (2002)
  • G. Recio et al.

    Electrophysiological correlates of perceiving and evaluating static and dynamic facial emotional expressions

    Brain research

    (2011)
  • J. Rellecke et al.

    On the automaticity of emotion processing in words and faces: event-related brain potentials evidence from a superficial task

    Brain Cogn

    (2011)
  • A. Schacht et al.

    Emotions in word and face processing: early and late cortical responses

    Brain Cogn

    (2009)
  • J.-L. Schwartz et al.

    Seeing to hear better: evidence for early audio-visual interactions in speech identification

    Cognition

    (2004)
  • K. Tempel et al.

    Effects of positive pictograms and words: an emotional word superiority effect?

    Journal of Neurolinguistics

    (2013)
  • S.A. Trautmann et al.

    Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations

    Brain research

    (2009)
  • I.J. Wambacq et al.

    Processing of affective prosody and lexical-semantics in spoken utterances as differentiated by event-related potentials

    Cognitive Brain Research

    (2004)
  • L.A. Watson et al.

    Seeing yourself in a positive light: brain correlates of the self-positivity bias

    Brain Res

    (2007)
  • H. Aviezer et al.

    Body cues, not facial expressions, discriminate between intense positive and negative emotions

    Science

    (2012)
  • M. Bayer et al.

    P1 and beyond: functional separation of multiple emotion effects in word recognition

    Psychophysiology

    (2012)
  • P. Berg et al.

    Dipole modelling of eye activity and its application to the removal of eye artefacts from the EEG and MEG

    Clin Phys Physiol Meas

    (1991)
  • A. Bockler et al.

    Catching eyes: effects of social and nonsocial cues on attention capture

    Psychol Sci

    (2014)
  • M.M. Bradley et al.

    Affective reactions to acoustic stimuli

    Psychophysiology

    (2000)
  • T. Brosch et al.

    Cross-modal emotional attention: emotional voices modulate early stages of visual processing

    Journal of Cognitive Neuroscience

    (2009)
  • J.T. Cacioppo et al.

    If attitudes affect how stimuli are processed, should they not affect the event-related brain potential?

    Psychological Science

    (1993)
  • B. de Gelder et al.

    Real Faces, Real Emotions: Perceiving Facial Expressions in Naturalistic Contexts of Voices, Bodies and Scenes

    (2011)
  • B. de Gelder et al.

    The perception of emotions by ear and by eye

    Cognition & Emotion

    (2000)
  • P.C. Ellsworth et al.

    Appraisal processes in emotion

    Handbook of affective sciences

    (2003)
  • T. Flaisch et al.

    Emotion and the processing of symbolic gestures: an event-related brain potential study

    Soc Cogn Affect Neurosci

    (2011)
  • A.B. Gerdes et al.

    Emotional sounds modulate early neural processing of emotional pictures

    Front Psychol

    (2013)
  • M.H. Giard et al.

    Auditory–visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study

    Journal of Cognitive Neuroscience

    (1999)
  • Cited by (29)

    • Predicting participants’ attitudes from patterns of event-related potentials during the reading of morally relevant statements – An MVPA investigation

      2021, Neuropsychologia
      Citation Excerpt :

      Positive stimuli, on the other hand, are associated with positive affect and reward (Citron, 2012). Although our study did not assess ratings, it is assumed that even with the moral themes one agrees with (such as being in favour of abortion), the concepts themselves would not actually have been rated ‘positively’, in contrast to words such as love or happiness in the studies conducted in terms of word ratings (e.g., Garcia et al., 2012; Rohr and Rahman, 2015). Note that this does not mean that the concepts we used can be considered neutral or less emotional.

    • Emotional bias varies with stimulus type, arousal and task setting: Meta-analytic evidences

      2019, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Thus, it is reasonable to infer that cultural background may play an important role in emotional bias. Moreover, using P3 amplitudes as the criterion variable, some studies observed positivity offset in participants with typical Western cultural background (Kissler et al., 2009; Phavichitr et al., 2008; Rohr and Rahman, 2015), while negativity bias occurred in those with Eastern cultural background (Chen et al., 2015; Liu et al., 2010). However, some studies have shown the opposite (Citron et al., 2013; Ito et al., 1998a,b; Wang et al., 2011; Yao et al., 2016).

    • Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence

      2018, Cortex
      Citation Excerpt :

      In human verbal communication, there is a natural prevalence for face-to-face interactions, involving the multimodal interplay of visual and auditory signals sent from the speaker to the listener. Though auditory information alone is sufficient for effective communication (Giraud & Poeppel, 2012), seeing the interlocutor's facial motions apparently provides further advantages (e.g., Crosse, Butler, & Lalor, 2015; Fort, Spinelli, Savariaux, Kandel, 2013; Peelle & Sommers, 2015; Rohr & Abdel Rahman, 2015; van Wassenhove, 2013). Some authors refer to this effect as visual enhancement (Peelle & Sommers, 2015), underscoring that human communication involves multisensory adaptation.

    • People matter: Perceived sender identity modulates cerebral processing of socio-emotional language feedback

      2016, NeuroImage
      Citation Excerpt :

      Recently, communicative context manipulations have been shown to modulate the processing of emotional language as reflected in brain event-related potentials (ERPs; Fields and Kuperberg, 2012; Herbert et al., 2011; Rohr and Abdel Rahman, 2015; Schindler et al., 2014, 2015). For example, Rohr and Abdel Rahman (2015) demonstrated that emotion effects in word processing were larger and occurred earlier when speakers in video-clips directly looked at participants. To the best of our knowledge, however, only one previous study manipulated (inferred) context without physically changing stimulus attributes: In a social feedback situation, the notion of interacting with a human partner has been found to amplify visual processing when compared with random feedback (Schindler et al., 2014, 2015).

    View all citing articles on Scopus
    View full text