According to the Conceptual Act Theory of Emotion, the situated conceptualization used to construe a situation determines the emotion experienced. A neuroimaging experiment tested two core hypotheses of this theory: (1) different situated conceptualizations produce different forms of the same emotion in different situations, (2) the composition of a situated conceptualization emerges from shared multimodal circuitry distributed across the brain that produces emotional states generally. To test these hypotheses, the situation in which participants experienced an emotion was manipulated. On each trial, participants immersed themselves in a physical danger or social evaluation situation and then experienced fear or anger. According to Hypothesis 1, the brain activations for the same emotion should differ as a function of the preceding situation (after removing activations that arose while constructing the situation). According to Hypothesis 2, the critical activations should reflect conceptual processing relevant to the emotion in the current situation, drawn from shared multimodal circuitry underlying emotion. The results supported these predictions and demonstrated the compositional process that produces situated conceptualizations dynamically.
Many objects typically occur in particular locations, and object words encode these spatial associations. We tested whether such object words (e.g., head, foot) orient attention toward the location where the denoted object typically occurs (i.e., up, down). Because object words elicit perceptual simulations of the denoted objects (i.e., the representations acquired during actual perception are reactivated), we predicted that an object word would interfere with identification of an unrelated visual target subsequently presented in the object's typical location. Consistent with this prediction, three experiments demonstrated that words denoting objects that typically occur high in the visual field hindered identification of targets appearing at the top of the display, whereas words denoting low objects hindered target identification at the bottom of the display. Thus, object words oriented attention to and activated perceptual simulations in the objects' typical locations. These results shed new light on how language affects perception.
Separate, extended series of positive, negative, and neutral pictures were presented to 24 (12 men, 12 women) undergraduates. Each series was presented on a different day, with full counterbalancing of presentation orders. Affective state was measured using (a) orbicularis oculi activity in response to acoustic startle probes during picture presentation, (b) corrugator supercilii activity between and during picture presentation, and (c) changes in self-reports of positive and negative affect. Participants exhibited larger eyeblink reflex magnitudes when viewing negative than when viewing positive pictures. Corrugator activity was also greater during the negative than during the positive picture set, during both picture presentation and the period between pictures. Self-reports of negative affect increased in response to the negative picture set, and self-reports of positive affect were greatest following the positive picture set. These findings suggest that extended picture presentation is an effective method of manipulating affective state and further highlight the utility of startle probe and facial electromyographic measures in providing on-line readouts of affective state.
The capacity to stabilize the content of attention over time varies among individuals, and its impairment is a hallmark of several mental illnesses. Impairments in sustained attention in patients with attention disorders have been associated with increased trial-to-trial variability in reaction time and event-related potential deficits during attention tasks. At present, it is unclear whether the ability to sustain attention and its underlying brain circuitry are transformable through training. Here, we show, with dichotic listening task performance and electroencephalography, that training attention, as cultivated by meditation, can improve the ability to sustain attention. Three months of intensive meditation training reduced variability in attentional processing of target tones, as indicated by both enhanced theta-band phase consistency of oscillatory neural responses over anterior brain areas and reduced reaction time variability. Furthermore, those individuals who showed the greatest increase in neural response consistency showed the largest decrease in behavioral response variability. Notably, we also observed reduced variability in neural processing, in particular in low-frequency bands, regardless of whether the deviant tone was attended or unattended. Focused attention meditation may thus affect both distracter and target processing, perhaps by enhancing entrainment of neuronal oscillations to sensory input rhythms, a mechanism important for controlling the content of attention. These novel findings highlight the mechanisms underlying focused attention meditation and support the notion that mental training can significantly affect attention and brain function.
Mindfulness is defined as paying attention in the present moment. We investigate the hypothesis that mindfulness training may alter or enhance specific aspects of attention. We examined three functionally and neuroanatomically distinct but overlapping attentional subsystems: alerting, orienting, and conflict monitoring. Functioning of each subsystem was indexed by performance on the Attention Network Test. Two types of mindfulness training (MT) programs were examined, and behavioral testing was conducted on participants before (Time 1) and after (Time 2) training. One training group consisted of individuals naive to mindfulness techniques who participated in an 8-week mindfulness-based stress reduction (MBSR) course that emphasized the development of concentrative meditation skills. The other training group consisted of individuals experienced in concentrative meditation techniques who participated in a 1-month intensive mindfulness retreat. Performance of these groups was compared with that of control participants who were meditation naive and received no MT. At Time 1, the participants in the retreat group demonstrated improved conflict monitoring performance relative to those in the MBSR and control groups. At Time 2, the participants in the MBSR course demonstrated significantly improved orienting in comparison with the control and retreat participants. In contrast, the participants in the retreat group demonstrated altered performance on the alerting component, with improvements in exogenous stimulus detection in comparison with the control and MBSR participants. The groups did not differ in conflict monitoring performance at Time 2. These results suggest that mindfulness training may improve attention-related behavioral responses by enhancing functioning of specific subcomponents of attention. Whereas participation in the MBSR course improved the ability to endogenously orient attention, retreat participation appeared to allow for the development and emergence of receptive attentional skills, which improved exogenous alerting-related process.
OBJECTIVE: Positron emission tomography was used to investigate the neural substrates of normal human emotional and their dependence on the types of emotional stimulus. METHOD: Twelve healthy female subjects underwent 12 measurements of regional brain activity following the intravenous bolus administration of [15O]H2O as they alternated between emotion-generating and control film and recall tasks. Automated image analysis techniques were used to characterize and compare the increases in regional brain activity associated with the emotional response to complex visual (film) and cognitive (recall) stimuli. RESULTS: Film- and recall-generated emotion were each associated with significantly increased activity in the vicinity of the medial prefrontal cortex and thalamus, suggesting that these regions participate in aspects of emotion that do not depend on the nature of the emotional stimulus. Film-generated emotion was associated with significantly greater increases in activity bilaterally in the occipitotemporparietal cortex, lateral cerebellum, hypothalamus, and a region that includes the anterior temporal cortex, amygdala, and hippocampal formation, suggesting that these regions participate in the emotional response to certain exteroceptive sensory stimuli. Recall-generated sadness was associated with significantly greater increases in activity in the vicinity of the anterior insular cortex, suggesting that this region participates in the emotional response to potentially distressing cognitive or interoceptive sensory stimuli. CONCLUSIONS: While this study should be considered preliminary, it identified brain regions that participate in externally and internally generated human emotion.
OBJECTIVE: Happiness, sadness, and disgust are three emotions that differ in their valence (positive or negative) and associated action tendencies (approach or withdrawal). This study was designed to investigate the neuroanatomical correlates of these discrete emotions. METHOD: Twelve healthy female subjects were studied. Positron emission tomography and [15O]H2O were used to measure regional brain activity. There were 12 conditions per subject: happiness, sadness, and disgust and three control conditions, each induced by film and recall. Emotion and control tasks were alternated throughout. Condition order was pseudo-randomized and counterbalanced across subjects. Analyses focused on brain activity patterns for each emotion when combining film and recall data. RESULTS: Happiness, sadness, and disgust were each associated with increases in activity in the thalamus and medial prefrontal cortex (Brodmann's area 9). These three emotions were also associated with activation of anterior and posterior temporal structures, primarily when induced by film. Recalled sadness was associated with increased activation in the anterior insula. Happiness was distinguished from sadness by greater activity in the vicinity of ventral mesial frontal cortex. CONCLUSIONS: While this study should be considered preliminary, it identifies regions of the brain that participate in happiness, sadness, and disgust, regions that distinguish between positive and negative emotions, and regions that depend on both the elicitor and valence of emotion or their interaction.
In this article, the authors elaborate on 3 ideas advanced in P. Rozin and A. B. Cohen's (2003) innovative study of facial expression. Taking a cue from their discovery of new expressive behaviors (e.g., the narrowed eyebrows), the authors review recent studies showing that emotions are conveyed in more channels than usually studied, including posture, gaze patterns, voice, and touch. Building on their claim that confusion has a distinct display, the authors review evidence showing distinct displays for 3 self-conscious emotions (embarrassment, shame, and pride), 5 positive emotions (amusement, desire, happiness, love, interest), and sympathy and compassion. Finally, the authors offer a functional definition of emotion to integrate these findings on "new" displays and emotions.
Emotions can color people’s attitudes toward unrelated objects in the environment. Existing evidence suggests that such emotional coloring is particularly strong when emotion-triggering information escapes conscious awareness. But is emotional reactivity stronger after nonconscious emotional provocation than after conscious emotional provocation, or does conscious processing specifically change the association between emotional reactivity and evaluations of unrelated objects? In this study, we independently indexed emotional reactivity and coloring as a function of emotional-stimulus awareness to disentangle these accounts. Specifically, we recorded skin-conductance responses to spiders and fearful faces, along with subsequent preferences for novel neutral faces during visually aware and unaware states. Fearful faces increased skin-conductance responses comparably in both stimulus-aware and stimulus-unaware conditions. Yet only when visual awareness was precluded did skin-conductance responses to fearful faces predict decreased likability of neutral faces. These findings suggest a regulatory role for conscious awareness in breaking otherwise automatic associations between physiological reactivity and evaluative emotional responses.
According to the Perceptual Symbols Theory of cognition (Barsalou, 1999), modality-specific simulations underlie the representation of concepts. A strong prediction of this view is that perceptual processing affects conceptual processing. In this study, participants performed a perceptual detection task and a conceptual property-verification task in alternation. Responses on the property-verification task were slower for those trials that were preceded by a perceptual trial in a different modality than for those that were preceded by a perceptual trial in the same modality. This finding of a modality-switch effect across perceptual processing and conceptual processing supports the hypothesis that perceptual and conceptual representations are partially based on the same systems.
This experiment was designed to assess the differential impact of initially presenting affective information to the left versus right hemisphere on both the perception of and response to the input. Nineteen right-handed subjects were presented with faces expressing happiness and sadness. Each face was presented twice to each visual field for an 8-sec duration. The electro-oculogram (EOG) was monitored and fed back to subjects to train them to keep their eyes focused on the central fixation point as well as to eliminate trials confounded by eye movement artifact. Following each slide presentation, subjects rated the intensity of the emotional expression depicted in the face and their emotional reaction to the face on a series of 7-point rating scales. Subjects reported perceiving more happiness in response to stimuli initially presented to the left hemisphere (right visual field) compared to presentations of the identical faces to the right hemisphere (left visual field). This effect was predominantly a function of ratings on sad faces. A similar, albeit less robust, effect was found on self-ratings of happiness (the degree to which the face elicited the emotion in the viewer). These data challenge the view that the right hemisphere is uniquely involved in all emotional behavior. The implications of these findings for theories concerning the lateralization of emotional behavior are discussed.
<p>This experiment was designed to test whether reading disabled boys differ from matched controls on behavioral measures of interhemispheric transfer time (IHTT). Specifically, we proposed that language-disordered reading disabled children who had deficits in naming would show either faster or slower IHTTs compared with controls. From an initial group of 118 right-handed males, we selected a group of 25 disabled and 25 normal readers, matched on age. All subjects had to obtain a full scale IQ of 90 or above, a PIQ score of 85 or above, and a scaled score of 7 or above on the Block Design Subtest of the WISC-R. After meeting additional criteria for group assignment, manual reaction time (RT) measures of IHTT were obtained in response to simple visual and tactile stimuli during two laboratory testing sessions. Half the trials were conducted with the hands in an uncrossed orientation and half with the hands crossed in order to examine the effects of spatial compatibility on estimates of IHTT. The results revealed no overall group differences in IHTT for any of the conditions. However, correlations between IHTT measures and indices of cognitive performance indicated that faster IHTTs were significantly correlated with poorer performance on measures of reading and language function in the dyslexic group. These data are discussed within the context of a model of interhemispheric transfer deficits in disabled readers.</p>
Working memory (WM) comprises operations whose coordinated action contributes to our ability to maintain focus on goal-relevant information in the presence of distraction. The present study investigated the nature of distraction upon the neural correlates of WM maintenance operations by presenting task-irrelevant distracters during the interval between the memoranda and probes of a delayed-response WM task. The study used a region of interest (ROIs) approach to investigate the role of anterior (e.g., lateral and medial prefrontal cortex--PFC) and posterior (e.g., parietal and fusiform cortices) brain regions that have been previously associated with WM operations. Behavioral results showed that distracters that were confusable with the memorandum impaired WM performance, compared to either the presence of non-confusable distracters or to the absence of distracters. These different levels of distraction led to differences in the regional patterns of delay interval activity measured with event-related functional magnetic resonance imaging (fMRI). In the anterior ROIs, dorsolateral PFC activation was associated with WM encoding and maintenance, and in maintaining a preparatory state, and ventrolateral PFC activation was associated with the inhibition of distraction. In the posterior ROIs, activation of the posterior parietal and fusiform cortices was associated with WM and perceptual processing, respectively. These findings provide novel evidence concerning the neural systems mediating the cognitive and behavioral responses during distraction, and places frontal cortex at the top of the hierarchy of the neural systems responsible for cognitive control.
Four U.S. sites formed a consortium to conduct a multisite study of fMRI methods. The primary purpose of this consortium was to examine the reliability and reproducibility of fMRI results. FMRI data were collected on healthy adults during performance of a spatial working memory task at four different institutions. Two sets of data from each institution were made available. First, data from two subjects were made available from each site and were processed and analyzed as a pooled data set. Second, statistical maps from five to eight subjects per site were made available. These images were aligned in stereotactic space and common regions of activation were examined to address the reproducibility of fMRI results when both image acquisition and analysis vary as a function of site. Our grouped and individual data analyses showed reliable patterns of activation in dorsolateral prefrontal cortex and posterior parietal cortex during performance of the working memory task across all four sites. This multisite study, the first of its kind using fMRI data, demonstrates highly consistent findings across sites.
Work in philosophy and psychology has argued for a dissociation between perceptually-based similarity and higher-level rules in conceptual thought. Although such a dissociation may be justified at times, our goal is to illustrate ways in which conceptual processing is grounded in perception, both for perceptual similarity and abstract rules. We discuss the advantages, power and influences of perceptually-based representations. First, many of the properties associated with amodal symbol systems can be achieved with perceptually-based systems as well (e.g. productivity). Second, relatively raw perceptual representations are powerful because they can implicitly represent properties in an analog fashion. Third, perception naturally provides impressions of overall similarity, exactly the type of similarity useful for establishing many common categories. Fourth, perceptual similarity is not static but becomes tuned over time to conceptual demands. Fifth, the original motivation or basis for sophisticated cognition is often less sophisticated perceptual similarity. Sixth, perceptual simulation occurs even in conceptual tasks that have no explicit perceptual demands. Parallels between perceptual and conceptual processes suggest that many mechanisms typically associated with abstract thought are also present in perception, and that perceptual processes provide useful mechanisms that may be co-opted by abstract thought.
This paper reports three studies showing sex differences in EEG asymmetry during self-generated cognitive and affective tasks. In the first experiment, bilateral EEG, quantified for alpha on-line, was recorded from right-handed subjects while they either whistled, sang or recited lyrics of familiar songs. The results revealed significant asymmetry between the whistle and talk conditions only for subjects with no familial left-handedness and, within this group, only for females and not for males. In the second experiment, bilateral EEG was recorded while right-handed subjects (with no familial left-handedness) self-induced covert affective and non-affective states. Results revealed significantly greater relative right-hemisphere activation during emotion versus non-emotion trials only in females; males showed no significant task-dependent shifts in asymmetry between conditions. The third experiment was designed to test the hypothesis that females show greater percent time asymmetry than males during biofeedback training for symmetrical and asymmetrical EEG patterns. Results confirmed this prediction as well as indicating that females show better control of such asymmetrical cortical patterning. These findings provide new neuropsychological support for the hypothesis of greater bilateral flexibility in females during self-generation tasks.
<p>For decades the importance of background situations has been documented across all areas of cognition. Nevertheless, theories of concepts generally ignore background situations, focusing largely on bottom-up, stimulus-based processing. Furthermore, empirical research on concepts typically ignores background situations, not incorporating them into experimental designs. A selective review of relevant literatures demonstrates that concepts are not abstracted out of situations but instead are situated. Background situations constrain conceptual processing in many tasks (e.g., recall, recognition, categorization, lexical decision, color naming, property verification, property generation) across many areas of cognition (e.g., episodic memory, conceptual processing, visual object recognition, language comprehension). A taxonomy of situations is proposed in which grain size, meaningfulness, and tangibility distinguish the cumulative situations that structure cognition hierarchically.</p>
Recent evidence suggests that perceptions of social class rank influence a variety of social cognitive tendencies, from patterns of causal attribution to moral judgment. In the present studies we tested the hypotheses that upper-class rank individuals would be more likely to endorse essentialist lay theories of social class categories (i.e., that social class is founded in genetically based, biological differences) than would lower-class rank individuals and that these beliefs would decrease support for restorative justice--which seeks to rehabilitate offenders, rather than punish unlawful action. Across studies, higher social class rank was associated with increased essentialism of social class categories (Studies 1, 2, and 4) and decreased support for restorative justice (Study 4). Moreover, manipulated essentialist beliefs decreased preferences for restorative justice (Study 3), and the association between social class rank and class-based essentialist theories was explained by the tendency to endorse beliefs in a just world (Study 2). Implications for how class-based essentialist beliefs potentially constrain social opportunity and mobility are discussed.
The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation.
Spatial working memory is a cognitive brain mechanism that enables the temporary maintenance and manipulation of spatial information. Recent neuroimaging and behavioral studies have led to the proposal that directed spatial attention is the mechanism by which location information is maintained in spatial working memory. Yet it is unclear whether attentional involvement is required throughout the period of active maintenance or is only invoked during discrete task-phases such as mnemonic encoding. In the current study, we aimed to track the time-course of attentional involvement during spatial working memory by recording event-related brain potentials (ERPs) from healthy volunteers. In Experiment 1, subjects performed a delayed-recognition task. Each trial began with the presentation of a brief stimulus (S1) that indicated the relevant location that subjects were to maintain in working memory. A 4.8-5.3 sec delay interval followed during which a single task-irrelevant probe was presented. The delay interval concluded with a test item (S2) to which subjects made a response indicating whether the S2-location was the same as the S1-memory location. To determine if attention was differentially engaged during discrete phases of the trial, task-irrelevant probes were presented early (400-800 msec following S1-offset) or late (2600-3000 msec following S1-offset) during the delay interval. Sensory-evoked ERPs (P1 and N1) elicited by these irrelevant probes showed attention-like modulations with greater amplitude responses for probes occurring at the S1-memory locations in comparison to probes presented at other locations. This pattern was obtained for both early- and late-delay probes. Probe-evoked activity during delayed-recognition trials was similar to activity observed when spatial attention was explicitly focused on a location in visual space (Experiment 2). These results are consistent with a model of spatial working memory in which perceptual level selective attention is utilized throughout the entire period of active maintenance to keep relevant spatial information in mind.
Studies of emotion signaling inform claims about the taxonomic structure, evolutionary origins, and physiological correlates of emotions. Emotion vocalization research has tended to focus on a limited set of emotions: anger, disgust, fear, sadness, surprise, happiness, and for the voice, also tenderness. Here, we examine how well brief vocal bursts can communicate 22 different emotions: 9 negative (Study 1) and 13 positive (Study 2), and whether prototypical vocal bursts convey emotions more reliably than heterogeneous vocal bursts (Study 3). Results show that vocal bursts communicate emotions like anger, fear, and sadness, as well as seldom-studied states like awe, compassion, interest, and embarrassment. Ancillary analyses reveal family-wise patterns of vocal burst expression. Errors in classification were more common within emotion families (e.g., 'self-conscious,' 'pro-social') than between emotion families. The three studies reported highlight the voice as a rich modality for emotion display that can inform fundamental constructs about emotion.
Research on temporal-order judgments, reference frames, discrimination tasks, and links to oculomotor control suggest important differences between inhibition of return (IOR) and attentional costs and benefits. Yet, it is generally assumed that IOR is an attentional effect even though there is little supporting evidence. The authors evaluated this assumption by examining how several factors that are known to influence attentional costs and benefits affect the magnitude of IOR: target modality, target intensity, and response mode. Results similar to those previously reported for attention were observed: IOR was greater for visual than for auditory targets, showed an inverse relationship with target intensity, and was equivalent for manual and saccadic responses. Important parallels between IOR and attentional costs and benefits are indicated, suggesting that, like attention, IOR may in part affect sensory-perceptual processes.