We investigated the top-down influence of working memory (WM) maintenance on feedforward perceptual processing within occipito-temporal face processing structures. During event-related potential (ERP) recordings, subjects performed a delayed-recognition task requiring WM maintenance of faces or houses. The face-sensitive N170 component elicited by delay-spanning task-irrelevant grayscale noise probes was examined. If early feedforward perceptual activity is biased by maintenance requirements, the N170 ERP component elicited by probes should have a greater N170 amplitude response during face relative to house WM trials. Consistent with this prediction, N170 elicited by probes presented at the beginning, middle, and end of the delay interval was greater in amplitude during face relative to house WM. Thus, these results suggest that WM maintenance demands may modulate early feedforward perceptual processing for the entirety of the delay duration. We argue based on these results that temporally early biasing of domain-specific perceptual processing may be a critical mechanism by which WM maintenance is achieved.
The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation.
Planned and reflexive behaviors often occur in the presence of emotional stimuli and within the context of an individual's acute emotional state. Therefore, determining the manner in which emotion and attention interact is an important step toward understanding how we function in the real world. Participants in the current investigation viewed centrally displayed, task-irrelevant, face distractors (angry, neutral, happy) while performing a lateralized go/no-go continuous performance task. Lateralized go targets and no-go lures that did not spatially overlap with the faces were employed to differentially probe processing in the left (LH) and right (RH) cerebral hemispheres. There was a significant interaction between expression and hemisphere, with an overall pattern such that angry distractors were associated with relatively more RH inhibitory errors than neutral or happy distractors and happy distractors with relatively more LH inhibitory errors than angry or neutral distractors. Simple effects analyses confirmed that angry faces differentially interfered with RH relative to LH inhibition and with inhibition in the RH relative to happy faces. A significant three-way interaction further revealed that state anxiety moderated relations between emotional expression and hemisphere. Under conditions of low cognitive load, more intense anxiety was associated with relatively greater RH than LH impairment in the presence of both happy and threatening distractors. By contrast, under high load, only angry distractors produced greater RH than LH interference as a function of anxiety.
- Practices Specific to Tibetan Buddhism,
- Literature Specific to Tibetan Buddhism,
- Contemplation by Tradition,
- Means of Accomplishment (sadana, druptap),
- Literature of Buddhist Contemplation,
- Practices of Buddhist Contemplation,
- Generation phase (utpattikrama, kyerim),
- Deity yoga (devata-yoga, lhé nenjor),
- Buddhist Contemplation
Recent neuroimaging and neuropsychological work has begun to shed light on how the brain responds to the viewing of facial expressions of emotion. However, one important category of facial expression that has not been studied on this level is the facial expression of pain. We investigated the neural response to pain expressions by performing functional magnetic resonance imaging (fMRI) as subjects viewed short video sequences showing faces expressing either moderate pain or, for comparison, no pain. In alternate blocks, the same subjects received both painful and non-painful thermal stimulation. Facial expressions of pain were found to engage cortical areas also engaged by the first-hand experience of pain, including anterior cingulate cortex and insula. The reported findings corroborate other work in which the neural response to witnessed pain has been examined from other perspectives. In addition, they lend support to the idea that common neural substrates are involved in representing one's own and others' affective states.
Four experiments testing right-handed adult males examined interhemispheric transfer time (IHTT) estimation with visual evoked potentials (EPs) elicited in response to hemiretinal presentations of checkerboard-flash stimuli. Experiment 1 was a study of the relation between reaction time (RT) and EP measures of IHTT. EP measures provided more valid estimates than RT measures because more subjects showed IHTT in the direction of anatomical prediction. Experiment 2 showed that EPs derived from lateral occipital sites provided more valid and longer estimates of IHTT compared with EPs from medial occipital sites. Experiment 3 showed no difference between random versus blocked hemiretinal stimuli. Experiment 4 showed that IHTT derived with a linked-ears reference provided more valid estimates than IHTT derived with a mid-frontal reference and that small changes in stimulus eccentricity did not influence IHTT. The findings of these experiments indicate that noninvasive estimates of visual IHTT can be obtained in humans.
Research on temporal-order judgments, reference frames, discrimination tasks, and links to oculomotor control suggest important differences between inhibition of return (IOR) and attentional costs and benefits. Yet, it is generally assumed that IOR is an attentional effect even though there is little supporting evidence. The authors evaluated this assumption by examining how several factors that are known to influence attentional costs and benefits affect the magnitude of IOR: target modality, target intensity, and response mode. Results similar to those previously reported for attention were observed: IOR was greater for visual than for auditory targets, showed an inverse relationship with target intensity, and was equivalent for manual and saccadic responses. Important parallels between IOR and attentional costs and benefits are indicated, suggesting that, like attention, IOR may in part affect sensory-perceptual processes.
Working memory (WM) representations serve as templates that guide behavior, but the neural basis of these templates remains elusive. We tested the hypothesis that WM templates are maintained by biasing activity in sensoriperceptual neurons that code for features of items being held in memory. Neural activity was recorded using event-related potentials (ERPs) as participants viewed a series of faces and responded when a face matched a target face held in WM. Our prediction was that if activity in neurons coding for the features of the target is preferentially weighted during maintenance of the target, then ERP activity evoked by a nontarget probe face should be commensurate with the visual similarity between target and probe. Visual similarity was operationalized as the degree of overlap in visual features between target and probe. A face-sensitive ERP response was modulated by target-probe similarity. Amplitude was largest for probes that were similar to the target, and decreased monotonically as a function of decreasing target-probe similarity. These results indicate that neural activity is weighted in favor of visual features that comprise an actively held memory representation. As such, our findings support the notion that WM templates rely on neural populations involved in forming percepts of memory items.