You are on page 1of 4

Study title: The effect of emotion on averted gaze cue time with human and nonhuman faces

Principal Investigator: Jessica Green and Brooke Troxell

Specific Aims
The purpose of this study is to investigate the effect displayed emotion has on target
identification when both direct and averted gaze are used. We also want to see if human
faces and nonhuman faces are processed the same way by attention pathways in the brain
and if they produce the same response times.

Background and Significance

Previous studies have shown that perceived gaze direction can be used to shift visual
attention (Frischen et al, 2007). The shift in attention is socially evolved automatic response
used to orient oneself to the same object that other people are looking at (Adams et al, 2010).
The most current research has focused on attempting to identify the pathways and
mechanisms in the brain that produce this automatic response and how it influences cognitive
abilities (Hietanen et al, 2008). However most studies done, to date, have only used neutral
faces or schematic eyes to examine the pathways (George et al, 2008). There were a few
studies done that examined the pathways in the context of fear (Matthews et al, 2003; Wieser
et al, 2014). Since it is hypothesized that perceived gaze direction is a socially evolved cue,
and emotions are also a vital part of human societal interactions, we wanted to see if different
emotions, not just fear, have an influence on the shifting of visual attention and the pathways
that the information is processed.

Therefore, the goal of the current study is to assess whether displayed emotion influences a
shift in visual attention and whether this shift influences cognitive responses to stimuli. This
study also wishes to examine whether humans process human and nonhuman faces the
same way and if the type of face with displayed emotion has an effect on cognitive
processing.

Research Design and Methods:


All data collection and analysis will take place in the PIs lab (1800 Gervais St.). The sound-
attenuated experimental chamber contains a computer monitor and speakers from which all
stimuli will be presented. Participants will sit in a comfortable chair facing the computer (at a
distance of ~60 cm) and respond to all stimuli using a computer gamepad.

Questionnaires: Prior to completing the attention task, each participant will complete a 20-
item checklist designed to screen adults for dyslexia (the Revised Adult Dyslexia Checklist;
RADC). Participants indicate if each item (e.g., Do you dislike reading long books?) applies to
them or not by checking a box for Yes or No (Vinegrad, 1994).
Because there is often co-morbidity between disorders, many of which are also linked to
attention deficits, we will also collect non-diagnostic screening measures of ADHD and Autism
from each participant. This will allow us to not only investigate our a priori hypotheses about
the link between visual attention and dyslexia while controlling for symptoms of other
disorders, but will also allow us to explore the data based on those measures. The Jasper-
Goldberg Adult ADD Screening Examination (version 5.0) is a 24-item questionnaire to
assess symptoms of attention-deficit disorder in adults. Participants rate each item (e.g., Even
when sitting quietly, I am usually moving my hands or feet) on a scale from 0 to 5, with 0
meaning not at all and 5 meaning very much (Jasper & Goldberg, 1993). The Adult Autism
Spectrum Quotient is a 50-item questionnaire assessing degree of autism-like traits.
Participants read each item (e.g. I enjoy social occasions) and choose from four options as to
how much they agree with the statement - definitely agree, slightly agree, slightly disagree,
definitely disagree (Baron-Cohen et al., 2001). We included a demographic questionnaire
containing questions about age, gender, ethnicity, and race. It also asks questions about cell
phone and other mobile device usage, and frequency of texting and instant messaging.

Experimental Task: Participants will perform a simple visual search attention task, on a
computer. This task involves the presentation of a face, either human or nonhuman, and it
may be displaying one of six emotions: happiness, sadness, anger, surprise, fear, or
neutrality. Some of the faces will also have averted eye gaze indicating the location where
the visual target stimulus may appear. The subject will indicate which side of the screen the
target appeared using the game controller.

Data Analyses: Analyses of behavioural data (response times, accuracy) and correlations
between neural measures and questionnaire data will be performed using in-house software
written in MATLAB.

Time-table: We expect to initiate data collection as soon as IRB approval has been obtained.
Data collection will continue until the previously stated inclusion goal is reached.

Human Subjects
1. Target Population:
A total of 30 participants will be included in the experiment. All participants will be healthy
young adults (age 18-40) who have normal or corrected-to-normal vision. Selection is
independent of gender and ethnicity.

Gender and minority inclusion As stated earlier, no participants will be excluded based on
gender or race/ethnicity. Children will not be enrolled in the study. It is expected that the
gender and racial composition of the study sample will mirror the greater Columbia population
composition seen in Table 1.

Table 1. Racial and ethnic composition of the greater Columbia, SC area.


Am. Indian, Asian, Black, not White, not Other, Total
Alaskan Pacific of Hispanic of Hispanic Unknown
Native Islander Origin Origin
Greater Columbia Area (%) 0.3 1.5 32.5 64.7 1.0 100
Target Male (n) 0 1 5 9 0 15
Female (n) 0 1 5 9 0 15

2. Recruiting Plans: Recruitment will be performed through the psychology undergraduate


subject pool and through advertisements on the PIs website.

3. Existing Data/Samples: not applicable

4. Consent/Assent: Consent will be obtained by the PI or by lab researchers (postdocs,


graduate students, or research assistants) who have completed all required IRB training and
have been trained by the PI. The researcher will outline the main section of the consent form
for the participant and remind them of their right to withdraw consent at any time during the
experiment. The participant will then be given as much time as needed to review the consent
form and ask any questions. The process typically takes less than 10 minutes.

5. Potential Risks: The risks to participants are minimal, although there is a slight risk of
breach of confidentiality despite any steps taken to protect participants' privacy.
All questionnaires and screening questions are framed as requiring voluntary responses only.
Participants can choose to withdraw from the study without penalty if they do not wish to
provide the requested information. All study data are anonymized and only contain a subject
number. All identifying information is stored in locked cabinets that only study personnel can
access.

For EEG, participants may feel some mild discomfort from the pressure of the elastic
electrode cap on their head or the application of the conductive gel. Some of the gel will also
remain in the participants hair when the electrode cap is removed, but this gel will easily
wash out. Standard non-invasive EEG recording techniques are used the PI and research
assistants have all been well trained to minimize subject discomfort. Facilities are available in
the lab for participants to wash their hair.

6. Potential Benefits: There will be no direct benefit to people who participate in this study.
However, some neurological and psychiatric disorders, such as autism have characteristic
symptoms than involve not being able to interpret social cues and emotions on other faces
(Jong et al, 2008). Understanding how facial and emotional information is processed in the
brain by non-autistic people, could produce cures or therapies to aid autistic persons in social
functioning, which is a main part of human function and survival.

7. Confidentiality: Consent forms that contain identifying information will be housed in a


locked filing cabinet within a locked office that only the researchers will have access to. All
other materials (questionnaires, electronic data files, subsequent analyses) will identify
participants only by number and will not contain any information that could identify the
participant.

All desktop computers and laptops used for data acquisition and analysis are password
protected and only accessible to study personnel. In addition, no data contain any identifying
information and are only labeled with subject numbers.

8. Compensation: Students can receive experimental credit for participation in the study
based on the total number of hours of participation (e.g., One hour of credit per hour of
participation). If not receiving course credit, participants will receive $15 for their participation.

9. Withdrawal: The participants are given a full description of the study and informed both
verbally and in writing (in the consent form) that they can withdraw during the study at any
time, without negative consequence, by contacting the principal investigator.
Reference Cited

Adams, Reginald B., Kristin Pauker, and Max Weisbuch. "Looking the Other Way: The Role of
Gaze Direction in the Cross-race Memory Effect." Journal of Experimental Social
Psychology 46.2 (2010): 478-81. Web.
Frischen, Alexandra, Andrew P. Bayliss, and Steven P. Tipper. "Gaze Cueing of Attention:
Visual Attention, Social Cognition, and Individual Differences." Psychological
Bulletin133.4 (2007): 694-724. Web.
George, N., and L. Conty. "Facing the Gaze of Others." Neurophysiologie Clinique/Clinical
Neurophysiology 38.3 (2008): 197-207. Web.
Hietanen, Jari K., Jukka M. Leppnen, Mikko J. Peltola, Kati Linna-Aho, and Heidi J.
Ruuhiala. "Seeing Direct and Averted Gaze Activates the Approachavoidance
Motivational Brain Systems." Neuropsychologia 46.9 (2008): 2423-430. Web.
Jong, Maartje Cathelijne De, Herman Van Engeland, and Chantal Kemner. "Attentional
Effects of Gaze Shifts Are Influenced by Emotion and Spatial Frequency, but Not in
Autism." Journal of the American Academy of Child & Adolescent Psychiatry 47.4
(2008): 443-54. Web.
Matthews, Andrew, Elaine Fox, Jenny Yiend, and Andy Calder. "The Face of Fear: Effects of
Eye Gaze and Emotion on Visual Attention." Visual Cognition 10.7 (2003): 823-35.
Web.
Wieser, Matthias J., Vladimir Miskovic, Sophie Rausch, and Andreas Keil. "Different Time
Course of Visuocortical Signal Changes to Fear-conditioned Faces with Direct or
Averted Gaze: A SsVEP Study with Single Trial Analysis." Neuropsychologia 62 (2014):
101-10. Web.

You might also like