You are on page 1of 10

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

The Application of Facial Expression Recognition to the Evaluation of Abstract


Graphics
Chih-Hsiang Ko and Chia-Yin Yu
Department of Industrial and Commercial Design
National Taiwan University of Science and Technology
43 Keelung Road, Section 4, Taipei, Taiwan
linko@mail.ntust.edu.tw, lisayu1202@gmail.com
ABSTRACT
While designing abstract graphics, designers
transform original objects by selecting certain
elements to emphasize individual concepts and
preserve the original content. It is crucial for the
designers to understand viewers emotions in order
to build emotional connections between the viewers
and the designers. Physiological signals can reveal
peoples feelings and are corresponded with their
oral responses. Facial expressions are considered as
an effective tool for understanding human
emotional reactions. In this research, an emotion
recognition experiment based on facial expression
recognition was conducted for understanding
subjects emotional responses on different abstract
graphics. FaceReader, an automatic facial
expression recognition software, was applied to
measure the subjects emotions. The results
indicated that there were emotional differences in
different abstract graphic styles, and the possibility
of applying facial expression recognition in the
design field for understanding peoples emotions
was confirmed. It was expected that the results of
this study could help the designers to build
emotional communication with the viewers by
linking designs with consumer interests to promote
marketing activities.

KEYWORDS
Abstract graphics, facial expression recognition,
emotion, visual image, Chinese dragon, FaceReader.

1 BACKGROUND
Abstraction is the process of reducing an image
or object to a simpler form [1]. It is a stylized

ISBN: 978-1-941968-16-1 2015 SDIWC

method for designers to transform or imitate


existing objects or phenomena, to simplify
information and enhance design ideas. The
result of abstraction depends on the designers
perspectives on the objects. It is critical for the
designers to ensure that exact ideas are
transmitted to viewers for understanding
abstract graphics. Therefore, the designers need
to understand the viewers interests in order to
build emotional connections between the
viewers and the graphic images. Verbal and
nonverbal behaviors are bridges of human
communication that transmit emotions between
people. Nonverbal communication includes all
forms of communication except words. It
consists of physical behaviors commonly
referred to as body language, gestures and
facial expressions. Among these different
channels, the facial expression is regarded as
the most important channel of expression of
emotion because the human face provides
useful information on the emotion and the inner
state of the individual [2]. There are some facial
expressions of emotion which are universal [3],
and can be used as references for emotion
recognition. In this research, abstract graphics
were categorized on the basis of abstraction by
a focus group. Test subjects emotions for
different types of abstract graphics were
examined through an emotion recognition
experiment in accordance with the theory of
facial expression recognition. Meanwhile, an
automatic facial expression recognition
software called FaceReader was applied in the
research to analyze the subjects emotions. By

136

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

examining the subjects emotional data through


facial expression recognition, the researchers
could interpret their emotional responses on
different forms of the abstract graphics.
2 LITERATURE REVIEW
2.1 Abstraction in Design
The general definition of abstraction has two
aspects, one is to disassociate from any specific
instance; another is to express a quality apart
from an object [4]. In the art field, abstract art is
comprehended by using shapes, forms, colors
and lines to create art works, while the subject
is independent from visual references in the real
world [5]. Some of the well-known abstract
artists were Kandinsky and Delaunay. In the
design field, abstraction is commonly known as
a method or technique for designers to
transform certain objects into something else.
The main difference between fine art and
design in abstraction is that the designers have
responsibilities to make sure the information is
transmitted to audiences correctly. That is, the
result of abstraction depends on the designers
perspectives, including how they observe and
decide what information they expect to preserve,
how to select elements to simplify the original
objects while preserving certain information.
Through the process of design abstraction,
pictures, signs and symbols represent the
affordance in abstractness related to their
meanings or functions [6].
2.2 Facial Expressions in Emotion
People send emotional signals while in
communication with others. Besides, most
psychological processes reflect emotions and
most emotional reactions are non-verbal. Our
evolutionary heritage makes a major
contribution to the shaping of our emotional
responses. Humans continue to make facial
expressions because they have acquired
communicative value throughout evolutionary
history [7][8]. Among human communication,

ISBN: 978-1-941968-16-1 2015 SDIWC

55% of communication comes from facial


expressions, 38% of communication gets
accomplished by tones of voice, and only 7% of
communication relies on verbal exchange [9].
A human can make over 10,000 facial
expressions to express a wide variety of subtle
emotions. Facial expressions of emotion are not
culturally determined; they are universal across
human cultures and thus biological in origin
[10]. A facial expression is the result of the full
or partial activation of combination of facial
muscles [11]. There are six common emotions
all humans share, which are happiness, sadness,
anger, scare, surprise, and disgust [12][13].
2.3 Research on Facial Expression
Recognition
The results of facial recognition studies suggest
that people are able to know a persons
psychological state from facial expressions
[14][15][16]. There are 20 or more muscles
involved in a facial expression. The generation
of emotional expressions requires sequenced
movements of facial muscles, which highlights
the important function of facial expressions in
the social communication of changes in
affective states [17]. When a unique facial
feature is read and categorized into a certain
emotion category, it expresses not only selffeelings but also social information [17][18]. In
daily life, facial expressions can be used as a
method to open or close a communication, to
provide verbal or nonverbal reactions, or to
replace the meaning of a conversation [19].
The most well-known research on facial
expression is Facial Action Coding System
(FACS) [11]. Researchers have to distinguish
subjects facial muscle movements to
generalize possible emotions. Nowadays, the
fundamental theory of FACS is used for
understanding the subjects emotions in various
research and related techniques for improving
the recognition method have also been
developed. FACS is based on facial muscle
movements that accompany facial expressions
because consistent patterns have been found in

137

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

expressions across cultures. An action unit (AU)


is defined as the minimum, visible,
anatomically based action involved in the
movement of the face. A facial expression is
described in terms of a particular action unit
that singly or in combination with other units
producing facial movements [12][13].
The descriptive explanations and details on
facial muscles of FACS have huge
contributions to facial expression recognition.
Most FACS-related research topics were the
study of particular facial expressions [20][21].
Some explored how facial expression
recognition could improve intelligence service
[22][23]. These systems were focused on how
to make measurements automatically and
reduce processing time [24]. Nowadays, the
possibilities of commercially applicable facial
expression recognition have been discussed,
such as measuring consumer emotions as a
marketing research tool [25]. In the design field,
subjects facial expressions on different colors
were examined [26]. Such research indicated
that the subjects from different cultural
backgrounds (Australian and European) had
identical performance in pairing colors and
facial expressions. It was also believed that
color information had significant potential to
improve the performance of facial expression
recognition [27]. In addition, great attentions
were given on the relationship between modern
product design and emotions. The facial
expression recognition method has been
recommended by many researchers to be a
good technique to interpret a persons emotion
if compared with other recognition methods
such as speech [28][29][30]. In other words,
design elements in graphics could affect human
emotions due to people having different
recognitions on them.
2.4 FaceReader
FaceReader is used as a tool in this research to
analyze the subjects emotions for different
styles of abstract graphics. FaceReader is an
automatic analytical facial expression software

ISBN: 978-1-941968-16-1 2015 SDIWC

based on the definition and recognition


principles of FACS. It allows researchers to
quantitatively analyze experimental subjects
six basic emotions. The correctness of its
recognition is 90% [31]. FaceReader was used
to test the usability of computers, and to
examine its areas of applicability [32]. In their
experiment, 12 subjects performed the same
task, and were asked to fill out self-stated
reports after finishing the task to describe their
emotions and feelings. After comparing the
analytical statistics from FaceReader, the
content of the self-stated reports as well as the
researchers observation reports, they found
high similarities in the results. FaceReader is
considered to be an efficient tool and is capable
of measuring instant emotions [32]. However,
even though FaceReader can record all changes
of facial expressions, it is still restricted by the
six basic emotions, and more complex emotions
cannot be analyzed. For instance, the subjects
began the experimental task in earnest, but
FaceReader analyzed the emotion as anger
[32]. In other words, FaceReader can be used to
objectively detect the subjects instant and
subtle changes on facial expressions, then
provide information for judgments on possible
representational
emotional
components.
Nevertheless, in order to understand the
subjects feelings, the researchers observation
and the subjects self-stated reports are needed
to make further analysis of the facial expression
recognition results. On the other hand, even the
researchers can observe and record the
reactions of the subjects to experimental stimuli
without any interference, it is inevitable to
make errors while manually recording or
recognizing emotions, such as missing out
some subjective identifications.
In the experiment of facial expression
recognition, the main task is to distinguish the
subjects emotional differences during the
experiment. The duration of emotion is short,
approximately from 0.5 to 4 seconds [10]. It is
difficult to distinguish the differences among all
samples with numerical values. To solve this
problem, the maximum numerical value in each

138

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

emotion from each subject was analyzed, to


understand if there were emotional differences
when the subjects tasted different juices [33].
To sum up, it is convincible that FaceReader is
a reliable tool for analyzing the subjects
emotions by examining their facial expressions;
however, the researchers observation, the
subjects self-stated reports, and a wellconducted experiment are also required to
minimize any possible contradictory result.

The plan view of the experimental platform is


illustrated in Figure 1.

3 METHODS
The researchers designed an emotion
experiment based on the theory of facial
expression recognition to understand the
subjects emotional reactions on different types
of abstract graphics. The purpose was to
evaluate the possibility of applying facial
recognition as a method to understand the
subjects emotions in order to build a bridge
between the viewers and the designers.
3.1 Experimental Design
The experiment required a quiet and cozy space
to keep the experimental subjects from being
interfered with by exterior surrounding
elements so as to affect the correctness of the
experiment. The experimental site was set at a
classroom in National Taiwan University of
Science and Technology, where there were no
noises around. It was a one-to-one experiment
in which light was sufficient and the subjects
sat in comfortable posture, facing the screen
center with field of vision from 10-20 degrees
to take the test. The distance between the
subject and the screen was about 50-60 cm,
depending on different subjects needs to make
slight adjustments. There was a clapboard
between the subject and the researchers, so the
subject could take the experiment alone while
the researchers observed and controlled the
process. The whole experiment was recorded
by video and audio for further analysis. For the
convenience of organizing and analyzing files,
the video resolution was set at 640480 pixels.

ISBN: 978-1-941968-16-1 2015 SDIWC

Figure 1. The plan view of the experimental platform.

3.2 Stimuli
The researchers used thematic abstract graphics
with different design elements as stimuli.
Chinese dragon graphics were chosen because
the dragon is an iconic virtual animal in
Chinese culture. It is presumed that the subjects
may have various imaginations on the dragon in
abstract graphic designs. The researchers
collected and numbered 204 Chinese dragon
graphics from books and online databases. The
backgrounds of the images were unified in the
sampling process. After the images were filed,
a focus group discussed and shared their
opinions on certain issues, then they
categorized the images. Seven experts in the
design field with more than two years of
working experience were invited to the focus
group to categorize the sample images by visual
similarity. The focus group used the KJ method
to categorize the images and chose the final
samples for the experiment. The procedures
were as follows.
1. The resolution of each image was
standardized to 300dpi for the printing
process and the height was set at 12cm.
Each image was color-printed at the center
of a 1015cm card. Each card was labeled
with a number from 001 to 204.
2. The researchers provided the information
of abstraction in design from the literature
review including definitions, related

139

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

3.

and
applications.
Any
images
disagreement was further discussed until a
consensus was reached.
The focus group categorized the abstract
graphics by visual similarity into ink
paintings, geometric figures and color
blocks. Images with similarity to the
others were eliminated. Finally, six
representational abstract graphics of each
category were classified for the
experiment (see Table 1).

Table 1. Sample images.


Ink paintings

Geometric figures

Color blocks

3.3 Research Subjects


The subjects were 120 Taiwanese university
students, the age average was 22.62, with

ISBN: 978-1-941968-16-1 2015 SDIWC

standard deviation of 1.7. All of the subjects


eye visions were 0.8 or above after visual
correction without color blindness or visual
dysfunction.
3.4 Procedure
One of the main challenges of the experiment
was the observation of the subjects emotions
through facial movements. After the pre-test
and gathering feedback from the subjects, the
experimental procedure was conducted as
follows.
1. The subjects viewed the introduction video.
2. The subjects viewed all types of the
sample images displayed in Latin square to
avoid the continuous effect.
3. The subjects were instructed to verbalize
their feelings after each type of the sample
images was shown.
After the experiment, all of the subjects videos
were analyzed by FaceReader into numerical
values. Each subjects timing sections for
watching the sample images and at the
interviewing stage were captured. The
researchers referred to the method used in
analyzing FaceReader data [33], and took the
maximum numerical value of each emotion
from each subject to understand if there were
emotional differences in different abstract
graphic styles.
4 RESULTS
The researchers collected 112 valid samples
after excluding 8 invalid ones, with 128,319
data in all (94,080 data of viewing the samples
and 34,239 data of orally expressing feelings).
After capturing the maximum value of each
emotion from every subject, the researchers
imported emotion data into SPSS to run
repeated-measures one-way ANOVA. The
results were firstly examined by Mauchlys test
of sphericity while it was certain that sphericity
could be assumed as a non-significant outcome
(sig. [p] > 0.05).

140

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

The researchers were able to select the


sphericity assumed line of data while
examining the main ANOVA outcomes (tests
of within-subjects effects). If Mauchlys test
was significant (sig. [p] < 0.05) in this
research, the value of Greenhouse-Geisser was
selected as a sphericity adjustment. When it
showed significant differences in the tests of
within-subjects
effects,
the
pairwise
comparisons were provided to locate the source
of emotional differences. Please refer to the
result data of facial recognition emotions in
Table 2 and the graph of the result data for each
sample image category in Figure 2.
Table 2.
emotions.

Happy
Sad
Anger
Surprise
Scare
Disgust

The mean data of facial recognized

Ink
paintings
0.731
0.331
0.105
0.066
0.008
0.066

Geometric
figures
0.502
0.29
0.106
0.078
0.008
0.055

Color
blocks
0.507
0.329
0.236
0.122
0.009
0.048

differences in the subjects happy emotion. It


shows that the subjects feel happier to ink
paintings comparing with other types of
abstract graphics by referring to pairwise
comparisons, which are detailed in Table 3.
Table 3.
The result of happiness running
repeated-measures one-way ANOVA.

Means
Std. Deviation
Mauchlys test of
sphericity (sig.)
Tests of withinsubjects effects (GG) (sig.)

Ink
Geometric
paintings
figures
0.731
0.502
0.217
0.339

Color
blocks
0.507
0.343

0.037*
0.000***

N/A
0.000*** 0.000***
0.000***
N/A
0.858
0.000***
0.858
N/A
Note: Differences are considered as significant at p <
0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.
Pairwise
comparisons (sig.)

4.2 Sadness
The subjects did not have notable changes in
sad emotion while viewing the three types of
abstract graphics (see Table 4). Mauchlys test
of sphericity shows that p=0.751>0.05, which
indicates that the sphericity assumption is not
violated. The test of within-subjects effects
indicates that p=0.083>0.05, which suggests
that there are no significant differences in the
subjects sad emotion.

Figure 2. The graph of mean data for each sample image


category.

Table 4. The result of sadness running repeatedmeasures one-way ANOVA.

4.1 Happiness
The values of happiness have noticeable
changes in the facial expression recognition
result. Mauchlys test of sphericity shows that
p=0.000<0.05, which indicates that the
sphericity assumption is violated. GreenhouseGeisser correction shows that p=0.000<0.05,
which indicates that there are significant

ISBN: 978-1-941968-16-1 2015 SDIWC

Means
Std. Deviation
Mauchlys test of
sphericity (sig.)
Tests of withinsubjects effects
(Sphericity
assumed) (sig.)

Ink
Geometric
paintings
figures
0.331
0.290
0.262
0.239

Color
blocks
0.329
0.267

0.751
0.083

141

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

Note: Differences are considered as significant at p <


0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.

higher than the other two types of abstract


graphics. Please refer to Table 6.

4.3 Anger

Table 6. The result of surprise running repeatedmeasures one-way ANOVA.

According to the result, the subjects angry


emotion stays low values during the experiment.
Mauchlys test of sphericity shows that
p=0.000<0.05, which indicates that the
sphericity assumption is violated. GreenhouseGeisser correction shows that p=0.000<0.05,
which suggests that there are significant
differences in the subjects angry emotion for
different types of abstract graphics. Pairwise
comparisons show that the subjects values of
anger are significantly higher while viewing
color blocks. Please refer to Table 5.
Table 5. The result of anger running repeatedmeasures one-way ANOVA.

Means
Std. Deviation
Mauchlys test of
sphericity (sig.)
Tests of withinsubjects effects (GG) (sig.)

Ink
Geometric
paintings
figures
0.105
0.106
0.140
0.138

Color
blocks
0.236
0.219

0.000***
0.000***

N/A
0.928
0.000***
0.928
N/A
0.000***
0.000*** 0.000***
N/A
Note: Differences are considered as significant at p <
0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.
Pairwise
comparisons (sig.)

4.4 Surprise
The values of surprise do not have noticeable
changes as shown in Figure 2. However,
Mauchlys test of sphericity shows that
p=0.000<0.05, which indicates that the
sphericity assumption is violated. GreenhouseGeisser correction shows that p=0.007<0.05,
which indicates that there are significant
differences in the subjects surprise emotion.
Pairwise comparisons indicate that color blocks
make the subjects values of surprise apparently

ISBN: 978-1-941968-16-1 2015 SDIWC

Means
Std. Deviation
Mauchlys test of
sphericity (sig.)
Tests of withinsubjects effects (GG) (sig.)

Ink
Geometric
paintings
figures
0.066
0.078
0.138
0.168

Color
blocks
0.122
0.194

0.000***
0.007**

N/A
0.310
0.002**
0.310
N/A
0.037*
0.002**
0.037*
N/A
Note: Differences are considered as significant at p <
0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.
Pairwise
comparisons (sig.)

4.5 Scare
Scare stays at the lowest values during the
experiment, as refer to Figure 2 and Table 7.
The subjects did not have notable changes in
scare emotion. Mauchlys test of sphericity
shows that p=0.010<0.05, which indicates that
the sphericity assumption is violated.
Greenhouse-Geisser correction shows that
p=0.895>0.05, which indicates that there are no
significant differences in the subjects scare
emotion.
Table 7. The result of scare running repeatedmeasures one-way ANOVA.
Ink
Geometric
paintings
figures
0.008
0.008
0.026
0.032

Color
blocks
0.009
0.031

Means
Std. Deviation
Mauchlys test of
0.010**
sphericity (sig.)
Tests of withinsubjects effects (G0.895
G) (sig.)
Note: Differences are considered as significant at p <
0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.

142

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

4.6 Disgust
Disgust emotion has the second lowest values
in the experiment. According to the result, the
subjects disgust emotion did not have notable
changes. Mauchlys test of sphericity shows
that p=0.000<0.05, which indicates that the
sphericity assumption is violated. GreenhouseGeisser correction shows that p=0.391>0.05,
which indicates that there are no significant
differences in the subjects disgust emotion.
Please refer to Table 8.
Table 8. The result of disgust running repeatedmeasures one-way ANOVA.

Means
Std. Deviation
Mauchlys test of
sphericity (sig.)
Tests of withinsubjects effects (GG) (sig.)

Ink
paintings
0.066
0.172

Geometric
figures
0.055
0.148

Color
blocks
0.048
0.117

0.000***
0.391

Note: Differences are considered as significant at p <


0.05. * denotes p < 0.05, ** denotes p < 0.01, ***
denotes p < 0.001.

4.7 Discussion
Facial expressions are a way of emotional
communication. They can represent not only a
persons current emotional status, but also
express emotions to the other people in bilateral
communication. According to previous research,
over half of human communication depends on
facial expressions, which indicates that the
verbalized content is actually a supplement in
communication. However, when the subjects
viewed the sample images, they only received
information unilaterally. The researchers
observed that the subjects seldom had notable
facial expressions at the stage of viewing the
sample images. On the contrary, when the
subjects verbalized their feelings to the sample
images, there were distinctive facial
expressions and the line chart produced by
FaceReader showed that different emotions
crossed with each other. The situation

ISBN: 978-1-941968-16-1 2015 SDIWC

suggested that although the verbalized content


was a supplement in communication, the
process of verbalizing feelings enhanced the
intensity of facial expressions. When the
subjects were instructed to verbalize their
feelings, the main task for the subjects to do
was trying to express their feelings to the
sample images verbally. The researchers
purpose was to capture the subjects enhanced
facial expressions for understanding their
emotional differences among different types of
abstract graphics.
The results of anger, surprise, scare and disgust
generally were low numerical values of under
0.3, although the subjects had significant
emotional differences in anger and surprise. On
the other hand, sadness remained the highest
values in negative emotions, but there were no
significant differences in the subjects sad
emotion in all types of abstract graphics. The
results suggested that the subjects had different
emotional feelings even the numerical values
were low. For the emotions with high values
but no significant differences, the subjects
emotions could be from the feelings to the
thematic images. In this research, none of the
subjects mentioned words related to sadness
while verbalizing their feelings to the sample
images. During the experiment, the video
camera was set at the top of the screen to avoid
interfering with the subjects vision and to
capture the changes of facial expressions
(shown in Figure 1). Thus, when the subjects
were watching the sample images, their facial
expressions might be regarded as sadness due
to the angle of the video camera.
The results of facial expression recognition
indicated that ink paintings made the subjects
feel the happiest. According to the subjects
oral content, ink paintings were a special form
to illustrate the Chinese dragons. Their
appearances were portrayed like smoke that
made the images non-specific and dissimilar to
conventional descriptions of the Chinese
dragons. Although the overall values were low,
the subjects had significant emotional
differences in anger and surprise while viewing

143

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

color blocks. That is, the subjects felt more


angry and surprised to color blocks comparing
with the other sample images. Surprise is an
emotion that could be either positive or
negative; therefore, it was difficult to determine
whether the subjects felt surprised to color
blocks positively or negatively only by the
facial expression recognition method. In the
subjects verbalized reports, some mentioned
that color blocks made them concentrate on
what they were watching. The subjects oral
responses in this research conformed to other
researchers experiment [32]. It was assumed
that color blocks made the subjects more
concentrated comparing to other samples. As a
result, facial expression recognition showed
that the subjects were angrier while viewing
color blocks than other types of sample images.
In addition, the researchers found that most of
the subjects had positive responses in
describing geometric figures, but did not have
noticeable emotions by facial expression
recognition in three types of abstract graphics.
In a word, the subjects might overstate their
feelings verbally but their facial expressions
were flat and reflected actual feelings.
5 CONCLUSIONS
Abstraction in design is a common method for
designers to transform specific objects, but the
prior concern is to ensure that viewers are able
to understand the content of the abstract
graphics. Apart from forms and aesthetics, it is
also important for the designers to understand
the viewers interests. It is believed that
emotional connections can make the viewers to
be impressed by the graphics. Therefore,
viewers emotions for abstract graphics are
important for designers to build emotional
connections between viewers and graphics.
Nonverbal communication is considered as a
convincing and direct tool for understanding
human emotions. Analyzing facial expressions
is regarded as an appropriate research method,
since 55% of human communication comes
from facial expressions. By using the thematic

ISBN: 978-1-941968-16-1 2015 SDIWC

images of the Chinese dragons as stimuli, the


researchers have verified the possibility of
applying facial expression recognition to
evaluate the viewers emotions for the abstract
graphics. The period of an emotions
occurrence was short, approximately from 0.5
to 4 seconds, indicating it was impossible for
the subjects to stay at a certain emotion status
for a long period of time. The researchers
extracted the maximum values of each subjects
emotions in order to understand the actual
emotional waves from the subject. The results
showed that the subjects had emotional
differences in happiness, anger and surprise to
different abstraction styles. According to facial
expression recognition, ink paintings made the
subjects happiest, and they felt angrier and
more surprised on color blocks comparing with
the other two types of abstract graphics.
Although the result could be different by
changing the graphic themes, it was suggested
that the viewers had different emotions for the
abstract graphics composed by different design
elements.
By using FaceReader to recognize the subjects
emotions, it was able to overcome the difficulty
of executing quantitative research in facial
expression recognition. As the bias of manual
recognition was avoided, the researchers
controlled variables by multiple pre-tests as
well as setting up the environment for the facial
expression
recognition
experiment.
In
conclusion, the subjects verbalized content
could help the researchers to collect the
subjects facial expressions, and was used as a
supplement to explain possible reasons for
various emotions. The result has confirmed the
necessity and practicality of using facial
expression recognition to evaluate the viewers
emotions for abstract graphics, and can be
applied in design practice to develop consumeroriented strategies in marketing.
ACKNOWLEDGEMENT

144

Proceedings of The Fourth International Conference on Informatics & Applications, Takamatsu, Japan, 2015

This work was sponsored by the Ministry of


Science and Technology, Taiwan, under the
Grant No. MOST103-2410-H-011-025.

[20]

[21]

REFERENCES
[1]

[2]

[3]

[4]

[5]
[6]
[7]
[8]
[9]
[10]

[11]

[12]

[13]

[14]

[15]

[16]
[17]

[18]

[19]

A. Hashimoto and M. Clayton, Visual Design


Fundamentals: A Digital Approach, 3rd ed. Boston,
MA: Cengage Learning, 2009.
H.S. Asthana, Facial Asymmetry in Expression of
Emotion: A Neurocultural Perspective. New Delhi,
India: Concept Publishing Company, 2012.
P. Ekman, The Face of Man: Expressions of
Universal Emotions in a New Guinea Village. New
York: Garland, 1980.
D. Druckman, Doing Research: Methods of Inquiry
for Conflict Analysis. Thousand Oaks, CA: Sage,
2005.
D. Elger, Abstract Art. Kln, Germany: Taschen,
2008.
R. Arnheim, Visual Thinking. Berkeley, CA:
University of California Press, 1997.
C. Darwin, The Expression of the Emotions in Man
and Animals. London: John Murray, 1872.
A.J. Fridlund, Human Facial Expression: An
Evolutionary View. London: Academic Press, 2014.
A. Mahrabian, Silent Messages. Belmont, CA:
Wadsworth, 1981.
P. Ekman, Emotions Revealed: Recognizing Faces
and Feelings to Improve Communication and
Emotional Life. New York: Times Books, 2004.
P. Ekman, and W.V. Friesen, Facial Action Coding
System: A Technique for the Measurement of Facial
Movement. Palo Also: Consulting Psychologists
Press, 1978.
P. Ekman, W.V. Friesen, and J.C. Hager, Facial
Action Coding System: Investigators Guide. Salt
Lake City, UT: Research Nexus, 2002a.
P. Ekman, W.V. Friesen, and J.C. Hager, Facial
Action Coding System: The Manual. Salt Lake City,
UT: Research Nexus, 2002b.
P. Ekman, W.V. Friesen, M. OSullivan, A. Chan, I.
Diacoyanni-Tarlatzis, K. Heider, et al., Universals
and cultural differences in the judgments of facial
expressions of emotion, J. Pers. Soc. Psychol., vol.
53, no. 4, pp. 712-717, 1987.
P. Ekman, E.R. Sorenson, and W.V. Friesen, Pancultural elements in facial displays of emotion,
Science, vol. 164, no. 3875, pp. 86-88, 1969.
C. Izard, The Face of Emotion. New York:
Appleton-Century-Crofts, 1971.
S. D. Pollak, M. Messner, D.J. Kistler, and J.F. Cohn,
Development of perceptual expertise in emotion
recognition, Cognition, vol. 110, no. 2, pp. 242-247,
2009.
W.E. Rinn, The neuropsychology of facial
expression: A review of the neurological and
psychological mechanisms for producing facial
expressions, Psychol. Bull., vol. 95, no. 1, pp. 5277, 1984.
M.L. Knapp, and J.A. Hall, Nonverbal
Communication in Human Interaction, 7th ed.
Boston, MA: Cengage Learning, 2010.

ISBN: 978-1-941968-16-1 2015 SDIWC

[22]

[23]

[24]

[25]

[26]

[27]

[28]

[29]

[30]

[31]

[32]

[33]

J. Reeve, and G. Nix, Expressing intrinsic


motivation through acts of exploration and facial
displays of interest, Motiv. Emotion, vol. 21, no. 3,
pp. 237-250, 1997.
K.R. Scherer, E. Clark-Polner, and M. Mortillaro,
In the eye of the beholder? Universality and cultural
specificity in the expression and perception of
emotion, Int. J. Psychol., vol. 46, no. 6, pp. 401-435,
2011.
K.T. Song, M.J. Han, and J.W. Hong, Online
learning design of an image-based facial expression
recognition system, Intell. Serv. Robot., vol. 3, no.
3, pp. 151-162, 2010.
A. Ray and A. Chakrabarti, Design and
implementation of affective e-learning strategy
based on facial emotion recognition, in Proceedings
of the InConINDIA 2012, AISC 132, S.C. Satapathy
et al. Eds. Berlin/ Heidelberg, Germany: SpringerVerlag, 2012, pp. 613-622.
I.A. Essa and A.P. Pentland, Coding, analysis,
interpretation, and recognition of facial expressions,
IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no.
7, pp. 757-763, 1997.
M. Valstar, Automatic facial expression analysis,
in
Understanding
facial
expressions
in
Communication:
Cross-cultural
and
Multidisciplinary Perspectives, M.K. Mandal and A.
Awasthi, Eds. New Delhi, India: Springer, 2015, pp.
143-172.
O. Pos, and P. Green-Armytage, Facial expressions,
colors and basic emotions, Color: Des. Creativity,
vol. 1, no. 1, pp. 1-20, 2007.
S.M. Lajevardi and H.R. Wu, Facial expression
recognition in perceptual color space, IEEE Trans.
Image Process, vol. 21, no. 8, pp. 3721-3733, 2012.
M. Pantic and I. Patras, Dynamics of facial
expression: Recognition of facial actions and their
temporal segments from face profile image
sequences, IEEE Trans. Syst. Man Cybern. B, vol.
36, no. 2, pp. 433-449, 2006.
Y. Zhan and G. Zhou, Facial expression recognition
based on hybrid features and fusing discrete HMM,
in Virtual Reality, HCII 2007, (LNCS 4563), R.
Shumaker, Ed. Berlin/ Heidelberg, Germany:
Springer-Verlag, 2007, pp. 408-417.
B. T. Lau, Portable real time needs expression for
people with communication disabilities, in
Proceedings of Communication in Computer and
Information
Science:
Intelligent
Interactive
Assistance and Mobile Multimedia Computing, D.
Versick, Ed. Berlin/ Heidelberg, Germany: SpringerVerlag, 2009, pp. 85-95.
L. Loijens and O. Krips, FaceReader methodology.
Wageningen, Nederland: Noldus Information
Technology, 2012.
B. Zaman and T. Shrimpton-Smith, The
FaceReader: Measuring instant fun of use, in
Proceedings of the 4th Nordic Conf. on HumanComputer Interaction: Changing Roles, K. Morgan,
T. Bratteteig, G. Ghosh, and D. Svanaes Eds. New
York: ACM, 2006, pp. 457-460.
L. Danner, S. Haindl, M. Joechl, and K.
Duerrschmid, Facial expressions and autonomous
nervous system responses elicited by tasting
different juices, Food Res. Int., vol. 64, pp. 81-90,
2014.

145

You might also like