You are on page 1of 6

11th

11th IFAC
IFAC Symposium
Symposium on
on Robot
Robot Control
Control
11th
IFAC
Symposium
on Robot
Control
August
26-28,
2015. Salvador,
Salvador,
BA,
Brazil
August
26-28,
2015.
BA,
Brazil
11th
IFAC
Symposium
on Robot
Control
August 26-28, 2015. Salvador, BA, Brazil
Available online at www.sciencedirect.com
August 26-28, 2015. Salvador, BA, Brazil

ScienceDirect
IFAC-PapersOnLine 48-19 (2015) 136141

Development of a Human Machine


Development
Development of
of a
a Human
Human Machine
Machine
Interface
for
Control
of
Robotic
Wheelchair
Interface
for
Control
of
Robotic
Wheelchair
Interface for
Control
of
Robotic
Wheelchair
and Smart Environment
and
Smart
Environment
and Smart Environment

Richard
L.
Richard J.
J. M.
M.G.
G. Tello
Tello ,, Alexandre
Alexandre
L. C.
C. Bissoli
Bissoli ,,

Richard
J.
M.
G.
Tello
,
Alexandre
L.
C.
Bissoli
Flavio
Ferrara
,, Sandra
u
, Andre
Richard
J. M.
Tello M
, Alexandre
L. C. Ferreira
Bissoli ,, ,,
Flavio
Ferrara
Sandra
M
uller
ller
Ferreira
G.
, Andre

Flavio
Ferrara
,
Sandra
M
u
ller
,
Andre
Ferreira

Teodiano
F.
Bastos-Filho
Flavio Ferrara
, Sandra
uller , Andre
Ferreira ,,
Teodiano
F. M
Bastos-Filho

Teodiano
F.
Bastos-Filho

Teodiano F. Bastos-Filho

Post-Graduate Program in Electrical Engineering (PPGEE). Federal


Post-Graduate
Program
in Electrical Engineering (PPGEE). Federal

Program
in
Engineering
(PPGEE).
Federal
Post-Graduate
University
(UFES).
Av.
Ferrari
514.
Post-Graduate
ProgramSanto
in Electrical
Electrical
Engineering
(PPGEE).
University of
of Espirito
Espirito
Santo
(UFES).
Av. Fernando
Fernando
Ferrari Federal
514.
University
of
Espirito
Santo
(UFES).
Av.
Fernando
Ferrari
514.
Brazil.
(e-mail:
richard@ele.ufes.br;
University Vitoria,
of
Espirito
Santo
(UFES).
Av.
Fernando
Ferrari
514.
Vitoria, Brazil. (e-mail: richard@ele.ufes.br;
Vitoria,
Brazil.
(e-mail:
richard@ele.ufes.br;
alexandre-bissoli@hotmail.com;
andrefer@ele.ufes.br;
Vitoria, Brazil. (e-mail: richard@ele.ufes.br;
alexandre-bissoli@hotmail.com;
andrefer@ele.ufes.br;
alexandre-bissoli@hotmail.com;
andrefer@ele.ufes.br;
teodiano.bastos@ufes.br)
alexandre-bissoli@hotmail.com;
andrefer@ele.ufes.br;
teodiano.bastos@ufes.br)

teodiano.bastos@ufes.br)
Politecnico di Milano:
Piazza
Leonardo
teodiano.bastos@ufes.br)
Piazza Leonardo da
da Vinci,
Vinci, 20133,
20133, Milano,
Milano,
Politecnico di Milano:
Piazza
Leonardo
da
Vinci,
20133,
Politecnico di Milano:
Italy
(e-mail:
femferrara@gmail)
Politecnico di Milano:
Piazza
Leonardo
da
Vinci,
20133, Milano,
Milano,
Italy
(e-mail:
femferrara@gmail)

Italy
(e-mail:
femferrara@gmail)
Electrical Engineering
Department,
Federal Institute
Institute of
of Esprito
Esprito
Italy
(e-mail:
femferrara@gmail)
Electrical
Engineering
Department,
Federal

Engineering
Department,
Federal
Institute
of
Electrical
Santo
(IFES).
Av.
1729,
Vitoria,
Brazil
(e-mail:
Electrical
Federal
Institute
of Esprito
Esprito
Santo
(IFES). Engineering
Av. Vitoria,
Vitoria, Department,
1729, 29040-780.
29040-780.
Vitoria,
Brazil
(e-mail:
Santo
(IFES).
Av.
Vitoria,
1729,
29040-780.
Vitoria,
Brazil
sandra.muller@ifes.edu.br)
Santo (IFES). Av. Vitoria,
1729, 29040-780. Vitoria, Brazil (e-mail:
(e-mail:
sandra.muller@ifes.edu.br)
sandra.muller@ifes.edu.br)
sandra.muller@ifes.edu.br)

Abstract:
Abstract: In
In this
this work,
work, we
we address
address the
the problem
problem of
of integrating
integrating aa robotic
robotic wheelchair
wheelchair into
into aa smart
smart
Abstract:
In
this
work,
we
address
the
problem
of
integrating
aacontrol
robotic
wheelchair
into
smart
environment.
This
approach
allows
people
with
disabilities
to
home
appliances
the
Abstract:
In
this
work,
we
address
the
problem
of
integrating
robotic
wheelchair
into aa of
smart
environment. This approach allows people with disabilities to control home appliances
of
the
environment.
This
approach
allows
people
with
disabilities
to
control
home
appliances
of
the
environment
using
aapproach
Human Computer
Computer
Interface
(HCI)
basedto
oncontrol
differenthome
biological
signals.
The
environment.using
This a
allows people
with (HCI)
disabilities
appliances
of The
the
environment
Human
Interface
based
on
different
biological
signals.
environment
using
a
Human
Computer
Interface
(HCI)
based
on
different
biological
signals.
The
home
appliances
includes
TV,
radio,
lights/lamp
and
fan.
Three
control
paradigms
using
surface
environment
using
a Human
Interface (HCI)
based
oncontrol
different
biologicalusing
signals.
The
home
appliances
includes
TV,Computer
radio, lights/lamp
and fan.
Three
paradigms
surface
home
TV,
radio,
fan.
control
Electromyography
(sEMG),
(EOG)
and
(EEG)
home appliances
appliances includes
includes
TV,Electrooculography
radio, lights/lamp
lights/lamp and
and
fan. Three
Three
control paradigms
paradigms using
using surface
surface
Electromyography
(sEMG),
Electrooculography
(EOG)
and Electroencephalography
Electroencephalography
(EEG)
Electromyography
(sEMG),
Electrooculography
(EOG)
and
Electroencephalography
(EEG)
signals
were
used.
These
signals
are
captured
through
a
biosignal
acquisition
system.
Three
subElectromyography
(sEMG),
Electrooculography
(EOG)
and
Electroencephalography
(EEG)
signals were used. These signals are captured through a biosignal acquisition system. Three
subsignals
were
used.
These
signals
are
captured
through
a
biosignal
acquisition
system.
Three
subparadigms
for
sEMG/EOG
analyzes
were defined:
defined:
moving
eyes horizontally
horizontally
(left/right),
raising
signals werefor
used.
These signals
are captured
through
a biosignal
acquisition (left/right),
system. Three
subparadigms
sEMG/EOG
analyzes
were
moving
eyes
raising
paradigms
for
sEMG/EOG
analyzes
were
defined:
moving
eyes
horizontally
(left/right),
raising
brow
and
prolonged
clench.
On
the
other
hand,
the
navigation
of
the
wheelchair
is
executed
paradigms
for sEMG/EOG
moving
eyes horizontally
(left/right),
raising
brow
and prolonged
clench.analyzes
On the were
otherdefined:
hand, the
navigation
of the wheelchair
is executed
brow
prolonged
clench.
On
other
hand,
navigation
the
is
executed
through
Steady-State
Visually
Evoked
stage
brow and
andan
clench.
On the
the
otherPotentials
hand, the
the(SSVEP)-BCI.
navigation of
of Each
the wheelchair
wheelchair
is proposed
executed
through
anprolonged
Steady-State
Visually
Evoked
Potentials
(SSVEP)-BCI.
Each
stage of
of our
our
proposed
through
an
Steady-State
Visually
Evoked
Potentials
(SSVEP)-BCI.
Each
stage
of
our
proposed
system
showed
a good
good performance
performance
for most
most
subjects.
Therefore, volunteers
volunteers
were
recruited
to
throughshowed
an Steady-State
Visually Evoked
Potentials
(SSVEP)-BCI.
Each stagewere
of our
proposed
system
a
for
subjects.
Therefore,
recruited
to
system
a
for
Therefore,
volunteers
were
recruited
to
participate
of the
the
studyperformance
and were
were distributed
distributed
in two
two groups
groups
(subjects
for home
home
appliances
and
system showed
showed
a good
good
performance
for most
most subjects.
subjects.
Therefore,
volunteers
were
recruitedand
to
participate
of
study
and
in
(subjects
for
appliances
participate
of
the
study
and
were
distributed
in
two
groups
(subjects
for
home
appliances
and
subjects
for
SSVEP-BCI).
The
average
accuracy
for
prolonged
clench
approach
was
of
95%,
participate
the study andThe
wereaverage
distributed
in two
(subjects
home appliances
and
subjects
forofSSVEP-BCI).
accuracy
forgroups
prolonged
clenchforapproach
was of 95%,
subjects
for
SSVEP-BCI).
The
average
accuracy
for
prolonged
clench
approach
95%,
the
raising
was
achieved
Multivariate
subjects
forbrow
SSVEP-BCI).
Themoving
averageeyes
accuracy
for 93%.
prolonged
clench Synchronization
approach was
was of
of Index
95%,
the
raising
brow
was 85%
85% and
and
moving
eyes
achieved
93%.
Multivariate
Synchronization
Index
the
raising
brow
was
85%
and
moving
eyes
achieved
93%.
Multivariate
Synchronization
Index
(MSI)
was used
used
for
feature
extraction
from
EEG
signals.
The
flickering frequencies
frequencies
were 8.0
8.0
Hz
the raising
browfor
was
85% and
movingfrom
eyes EEG
achieved
93%.
Multivariate
Synchronization
Index
(MSI)
was
feature
extraction
signals.
The
flickering
were
Hz
(MSI)
was
for
extraction
from
EEG
signals.
The
flickering
frequencies
were
8.0
(top),
Hz
13.0
Hz
and
15.0
Hz
Results
from
this
(MSI) 11.0
was used
used
for feature
feature
from
The
flickering
wereshowed
8.0 Hz
Hz
(top),
11.0
Hz (right),
(right),
13.0 extraction
Hz (bottom)
(bottom)
andEEG
15.0signals.
Hz (left).
(left).
Results
fromfrequencies
this approach
approach
showed
(top),
11.0
13.0
Hz
and
Hz
Results
from
this
showed
that
varies
in
range
subjects
using
window
length
11 s.
(top),classification
11.0 Hz
Hz (right),
(right),
13.0
Hz (bottom)
(bottom)
and 15.0
15.0among
Hz (left).
(left).
Results
from
this approach
approach
that
classification
varies
in the
the
range of
of 45-77%
45-77%
among
subjects
using
window
length of
of showed
s.
that
classification
varies
the
45-77%
among
using window
length
of 11 s.
that
classification
varies in
inFederation
the range
rangeof of
of
45-77% Control)
among subjects
subjects
window
length
s.
2015,
IFAC (International
Automatic
Hosting byusing
Elsevier
Ltd. All
rightsofreserved.
Keywords:
SSVEP-BCI,
Robotic
Wheelchair,
EEG,
sEMG,
EOG,
Smart
Environment.
Keywords: SSVEP-BCI, Robotic Wheelchair, EEG, sEMG, EOG, Smart Environment.
Keywords:
Keywords: SSVEP-BCI,
SSVEP-BCI, Robotic
Robotic Wheelchair,
Wheelchair, EEG,
EEG, sEMG,
sEMG, EOG,
EOG, Smart
Smart Environment.
Environment.
1. INTRODUCTION
INTRODUCTION
temporal resolution
resolution generated
generated by
by neuronal
neuronal dynamics
dynamics from
from
1.
temporal
temporal
resolution
generated
by
dynamics
from
1.
the
Therefore,
aa BCI
brain
and
1. INTRODUCTION
INTRODUCTION
temporal
by neuronal
neuronal
dynamics
from
the scalp.
scalp. resolution
Therefore,generated
BCI records
records
brain signals,
signals,
and EEG
EEG
the
scalp.
Therefore,
a
BCI
records
brain
signals,
and
EEG
signal
features
are
then
translated
into
artificial
outputs
scalp.
Therefore,
a BCI
records brain
EEG
A Human
Human Machine
Machine Interface
Interface (HMI)
(HMI) is
is a
a platform
platform that
that the
signal
features
are then
translated
into signals,
artificialand
outputs
A
signal
features
are
then
translated
into
artificial
outputs
or
aa real
is
A
Machine
Interface
is
that
featuresthat
are act
thenin
intoBCI
artificial
outputs
allows
interaction
between
user
and
system.
or commands
commands
that
act
intranslated
real world.
world.
BCI
is aa potential
potential
A Human
Human
Machine
Interface
(HMI)
is aa platform
platform
that signal
allows
interaction
between
user(HMI)
and automatized
automatized
system.
or
commands
that
act
in
a
real
world.
BCI
is
a
potential
alternative
and
augmentative
communication
(AAC)
and
allows
interaction
between
user
and
automatized
system.
or
commands
that
act
in
a
real
world.
BCI
is
a
potential
On
the
other
hand,
a
Brain-computer
interface
(BCI)
is
alternative
and
augmentative
communication
(AAC)
allows
between
user and automatized
system.
On
theinteraction
other hand,
a Brain-computer
interface (BCI)
is alternative and augmentative communication (AAC) and
and
On
the
other
hand,
a
Brain-computer
interface
(BCI)
is
control
solution
for
people
with
severe
motor
disabilities
alternative
and
augmentative
communication
(AAC)
and
a
technology
that
provides
human
with
direct
communicontrol
solution
for
people
with
severe
motor
disabilities
On
the
other
hand,
a
Brain-computer
interface
(BCI)
is
a technology that provides human with direct communi- control solution for people with severe motor disabilities
(Wolpaw
et
al.
(2000),
Kelly
et
al.
(2005)
and
Gao
et
acation
technology
that
provides
human
with
direct
communicontrol solution
for people
severe
motor
cation
between
theprovides
users brain
brain
signals
and
a computer,
computer,
et al. (2000),
Kellywith
et al.
(2005)
and disabilities
Gao et al.
al.
a technology
that
human
with and
direct
communi- (Wolpaw
between
the
users
signals
a
(Wolpaw
et al. (2000), Kelly et al. (2005) and Gao et al.
(2003)).
cation
the
brain
and
generating
an
of
that
(2003)). et al. (2000), Kelly et al. (2005) and Gao et al.
cation between
between
the users
users channel
brain signals
signals
and aa computer,
computer,
generating
an alternative
alternative
channel
of communication
communication
that (Wolpaw
(2003)).
generating
an
channel
of communication
that
does
not
the
way
muscles
generating
an alternative
alternative
channel
that (2003)).
does
not involve
involve
the traditional
traditional
wayofas
ascommunication
muscles and
and nerves
nerves
One
One kind
kind of
of BCI
BCI named
named SSVEP-BCI
SSVEP-BCI uses
uses the
the excitation
excitation
does
not
involve
the
traditional
way
as
muscles
and
nerves
(Wolpaw
et
Among
current
BCIs,
a
kind
of
BCI
named
SSVEP-BCI
uses
the
excitation
does not involve
the traditional
as muscles
nerves One
(Wolpaw
et al.
al. (2000)).
(2000)).
Amongway
current
BCIs, and
a noninvanoninvaof
the
retina
of
eye
by
a
stimulus
at
a
certain
One
kind
of
BCI
named
SSVEP-BCI
uses
the frequency,
excitation
of
the
retina
of
eye
by
a
stimulus
at
a
certain
frequency,
(Wolpaw
et
al.
(2000)).
Among
current
BCIs,
a
noninvasive
brain et
imaging
method
commonly
employed
innoninvaBCIs is
is of
the
retina
of
eye
by
a
stimulus
at
a
certain
frequency,
(Wolpaw
al. (2000)).
Among
current
BCIs, ain
sive
brain
imaging
method
commonly
employed
BCIs
making
the brain
brain
generating
an electrical
electrical
activity
of the
the
of
the
retina
of
eye
by
a
stimulus
at
a
certain
frequency,
making
the
generating
an
activity
of
sive
imaging
method
commonly
employed
in
EEG,
which
has
advantages
of
risk,
the
generating
an
activity
the
sive brain
brain
imaging
method
commonly
employed
in BCIs
BCIs is
is making
EEG,
which
has the
the
advantages
of lower
lower
risk, inexpensive
inexpensive
same
frequency
its
multiples
or
harmonics.
making
the brain
brainwith
generating
an electrical
electrical
activity of
ofThis
the
same
frequency
with
its
multiples
or
harmonics.
This
EEG,
which
has
the
advantages
of
lower
risk,
inexpensive
and
easily
measurable
(Chen
and
Kelly
frequency
with
its
multiples
or
harmonics.
This
EEG,
which
has the advantages
of (2014)
lower risk,
and
easily
measurable
(Chen et
et al.
al.
(2014)
and inexpensive
Kelly et
et al.
al. same
stimulus
produces
a
stable
Visual
Evoked
Potential
(VEP)
same
frequency
with
its Visual
multiples
or harmonics.
This
stimulus
produces
a
stable
Evoked
Potential
(VEP)
and
easily
measurable
(Chen
et
al.
(2014)
and
Kelly
et
al.
(2005)).
Further,
EEG(Chen
provides
electrical
signals
of et
high
produces
aa stable
Visual
Evoked
(VEP)
and easilyFurther,
measurable
et al.
(2014) and
Kelly
al. stimulus
(2005)).
EEG
provides
electrical
signals
of
high
of
amplitude
termed
as
stimulus
stable
Visual
Evoked Potential
PotentialVisually
(VEP)
of small
small produces
amplitude
termed
as Steady-State
Steady-State
Visually
(2005)).
small
amplitude
termed
as
Steady-State
Visually
(2005)). Further,
Further, EEG
EEG provides
provides electrical
electrical signals
signals of
of high
high of
Evoked
Potentials
(SSVEPs)
of
the
human
visual
system.
of
small
amplitude
termed
as
Steady-State
Visually
Evoked
Potentials
(SSVEPs)
of
the
human
visual
system.

authors
 The
Evoked
Potentials
(SSVEPs)
of
the
visual
system.
authors thank
thank FAPES
FAPES (a
(a foundation
foundation of
of the
the Secretary
Secretary of
of SciSciTo
such
gazes
one
Evoked
Potentials
(SSVEPs)the
of user
the human
human
system.
 The
To produce
produce
such potentials,
potentials,
the
user
gazes at
atvisual
one flickering
flickering
The
authors
thank of
FAPES
(a foundation
of
the Secretary
of Science
and
Technology
the
State
of
Espirito
Santo,
Brazil),
CAPES

To
produce
such
potentials,
the
user
gazes
at
one
flickering
ence
and
Technology
the State
of Espirito of
Santo,
Brazil), CAPES
The
authors
thank of
FAPES
(a foundation
the Secretary
of Scistimulus
oscillating
at
a
certain
frequency
(He
(2013)).
To
produce
such
potentials,
the
user
gazes
at
one
flickering
stimulus oscillating at a certain frequency (He (2013)). In
In
ence
and Technology
of
the State
of Espirito
Santo, Brazil),
CAPES
(a
foundation
of
the
Brazilian
Ministry
of
Education)
and
CNPQ
(a foundation
of the of
Brazilian
Ministry
of Education)
andCAPES
CNPQ
ence
and Technology
the State
of Espirito
Santo, Brazil),
stimulus
at
aa certain
frequency
(He
(2013)).
a typical
typical oscillating
SSVEP-BCI
system,
several
stimuli
flickering
at
stimulus
oscillating
at system,
certainseveral
frequency
(Heflickering
(2013)). In
In
a
SSVEP-BCI
stimuli
at
(a
foundation
ofNational
the Brazilian
Ministry
of Education)
and CNPQ
(The
Brazilian
Council
for
Scientific
and
Technological
(The
BrazilianofNational
Council
for Scientific
and Technological
(a
foundation
the Brazilian
Ministry
of Education)
and CNPQ
aadifferent
typical
SSVEP-BCI
several
stimuli
at
frequencies
aresystem,
presented
to the
the
user. flickering
The subject
subject
typical
SSVEP-BCI
system,
several
stimuli
flickering
at
(The
Brazilian for
National
Council
fortoScientific
and Technological
different
frequencies
are
presented
to
user.
The
Development),
the
support
given
this
work.
Development),
the support
given
this work.and Technological
(The
Brazilian for
National
Council
fortoScientific
different
frequencies
are
presented
to
the
user.
The
subject
Development), for the support given to this work.
different
frequencies
are
presented
to
the
user.
The
subject
Development), for the support given to this work.
Copyright
2015 IFAC
IFAC
138 Hosting by Elsevier Ltd. All rights reserved.
2405-8963
IFAC (International Federation of Automatic Control)
Copyright
2015,
2015
138
Copyright
2015 responsibility
IFAC
138Control.
Peer review
of International Federation of Automatic
Copyright
under
2015 IFAC
138
10.1016/j.ifacol.2015.12.023

IFAC SYROCO 2015


August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141

overtly directs attention to one of the stimuli by changing


his/her gaze attention (Zhang et al. (2010)). This kind of
SSVEP-BCI was evaluated in this study and is commonly
called as dependent since muscle activities, such as gaze
shifting, are necessary.
One of the first studies related to control of smart home
applications using biological signals, such as EEG, was
reported in (Holzner et al. (2009)). In that work a BCI
based on P300 approach is used for TV channels switching,
for opening and closing doors and windows, navigation
and conversation, but all in a controlled environment of a
virtual reality (VR) system. Twelve subjects were evaluated and an average of 67.51% in the classification for all
subjects and all decisions was achieved. Other study (Ou
et al. (2012)) based on VR in order to create a controlled
environment was performed. In that work, the term Brain
computer interface-based Smart Environmental Control
System (BSECS) was introduced and a BCI technique
with Universal Plug and Play (UPnP) home networking for
smart house applications environment was proposed. Also,
an architecture where the air conditioner and lights/lamp
can be successfully and automatically adjusted in realtime based on the change of cognitive state of users was
designed.
A hybrid BCI for improving the usability of a smart home
control was reported in (Edlinger and Guger (2012)). In
that study, P300 and SSVEP approaches were used. Results indicated that P300 is very suitable for applications
with several controllable devices, where a discrete control
command is desired. However, that study also reports that
SSVEP is more suitable if a continuous control signal is
needed and the number of commands is rather limited. A
simple threshold criterion was used to determine if the
user is looking at the flickering light. All the different
commands were summarized in 7 control masks: a light
mask, a music mask, a phone mask, a temperature mask,
a TV mask, a move mask and a go to mask. That study
was also tested in a VR. A similar approach using a hybrid
BCI paradigm based on P300 and SSVEP is reported
in (Wang et al. (2014)), where a Canonical Correlation
Analysis (CCA) technique was applied for the SSVEP
detection. Applications involving robotic wheelchairs and
SSVEP signals were also reported in (Muller et al. (2011);
Xu et al. (2012); Diez et al. (2013); Singla et al. (2014)).
A recent study using SSVEP and P300 approaches for
wheelchair control was reported in (Li et al. (2013)). On
the other hand, a hybrid BCI based on SSVEP and visual
motion stimulus was applied to a robotic wheelchair in
(Punsawad and Wongsawat (2013)). Finally, studies that
combine motor imagery (MI) and SSVEPs to control a real
wheelchair were reported in (Bastos et al. (2011) and Cao
et al. (2014)).
In this work, we address the problem of integrating a
wheelchair into a smart environment. Due to the variety
of disabilities that benefit from assistive technologies, an
optimal approach could allow the user to choose the preferred control paradigm according to the degree of his/her
disability. The system allows the handling of various devices in a real environment, e.g. a room, by means of
biological signals controlled from a robotic wheelchair. We
present this system with three kind of assistive control
139

137

paradigms using, respectively, muscle (sEMG), EOG and


brain signals (EEG) as shown in Fig. 1.
LEVELS OF
CAPACITY

USER INTERFACE

sEMG SIGNAL

Face muscles:
clenching and
raising brow

EOG SIGNAL

Ocular globe: left


and right

EEG SIGNAL

SSVEP

Fig. 1. Levels of capacity of our proposed system.


2. METHODS
Different stages were addressed as follows:
2.1 Assistive System
In the context of smart environments applied to assistive
technologies, this paper proposes an input interface allowing people with disabilities to turn on and off appliances
without help, from the wheelchair. Part of this work was to
design and build a smart box that allows controlling up to
four appliances in an environment, including TV set, radio,
lamp/lights and fan. sEMG, EOG and EEG signals were
recorded using a biological signal acquisition device. The
user can issue commands from the wheelchair, and then
the signal is transmitted through RF to the smart box,
when the corresponding equipment is finally operated. We
used RF communication to turn the appliances on and
off remotely. The RF transmitter and receiver work in
a frequency of 433MHz, controlled by an Arduino Mega
microcontroller. The communication is unidirectional, that
is, only the transmitter sends the data to the receiver.
On the other hand, an SSVEP-BCI was used to control the
navigation of the wheelchair. The wheelchair is equipped
with a small 10.1 display that exhibits the Control
Interface (CI). In addition, four small boxes containing
four Light-Emitting Diodes (LEDs) of white color were
placed in the four side of this display as visual stimuli for
the generation of evoked potentials.
The CI uses the display to visualize a menu, through which
the user can navigate or operate the desired device. It is
worth noting that this menu is dynamic, as device options
can change according to the current room, or be customized by the user before running the system. Moreover,
for some device we provide additional operations that can
be performed using a sub-menu. For example, after turning
on the TV, the display shows a sub-menu with options
such as Channel Up, Channel Down, Volume Up, Volume
Down. It is always possible to go back to the main menu
and turn off the system.
The CI was presented in (Ferrara et al. (2015)). It offers
an interface of procedures that can be accessed through
Remote Procedure Call (RPC), that is, one for turning the
interface on, one for turning it off, and one for transmitting

IFAC SYROCO 2015


138
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141
August 26-28, 2015. Salvador, BA, Brazil

logical commands encoded as integers. In (Ferrara et al.


(2015)), it has been demonstrated that choosing three
kinds of paradigms is optimal to operate the menu. In
this work, we used a novel method to assess the overall
performance of the system, named Utility (Dal Seno et al.
(2010)) while using the preferred control paradigm.

empirically; we found them to provide good results with


most subjects. For detecting eye movements, our system
provides a discrete number with values 0 and 1. We
analyzed the classification accuracy of 720 total trials.
Hence, the average accuracy and trial duration using
Utility were computed.

2.2 sEMG and EOG control

2.3 Estimation of MSI for SSVEP-BCI

For individuals with disabilities that do not affect voluntary control of facial muscles, we propose a system based
on sEMG and EOG signals. With the aim of ensuring
a high reliability, we defined the three paradigms aforementioned: moving eyes horizontally (left/right), raising
brow and prolonged clench. For processing sEMG/EOG
signals, a comparison of the signal amplitude with a predefined threshold value was performed. Aspects involving
the width and duration of the signal components in the
decision of classification were observed. The signals corresponding to the left-right eye movements were dominant
in the horizontal axes. For those signals, we found an
amplitude and duration very prolonged, opposite signals
and high amplitude. Thus, the user is able to control the
options through moving eyes whereas the raising brow is
used to confirm the highlighted option. Finally, prolonged
clench option is used for activation and deactivation of
the SSVEP-BCI. Fig. 2 shows a summary of the three
paradigms allocated for the sEMG/EOG approaches.

This stage is in charge of controlling the navigation of


the robotic wheelchair. Five subjects (three males and two
females), with ages from 21 to 27 years old, were recruited
to participate in this study (average age: 25.6; Standard
Deviation (STD): 2.61). The research was carried out in
compliance with Helsinki declaration, and the experiments
were performed according to the rules of the ethics committee of UFES/Brazil, under registration number CEP048/08.

Fig.

2. Biological signal
sEMG/EOG signals.

transducer

module

for

Each individual performed the tasks regarding the sEMG/


EOG in a different personal fashion. Thus, it is quite
optimistic to expect that a new user is able to achieve
an optimal performance since the first trial. However,
our tests revealed than for most users, just a very brief
adaptation period was required to figure out the best way
to perform the gesture. This adaptation period consisted
of 2 or 3 minutes while the user tries to execute a command
and observe a visual feedback in a LCD screen. Although
this step is not required, it is recommended considering
that this procedure can accelerate the development of
the users control skills. Online experiments with eight
healthy subjects were performed. Further, the subjects
were seated on the wheelchair, in front of the display where
the program was running, and asked to perform thirty
repetitions per command, resulting in 90 total trials. In
a control panel, the level of clenching and raising brow
through continuous values between 0 and 1 is expressed,
so it is sufficient to set predefined thresholds to detect
the two movements. We opted to trigger an action when
the correspondent value is greater than 0.8 and the other
one is less than 0.2. These thresholds were obtained
140

For the development of the SSVEP-BCI, 12 channels of


EEG with the reference at the left ear lobe were recorded
at 600 samples/s, with 1 to 100 Hz pass-band. The ground
electrode was placed on the forehead. The EEG electrode
placements were based on the International 10-20 System.
The electrodes used were: P7, PO7, PO5, PO3, POz, PO4,
PO6, PO8, P8, O1, O2 and Oz. The equipment used
for EEG signal recording was BrainNet-36. The timing
of the four LEDs flickers was precisely controlled by a
microcontroller (PIC18F4550, Microchip Technology Inc.,
USA) with 50/50% on-off duties, and frequencies of 8.0
Hz (top), 11.0 Hz (right), 13.0 Hz (bottom) and 15.0 Hz
(left). To send commands to the wheelchair, the user has
to fix the attention to one of the flickering frequencies.
The EEG data are segmented and windowed in window
lengths (WL) of 1 s with an overlap of 50%. Then, a spatial
filtering is applied using a Common Average Reference
(CAR) filter and a band-pass filter between 3-60 Hz for
the twelve channels. Several studies (Vialatte et al. (2010);
Pastor et al. (2003)) confirm that visual evoked potentials
are generated with greater intensity on the occipital area
of the cortex. Thus, the twelve electrodes were used in
the initial stage only for application of a CAR spatial
filter. According to our observations, the application of
this spatial filter to the twelve electrodes improves the
classification performance when selecting O1, O2 and Oz
electrodes. Based on that fact, we have evaluated the
detection of SSVEPs using these three channels as input
vector for the feature extractor after the filtering process.
Multivariate Synchronization Index (MSI) was used for
feature extraction. A brief description of this technique is
explained below.
MSI is a novel method to estimate the synchronization
between the actual mixed signals and the reference signals
as a potential index for recognizing the stimulus frequency.
(Zhang et al. (2014)) has proposed the use of a S-estimator
as index, which is based on the entropy of the normalized
eigenvalues of the correlation matrix of multivariate signals. Autocorrelation matrices C11 and C22 for X and
Yi , respectively, and crosscorrelation matrices C12 and
C21 for X and Yi can be obtained as (Tello et al. (2014)),
where i refers to the number of targets

IFAC SYROCO 2015


August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141

C11 = (1/N ).XXT


C22 = (1/N ).Yi .Yi

C12 = (1/N ).XYi


C21 = (1/N ).Yi .XT
as
A correlation matrix Ci can

 be constructed
C11 C12
Ci =
C21 C22

(1)
(2)
(3)
(4)

139

in the lateral canthus of the eye (F7 and F8 position


according to 10-20 standard) in order to monitoring the
frontal lobe. In addition, Fig. 4 shows the general block
diagram of the proposed system.

(5)

The internal correlation structure of X and Yi contained


in the matrices C11 and C22 , respectively, is irrelevant for
the detection of stimulus frequency (Carmeli et al. (2005)).
It can be removed by constructing a linear transformation
matrix


0
C11 1/2
(6)
U=
0
C22 1/2
So that C11 1/2 C11 1/2 =C11 , C22 1/2 C22 1/2 =C22 and by
i = UCU which results in
applying the transformation C
a transformed correlation matrix of size P P , where
P = M + 2H (Carmeli et al. (2005)). The eigenvalues
i , normalized as
i = i / P i for
i1 , i2 ,...,iP of C
m
m
m=1 m
m = 1, 2, ..., P , can be used to evaluate the synchronization
index Si for matrix Yi as
P i
i )
log(
m
(7)
Si = 1 + m=1 m
log(P )
see (Zhang et al. (2014)). Using S1 ,S2 , ...,SK computed
for the stimulus frequencies f1 ,f2 , ...,fK , the MSI can be
estimated as
(8)
S = maximum Si
1iK

In this case, the way to assess the performance of the


SSVEP-BCI system was the Shannons Information Transfer Rate (ITR), see details in (Vialatte et al. (2010)).
3. SYSTEM ARCHITECTURE
sEMG, EOG and EEG signals are captured by the signal
acquisition equipment, which has inputs for electromyographic and electroencephalographic signals. Through a
computational sniffer the biological signals are read from
the equipment, then these signals are transmitted and processed in an embedded computer by algorithms developed
in Matlab. This embedded computer in the wheelchair
has the following specifications: Mini ITX motherboard,
3.40 GHz Intel Core i5 processor, and 4GB RAM. The
data are analyzed in the main routine and the prolonged
clench signal works as a switch, which determines the operation of the navigation or the smart environment control.
Raising brow or moving the eyes are used to control the
devices inside the house. The navigation of the wheelchair
is executed through the SSVEP-BCI approach, which is
based on commands and these are used for directional
control of the wheelchair. The LED on the above side (8
Hz) indicates forward, the LED on the right indicates the
movement to the right turn (11 Hz), the LED on the left
side (15 Hz) indicates the movement to the left turn and,
finally, the LED on the bottom (13 Hz) indicates stopping.
Fig. 3 shows all electrode placement locations and a user
on the wheelchair. The sEMG/EOG electrodes were placed
141

Fig. 3. (a) Electrode placement location using 10-20 system for our system; (b) an user using the wheelchair.
4. EXPERIMENTAL RESULTS
Table 1 summarizes the classification outcome for muscle
movement tasks. Each movement is treated independently
because a subject may be able to execute some movement
very better (or worse) than the others. It can be noticed
that the average accuracy is remarkably higher than random guessing and often expresses a very good performance
of the classifier (between 85% to 95%).
Nevertheless, as often in assistive technologies, some users
show troubles while realizing a certain movement that
preclude them to give the correspondent command. For
example, subject 1 was not able to succeed while raising
brow. Since the computation of Utility depends on which
menu option the user wants to operate, we consider the
mean Utility as the arithmetic mean of the Utility in a
menu with four options and express the result in bits per
minute to facilitate the comparison with ITR.
A critical advantage of control by means of muscle movement is the speed of recognition. It can be seen that an
operation is triggered after a very small amount of time.
Fig. 5 represents the results of online tests using facial
expressions. Each point represents an expression made
by a user. Points in the top-left corner indicate better
performance. The size of the dots represents the value
of Utility. This could lead to a fast and efficient control
paradigm, especially after a period of training and selfimprovement.
For evaluation of the EEG control, each volunteer (during
30 seconds) fixes his/her attention to each stimulus and the
results are shown in Table 2. The results from the SSVEPBCI were acceptable considering that subjects never used
a BCI and neither had previous training. The highest value
of accuracy was for the subject 3 with 77% and 51.27
bits/min of ITR achieved from the average value of the
accuracy.

IFAC SYROCO 2015


140
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141
August 26-28, 2015. Salvador, BA, Brazil

Navigation Control

Feature Extraction

Pre-processing

sEMG
(Clenching)

Switch
Control

Classification

Smart Environment

Brain
Signals
(EEG)

EOG +
sEMG
(raising
brow)

UFES
Robotic
Wheelchair

Biosignal
Acquisition
Equipment

Main
Routine
PC

Switch
Control

Brain
Signals
(EEG)

Ocular
globe
(EOG)

8 Hz

15 Hz

11 Hz

Face
muscles
(sEMG)

13 Hz

Visual
stimuli
through
SSVEP

Navigation

Fig. 4. The system paradigm of our multimodal system.


wheelchair. Three control paradigms using sEMG, EOG
and EEG signals were introduced and each stage showed a
good performance for most subjects. Our strategy provides
the reliability in terms of classification result and safety
of wheelchair control. To evaluate the sEMG/EOG system, eight subjects participated of the experiments. The
average accuracy for prolonged clench approach was of
95%, the raising brow was 85% and moving eyes achieved
93%. On the other hand, five subjects participated of the
study for the SSVEP-BCI. The total values of classification
vary in the range of 45-77% among subjects, considering
a WL of 1s. Our results would even improve because it
is widely known that increase in accuracy is related with
increase of the time windows (Tello et al. (2014)), this
fact suggests that while more information is processed, the
feature extractor can detect visual evoked potentials with
more precision.
Fig. 5. Comparison between control of the system from
different muscle movements and speed of recognition.

ACKNOWLEDGEMENTS

Table 1. Accuracy results for EEG control


using WL of 1 s.

The authors wish to thank all the volunteers for their


participation in the experiments.

SSVEP
Frequency
8 Hz
11 Hz
13 Hz
15 Hz
Mean Acc. [%]
ITR [bits/min]

s1
0.88
0.80
0.55
0.54
0.70
38.29

s2
0.82
0.70
0.73
0.42
0.67
33.49

Subjects
s3
0.90
0.78
0.82
0.58
0.77
51.27

s4
0.68
0.92
0.67
0.56
0.71
39.71

REFERENCES

s5
0.58
0.52
0.42
0.27
0.45
7.90

5. CONCLUSION
In this paper we presented a multimodal system capable
to employ different biological signals in order to control
several home appliances and the navigation of a robotic
142

Bastos, T., Muller, S., Benevides, A., and Sarcinelli-Filho,


M. (2011). Robotic wheelchair commanded by SSVEP,
motor imagery and word generation. In IEEE Engineering in Medicine and Biology Society, (EMBC).
Cao, L., Li, J., Ji, H., and Jiang, C. (2014). A hybrid brain
computer interface system based on the neurophysiological protocol and brain-actuated switch for wheelchair
control. Journal of Neuroscience Methods, 229, 3343.
Carmeli, C., Knyazeva, M.G., Innocenti, G.M., and Feo,
O.D. (2005). Assessment of EEG synchronization based
on state-space analysis. NeuroImage, 25, 339354.

IFAC SYROCO 2015


August 26-28, 2015. Salvador, BA, Brazil
Richard J. M. G. Tello et al. / IFAC-PapersOnLine 48-19 (2015) 136141

Chen, C.H., Ho, M.S., Shyu, K.K., Hsu, K.C., Wang,


K.W., and Lee, P.L. (2014). A noninvasive brain
computer interface using visually-induced near-infrared
spectroscopy responses. Neur. Letters, 580, 2226.
Dal Seno, B., Matteucci, M., and Mainardi, L. (2010). The
Utility Metric: A Novel Method to Assess the Overall Performance of Discrete Brain-Computer Interfaces.
Neural Systems and Rehabilitation Engineering, IEEE
Transactions on, 18(1), 2028.
Diez, P.F., Muller, S.M.T., Mut, V.A., Laciar, E.,
Avila, E., Bastos-Filho, T.F., and Sarcinelli-Filho, M.
(2013). Commanding a robotic wheelchair with a highfrequency steady-state visual evoked potential based
brain-computer interface. Med Eng Phys.
Edlinger, G. and Guger, C. (2012). A hybrid BrainComputer Interface for improving the usability of a
smart home control. In Complex Medical Engineering
(CME), 2012 ICME, 182185.
Ferrara, F., Bissoli, A., and Bastos-Filho, T. (2015). Designing an Assistive Control Interface based on Utility.
Proceedings of the 1st International Workshop on Assistive Technology IWAT 2015, Vitoria, Brazil, 142145.
Gao, X., Xu, D., Cheng, M., and Gao, S. (2003). A BCIbased environmental controller for the motion-disabled.
Neural Systems and Rehabilitation Engineering, IEEE
Transactions on, 11(2), 137140.
He, B. (2013). Neural Engineering. Springer. 2nd ed.
Holzner, C., Guger, C., Edlinger, G., Gronegress, C., and
Slater, M. (2009). Virtual Smart Home Controlled by
Thoughts. In Enabling Technologies: Infrastructures
for Collaborative Enterprises, 2009. WETICE 09. 18th
IEEE International Workshops on, 236239.
Kelly, S., Lalor, E., Reilly, R., and Foxe, J. (2005). Visual
spatial attention tracking using high-density SSVEP
data for independent brain-computer communication.
Neural Systems and Rehabilitation Engineering, IEEE
Transactions on, 13(2), 172178.
Li, Y., Pan, J., Wang, F., and Yu, Z. (2013). A Hybrid
BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control. Biomedical Engineering,
IEEE Transactions on, 60(11), 31563166.
Muller, S.M.T., de S, A.M.F.L.M., Bastos-Filho, T.F.,
and Sarcinelli-Filho, M. (2011). Spectral techniques for
incremental SSVEP analysis applied to a BCI implementation. V CLAIB La Habana, 10901093.
Ou, C.Z., Lin, B.S., Chang, C.J., and Lin, C.T. (2012).
Brain Computer Interface-based Smart Environmental
Control System. In Intelligent Information Hiding and
Multimedia Signal Processing (IIH-MSP), 281284.
Pastor, M., Artieda, J., Arbizu, J., Valencia, M., and
Masdeu, J. (2003). Human cerebral activation during
steady-state visual evoked response. J. Neurosci.
Punsawad, Y. and Wongsawat, Y. (2013). Hybrid SSVEPmotion visual stimulus based BCI system for intelligent
wheelchair. In Engineering in Medicine and Biology
Society (EMBC), 74167419.
Singla, R., Khosla, A., and Jha, R. (2014). Influence of
stimuli colour in SSVEP-based BCI wheelchair control
using support vector machines. Journal of Medical
Engineering & Technology, 38, 125134.
Tello, R., Muller, S., Bastos-Filho, T., and Ferreira, A.
(2014). A comparison of techniques and technologies
for SSVEP classification. In Biosignals and Biorobotics
143

141

Conference (BRC), 5th ISSNIP-IEEE, 16.


Vialatte, F.B., Mauriceb, M., Dauwelsc, J., and Cichocki,
A. (2010). Steady state visually evoked potentials: Focus
on essential paradigms and future perspectives. Progress
in Neurobiology, 90, 418438.
Wang, M., Daly, I., Allison, B.Z., Jin, J., Zhang, Y., Chen,
L., and Wang, X. (2014). A new hybrid BCI paradigm
based on P300 and SSVEP. Jour of Neur Methods.
Wolpaw, J., Birbaumer, N., Heetderks, W., McFarland,
D., Peckham, P., Schalk, G., Donchin, E., Quatrano, L.,
Robinson, C., and Vaughan, T. (2000). Brain-computer
interface technology: a review of the first international
meeting. Rehabilitation Engineering, IEEE Transactions on, 8(2), 164173.
Xu, Z., Li, J., Gu, R., and Xia, B. (2012). SteadyState Visually Evoked Potential (SSVEP)-Based BrainComputer Interface (BCI): A Low-Delayed Asynchronous Wheelchair Control System. Neural Information Processing. 19th ICONIP. Springer., 305314.
Zhang, D., Maye, A., Gao, X., Hong, B., Engel, A.K., and
Gao, S. (2010). An independent brain-computer interface using covert non-spatial visual selective attention.
J. Neural Eng.
Zhang, Y., Xu, P., Cheng, K., and Yao, D. (2014). Multivariate synchronization index for frequency recognition
of SSVEP-based brain computer interface. Journal of
Neuroscience Methods, 221(0), 32 40.
Table 2. Summary results for sEMG/EOG
control.
Subject
1
2
3
4
5
6
7
8
Average
Subject
1
2
3
4
5
6
7
8
Average
Subject
1
2
3
4
5
6
7
8
Average

Prolonged clench
Time [s]
Utility [bits/min]
1.06
58.80
0.86
72.60
1.29
45.18
2.08
30.00
3.09
17.52
2.07
20.13
3.86
11.87
2.20
28.38
2.06
35.56
Raising brow
Acc. [%]
Time [s]
Utility [bits/min]
0.90
1.35
37.20
1.00
1.12
55.80
0.93
1.87
28.92
0.73
2.41
12.00
0.80
1.55
24.18
0.97
3.41
17.10
0.63
2.81
5.88
0.85
2.07
25.87
Moving eyes
Acc. [%]
Time [s]
Utility [bits/min]
1.00
1.60
39.00
1.00
1.58
39.60
1.00
1.87
33.60
0.73
6.32
4.56
1.00
2.39
26.15
0.97
1.43
40.79
0.73
4.07
7.16
0.97
3.67
15.89
0.93
2.87
25.85
Acc. [%]
1.00
1.00
0.97
1.00
0.93
0.83
0.87
1.00
0.95

You might also like