You are on page 1of 10

IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO.

5, SEPTEMBER 2011 737


A Novel Emotion Elicitation Index Using Frontal
Brain Asymmetry for Enhanced EEG-Based
Emotion Recognition
Panagiotis C. Petrantonakis, Student Member, IEEE, and Leontios J. Hadjileontiadis, Senior Member, IEEE
AbstractThis paper aims at providing a novel method for eval-
uating the emotion elicitation procedures in an electroencephalo-
gram (EEG)-based emotion recognition setup. By employing the
frontal brain asymmetry theory, an index, namely asymmetry
Index (AsI), is introduced, in order to evaluate this asymmetry.
This is accomplished by a multidimensional directed information
analysis between different EEG sites from the two opposite brain
hemispheres. The proposed approach was applied to three-channel
(Fp1, Fp2, and F3/F4 10/20 sites) EEG recordings drawn from 16
healthy right-handed subjects. For the evaluation of the efciency
of the AsI, an extensive classication process was conducted using
two feature-vector extraction techniques and a SVM classier for
six different classication scenarios in the valence/arousal space.
This resulted in classication results up to 62.58% for the user in-
dependent case and 94.40%for the user-dependent one, conrming
the efcacy of AsI as an index for the emotion elicitation evaluation.
Index TermsEmotion elicitation, emotion recognition,
electroencephalogram, frontal brainasymmetry, multidimensional
directed information.
I. INTRODUCTION
H
UMAN machine interaction (HMI) has gained intense
attention in the last decade as machines have dramatically
inuenced our lives in many aspects, such as, communication,
profession, and entertainment. It has been lately argued [1] that if
machines could understand a persons affective state, HMI may
become more intuitive, smoother, and more efcient dening
a new approach in the HMI area known as ffective computing
(AC). AC is the research eld that deals with the design of
systems and devices that can recognize, interpret, and process
human emotions and would serve as the means of imbuing
machines with the ability of acting emotionally.
Emotion recognition is the rst step toward the abovemen-
tioned ultimate endeavor of AC. In order to recognize emo-
tions many approaches have been proposed basically using
facial expressions [2], [3], speech [4], [5], or signals from
the autonomous nervous system (ANS) (e.g., heart rate, gal-
vanic skin response) [6], [7] or even combination of them
Manuscript received November 6, 2010; revised April 13, 2011; accepted May
20, 2011. Date of publication May 27, 2011; date of current version September
2, 2011.
The authors are with the Department of Electrical and Computer Engineering,
Aristotle University of Thessaloniki, Thessalonika, GR-54124, Greece (e-mail:
ppetrant@auth.gr; leontios@auth.gr).
Color versions of one or more of the gures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identier 10.1109/TITB.2011.2157933
[8]. Lately, electroencephalogram-based emotion recognition
(EEG-ER) [9][11] has been proposed, which offers great ben-
ets with regard to its implementation, such as efcient time res-
olution, less intrusiveness, and signals captured from the origin
of the emotion genesis, i.e., the central nervous system (CNS).
During an EEG-ER realization three major steps have to be
implemented. First of all, the emotion elicitation step deals with
the problem of efciently evoked emotions to subjects who par-
ticipate in the corresponding experiment. Second and third steps
have to do with captured data preprocessing and classication,
respectively. Their effectiveness, however, is highly dependent
on the emotion elicitation. The latter is of major importance in
an EEG-ER system. If the subjects have not effectively become
emotionally aroused during the emotion elicitation steps, the re-
spective signals would not carry the corresponding emotional
information, resulting in an incompetent emotion classication
process.
Emotion elicitation procedures are mostly based on projec-
tions of videos or pictures that are assumed to evoke certain
emotions. The majority of the studies that dealt with the emotion
recognition problem (e.g., [11]) conducted emotion elicitation
processes that consisted of such projections of multiple trials for
each emotional state and used all of the signals corresponding to
each one of the trials for the classication step. Considering that
the situations depicted in the aforementioned videos/pictures
are emotionally interpreted from the individuals in a way that is
highly inuenced by factors, such as personality and personal
experiences, it is signicantly questioned if all of the trials have
the same emotional impact to all of the subjects. Different sub-
jects may be emotionally affected by different videos/pictures.
In this work, a novel measure to evaluate the emotional im-
pact of each emotion elicitation trial via the EEG signals is
introduced. Consequently, by picking out the emotion elicita-
tion trials that correspond to signicant emotional responses,
according to the introduced evaluation measure, the succeed-
ing classication would have considerable rate improvement.
In order to dene an emotion elicitation evaluation measure,
the frontal brain asymmetry concept [12] was used, exploiting
the neuronal bases of the emotion expression in the brain. To
extract this evaluation measure in the form of an index, namely
the asymmetry index (AsI), a multidimensional directed infor-
mation (MDI) analysis [13] was adopted, resulting in a robust
mathematical representation of the asymmetry concept, in a
manner that efciently handles the information ow (in bits)
among multiple brain sites. In order to proceed with the assess-
ment of the effectiveness of the AsI, an extended classication
1089-7771/$26.00 2011 IEEE
738 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011
process took place. A support vector machine (SVM) classier
and two feature-vector extraction techniques were employed for
the classication process. EEG signals were acquired from 16
healthy subjects using three EEG channels, which are sufcient
both for the representation of emotion expression in brain (see
Section II) and for the implementation of the MDI analysis [13].
Experimental results demonstrated the potential of AsI as a ro-
bust evaluation criterion of the emotion elicitation step of an
EEG-ER system.
The rest of the paper is structured as follows. Section II gives
some background material with regard to the emotion elicitation
process and frontal brain asymmetry concept, thoroughly de-
scribes the MDI method and introduces the proposed approach.
Section III discusses some implementation issues, such as the
construction of the dataset and the classication set up, whereas
Section IV presents the results. Section V provides some dis-
cussion on the overall evaluation of the proposed methodology.
Finally, Section VI concludes the paper.
II. METHOD
A. Background
1) Valence/Arousal-Based Emotion Elicitation: Psycholo-
gists do not present emotions as discrete states, but rather
as continuous ones and therefore demonstrate them in an
n-dimensional (n-D) space; usually the 2-D valence/arousal
space (VAS) is adopted. Valence stands for ones judgment about
a situation as positive or negative and arousal spans from calm-
ness to excitement, expressing the degree of ones excitation.
The most frequently used technique to evoke emotion during an
EEG-ER procedure is to use pictures depicting situations that
are supposed to elicit affective states relying in the VAS. The
International Affective Picture System (IAPS) [14] is a widely
used dataset, containing such pictures, which come with their
individual values of valence and arousal. Consequently, the pro-
jection of the IAPS pictures to a subject with the simultaneous
recording of EEG signals formulates an emotion elicitation pro-
cess originating from the VAS theory.
2) Frontal Brain Asymmetry: The asymmetry between the
left and right brain hemispheres forms the most prominent ex-
pression of emotion in brain signals. Davidson et al. [12] de-
veloped a model that related this asymmetric behavior with
emotions, with the latter analyzed in the VAS. According to that
model, emotions are: 1) organized around approach-withdrawal
tendencies and 2) differentially lateralized in the frontal region
of the brain. The left frontal area is involved in the experi-
ence of positive emotions (high values of valence), such as joy
or happiness (the experience of positive affect facilitates and
maintains approach behaviors), whereas the right frontal region
is involved in the experience of negative emotions (lower va-
lence values), such as fear or disgust (the experience of negative
affect facilitates and maintains withdrawal behaviors). In order
to quantify the abovementioned asymmetry, Davidson used the
asymmetry index DI = (L R)/(L +R), where L and R are
the power of specic bands of the left and right hemispheres,
respectively. Later, the asymmetry concept was also conrmed
by other studies. For example, Davidson et al. [15], tried to
differentiate the emotions of happiness and disgust with EEG
signals captured from the left and right frontal, central, anterior
temporal, and parietal regions (F3, F4, C3, C4, T3, T4, P3, P4
positions according to the 1020 system [16]). The results re-
vealed a more right-sided activation, as far as the power of alpha
(812 Hz) band of the EEG signal is concerned, for the disgust
condition for both the frontal and anterior temporal regions.
Thus, the results enhanced the applicability of the aforemen-
tioned model (this model is used by the method described in
succeeding sections to dene the AsI as a metric of an emo-
tion arousal (see II.C)) and conrmed the evidenced extensive
anatomical reciprocity of both regions with limbic circuits that
have been directly implicated in the control of emotion [17].
Furthermore, Davidson [18] cites several studies that have ex-
amined frequency bands other than alpha, including theta, beta,
and gamma. In these studies, alpha power was also examined
and in a number of cases, asymmetrical effects were found in
bands other than alpha, while effects in the alpha band were
absent. Moreover, Bos [11] examined the efcacy of alpha, beta
and the combination of them to discriminate emotions within
the VAS and concluded that both bands include important infor-
mation for the aforementioned discrimination. This perspective
is adopted in this study (see Section III.A).
B. Multidimensional Directed Information
Correlations among multiple time series are in general ex-
pected when they are simultaneously observed from an ob-
ject. If a relation of temporal ordering is noted as the corre-
lation relation among these time series, some are interpreted as
causes and others as results, suggesting a causeeffect relation
among the time series (causality analysis). When causality in
such a sense is noted in multiple time series, the relation is
dened as directed information [19]. There are methods suited
to causality analysis, such as directed-coherence analysis [20],
directed-information analysis [19], multidimensional directed
information (MDI) analysis [13], Kaminskis method (DTF)
[21], partial directed coherence [22], and Granger causality [23].
In this work, the MDI analysis was employed as a means to
identify the causality between any two series considering all
acquired series. One of the main advantages of MDI is that the
amount of information propagation is presented as an absolute
value in bits and not as a correlation, which is a relative value
(e.g., in directed-coherence analysis [20]); a description of the
MDI follows.
Consider the simple case of two stationary time series X and
Y of length N divided into n epochs of length L = N/n; each
epoch of length L = P + 1 +M is written as a sequence of
two sections of length P and M before and after the x
k
and y
k
sampled value of time series X and Y at time k, respectively, i.e.,
X = x
kP
. . . x
k1
x
k
x
k+1
. . . x
k+M
= X
P
x
k
X
M
(1)
Y = y
kP
. . . y
k1
y
k
y
k+1
. . . y
k+M
= Y
P
y
k
Y
M
(2)
where X
P
= x
kP
. . . x
k1
; X
M
= x
k+1
. . . x
k+M
; Y
P
=
y
kP
. . . y
k1
; Y
M
= y
k+1
. . . y
k+M
.
PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 739
The mutual information between the time series X and Y is
written as
I(X; Y ) =

k
I
k
(X; Y ) (3)
where,
I
k
(X; Y ) = I(x
k
; Y
M
|X
P
Y
P
y
k
)
+I(y
k
; X
M
|X
P
Y
P
x
k
) +I(x
k
; y
k
|X
P
Y
P
).
(4)
The three terms of the right-hand side of (4) correspond to the
information shared by the signal x
k
of X at time k and the future
part Y
M
of Y after time k (rst part); the information shared by
the signal y
k
of Y at time k and the future part X
M
of X after
time k (second part); and the information that is not contained in
the past parts X
P
and Y
P
of X and Y but is shared by x
k
and y
k
(third part). Since I
k
(X; Y) represents mutual information which
has symmetry, we have I
k
(X; Y) = I
k
(Y; X), meaning that it
contains no directivity, while the three terms in the right part of
(4) contain a temporal relation which produces directivity. This
directivity is dened [19] as directed information and depicted
by using the arrow for clarication. For example, the rst term
of the right part of (4) can be written as
I(x
k
; Y
M
|X
P
Y
P
y
k
) = I(x
k
Y
M
|X
P
Y
P
y
k
) (5)
and analyzed as
I(x
k
Y
M
|X
P
Y
P
y
k
) =
M

m=1
I(x
k
y
k+m
|X
P
Y
P
y
k
)
(6)
where each term on the right-hand side of (6) can be interpreted
as information that is rst generated in Xat time k and propagated
with a time delay of m to Y, and can be calculated through
the conditional mutual information as a sum of joint entropy
functions:
I(x
k
y
k+m
|X
P
Y
P
y
k
) = H(X
P
Y
P
x
k
y
k
)
+H(X
P
Y
P
y
k
y
k+m
) H(X
P
Y
P
y
k
)
H(X
P
Y
P
x
k
y
k
y
k+m
). (7)
According to [13], the joint entropy H(z
1
. . . z
n
) of n Gaus-
sian stochastic variables z
1
, . . . , z
n
can be calculated using the
covariance matrix R(z
1
. . . z
n
) as
H(z
1
. . . z
n
) =
1
2
log[(2e)
n
|R(z
1
. . . z
n
)|] (8)
where | | denotes the determinant; by using (8), (7) can be
written as
I(x
k
y
k+m
|X
P
Y
P
y
k
)
=
1
2
log
|R(X
P
Y
P
x
k
y
k
)| |R(X
P
Y
P
y
k
y
k+m
)|
|R(X
P
Y
P
y
k
)| |R(X
P
Y
P
x
k
y
k
y
k+m
)|
. (9)
So far, the calculation of the information owbetween two series
was presented. When the relations among three or more series
are to be examined, however, the correct result of analysis is
not generally obtained if the aforementioned method is applied
to each pair of series [13]. To better comprehend this, consider
that there exist three series X, Y, and Z (instead of the just two
X and Y) with information ow from Z to both X and Y but not
between X and Y. When conventional directed information anal-
ysis, based only on the relation between X and Y, is applied, an
information ow would wrongly be identied, as if there exists
a ow between X and Y, since they contain the common compo-
nent from Z. To circumvent this ambiguity, the aforementioned
method has to be expanded accordingly, in order to consider all
measured time series, and the MDI, which represents the ow
between two arbitrary series, must be dened. Consequently,
the following expression for the MDI is obtained for the simple
case of three interacting signals X, Y, Z:
I(x
k
y
k+m
|X
P
Y
P
Z
P
y
k
z
k
)
=
1
2
log
|R(X
P
Y
P
Z
P
x
k
y
k
z
k
)||R(X
P
Y
P
Z
P
y
k
z
k
y
k+m
)|
|R(X
P
Y
P
Z
P
y
k
z
k
)||R(X
P
Y
P
Z
P
x
k
y
k
z
k
y
k+m
)|
.
(10)
Using (10) and (6), the total amount of information, namely
S, that is rst generated in X and propagated to Y taking into
account the existence of Z, across the time delay range is
S
XY
: I(x
k
Y
M
|X
P
Y
P
Z
P
y
k
z
k
) =
M

m=1
1
2
log

|R(X
P
Y
P
Z
P
x
k
y
k
z
k
)| |R(X
P
Y
P
Z
P
y
k
z
k
y
k+m
)|
|R(X
P
Y
P
Z
P
y
k
z
k
)| |R(X
P
Y
P
Z
P
x
k
y
k
z
k
y
k+m
)|
.
(11)
In the subsequent paragraph, (11) will be used to consolidate
the AsI measure by estimating the mutual information shared
between the left and right brain hemisphere, exploiting that way
the frontal brain asymmetry concept.
C. The Proposed Approach
According to the frontal brain asymmetry concept, the expe-
rience of negative emotions is related with an increased right
frontal and prefrontal hemisphere activity, whereas positive
emotions evoke an enhanced left-hemisphere activity. Let us
assume that an EEG channel, i.e., Channel 1 recorded from the
left hemisphere, another EEGchannel, Channel 2 fromthe other
hemisphere, and Channel 3 recorded from both hemispheres as
a dipole channel represent the signals X, Y, and Z, previously in-
troduced by the MDI analysis, respectively. If we now consider
the asymmetry concept, a measure to evaluate this asymmetry
information in signals X and Y taking into account the infor-
mation propagated by signal Z to both of them (and as a result
isolating the information shared between the X, Y pair more
effectively), would introduce an index of how effectively an
emotion has been elicited. Toward this, it is assumed that the
total amount of information S [see (11)], hidden in the EEG sig-
nals and shared between the left and right hemisphere (signals
X and Y, respectively) would become maximum when the sub-
ject is calm (information symmetry), whereas S would become
minimum when the subjects are emotionally aroused (infor-
mation asymmetry). In order to investigate the aforementioned
740 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011
assumption, two values were calculated according to the MDI
analysis, i.e., the S
r
and S
p
values. S
r
refers to bidirectional in-
formation sharing between X and Y, taking into account Z, when
the subject does not feel any emotion; hence, she/he is relaxed,
i.e.,
S
r
= S
XY
r
+S
Y X
r
(12)
where S
p
is the same sharing information during the period
where she/he is supposed to feel an emotion, i.e.,
S
p
= S
XY
p
+S
Y X
p
. (13)
According to what has already been discussed, S
p
will presum-
ably be smaller than S
r
if the asymmetry concept holds. Finally,
in order to directly dene a measure for emotion experience,
the asymmetry index (AsI) is introduced which is dened as
the distance of the (S
p
, S
r
) point, corresponding to a specic
picture, from the line S
r
= S
p
, i.e.
AsI = (S
r
S
p
)

2
2
. (14)
Later in this paper, AsI will serve as an index for the efciency
of the emotion elicitation of each trial during the corresponding
experiment (see subsequent section for dataset construction) and
will be further evaluated in regard of its foundation as a metric
for emotion arousal through an extensive classication setup.
III. IMPLEMENTATION ISSUES
A. Dataset Construction
For the construction of the EEG data, a specically designed
experiment was conducted through an emotion elicitation pro-
cess. In this experiment, 16 (nine males and seven females)
healthy right-handed volunteers in the age group of 1932 years
participated. The whole experiment was designed to induct emo-
tion within the VAS and, specically, for four affective states,
i.e., LALV (low arousal-low valence), LAHV (low arousal-high
valence), HAHV (high arousal-high valence), and HALV (high
arousal-low valence).
1
Ten pictures per affective state were se-
lected from the IAPS database according to their arousal and
valence values provided by the norms of the IAPS database.
Particularly, the low (L) grade previously mentioned was con-
sidered lower than 4 (<4) in a scale of 19 and the high (H)
grade greater than 6 (6<) in the same scale. Moreover, the stan-
dard deviation of these values considered to be lower than 2.2.
According to these criteria, 11, 47, 61, and 17 pictures for LALV,
LAHV, HAHV, and HALV affective states were extracted, re-
spectively. From these pictures, ten pictures per affective state
were randomly selected and constituted the 40-picture database
used in the experiment. Fig. 1(a) shows the location of the pic-
tures nally selected at the VAS.
The experimental protocol included the series of steps
schematically depicted in Fig. 1(b). In particular, before the
projection of each picture, an 11-s period took place consisting
sequentially of: 1) a 5-s black screen period, 2) a 5-s period
1
Valence is commonly associated with positive/negative instead of high/low
but the second one is used here for the sake of simplicity.
Fig. 1. (a) Location of the selected pictures (big black dots) for the experi-
ments conduction along with the rest pictures of the IAPS database (small black
dots) at the arousal-valence space. (b) Schematic representation of the experi-
mental protocol followed. (c) The Fp1, Fp2, F3, and F4 electrode positions in
the 1020 system (marked with black), used for the EEG acquisition.
in which countdown frames (5 1) were demonstrated, and
3) an 1-s projection of a cross shape in the middle of the screen to
attract the sight of the subject. The 5-s countdown phase was em-
ployed to accomplish a relaxation phase and emotion-reset [11],
due to its naught emotional content before the projection of the
new picture. During the experiment, the selected pictures were
projected (in sequence: 10 for LALV, 10 for LAHV, 10 for
HAHV, and 10 for HALV) for 5 s. After the picture projection, a
computerized 20-s self-assessment mannequin (SAM) [24] pro-
cedure took place. The same 36-second procedure was repeated
for every one of the 40 pictures. The sequence of the projection
of the affective states was chosen in order to show emotionally
PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 741
intensive pictures (i.e., pictures with high arousal score) at the
end of the experiment as a picture that is supposed to evoke
an intense emotion would emotionally cover up a subsequent
one which is characterized by a lower arousal value. In other
words, an intensive emotion would dominate upon a milder one.
Thus the HAHV and HALV picture sets were shown after the
LALV and LAHV ones.
The EEG signals from each subject were recorded during
the whole projection phase. The EEG signals were acquired
from Fp1, Fp2, F3, and F4 positions, according to the 1020
system [see Fig. 1(c)], related to the emotion expression in
the brain, based on the asymmetry concept. The Fp1 and Fp2
positions were recorded as monopole channels (Channels 1 and
2, respectively, according to the description took place in the
aforementioned section), whereas the F3 and F4 positions as
a dipole (Channel 3), resulting in a three-channel EEG set.
Thus, the shared information between Fp1 and Fp2 channels
(see Section II.B) is measured, as effectively as possible, by
taking into account the respective information propagated by
the dipole F3/F4 to both of them. The ground and reference
electrodes were placed at the right (A2) and left (A1) earlobes,
respectively [Fig. 1(c)].
After the acquisition part, the EEG signals were subjected to
a band-pass Butterworth ltering, to retain only the frequencies
within the alpha (812 Hz) and beta (1330 Hz) bands. Thus,
both the mutual relation regarding the prefrontal cortical acti-
vation or inactivation (see Section II.A) is exploited, and the
elimination of superimposed artifacts from various sources is
accomplished. As it has been reported [25][27] they are effec-
tively eliminated by appropriate bandpass ltering. Specically,
the inuence of eye movement/blinking is most dominant below
4 Hz, heart-functioning causes artifacts around 1.2 Hz, whereas
muscle artifacts affect the EEG spectrum above 30 Hz. Non-
physiological artifacts caused by power lines are clearly above
30 Hz (5060 Hz). Consequently, by extracting the alpha and
beta frequency bands only from the acquired EEG recordings,
the most part of the noise inuence is circumvented. Afterwards,
the EEG signals were segmented into 5 s segments correspond-
ing to the duration of each picture projection. Finally, the EEG
signals referring to the countdown phase were also cut and l-
tered, eliminating any undesired artifact, as they were intended
to be used as the ground truth for the evaluation of the emotion
elicitation. It must be stressed out that S
p
and S
r
values were
calculated for the signals that correspond to the projection of
the IAPS picture and the countdown phase that took place im-
mediately before the projection of the abovementioned picture,
respectively.
EEG recordings were conducted using the g.MOBIlab
(g.tec medical and electrical engineering, Guger Technologies,
www.gtec.at) portable biosignal acquisition system [four EEG
bipolar channels, passive electrodes (i.e., with no inbuilt cir-
cuitry), lters: 0.530 Hz, voltage change sensitivity: 100 V,
data acquisition: A/D converter with 16-bit resolution and sam-
pling frequency of 256 Hz; data transfer: wireless, Bluetooth
Class I technology].
B. Classication Setup
For better evaluating the efciency of the AsI to serve as
metric of the reliability of an emotion elicitation procedure,
a thorough classication process was designed. In particular,
two feature-vector extraction techniques one classier and six
different classication scenarios were employed.
The feature vector set was constructed based on two methods,
i.e., higher order crossings (HOC) [28] and cross-correlation
(CC) [29]. HOC as a technique for feature vector extraction was
proposed by the authors [28], resulting in good classication
results. The CC method was chosen here for further evaluation
of both the HOC scheme and AsI index, as mental and emo-
tional brain activities strongly depend on synaptic connections;
hence, the state of mind is partly manifested in the spatial cor-
relation of EEGs. To this end, cross-correlation coefcients of
EEGs associated with emotional states are used to form the
feature vector of the CC approach [29]. A brief description of
the implementation issues of the two feature-vector extraction
methods is presented in the Appendix.
As there is no single best classication algorithm, that is one-
size-ts-all, the choice of the most efcient classier is strongly
dependent on the examined problem and the relevant dataset
to be classied [30]. After having tested some classiers such
as quadratic discriminant analysis (QDA) [31], Mahalanobis
distance (MD) [32], k-nearest neighbor (k-NN) [33], and support
vector machines (SVM) [34] we chose the SVM, in particular
with a ve-order polynomial kernel, which outperformed with
higher recognition accuracy in our case.
According to the AsI concept, emotion elicitation trials, i.e.,
corresponding (S
p
, S
r
) pairs, with big AsI values are supposed
to elicit more effectively the respective emotional state. On the
other hand, trials with lower AsI values should exhibit the oppo-
site result. Toward the investigation of this assumption, the AsI
values were normalized in the range [0, 1] and the emotion elic-
itation trials that correspond to AsI larger than 0.5 (0.5 < AsI)
gathered the big AsI group. The other with AsI lower than 0.1
(AsI < 0.1) gathered the corresponding small AsI group. The
threshold for the construction of the small AsI group was cho-
sen to conclude in a group with comparative number of signals
per affective states with the respective group of the big AsI.
Particularly, the big AsI group was consisted of 31, 29, 21, and
15 signals (gathered from approximately all subjects) for the
LALV, LAHV, HAHV, and HALV affective states, respectively.
On the other hand, the small AsI group incorporated 34, 35,
35, and 37 signals for the same affective states as before. Af-
terwards, a classication procedure was implemented for both
big and small AsI groups under six three-classes classication
scenarios, i.e., 1) S1: class 1: LA, class 2: HA and class 3: the
respective relax signals, 2) S2: class1: LV, class 2: HV and class
3: the respective relax signals, 3) S3: class 1: LALV, class 2:
HALV and class 3: the respective relax signals, 4) S4: class 1:
LAHV, class 2: HAHV and class 3: the respective relax signals,
5) S5: class1: LALV, class 2: LAHV and class 3: the respective
relax signals, 6) S6: class 1: HALV, class 2: HAHV and class 3:
the respective relax signals.
742 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011
Fig. 2. Overall system diagram presenting the collection and processing of the EEG data.
These classication scenarios were chosen in order to:
1) emphasize on the discrimination of the adverse expressions
of the valence and arousal coordinates in the VAS and the relax
state and 2) to create different classication setups for testing
the consistency of the AsI to perform efciently in every one of
them. For each one of the classication scenarios the 70% of the
signals were used as a training set whereas the remaining 30%
as a testing set. The cross validation technique was adopted for a
better evaluation of the classication performance resulting in a
20-iteration classication process. The mean classication rate

C from the 20 iterations was nally extracted. The same clas-


sication procedure was also conducted for the whole dataset,
without the grouping according to the AsI value. Furthermore,
until now, the categorization of a signal to an affective state,
corresponding to a specic picture, was done according to the
norms of the IAPS database. A nal classication setup was
also conducted for the whole dataset, but with the categoriza-
tion following the self-assessment of each picture as described
in Section III.A.
All above processes were conducted for both the HOC and
CCfeature extraction methods (see the Appendix). Asystemdi-
agramdening each one of the implementation steps is depicted
in Fig. 2.
IV. RESULTS
The efciency of AsI as a metric for the reliability of the
emotion elicitation procedures is evaluated through an extensive
classication performance. Before that, in order to investigate
if the assumption made in Section II.C, i.e., S
p
should generally
be smaller than S
r
, actually holds, the S
r
values were plotted
versus the S
p
ones. Fig. 3 shows the aforementioned plot for
each one of the emotion elicitation trials for all subjects and all
affective states. Indeed, the assumption seems to hold for a great
volume of the (S
p
, S
r
) pairs.
Fig. 4 shows the results of the classication procedure. Black
solid and black dashed lines represent the corresponding

C val-
ues of big and small AsI groups, respectively. Whilst, the black
dotted and black dasheddotted lines correspond to the whole
dataset categorized by norms and subjective criteria, respec-
tively. The circle (o) and triangle () marks correspond to HOC
and CC methods, respectively. It is obvious that signals within
Fig. 3. S
r
versus S
p
values for the four affective states of IAPS-based emotion
elicitation.
the big AsI group exhibit profoundly higher

C values in contrast
with the

C of small AsI group and the whole dataset. In partic-
ular,

C values for the big AsI group are {61.19, 55.24, 58.16,
54.69, 62.58, 58.55}% for the HOC feature extraction method
and {43.71, 48.66, 49.95, 47.72, 50.91, 51.12}% for the CC
method corresponding to the S1, S2, S3, S4, S5, and S6 classi-
cation scenarios, respectively, conrming also the superiority of
the HOC method against the CC one. It should be noted that we
have also unied the frequency bands in (23) (see the Appendix),
assuming nonorthogonality in the frequency components, and
calculated the classication performance of the CC approach.
This, however, resulted in reduced mean classication accuracy
[(6.46%5.7%)], when compared with the one derived from
the adoption of separate frequency bands.
Moreover, in order to compare the classication results of
the big AsI group with other groups constituted of the same
number of signals per affective state, 50 groups were randomly
extracted and were subjected to the abovementioned classica-
tion process. Fig. 5 shows the results of this procedure. With
black dotted line is depicted the median value of the

C values
PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 743
Fig. 4. Classication rates,

C, for HOC and CC methods for big and small
AsI groups, norm and subjective-based categorization of EEG signals.
Fig. 5. Classication rates,

C, for HOC and CC methods for big and small
AsI groups, and median

C derived from all 50 randomly created groups with
equal signal number with Big AsI groups.
extracted for the 50 randomly constructed groups for each classi-
cation scenario. Once again the superiority of the classication
performance in the big AsI group is evident, leading to the con-
clusion that the sets of signals that constitute the big AsI group
represent the most effective emotion elicitation trials.
From all aforementioned analyses, it is clear that the results
discussed so far refer to the user-independent case. In an at-
tempt to evaluate the efciency of the AsI in a user-dependent
concept, the classication setup previously discussed was also
implemented for each one of the 16 subjects. Fig. 6 depicts the
corresponding results. The black dotted line represents the mean
normalized AsI of each one of the subjects extracted from all
affective states plotted in a descending order. The black solid
and black dashed lines present the mean classication rate,

C,
Fig. 6. Mean classication rates,

C, for HOC and CC methods, all subjects.
In black dotted line the mean AsI is also depicted in descending order.
TABLE I
MEAN CLASSIFICATION RATES (%) FOR SUBJECTS 4 AND 13 FOR ALL
CLASSIFICATION SCENARIOS
(normalized in the [0, 1] range), i.e.,

C is the mean of the



C
across the six classication scenarios S1. . .S6, for the HOC and
CC methods, respectively. The important conclusion from this
gure is that the HOCand CClines incline with the same way as
the AsI line. This shows that a subject with small mean AsI tends
to exhibit poorer classication performance in contrast with an-
other with higher mean AsI values. It must be noted that HOC,
CC, and mean AsI lines correspond to different values, i.e.,

C
and AsI, respectively. Furthermore, from Fig. 6, it is obvious
that Subjects 4 and 13 exhibit the best and the worst classi-
cation performance as far as the HOC method is concerned,
respectively. The

C values for each one of the six classica-
tion scenarios, for the aforementioned subjects, are tabulated in
Table I. That way, not only the single trials can be evaluated for
their ability to evoke emotions, but also individual subjects can
be characterized according to their tendency of how easily can
be emotionally aroused emotion elicitation procedures.
It should be stressed out that HOC method overcame the CC
one in the classication performance in all classication setups
resulting in a 62.58% for the classication scenario S5 (see
Fig. 4) in a user-independent setup while in a user-dependent
concept it also exhibited the maximum mean classication rate
across all classication scenarios, 86.92% (Subject 4). Particu-
larly, in the S3 classication scenario (three classes) the HOC
method resulted in 94.4%. On the other hand, the CCmethod re-
sulted in 51.12% (S6), 51.62% (Subject 1), and 59.6% (Subject
1, S3) in the respective classication cases. All given previously
744 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011
Fig. 7. Classication rates,

C, for the HOC method for Big groups of AsI and
DI indexes.
conrm the superiority of the HOC approach in contrast to the
CC one.
Finally, in order to compare the efciency of AsI with the
DI one, the corresponding big group according to DI was con-
structed, taking into consideration the absolute values of it. The
threshold for the categorization to the big group was DI 0.5,
in accordance to the denition of the big AsI-based group (see
Section III.B). Subsequently, the selected trials were used for
classication with the more efcient HOC-based feature vector
extraction method. The corresponding results are depicted in
Fig. 7. From the latter, it is obvious that AsI-based big group
provides with higher

C rates for all classication scenarios, i.e.,
(S1-S6): {61.19, 55.24, 58.16, 54.69, 62.58, 58.55}%, as com-
pared with the corresponding rates from the DI-based one, i.e.,
(S1-S6): {49.90, 54.89, 55.27, 40.40, 53.20, 43.41}%.
V. DISCUSSION
The presented results have shown that AsI can serve as an
indication of how well an emotion has been evoked during an
emotion elicitation process. Firstly, in Fig. 3 the assumption
that S
r
is larger than S
p
holds in a great volume of the emotion
elicitation trials. The exclusion of the remaining trials with the
opposite attitude (i.e., low AsI values) resulted in a better clas-
sication rates in contrast with the whole dataset (see Fig. 4) or
any random group of equal number of signals with the Big AsI
group (see Fig. 5). It is noteworthy, regarding the construction
of the Big AsI group, that the (S
p
, S
r
) pairs, corresponding to
the HALV state, exhibited, in average, lower AsI values which
also resulted in the selection of only 15 signals from this affec-
tive state (see the previous section). This might be due to the
intensiveness of the emotion that the pictures in this category
elicit. For the projection of pictures in this category many sub-
jects reported that some of the pictures produced so negatively
emotional charges that they were thinking about them while
the procedure was proceeding. As a result, the experience of
negative emotions occurred during the relaxation (countdown)
phase, would reduce the symmetry during its realization; hence,
S
r
values would tend to be almost equal with S
p
ones. It must
also be stressed out that all S
r
and S
p
are normalized in the range
[0, 1] by nding the maximum value between both value sets.
The maximum value that was found was a S
r
value from the
LALV set. This fact conrms the previous remark, as it implies
that during the relaxation (countdown) phases of the projec-
tion period of the LALV pictures the maximum symmetry was
accomplished. Furthermore, if we obtain the maximum S
p
val-
ues for the affective states after normalization (see Fig. 3) we
have {0.124, 0.106, 0.93, 0.87} for LALV, LAHV, HAHV, and
HALV, respectively. This fact is also a clue that during the pro-
jection of pictures with high arousal scores, lower S
p
values are
obtained, implying lower amount of shared information between
the left and right hemispheres, i.e., more intense asymmetry.
To authors knowledge up to now, no other attempt to eval-
uate the emotion elicitation reliability has been reported in the
literature. In [9], the nal selection of the emotional stimulus
was based on the assessment of a bunch of emotional lms
by subjects that did not participate in the corresponding -
nal data-collection experiment. This approach raises various
issues, as different emotional stimulus are apprehended differ-
ently from each individual. Furthermore, Fig. 4 shows that even
the categorization of the signals according to the emotional self-
assessment of the subjects participated in the experiments lead
to low classication performance, maybe due to subconscious
evaluation of the emotional stimulus that is not later projected
in the self-assessment procedure. The later, may be a corollary
of the inability of the self-assessment methods (in this work
the SAM method) to effectively convey the means of such an
assessment. Thus, a more objective approach, as the one intro-
duced in this paper, may overcome these issues by evaluating
the emotional impact on subjects using only the information
transferred from the origin of the emotion genesis. Moreover,
the comparison of the AsI with the DI revealed its ability to
better evaluate the trials of the emotion elicitation procedure
that probably convey emotion information in contrast with the
latter one. A possible reason of this fact is that DI does not
take into consideration a ground truth in regard with how the
absence of asymmetry is reected in the EEG signals of each
subject. Due to the denition of the AsI (see Section II.C), the
nal measure of the efciency of each emotion elicitation trial
is based on the nonasymmetry detected immediately before the
aforementioned trial. This fact implies a more robust evaluation
of howeffectively an emotion is elicited, taking into account the
preceding emotional state of the subject.
In this work, the MDI method was employed in order to
estimate the amount of information ow between the two hemi-
spheres of the brain. The total amount of information measured
[see (11)] was estimated for the whole 5-s signal corresponding
to the projection of the IAPS picture. It would be interesting
to investigate whether the brain asymmetry that was described
in Section II.A is exhibited during the whole 5-s period or in
time period inside the signal. This investigation would lead in
a further purge of the dataset not only in a subject level (see
Fig. 6) or trial level, but also in a time segment level during
each emotion elicitation trial. This should result in more ef-
cient isolation of the emotional information in EEG signals and,
PETRANTONAKIS AND HADJILEONTIADIS: NOVEL EMOTION ELICITATION INDEX USING FRONTAL BRAIN ASYMMETRY 745
thus, in better classication performance during an EEG-ER
task.
VI. CONCLUSION
A novel approach to evaluate the effectiveness of an emotion
elicitation trial was presented in this paper. The frontal EEG
asymmetry along with the MDI approach led to the denition
of the AsI, which was used to discard the signals with less
emotional information and perform more reliable EEG-ER
processes. The classication-based evaluation of the efcacy of
the AsI resulted in promising results, which pave the way for
more robust emotion elicitation processes. In particular, signals
with higher AsI values seem to demonstrate better classication
attitudes, leading to the conclusion that they are imbued with
greater volume of emotional information, in contrast to other
with lower AsI values. Moreover, except for selecting the most
effective EEGsignals, AsI provides the potential of also discard-
ing individual subjects that seem not to emotionally react in the
respective stimulus. In addition, AsI seems to be quite efcient
evaluation criterion for an emotion elicitation process, compared
to the known DI measure. However, despite the potential of the
presented results, further justication of the reliability of the
AsI must be provided by implementing the proposed approach
to a larger scale experiments.
APPENDIX
Consider, a nite zero-mean series {Z
t
}, t = 1, . . . , N os-
cillating about level zero. Let be the backward difference
operator dened by
Z
t
Z
t
Z
t1
. (15)
If we dene the following sequence:

k

k1
, k = 1, 2, 3, . . . , (16)
with
1

0
being an identity lter, we can estimate the cor-
responding HOC, namely simple HOC [35], by
D
k
= NZC{
k
(Z
t
)}, k = 1, 2, 3, . . . ; t = 1, . . . , N (17)
where NZC{.} denotes the estimation of the number of zero-
crossings and

k1
Z
t
=
k

j=1

k 1
j 1

(1)
j1
Z
tj+1
. (18)
In practice, we only have nite time series and lose an observa-
tion with each difference. Hence, to avoid this effect we must
index the data by moving to the right, i.e., for the evaluation of k
simple HOC, the index t =1 should be given to the k-th or a later
observation. For the estimation of the number of zero-crossings
in (17), a binary time series X
t
(k) is initially constructed given
by
X
t
(k) =

1 if

k
(Z
t
) 0
0 if

k
(Z
t
) < 0
,
k = 1, 2, 3, . . . ; t = 1, . . . , N (19)
and the desired simple HOC are then estimated by counting
symbol changes in X
1
(k), . . . , X
N
(k), i.e.,
D
k
=
N

t=2
[X
t
(k) X
t1
(k)]
2
. (20)
In nite data records it holds D
k+1
D
k
1 [35].
The HOC used to construct the feature vector (FV
HOC
),
formed as follows:
FV
HOC
= [D
1
, D
2
, . . . , D
L
], 1 < L J (21)
where J denotes the maximumorder of the estimated HOCand L
the HOC order up to they were used to form the FV
HOC
. Due to
the three-channel EEG recording setup the nal FV
c
HOC
, where
c stands for combined, was structured as
FV
c
HOC
= [D
i
1
, D
i
2
, . . . , D
i
L
, D
j
1
, D
j
2
, . . . , D
j
L
, D
s
1
, D
s
2
, . . . , D
s
L
],
i = 1; j = 2; s = 3 (22)
where i, j, s denote the channel number that participates to the
combination.
The CC method introduced in [29] estimates the cross-
correlation coefcient c(; ij) between potentials of the EEG
electrodes i and j for the frequency band , i.e.:
c(; ij) =

X
i
(
n
)X

j
(
n
)

|X
i
(
n
)|
2

|X
j
(
n
)|
2
(23)
where X
i
(
n
) is the Fourier transform of EEG at the ith elec-
trode site and nth frequency bin, and the summation is over
the frequency bins in the frequency band. In this paper, the
cross-correlation coefcient was estimated for both alpha and
beta bands resulting in a 6-feature feature vector, three features
for each band as we have a three-channel EEG recording set.
The nal feature vector was named FV
c
CC
.
ACKNOWLEDGMENT
The authors would like to thank all the 16 subjects participated
in the experiment for their patience during the tedious EEG
recording phase.
REFERENCES
[1] P. Rani, N. Sarkar, C. A. Smith, and J. A. Adams, Affective communica-
tion for implicit human-machine interaction, IEEE Int. Conf. Syst., Man,
Cybern., pp. 48964903, 2003.
[2] I. Cohen, A. Garg, and T. S. Huang, Emotion recognition from facial
expressions using multilevel HMM, presented at the Neural Inf. Pro-
cess. Syst. Workshop Affective Computing, Colorado, 2000, available at:
[Online] http://www.ifp.uiuc.edu/ashutosh/ papers/NIPS_emotion.pdf
[3] F. Bourel, C. C. Chibelushi, and A. A. Low, Robust facial expression
recognition using a state-based model of spatially-localized facial dynam-
ics, in Proc. Fifth IEEE Int. Conf. Automatic Face Gesture Recog., 2002,
pp. 106111.
[4] B. Schuller, S. Reiter, R. Mueller, M. Al-Hames, and G. Rigoll, Speaker
independent speech emotion recognition by ensemble classication, in
Proc. 6th Int. Conf. Multimedia Expo., 2005, pp. 864867.
[5] F. Yu, E. Chang, Y. Q. Xu, and H. Y. Shum, Emotion detection from
speech to enrich multimedia content, in Proc. IEEE Pacic Rim Conf.
Multimedia, 2001, pp. 550557.
746 IEEE TRANSACTIONS ON INFORMATION TECHNOLOGY IN BIOMEDICINE, VOL. 15, NO. 5, SEPTEMBER 2011
[6] R. W. Picard, E. Vyzas, and J. Healey, Toward machine emotional intel-
ligence: Analysis of affective physiological state, IEEE Trans. Pattern
Anal. Mach. Intell., vol. 23, no. 10, pp. 11751191, Oct. 2001.
[7] F. Nasoz, C. L. Lisetti, K. Alvarez, and N. Finkelstein, Emotion recogni-
tion from physiological signals for user modeling of affect, in Proc. 9th
Int. Conf. User Model, June 2226, 2003, Pittsburg, USA, [Online]. avail-
able at: http://www.eurecom.fr/util/ publidownload.en.htm?id = 1806.
[8] C. Busso, Z. Deng, S. Yildirim, M. Buut, C. M. Lee, A. Kazemzadeh,
S. Lee, U. Neumann, and S. Narayanan, Analysis of emotion recognition
using facial expressions, speech and multimodal information, in Proc.
6th Int. Conf. Multimodal Interfaces, 2004, pp. 205211.
[9] K. Takahashi, Remarks on emotion recognition from bio-potential sig-
nals, in Proc. of 2nd Int. Conf. Autonomous Robots and Agents, 2004,
pp. 186191.
[10] A. Heraz and C. Frasson, Predicting the three major dimensions of the
learners emotions from brainwaves, Int. J. Comput. Sci., vol. 2, no. 3,
pp. 187193, 2008.
[11] D. O. Bos, EEG-based emotion recognition: The inuence of vi-
sual and auditory stimuli, 2006 [Online]. Available: http://hmi.
ewi.utwente.nl/verslagen/capita-selecta/CS-Oude_Bos-Danny.pdf
[12] R. J. Davidson, G. E. Schwartz, C. Saron, J. Bennett, and D. J. Goleman,
Frontal versus parietal EEG asymmetry during positive and negative
affect, Psychophysiology, vol. 16, pp. 202203, 1979.
[13] O. Sakata, T. Shiina, and Y. Saito, Multidimensional Directed Information
and Its Application, Elec. Com. Jpn., vol. 4, pp. 385, 2002.
[14] P. J. Lang, M. M. Bradley, and B. N. Cuthbert, International affective pic-
ture system (IAPS): Affective ratings of pictures and instruction manual,
Univ. Florida, Gainesville, FL, Tech. Rep. A-8, 2008.
[15] R. J. Davidson, P. Ekman, C. D. Saron, J. A. Senulis, and W. V. Friesen,
Approach-withdrawal and cerebral asymmetry: Emotional expression
and brain physiology, J. Personality Soc. Psychol., vol. 58, pp. 330341,
1990.
[16] H. Jasper, The ten-twenty electrode system of the international feder-
ation, Electroencephalogr. Clin. Neurophysiol., vol. 39, pp. 371375,
1958.
[17] W. J. H. Nauta, The problem of the frontal lobe: A reinterpretation, J.
Psychiatric Res., vol. 8, pp. 167187, 1971.
[18] R. J. Davidson, What does the prefrontal cortex do in affect: Per-
spectives on frontal EEG asymmetry research, Biol. Psychol., vol. 67,
pp. 219233, 2004.
[19] T. Kamitake, H. Harashima, and H. Miyakawa, Time series analysis based
on directed information, Trans Inst. Elect. Inf. Commun. Engineers,
vol. 67-A, pp. 103110, 1984.
[20] G. Wang and M. Takigawa, Directed coherence as a measure of interhemi-
spheric correlation of EEG, Int. J. Psychophysiol., vol. 13, pp. 119128,
1992.
[21] M. Kaminski and K. J. Blinowska, A new method of the description
of the information ow in the brain structures, Biol. Cybern., vol. 65,
pp. 203210, 1991.
[22] L. A. Baccala and K. Sameshima, Partial directed coherence: a new
concept in neural structure determination, Biol. Cybern., vol. 84, pp. 463
474, 2001.
[23] W. Hesse, E. Moller, M. Arnord, and B. Schack, The use of time-variant
EEGGranger causality for inspecting directed interdependencies of neural
assemblies, J. Neurosci. Methods, vol. 124, pp. 2744, 2003.
[24] J. D. Morris, Observations: SAMthe self assessment mannequinAn ef-
cient cross-cultural measurement of emotional response, J. Advertising
Res., vol. 35, pp. 6368, 1995.
[25] K. Coburn and M. Moreno, Facts and artifacts in brain electrical activity
mapping, Brain Topography, vol. 1, pp. 3745, 1988.
[26] D. O. Olguin, Adaptive digital ltering algorithms for the elimination of
power line interference in electroencephalographic signals, Master thesis,
Instituto Tecnologico y de Estudios Superiores de Monterrey, Monterrey,
Mexico, 2005.
[27] M. Fatourechi, A. Bashashati, R. K. Ward, and G. E. Birch, EMG and
EOG artifacts in brain computer interface systems: A survey, Clin.
Neurophysiol., vol. 118, pp. 480494, 2007.
[28] P. C. Petrantonakis and L. J. Hadjileontiadis, Emotion recognition from
EEG using higher order crossings, IEEE Trans. Inf. Technol. Biomed.,
vol. 14, no. 2, pp. 186197, 2010.
[29] T. Musha, Y. Terasaki, H. A. Haque, and G. A. Ivamitsky, Feature ex-
traction from EEGs associated with emotions, Artif. Life Robot., vol. 1,
no. 1, pp. 1519, 1997.
[30] R. D. King, C. Feng, and A. Shutherland, StatLog: Comparison of clas-
sication algorithms on large real-world problems, Appl. Artif. Intell.,
vol. 9, no. 3, pp. 259287, 1995.
[31] W. J. Krzanowski, Principles of Multivariate Analysis. Oxford, UK:
Oxford University Press, 1988.
[32] P. C. Mahalanobis, On the generalized distance in statistics, Proc. Natl.
Inst. Sci. India, vol. 2, pp. 4955, 1936.
[33] T. Mitchell, Machine Learning.. New York, NY: McGraw-Hill, 1997.
[34] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector
Machines and Other Kernel-based Learning Methods.. Cambridge, UK:
Cambridge University Press, 2000.
[35] B. Kedem, Time Series Analysis by Higher Order Crossings. Piscat-
away, NJ: IEEE Press, 1994.
Panagiotis C. Petrantonakis (S08) was born in
Ierapetra, Crete, Greece, in 1984. He received the
Diploma degree in electrical and eomputer engineer-
ing from the Aristotle University of Thessaloniki
(AUTH), Thessaloniki, Greece, in 2007, where he
is currently working toward the Ph.D. degree at the
Signal Processing and Biomedical Technology Unit
of the Telecommunications Laboratory.
His current research interests include advanced
signal processing techniques, nonlinear transforms,
and affective computing.
Mr. Petrantonakis He is a member of the Technical Chamber of Greece.
Leontios J. Hadjileontiadis (S87M98SM11)
was born in Kastoria, Greece, in 1966. He received
the Diploma degree in electrical engineering and the
Ph.D. degree in electrical and computer engineering
both from the Aristotle University of Thessaloniki,
Thessaloniki, Greece, in 1989 and 1997, respectively.
He also received the Ph.D. degree in music composi-
tion, University of York, UK, in 2004.
Since December 1999, he is with the Depart-
ment of Electrical and Computer Engineering, Aris-
totle University of Thessaloniki, as a faculty mem-
ber, where he is currently an Associate Professor, working on lung sounds,
heart sounds, bowel sounds, ECG data compression, seismic data analysis
and crack detection in the Signal Processing and Biomedical Technology Unit
of the Telecommunications Laboratory. Also, he is currently a Professor in
composition at the State Conservatory of Thessaloniki, Thessaloniki , Greece.
His research interests include higher-order statistics, alpha-stable distributions,
higher-order zero crossings, wavelets, polyspectra, fractals, neuro-fuzzy mod-
eling for medical, mobile and digital signal processing applications.
Dr. Hadjileontiadis is a member of the Technical Chamber of Greece, of the
Higher-Order Statistics Society, of the International Lung Sounds Association,
and of the American College of Chest Physicians. He was the recipient of the
second award at the Best Paper Competition of the 9th Panhellenic Medical
Conference on Thorax Diseases, in 1997, Thessaloniki. He was also an open
nalist at the Student paper Competition (Whitaker Foundation) of the IEEE
EMBS, in 1997, Chicago, IL, a nalist at the Student Paper Competition (in
memory of Dick Poortvliet) of the MEDICON98, Lemesos, Cyprus, and the
recipient of the Young Scientist Award of the twenty-fourth International Lung
Sounds Conference99, Marburg, Germany. In 2004, 2005, and 2007 he orga-
nized and served as a mentor to three ve-student teams that have ranked as
third, second, and seventh worldwide, respectively, at the Imagine Cup Com-
petition (Microsoft), Sao Paulo, Brazil (2004)/Yokohama, Japan (2005)/Seoul,
Korea (2007), with projects involving technology-based solutions for people
with disabilities.

You might also like