You are on page 1of 6

A self-paced B C I system to control an electric wheelchair : evaluation of a

commercial, low-cost E E G device

Francesco Carrino1,2, Joel Dumoulin1, Elena Mugellini1, Omar Abou Khaled1, Rolf Ingold2
1

University of Applied Sciences Western Switzerland


Fribourg, Switzerland
2
University of Fribourg
Fribourg, Switzerland
francesco.carrino, joel.dumoulin, elena.mugellini, omar.aboukhaled@hefr.ch
rolf.ingold@unifr.ch
AbstractT he electrical cerebral activity has been already
used in several applications aiming at improving the daily life
of impaired people with strong motor disabilities. In particular
the E lectroencephalogram signals (E E G) have been used to
provide new ways for communication and control. However,
such kind of technology presents some important drawbacks
such as the price and the difficulty to prepare the system
ZLWKRXWDQH[SHUWVVXSSRUW7KLVZRUNintends to build a userfriendly, self-paced B rain-Computer Interface (B C I) system
that allows using commercial E E G headsets in order to drive
an electrical wheelchair with a motor imagery approach.
F urther more, the conceived system has been used for a first
evaluation of a commercial, low-cost, E E G device compared
with data coming from a professional device. T he result shows
that the low cost E E G device, at the actual state of the art,
provide interesting results but can hardly be used for selfpaced systems in er ror sensitive context.
Brain computer interface; H uman computer interaction;
E lectroencephalography; Wheelchairs; Wearable sensors

I.

INTRODUCTION

7KHILUVWVWXGLHVDERXWWKHEUDLQVHOHFWULFDODFWLYLW\GDWH
back to the end of 19th century, but, in spite of the
technological progress, the knowledge about the human
brain is still partial and incomplete. Despite that, nowadays
the brain electrical activity is successfully used in several
applications developed to improve the quality of life of
impaired people (e.g. with severe neuromuscular disorders,
such as ALS, brain stem stroke, and spinal cord injury). It
can also be used for rehabilitation or more simply for
gaming. In particular, the electroencephalography EEG
represents the most used Brain-Computer Interface (BCI)
technology in reason of good time resolution (compared with
neuronal activity) but also for three practical reasons: it is
non-invasive; it is cheaper than other solutions; it is portable.
These three features are considered very important in the
medical domain, in comparison with other BCI technologies
(such as electrocorticography, functional magnetic resonance
imaging, magnetoencephalography, etc.) which are more
invasive, expensive or cumbersome. However, an EEG
system is still difficult to use for applications out-of-the-lab,
because of the too-high cost, the difficulty to install the

system without an expert advice, and the time that the


electrodes preparation requires (in particular the correct
electrodes placement and the application of conductive gel).
In this respect, in the last years several companies have
produced cheaper commercial EEG devices that work with a
reduced number of dry electrodes (i.e. they do not need the
application of conductive gel on the scalp). Obviously, these
limitations affect the signal quality and therefore the
performance of the system; for this reason, commercial EEG
GHYLFHV DUH XVXDOO\ GHYHORSHG DLPLQJ DW HUURU-SURRI
contexts (such as gaming or neuromarketing). Moreover,
they often make use of signals coming from the face
muscular activity detected by electromyography (EMG) that
in the BCI context are usually considered as an artifact to
avoid or reject.
This work analyzes the possibilities given from a
commercial device WKH(PRWLYV(SRFK+HDGVHW Wo improve
the quality of life of people with reduced mobility, aiming to
use motor imagery techniques to control an electric
wheelchair. For this reason, only the EEG signals will be
considered. In order to process the signals gathered with the
headset we developed an application called GERBIL. The
developed application lean on OpenViBE platform [1,2]
which is used to perform real-time EEG signals
classification. The result of the EEG recognition are
interpreted and used to command a smart wheelchair which
is equipped with several types of sensors (optical and
inertial) to aid WKHXVHUVQDYLJDWLRQLQYDULRXVWDVNV VXFKDV
avoiding obstacles, correcting wrong or dangerous
maneuvers, etc.). However, this paper aims to evaluate the
performance of the developed BCI system, in particular
using a commercial EEG device, without concerning about
the technologies on the wheelchair and the related issues. For
this reason, all the navigations assistance systems are not
used and considered in this work.
This paper is organized as follows: Section 2 describes
the state of the art of related researches. Section 3 illustrates
the technological analysis, focusing on the chosen EEG
headset characteristics and the OpenViBE features. Section 4
describes GERBIL architecture and the signal processing
chain from the brain to the wheelchair. Section 5 presents the

tests and a first evaluation. Finally, section 6 concludes the


paper and discusses future works.
II.

RELATED WORK

Until today, commercial EEG devices are not often used


for research purposes. Existing works aim principally to
propose new types of interaction: reference [3] presents a
P300 [4] based interface for a smartphone. The user has to
stare at one picture in order to dial the related contact. Only
VL[ FRQWDFWV SKRWRV DUH VKRZHG simultaneously. Moreover,
the authors compare the results obtained by the P300 based
application with a similar approach but based on the winch
detection (with the same device). The researchers of
Wadsworth Center tested a commercial EEG headset with
their BCI2000 system [5] showing that it is still possible to
use a P300 speller (but with significantly reduced
performance than using more professional BCI devices). The
same device is used in [6] to move a robotic arm, but instead
of EEG signals they exploit the face muscular activity. A
multimodal approach is proposed in [7] that uses EEG, EMG
and the gyroscope present in the headset in order to navigate
through virtual spaces using only the head, here including
basic motor imagery tasks [8] and using concentration
analysis (like a sort of brain switch [9]). Works more related
to medical applications, such as [10,11], discuss about
neurofeedback game designs and algorithms, acquiring the
signals with low cost devices.
Using professional devices, there are some important
projects [12,13] that achieve good results in driving electrical
wheelchair through a motor imagery approach. In these cases
most part of the navigation tasks are handled by the
wheelchair (equipped with several sensors) putting on the
hand of the user just the decisional tasks (like selecting the
direction or the destination).
The work presented in this paper studies the possibilities
given by a commercial EEG device focusing on motor
imagery tasks, without using EMG signals or other sensors.
The developed prototype aims to drive an electrical
wheelchair and to evaluate the feasibility and the reliability
of such device in real-world settings.
The next section provides more information about the
features of the commercial EEG device used. Moreover it
presents in details the characteristics and the architecture of
OpenViBE, the platform used for signal acquisition and
processing.
III.

HARDWARE AND SOFTWARE ANALYSIS

The first step in the analysis process was a preliminary


study of three different devices in order to select the best one
fitting with the purposes of this work. The considered
GHYLFHVZHUH(PRWLYV(32&[14] (Fig. 1), 2&=V1,$[15]
DQG1HXUR6N\V0LQG6HW [16].

Figure 1. The Emotiv EPOC headset.

Another interesting device is Mynd created by


NeuroFocus [17], essentially developed for neuromarket
applications (the brainwaves are captured in order to
establish how consumers perceive brands, products,
packaging, etc.). NeuroFocus it is also supported by the
European Tools for Brain-Computer Interaction consortium
(TOBI). However, due to the too recent production and the
limited documentation available, this device was not taken
into account.
The features considered for the choice of the EEG device
are various: the number of electrodes and their placement
(motor imagery detection requires sensors placed near the Cz
position, as defined by the international 10-20 system), the
price, the usability, the possibility to access to raw data. All
these devices are wireless and based on Bluetooth
technology. Considering these parameters the only possible
FKRLFHZDVWKH(PRWLYV(32&KHDGVHW see Fig. 1) that with
14 sensors covers the brainwave activity across the full brain
(the other system present just one or three electrodes placed
on the frontal lobe). Moreover it grants access to raw data
and provide a SDK for developers. The device price is about
300$ that can arrive up to 3000$ with the full SDK
(depending on the license type). About usability, unlike NIA
and MindSet, the electrodes used by Emotiv EPOC are not
dry but they need to be humidified with a saline solution
before the use. The EPOC technical specifications are listed
in table 1.
TABLE I.

EMOTIV EPOC SPECIFICATIONS

Number of channels
Channel names
locations)

14 (plus CMS/DRL references)


(Int.

10-20

AF3, AF4, F3, F4, F7, F8, FC5,


FC6, P7, P8, T7, T8, O1, O2

Sampling method

Sequential sampling, Single ADC

Sampling rate

~128Hz (2048Hz internal)

Resolution
Bandwidth

16 bits (14 bits effective) 1 LSB =


9
0.2 - 45Hz, digital notch filters at
40Hz and 60Hz

Dynamic range (input referred)

256mVpp

Coupling mode

AC coupled

Connectivity

Proprietary
band

Battery life (typical)

12 hrs

Impedance measurement

Contact quality using patented


system

wireless,

2.4GHz

In order to process the raw signals coming from the EEG


sensors, several solutions have been analyzed and
OpenViBE was finally adopted. OpenViBE is a platform for
real-time processing of brain signals developed by the
French National Institute for Research in Computer Science
and Control (INRIA). It can be used to acquire, filter,
process, classify and visualize brain signals in real time. The
principal assets of OpenViBE are modularity, portability and
flexibility [1,2].
This platform allows to design and executing
2SHQ9L%( scenarios. A scenario is a set of signal
processing boxes communicating each others configured in
order to accomSOLVKWKHVFHQDULRVJRDO%HORZ are listed the
main scenarios used in this project:
x Visualization scenario. The system shows the
signals detected in real-time or from a previous
recorded acquisition.
x Acquisition scenario. The data coming from the
acquisition device are stored in a file.
x Training scenario. The selected classifiers are
trained (offline) with the recorded data, following
the chosen classifiers.
x Online classification. The detected EEG signals are
classified online, allowing a real-time interaction
based on brainwave.
x Online Evaluation. The system shows to the user
the task to perform with a visual stimulation. Then
the system classifies the signals detected. The
stimulus and the classification outputs are both
available to the system consenting to evaluate the
performance of the recognition.
In our work, these five scenarios are implemented and
configured with the main goal of distinguishing two mental
tasks corresponding to left or right hand movement. The
classification output will be transmitted by the VRPN
communication to the GERBIL application and then to the
electrical wheelchair. A right hand movement will make turn
the wheelchair right; a left hand movement will make turn
the wheelchair left.
The next section will show with more detail the structure
of the developed system, based on the OpenViBE scenarios.
IV.

SYSTEM ARCHITECTURE

The developed BCI system follows the structure showed


in Fig.2. Two principal modules are present: OpenViBE and
the developed application called GERBIL. OpenViBE
manages the first four signal process phases: acquisition,
preprocessing, features extraction and classification. Then,
the Virtual Reality Peripheral Network (VRPN) module
presents in GERBIL checks continuously the classification
output (VRPN is based on TCP/IP protocol). Afterwards,
GERBIL has the responsibility of taking the decisions based
on the classification results and translating them into
commands for the electric wheelchair (accordingly to the
user configuration). Finally, the wheelchair movements give
the feedback for the user whether the command has been
correctly interpreted or not.

Figure 2. The system architecture.

The following subsections details the signal processing


phase and the GERBILVIXQFWLRQDOLWLHV.

A. S ignal Processing
$ VSHFLILF 2SHQ9L%(V GULYHU PDQDJHV WKH signal
acquisition from the device. The detected brainwaves signals
follow then a preprocessing treatment. In this phase,
DFFRUGLQJ WR WKH XVHUV QHHGV 2SHQ9L%( VHOHFWV WKH
channels to enable and performs a first spatial filtering: a
Laplacian Surface is calculated in order to combine the
signal coming from the different channels; as output it
provides only two signals corresponding to the sensorimotor
cortex areas related with the left or right hand movement.
Then, a temporal filter is used to extract the required
bandwidth (usually between 8 and 24 Hz for motor imagery
tasks). When the preprocessing is over, the signal passes
through a temporal epoching phase and then the features
extraction can begin. The achieved features vector is sent to
the LDA classifier. Furthermore, a k-fold cross-validation is
used to improve the results. When the classification output is
available, the VRPN module handles the communication
with GERBIL.
B. G ERBIL
The application called GERBIL was conceived for
several reasons. Principally its role is to translate the
OpenViBE output in commands for the wheelchair but also
to provide a friendly interface to manage the whole system.
Here after, the five principal functions of GERBIL are
detailed.
A cquisition M anagement. In order to improve the
system usability, GERBIL provides a simple interface to
handle the various acquisition aspects (add, edit, delete) not
really implemented in OpenViBE. Moreover a single
DFTXLVLWLRQFDQEHFKRVHQDVWDUJHW7KHWDUJHWDFTXLVLWLRQ
will be used by the GERBIL Scenarios Generator to easily
link the different scenarios (e.g. Visualization and Training)
that are expected to work with the same data source.
Subjects M anagement. This function allows handling
(add, edit, delete) information about the subjects. Thus it will
be possible to link the acquired data with the information
about the subject to which they belong (age, gender, etc.).
Scenarios Generation. The GERBIL Scenarios
Generator main goal is to aid the user building the scenarios

Figure 3. On the left the Scenarios Generator interface where the type of scenarios can be detected and the related parameters
can be configured. On the right the Channel Selector which allows the user to configure the spatial filter with simple dragand-drop actions. The user has to select the reference channels and then the channels related to them.

and managing their configuration. Modifying a parameter


that involves several scenarios will update all the concerned
scenarios. Moreover the interface helps the user in the most
difficult phases, such as the choice of the coefficients to
assign to the spatial filter which depends on the role and
importance given to each channel. Fig.3 shows the Scenarios
Generator and the Channel selector, the interface that allows
the user to accomplish this operation with simple drag-anddrop actions (for a more complete demonstration of the
GERBIL interface, a video example is available at [18]).
V RPN. The VRPN module performs several functions:
x It manages the communication with the VRPN
server on the OpenViBE side.
x It provides graphical feedbacks to the user in terms
of several parameters. It allows visualizing the
classification output, the stimulus sent to the user
(in the case of Online Evaluation Scenario), the
false positives, etc.
x It offers the possibility to set online several
thresholds thus it allows the user to improve the
classification behavior in a sort of online post
processing phase.
Smart Wheelchair M anagement. This function handles
all the aspects related to the electrical wheelchair: from the
configuration of the serial port (port number, baud rate, etc.),
to the transmission of the command received by VRPN.
Moreover a virtual joystick was implemented, giving the
possibility to drive the wheelchair directly (essentially for
test purposes) bypassing the signal coming from the
classification.
In addition, GERBIL was designed and developed as a
separate application and not as an OpenViBE plug-in. That
was done to increase the system independence, allowing the
possibility to future development also with platform different
from OpenViBE. GERBIL is also independent from the
employed EEG device; that will facilitate future integration
with other systems.

In conclusion, OpenViBE and GERBIL cover all the


needed BCI functions, from the signals acquisition to the
management of the electrical wheelchair displacements,
realizing a self-paced (asynchronous) BCI application. The
developed system was used to test the capabilities of
commercial EEG devices, executing the Online Evaluation
Scenario. The next section will illustrate how the tests were
conducted and the achieved results.
V.

TEST AND RESULT

The tests conducted during this work aim principally to


investigate the feasibility and reliability of commercial EEG
headset in a BCI context. The test setup was limited to the
recognition of only two classes. The motor imagery tasks,
left/right hand movement, were taken into account to make
turn the wheelchair left or right, respectively. Obviously,
driving an electrical wheelchair with only two commands
and without rely on other technologies (such as a system for
semi-autonomous navigation) is a too strong simplification
for an application in real-world settings. However this
prototype resulted very useful to understand all the
difficulties that the use of a commercial EEG device
involves.
After few test sessions, some significant problems about
the VHQVRUV SODFHPHQW HPHUJHG )LUVWO\ GXH WR WKH GHYLFH
structure, with the electrodes placed on flexible arms, it is
not possible to guarantee the same sensor position over
different tests (also with the same subject). Secondly, the
chosen device is equipped with 14 sensors all around the
scalp but no one is really near the sensorimotor cortex. For
these reasons, the user was equipped with a standard EEG
headset respecting the international 10-20 system. Then,
exploiting the flexibility of the device arms, the sensors were
placed in the desired position (see Fig. 4).

Figure 4. Sensors re-positioning to achieve the C3 and C4 location.

Testing the system directly on the wheelchair (Fig. 5),


further problems appeared. The behavior sometimes
unexpected (due to classification errors) of the wheelchair
represents a strong perturbation for the user, concentrated in
motor imagery tasks. This problem makes any sort of
navigation practically impossible for non-trained subjects.
To avoid this problem, focusing on the commercial EEG
devices performance, another approach was used. Since a
motor-imagery task entails the activation of the same brain
zone of the corresponding real movement [19], the test was
performed using real gestures (that are less sensitive to
emotional perturbations).
Finally, the system was trained on two gestures,
following a scenario Graz standard [20] ZLWK  OHIWDQG
 ULJKW inputs. The result was compared with a dataset
provided by OpenViBE and created with the following
characteristics:
x BDVHG RQ D VFHQDULR *UD] VWDQGDUG  OHIW 
ULJKW .
x Acquired (hardware) with MindMedia NeXuS 32b.

x Channels: C3;C4;FC3;FC4;C5;C1;C2;C6;CP3;CP4.
x Acquired (software) with OpenViBE.
For both datasets, the classification was performed with a
LDA classifier (the SVC gave very similar result) with a kfold test composed of 7 partitions.
The datasets collected with the commercial device
(several configurations have been tested) reached a
classification rate rarely above the 60%. The dataset
collected with a professional headset achieved a rate of about
80%. The results had a remarkable amelioration introducing
the post processing phase implemented in GERBIL. With the
first dataset the classification rate reached about the 67.5%,
while with OpenViBE dataset, the result increased up to
91%.
In conclusion, it is important to specify that the result
presented here can be improved broadening the number of
subjects involved and extending the comparison to other
dataset acquired with professional headsets. In this respect,
during the project other online datasets were considered
(Physionet [21] and BCI competition [22]) but an OpenViBE
compatibility problem prevented to complete the test.
However, on the one hand the obtained result showed
clearly that a BCI self-paced application, based on
commercial EEG device is not realistic (especially for
context with high error sensitivity); on the other hand this is
a problem common to mostly of the BCI technologies based
on EEG that currently cannot achieve performance near to
100%.
VI.

CONCLUSION AND FUTURE WORK

This paper presented the design and the realization of a


self-paced BCI system, built in order to drive an electrical
wheelchair with a motor imagery approach. Furthermore, the
developed prototype was used to test the capabilities of
commercial EEG devices for BCI applications. The obtained
result confirms that these devices, at this moment, cannot be
used successfully for self-paced applications in context too
error sensitive. However, a furthermore interesting analysis
could be to integrate a real-time EEG error-related potentials
analysis [23], in order to decrease the false positives
occurrences.
For what concerns driving an electric
wheelchair, as shown in previous works [12,13], probably
the good direction is to integrate sensors in the wheelchair
leaving to the user just the decisional tasks. Another
complementary approach could be to use more sensors on
the subject (e.g. EMG, inertial sensors, etc.) trying to adopt
as less invasive technologies as possible for a multimodal
interaction.
REFERENCES
[1]
[2]

Figure 5. A subject testing the system on the wheelchair.

2SHQ9L%(6RIWZDUHIRU%UDLQ&RPSXWHU,QWHUIDFHVDQG
5HDO7LPH1HXURVFLHQFHV
)/RWWH<5HQDUGDQG$/pFX\HU6HOI-paced braincomputer interaction with virtual worlds: A quantitative
DQGTXDOLWDWLYHVWXG\RXWRIWKHODE3URFWK,QWO
Brain-Computer Interface Workshop and Training
Course, Citeseer, 2008, pp. 3-8.

[3]
[4]

[5]

[6]

[7]

[8]

[9]

[10]
[11]

00XNHUMHH1HXUR3KRQH%UDLQ-Mobile Phone
,QWHUIDFHXVLQJD:LUHOHVV((*+HDGVHW
cs.dartmouth.edu, 2010.
(:6HOOHUVDQG('RQFKLQ$3-based brainFRPSXWHULQWHUIDFHLQLWLDOWHVWVE\$/6SDWLHQWV
Clinical neurophysiology : official journal of the
International Federation of Clinical Neurophysiology,
vol. 117, Mar. 2006, pp. 538-48.
G. Schalk, D.J. McFarland, T. Hinterberger, N.
%LUEDXPHUDQG-5:ROSDZ%&,DJHQHUDOpurpose brain-FRPSXWHULQWHUIDFH %&, V\VWHP IE E E
transactions on bio-medical engineering, vol. 51, Jun.
2004, pp. 1034-43.
*15DQN\DQG6$GDPRYLFK$QDO\VLVRID
FRPPHUFLDO((*GHYLFHIRUWKHFRQWURORIDURERWDUP
Proceedings of the 2010 IE EE 36th Annual Northeast
Bioengineering Conference (NEBE C), Mar. 2010, pp. 1-2.
F. Carrino, J. Tscherrig, E. Mugellini, O. Abou Khaled,
DQG5,QJROG+HDG-Computer Interface - A Multimodal
$SSURDFKWR1DYLJDWH7KURXJK5HDODQG9LUWXDO:RUOGV
14th International Conference on Human-Computer
Interaction, Orlando, F lorida, in press. , 2011.
C. Neuper, R. Scherer, S. Wriessnegger, and G.
3IXUWVFKHOOHU0RWRULPDJHU\DQGDFWLRQREVHUYDWLRQ
modulation of sensorimotor brain rhythms during mental
control of a brain-FRPSXWHULQWHUIDFH Clinical
neurophysiology : official journal of the International
Federation of Clinical Neurophysiology, vol. 120, Feb.
2009, pp. 239-47.
6*0DVRQDQG*(%LUFK$EUDLQ-controlled switch
IRUDV\QFKURQRXVFRQWURODSSOLFDWLRQV IE E E
transactions on bio-medical engineering, vol. 47, Oct.
2000, pp. 1297-307.
:4LDQJ26RXULQDDQG10.KRD$)UDFWDO
'LPHQVLRQ%DVHG$OJRULWKPIRU1HXURIHHGEDFN*DPHV
Proc. C GI 2010. S P25, S ingapore, 2010, 2010.
4:DQJ26RXULQDDQG0.1JX\HQ((*-Based
6HULRXV*DPHV'HVLJQIRU0HGLFDO$SSOLFDWLRQV2010
International Conference on Cyberworlds, Oct. 2010, pp.
270-276.

[12]

[13]

[14]
[15]
[16]
[17]
[18]
[19]

[20]

[21]
[22]
[23]

R. Blatt, S. Ceriani, B. Dal Seno, G. Fontana, M.


0DWWHXFFLDQG'0LJOLRUH%UDLQFRQWURORIDVPDUW
ZKHHOFKDLU10th International Conference on Intelligent
Autonomous Systems, 2008.
X. Perrin, R. Chavarriaga, F. Colas, R. Siegwart, and
-'50LOOiQ%UDLQ-coupled interaction for semiautonomous navigation of an assistive robotRobotics
and Autonomous Systems, vol. 58, Dec. 2010, pp. 12461255.
(PRWLY- %UDLQ&RPSXWHU,QWHUIDFH7HFKQRORJ\,
http://www.emotiv.com/.
2&=1HXUDO,PSXOVH$FWXDWRU,
http://www.ocztechnology.com/.
1HXUR6N\- %UDLQZDYH6HQVRUVIRU(YHU\ERG\,
http://www.neurosky.com/.
0\QGDdry wireless full-EUDLQ((*,
http://www.neurofocus.com/.
0$*,- 0LQG$XJPHQWHG*HVWXUH,QWHUDFWLRQ,
http://magisystem.project.eia-fr.ch/index.html.
C. Neuper, R. Scherer, M. Reiner, and G. Pfurtscheller,
,PDJHU\RIPRWRUDFWLRQVGLIIHUHQWLal effects of
kinesthetic and visual-motor mode of imagery in singleWULDO((* Brain research. Cognitive brain research,
vol. 25, Dec. 2005, pp. 668-77.
G. Pfurtscheller, C. Neuper, C. Guger, W. Harkam, H.
Ramoser, a Schlgl, B. Obermaier, and M. Pregenzer,
&XUUHQWWUHQGVLQ*UD]%UDLQ-Computer Interface (BCI)
UHVHDUFK IE E E transactions on rehabilitation
engineering : a publication of the IE E E Engineering in
Medicine and Biology Society, vol. 8, Jun. 2000, pp. 2169.
((*0RWRU0RYHPHQW,PDJHU\'DWDVHW,
http://www.physionet.org/pn4/eegmmidb/.
%&,&RPSHWLWLRQV, http://www.bbci.de/competition/.
R. Chavarriaga, A. Biasiucci, K. Forster, D. Roggen, G.
7URVWHUDQG-'50LOODQ$GDSWDWLRQRIK\EULGKXPDQcomputer interaction systems using EEG error-related
SRWHQWLDOV Conference proceedings : ... Annual
International Conference of the IE E E Engineering in
Medicine and Biology Society. IEE E Engineering in
Medicine and Biology Society. Conference , vol. 2010, Jan.
2010, pp. 4226-9.

You might also like