Professional Documents
Culture Documents
Abstract—This paper provides a comprehensive computational due to the lack of appropriate sensory systems, unsolved
account of hand-centred research, which is principles, method- human-robot interaction problems, mysterious neuroscience
ologies and practical issues behind human hands, robot hands, issues, etc. In recent decades, due to significant innovations
rehabilitation hands, prosthetic hands and their applications.
In order to help readers understand hand-centred research, in multifingered robot hands and mature algorithms in robot
this paper presents recent scientific findings and technologies planning, priority has been given to multifingered robot ob-
including human hand analysis and synthesis, hand motion ject manipulation. As the hardware of multifingered robotic
capture, recognition algorithms and applications, it serves the systems has developed, there have been parallel advances in
purpose of how to transfer human hand manipulation skills to three most important engineering challenges for the robotics
related hand-centred applications in a computational context.
The concluding discussion assesses the progress thus far and community, namely, the optimal manipulation synthesis prob-
outlines some research challenges and future directions, and lem, the real-time grasping force optimisation problem, and
solution to which is essential to achieve the goals of human hand coordinated manipulation with finger gaiting [6], [7]. Stable
manipulation skill transfer. It is expected that the survey will multifingered robot manipulation is determined by engineering
also provide profound insights into an in-depth understanding of criteria, that is, force closures. Computer scientists have made
realtime hand-centred algorithms, human perception-action and
potential hand-centred healthcare solutions. significant advances in computational intelligence for robot
manipulation. Gomez et al. [8] developed an adaptive learning
Index Terms—Motion recognition, human hand modeling, mo- mechanism, which allows a tendon driven robotic hand to
tion capturing, multifingered robot manipulation and cognitive
robotics. explore its own movement possibilities, to interact with objects
of different shapes, sizes and materials and learn how to
grasp and manipulate them. It is recognised that the state
I. I NTRODUCTION of the art in hardware aspects of artificial hands confirms
that hardware platforms have capabilities of accommodating
I T is evident that the dexterity and multi-purposed ma-
nipulation property of the human hand inspires cross-
disciplinary research and application of robotics and artificial
advanced computational models for adapting the human hand
manipulation skills though it is still a challenge to operate
intelligence. It has been driven by the dream of developing an them in a real-time context [9]–[11].
artificial hand with the human hand’s properties [1]. However, However, the manipulation systems of the robotic hands are
five decades on, it is clear that the priority of the dream coming hardcoded to handle specific objects by their corresponding
true has been given to computational hand models [2], [3]. It robot hands. It is evident that robot hand control and optimi-
is the primary challenge of how to transfer human hand capa- sation problems are very difficult to resolve in mathematical
bilities into multi-fingered object manipulation, especially, in terms, however humans can solve their hand manipulation
a realtime context. It underlies advanced artificial intelligence tasks easily using skills and experiences. Object manipulation
robotics and its related disciplines and applications. algorithms are needed that have human-like manipulation
Recent innovations in motor technology and robotics have capabilities, are independent of robot hand hardware and are
achieved impressive results in the hardware of robotic hands conducted in realtime term. Hence, the main challenge that
such as the Southampton Hand, DLR hands, Robonout hand, researchers now face is how to enable robot hands to use what
Barret hand, DEKA hand, Emolution hand, Shadow hand can be learned from human hands, to manipulate objects, with
and iLimb hand [4]. Especially, the ACT hand [5] has not the same degree of skill and delicacy as human hands in a
only the same kinematics but also the similar anatomical reasonably fast manner. For instance, given the locations and
structure with the human hand, providing a good start for shapes of a cup by off-the-shelf image processing algorithms,
the new generation of anatomical robotic hands. However, a robotic hand is required, inspired by human hand biological
anatomically correct robotic hand is still a far way to go capabilities, to reach and manipulate the cup by continuously
responding with appropriate shape configuration and force
H. Liu is with Intelligent Systems and Biomedical Robotics Group, School distribution among the fingers and palm.
of Creative Technologies, University of Portsmouth, England, PO1 2DJ UK. Transferring human manipulation skills to artificial hands
Email: honghai.liu@port.ac.uk
The author would like to acknowledge the projects under grant No involves modeling and understanding the human hand mo-
EP/G041377/1 funded by Engineering and Physical Science Research Council, tion capabilities, and advanced multifingered manipulation
grant No IJP08/R2 by the Royal Society. planning and control systems, manipulation abilities, sensory
Copyright (c) 2009 IEEE. Personal use of this material is permitted.
However, permission to use this material for any other purposes must be perception and motion algorithms of artificial hand systems.
obtained from the IEEE by sending a request to pubs-permissions@ieee.org. It connects user manipulation commands/intentions to the
2
TABLE I
A DVANCED D ATA G LOVE P RODUCTS S ELECTED TO H IGHLIGHT THE S TATE OF THE A RT
tactile pattern obtained from multi-contact, finally to differen- that evolved from mathematical models of neurons and sys-
tiate the manipulations [47]. Dynamic haptic pattern refers to tems of neurons, neural network is becoming one of the
the time serial of the haptic sensing change and is capable most useful methods. Other neural network based classification
of providing more haptic information than static haptic pat- algorithms include log-linearized Gaussian mixture network,
tern during the interactions between the hand and objects. probabilistic neural network, fuzzy mean max neural network
Watanabe et al. utilized tactile spatiotemporal differential and radial basis function artificial neural network [55]. The
information with a soft tactile sensor array attached to a statistic classifiers such as HMM, Gaussian mixture model,
robot hand to classify object shapes [48]. While Hosoda et al. support vector machine have also been used intensively in
described the learning of robust haptic recognition of a bionic sEMG recognition. A few studies has compared several dif-
hand using regrasping strategy and neural network, through ferent methods [56], e.g., Claudio et al. [57] reported support
the dynamic interaction with the objects [49] . vector machine achieved a higher recognition in compari-
One of the challenges remain in haptics based recognition, son with neural networks and locally weighted projection
however, is how to model hand friction and its effects. Existing regression, while Liu proposed Cascaded Kernel Learning
approaches are based on point-contact friction models, either Machine which has been compared to other classifers such
the point friction model, cone friction model or compensating as k-nearest neighbour, multilayer neural network and support
friction via control strategy. Both models are idealistic for vector machine [58]. However, none of them has explained
practical robot manipulation in that the contact between an why the performance is enhanced. In addition, there is a lack of
object and the corresponding robot hand is a surface instead consideration of sEMG’s uncertainties such as non-stationary
of a point. Force distributed sensors are introduced to capture nature, muscle wasting, electrode position, different subjects
the magnitude of the force applied on a surface, but most and temperature impact. Muscle wasting or muscle fatigue
of them are incapable of achieving the force directions [50]. can be considered as the decrease in the force generating
The area and the complexity of the hand interacting space capacity of a muscle and has been evidenced in numerous
is limited due to the small size and resolution of the haptic studies. For the same hand motion, muscle fatigue results in
sensors. Additionally, locality of the haptic sensor is another a different sEMG signal which may cause a failure of the
challenge, which depends heavily on the contact conditions recognition method. Electrode position is also critical for the
[51]. Although compliant joints and soft finger tips [49] have valid sEMG signal and leads to estimates of sEMG variables
been proposed to simulate the hand soft tissues, it is still an that are different from those obtained in other nearby locations
open problem to imitate the non-linearly viscoelastic property [59]. Temperature is additionally proved to have an important
possessed by the finger soft tissues including inner skin and effect on the nerve conduction velocities and muscle actions
subcutaneous tissue. [60]. These uncertainties need more consideration in extracting
sEMG’s features which are determinant to the performance of
classifiers.
E. EMG based Hand Motion Recognition
There are two types of electromyogram: the intramuscu- IV. H AND -C ENTRED A PPLICATIONS
lar EMG, nEMG, and surface EMG, sEMG. The former In recent decades, with the developments and innovations
involves inserting a needle electrode through the skin into in motor technology and robotics, exciting fruit has been seen
the muscle whose electrical activity is to be measured; the in the design of physically artificial hands aiming to improve
latter refers placing the electrodes on the skin overlying the the flexibility of robotized systems. Artificial hands can be
muscle to detect the electrical activity of the muscle. nEMG generally categorized into mechanical grippers, robotic hands,
is predominantly used to evaluate motor unit function in rehabilitation hands and prosthetic hands. The mechanical
clinical neurophysiology. nEMG can provide focal recordings grippers, which are usually different from the mechanism
from deep muscles and independent signals relatively free of the human hand, have been widely used in industrial
of crosstalk. Due to the improvement of the reliable and applications for fast and effectively grasping and handling
implantable electrodes, the use of nEMG for human hand a limited set of known objects [61], [62]. They are usually
movement studies has been more explored. Farrell et al. designed for a specific task, executing a preprogrammed mo-
[52] presented that the intramuscular electrodes has the same tion trajectory and featuring low anthropomorphism and low
performance as surface electrodes on pattern classification manipulation capability [63]. Robotic hand and prosthetic hand
accuracy for prosthesis control. Ernest et al. [53] proved that are anthropomorphically designed to mimic the performance
a selective iEMG recording is representative of the applied of human hands, targeting to learn human hand skills with
grasping force and can potentially be suitable for proportional an adaption in dynamic unstructured environment and even
control of prosthetic devices. be competent for the work which human is incapable of. A
sEMG signals have been used as a dominant method of survey on artificial hands focusing on manipulative dexterity,
interaction with machines [54]. In an EMG-based interaction grasp robustness and human operability can be found at [64],
system, hand gestures are captured using sEMG sensors which an attempt to summarize up-to-date applications of artificial
evaluate and record physiologic properties of muscles at rest hands is provided as below.
and while contracting [21]. Various classification methodolo- Various anthropomorphic robotic hands had been developed
gies have been proposed for processing and discriminating or improved in the past ten years. Not only does the anthro-
sEMG signals for hand motions. As a computation technique pomorphic design make the skill transfer from human hand
8
V. C ONCLUDING R EMARKS
analyzing the human hands to hand-centred applications. The [11] A. Quagli, D. Fontanelli, D. Greco, L. Palopoli, and A. Bicchi, “Design
primary challenge that researchers now confront is how to of embedded controllers based on anytime computing,” IEEE Transac-
tions on Industrial Informatics, vol. 6, no. 4, pp. 492–502, 2010.
enable artificial hands to use what can be learned from human [12] A. Chella, H. Džindo, I. Infantino, and I. Macaluso, “A posture sequence
hands, to manipulate objects, with the same degree of skill learning system for an anthropomorphic robotic hand,” Robotics and
and delicacy as human hands. The challenging issues in hand- Autonomous Systems, vol. 47, no. 2-3, pp. 143–152, 2004.
[13] M. Carrozza, G. Cappiello, S. Micera, B. Edin, L. Beccai, and C. Cipri-
centred research can be summarized as: a) there is lack of ani, “Design of a cybernetic hand for perception and action,” Biological
generalized framework which dynamically merges hybrid rep- cybernetics, vol. 95, no. 6, pp. 629–644, 2006.
resentations for hand motion description rather than employing [14] S. Calinon, F. Guenter, and A. Billard, “On learning, representing, and
hardcoded methods. For instance, how to autonomously inter- generalizing a task in a humanoid robot,” IEEE Transactions on Systems,
Man and Cybernetics, Part B, vol. 37, no. 2, pp. 286–298, 2007.
pret the semantic meanings of a hand motion; b) there does [15] G. R. Bradberry, T.J. and J. Contreras-Vidal, “Reconstructing Three-
not exist a scheme which provides hand sensory feedback to Dimensional Hand Movements from Non-Invasive Electroencephalo-
artificial hands. For instance, it is required a sensor or model graphic Signals,” Journal of Neuroscience, in press.
[16] R. Gentner and J. Classen, “Development and evaluation of a low-cost
which generates necessary sensory feedback information for sensor glove for assessment of human finger movements in neurophys-
haptics perception understanding, view-invariant hand motion iological settings,” Journal of Neuroscience Methods, vol. 178, no. 1,
of artificial hands, e.g. a feasible slip model; c) existing pp. 138–147, 2009.
[17] C. Metcalf, S. Notley, P. Chappell, J. Burridge, and V. Yule, “Validation
algorithms fail to recognize and model human manipulation in- and Application of a Computational Model for Wrist Movements Us-
tention due to a variety of uncertainties, e.g., quality of sensory ing Surface Markers,” IEEE Transactions on BIomedical Engineering,
information, individual manipulation habits and clinical issues; vol. 55, no. 3, pp. 1199–1210, 2008.
d) It is evident that reliable interfacing/implanting into the [18] T. Moeslund and E. Granum, “A survey of computer vision-based human
motion capture,” Computer Vision and Image Understanding, vol. 81,
peripheral sensory nervous system and contextual information no. 3, pp. 231–268, 2001.
such as environmental models is missing for further bridging [19] X. Ji and H. Liu, “Advances in view-invariant human motion analysis:
the gap in artificial hands and human/environment interaction; A review,” IEEE Transactions on Systems, Man and Cybernetics, Part
C, vol. 40, no. 1, pp. 13–24, 2010.
e) feasible embedded algorithms is crucial, in terms of sen- [20] C. Metcalf and S. Notley, “Modified Kinematic Technique for Measuring
sory hand/context information fusion, to have artificial hands Pathological Hyperextension and Hypermobility of the Interphalangeal
operational and functional in human environments [9], [76]. It Joints,” IEEE Transactions on Biomedical Engineering, in press.
[21] C. Fleischer and G. Hommel, “Calibration of an EMG-Based Body
is expected that this paper has provided a relatively unified Model with six Muscles to control a Leg Exoskeleton,” Proc. Interna-
feasibility account for problems, challenges and technical tional Conference on Robotics and Automation, pp. 2514–2519, 2007.
specifications for hand-centred research, considering all the [22] “Delsys inc.” http://www.delsys.com.
major disciplines involved, namely, the human hand analysis [23] H. Liu, “A Fuzzy Qualitative Framework for Connecting Robot Qual-
itative and Quantitative Representations,” IEEE Transactions on Fuzzy
and synthesis, hand motion capture, hand skill transfer and Systems, vol. 16, no. 6, pp. 1522–1530, 2008.
hand-centred applications. This account of the-state-of-the- [24] T. Yoshikawa, “Multifingered robot hands: Control for grasping and
art presented has also provided partial insights into an in- manipulation,” Annual Reviews in Control, vol. 34, pp. 199–208, 2010.
[25] S. Mitra and T. Acharya, “Gesture Recognition: A Survey,” IEEE
depth understanding of human perception-action and potential transactions on Systems, Man and Cybernetics, Part C, vol. 37, no. 3,
healthcare solutions. pp. 311–324, 2007.
[26] Z. Ju and H. Liu, “A unified fuzzy framework for human hand motion
recognition,” IEEE Transactions on Fuzzy Systems, in press.
R EFERENCES [27] S. Fels and G. Hinton, “Glove-talkii - a neural-network interface which
maps gestures to parallel format speech synthesizer controls,” IEEE
[1] N. Palastanga, D. Field, and R. Soames, “Anatomy and Human Move- Transactions Neural Networks, vol. 9, no. 1, pp. 205–212, 1998.
ment - Structure and Function, Fifth Edition,” Elsevier Press, 2006.
[28] E. Stergiopoulou and N. Papamarkos, “Hand gesture recognition using
[2] L. Dipietro, A. Sabatini, and P. Dario, “A Survey of Glove-Based
a neural network shape fitting technique,” Engineering Applications of
Systems and Their Applications,” IEEE Transactions on Systems, Man
Artificial Intelligence, vol. 22, pp. 1141–1158, 2009.
and Cybernetics, Part C, vol. 38, no. 4, pp. 461–482, 2008.
[29] G. Heumer, H. Amor, and B. Jung, “Grasp recognition for uncalibrated
[3] B. Argall and A. Billard, “A Survey of Tactile Human-Robot Inter-
data gloves: A machine learning approach,” PRESENCE: Teleoperators
actions,” Robotics and Autonomous Systems, vol. 58, pp. 1159–1176,
and Virtual Environments, vol. 17, no. 2, pp. 121–142, 2008.
2010.
[4] A. Muzumdar, “Powered upper limb prostheses,” Springer, p. 208, 2004. [30] R. Palm, B. Iliev, and B. Kadmiry, “Recognition of human grasps by
[5] Y. Matsuoka, P. Afshar, and M. Oh, “On the design of robotic hands for time-clustering and fuzzy modeling,” Robotics and Autonomous Systems,
brain–machine interface,” Neurosurgical Focus, vol. 20, no. 5, pp. 1–9, vol. 57, no. 5, pp. 484–495, 2009.
2006. [31] E. Sudderth, M. Mandel, W. Freeman, and A. Willsky, “Visual Hand
[6] X. Zhu and H. Ding, “Computation of force-closure grasps: An iterative Tracking Using Nonparametric Belief Propagation,” Proc. International
algorithm,” IEEE Transactions on Robotics, vol. 22, no. 1, pp. 172–179, Conference on Computer Vision and Pattern Recognition, pp. 189–189,
2006. 2004.
[7] ——, “An efficient algorithm for grasp synthesis and fixture layout [32] J. Lin and T. Wu, “Modeling the Constraints of Human Hand Motion,”
design in discrete domain,” IEEE Transactions on Robotics, vol. 23, Urbana, vol. 51, no. 61, pp. 801–809.
no. 1, pp. 157–163, 2007. [33] C. Chua, H. Guan, and Y. Ho, “Model-based 3D hand posture estimation
[8] G. Gomez, A. Hernandez, P. Eggenberger Hotz, and R. Pfeifer, “An from a single 2D image,” Image and Vision Computing, vol. 20, no. 3,
adaptive learning mechanism for teaching a robot to grasp,” Proc. pp. 191–202, 2002.
International Symposium on Adaptive Motion of Animals and Machines, [34] B. Miners, O. Basir, and M. Kamel, “Understanding hand gestures using
pp. 1–8, 2005. approximate graph matching,” IEEE Transactions on Systems, Man and
[9] A. Malinowski and H. Yu, “Comparison of embedded system design Cybernetics, Part A, vol. 35, no. 2, pp. 239–248, 2005.
for industrial application,” IEEE Transactions on Industrial Informatics, [35] C. Chan and H. Liu, “Fuzzy qualitative human motion analysis,” IEEE
vol. 7, no. 2, pp. 244–254, 2011. Transactions on Fuzzy Systems, vol. 17, no. 4, pp. 851–862, 2009.
[10] S. Fischmeister and P. Lam, “Time-aware instrumentation of embedded [36] N. Binh, E. Shuichi, and T. Ejima, “Real-Time Hand Tracking and Ges-
software,” IEEE Transactions on Industrial Informatics, vol. 6, no. 4, ture Recognition System,” Proc. International Conference on Graphics,
pp. 652–663, 2010. Vision and Image, pp. 362–368, 2005.
10
[37] J. Triesch and C. von der Malsburg, “Classification of hand postures [60] J. Feinberg, “EMG: Myths and Facts,” HSS Journal, vol. 2, no. 1, pp.
against complex backgrounds using elastic graph matching,” Image and 19–21, 2006.
Vision Computing, vol. 20, no. 13-14, pp. 937–943, 2002. [61] X. Zhu, H. Ding, and M. Wang, “A numerical test for the closure prop-
[38] S. Ge, Y. Yang, and T. Lee, “Hand gesture recognition and tracking based erties of 3-d grasps,” IEEE Transactions on Robotics and Automation,
on distributed locally linear embedding,” Image and Vision Computing, vol. 20, no. 3, pp. 543–549, 2004.
vol. 26, no. 12, pp. 1607–1620, 2008. [62] X. Zhu and J. Wang, “Synthesis of force-closure grasps on 3-d objects
[39] B. Stenger, A. Thayananthan, P. Torr, and R. Cipolla, “Estimating 3D based on the q distance,” IEEE Transactions on Robotics and Automa-
hand pose using hierarchical multi-label classification,” Image and Vision tion, vol. 19, no. 4, pp. 669–679, 2003.
Computing, vol. 25, no. 12, pp. 1885–1894, 2007. [63] L. Zollo, S. Roccella, E. Guglielmelli, M. Carrozza, and P. Dario,
[40] A. Just and S. Marcel, “A comparative study of two state-of-the-art “Biomechatronic design and control of an anthropomorphic artificial
sequence processing techniques for hand gesture recognition,” Computer hand for prosthetic and robotic applications,” IEEE/ASME Transactions
Vision and Image Understanding, vol. 113, no. 4, pp. 532–543, 2009. on Mechatronics, vol. 12, no. 4, pp. 418–429, 2007.
[41] N. Binh and T. Ejima, “Real-Time hand Gesture Recognition Using [64] A. Bicchi, “Hands for dexterous manipulation and robust grasping: A
Pseudo 3-D Hidden Markov Model,” Proc. International Conference on difficult road toward simplicity,” IEEE Transactions on Robotics and
Cognitive Informatics, vol. 2, pp. 1–8, 2006. Automation, vol. 16, no. 6, pp. 652–662, 2002.
[42] M. Yeasin and S. Chaudhuri, “Visual understanding of dynamic hand [65] H. Liu, P. Meusel, N. Seitz, B. Willberg, G. Hirzinger, M. Jin, Y. Liu,
gestures,” Pattern Recognition, vol. 33, no. 11, pp. 1805–1817, 2000. R. Wei, and Z. Xie, “The modular multisensory DLR-HIT-Hand,”
[43] J. Lin, “Visual hand tracking and gesture analysis,” Ph.D. Dissertation, Mechanism and Machine Theory, vol. 42, no. 5, pp. 612–625, 2007.
University of Illinois at Urbana-Champaign, 2004. [66] H. Kawasaki, T. Komatsu, and K. Uchiyama, “Dexterous anthropo-
[44] H. Kawasaki, T. Furukawa, S. Ueki, and T. Mouri, “Virtual Robot morphic robot hand with distributed tactile sensor: Gifu hand II,”
Teaching Based on Motion Analysis and Hand Manipulability for Multi- IEEE/ASME Transactions on Mechatronics, vol. 7, no. 3, pp. 296–303,
Fingered Robot,” Journal of Advanced Mechanical Design, Systems, and 2002.
Manufacturing, vol. 3, no. 1, pp. 1–12, 2009. [67] S. Sun, C. Rosales, and R. Suarez, “Study of coordinated motions of
[45] Z. Wang, J. Yuan, and M. Buss, “Modelling of human haptic skill: A the human hand for robotic applications,” Proc. IEEE International
framework and preliminary results,” Proc. 17th IFAC World Congress, Conference on Information and Automation, pp. 776–781, 2010.
pp. 1–8, 2008. [68] T. Laliberte and C. Gosselin, “Simulation and design of underactuated
[46] M. Kondo, J. Ueda, and T. Ogasawara, “Recognition of in-hand manipu- mechanical hands,” Mechanism and Machine Theory, vol. 33, no. 1-2,
lation using contact state transition for multifingered robot hand control,” pp. 39–57, 1998.
Robotics and Autonomous Systems, vol. 56, no. 1, pp. 66–81, 2008. [69] C. Connolly, “Prosthetic hands from Touch Bionics,” Industrial Robot:
An International Journal, vol. 35, no. 4, pp. 290–293, 2008.
[47] N. Gorges, S. Navarro, D. Goger, and H. Worn, “Haptic object
[70] Bebionic, “Bebionic hand,” http://www.bebionic.com/, 2011.
recognition using passive joints and haptic key features,” Proc. IEEE
[71] P. Kyberd, C. Wartenberg, L. Sandsoj, S. Jonsson, D. Gow, J. Frid,
International Conference on Robotics and Automation, pp. 2349–2355,
C. Almstrom, and L. Sperling, “Survey of upper-extremity prosthesis
2010.
users in Sweden and the United Kingdom,” Journal of Prosthetics and
[48] K. Watanabe, K. Ohkubo, S. Ichikawa, and F. Hara, “Classification
Orthotics, vol. 19, no. 2, pp. 55–62, 2007.
of Prism Object Shapes Utilizing Tactile Spatiotemporal Differential
[72] C. Cipriani, F. Zaccone, S. Micera, and M. Carrozza, “On the shared con-
Information Obtained from Grasping by Single-Finger Robot Hand
trol of an EMG-controlled prosthetic hand: analysis of user–prosthesis
with Soft Tactile Sensor Array,” Journal of Robotics and Mechatronics,
interaction,” IEEE Transactions on Robotics, vol. 24, no. 1, pp. 170–184,
vol. 19, no. 1, pp. 85–96, 2007.
2008.
[49] K. Hosoda and T. Iwase, “Robust haptic recognition by anthropomorphic
[73] C. Pylatiuk, A. Kargov, and S. Schulz, “Design and evaluation of a low-
bionic hand through dynamic interaction,” Proc. IEEE/RSJ International
cost force feedback system for myoelectric prosthetic hands,” Journal
Conference on Intelligent Robots and Systems, pp. 1236–1241, 2010.
of Prosthetics and Orthotics, vol. 18, no. 2, pp. 57–61, 2006.
[50] K. Sato, K. Kamiyama, N. Kawakami, and S. Tachi, “Finger-Shaped [74] I. Gaiser, C. Pylatiuk, S. Schulz, A. Kargov, R. Oberle, and T. Werner,
GelForce: Sensor for Measuring Surface Traction Fields for Robotic “The FLUIDHAND III: A Multifunctional Prosthetic Hand,” Journal of
Hand,” IEEE Transactions on Haptics, vol. 3, no. 1, pp. 37–47, 2010. Prosthetics and Orthotics, vol. 21, no. 2, pp. 91–97, 2009.
[51] S. Takamuku, A. Fukuda, and K. Hosoda, “Repetitive grasping with [75] T. Rhee, U. Neumann, and J. Lewis, “Human Hand Modeling from
anthropomorphic skin-covered hand enables robust haptic recognition,” Surface Anatomy,” Proc. Symposium on Interactive 3D Graphics and
Proc. IEEE/RSJ International Conference on Intelligent Robots and Games, pp. 1–6, 2006.
Systems, pp. 3212–3217, 2008. [76] N. Motoi, M. Ikebe, and K. Ohnishi, “Real-time gait planning for
[52] T. Farrell and R. Weir, “A comparison of the effects of electrode im- pushing motion of humanoid robot,” IEEE Transactions on Industrial
plantation and targeting on pattern classification accuracy for prosthesis Informatics, vol. 3, no. 2, pp. 154–163, 2007.
control.” IEEE Transactions on Biomedical Engineering, vol. 55, no. 9,
pp. 2198–2211, 2008.
[53] E. N. Kamavuako, D. Farina, K. Yoshida, and W. Jensen, “Relationship
between grasping force and features of single-channel intramuscular emg
signals,” Journal of Neuroscience Methods, vol. 185, no. 1, pp. 143–150,
2009.
[54] K. Wheeler, M. Chang, K. Knuth, N. Center, I. Div, and C. Moffett Field,
“Gesture-based control and EMG decomposition,” IEEE Transactions Honghai Liu (M’02-SM’06) received his Ph.D de-
on Systems, Man and Cybernetics, Part C, vol. 36, no. 4, pp. 503–514, gree in robotics from King’s college London, UK,
2006. in 2003. He joined the University of Portsmouth,
[55] F. Mobasser and K. Hashtrudi-Zaad, “A method for online estimation UK in September 2005. He previously held re-
of human arm dynamics,” Proc. IEEE International Conference on search appointments at the Universities of London
Engineering in Medicine and Biology Society, vol. 1, pp. 2412–2416, and Aberdeen, and project leader appointments in
2006. large-scale industrial control and system integration
[56] S. Kawano, D. Okumura, H. Tamura, H. Tanaka, and K. Tanno, industry.
“Online learning method using support vector machine for surface- Dr Liu has published over 250 peer-reviewed
electromyogram recognition,” Artificial Life and Robotics, vol. 13, no. 2, international journals and conference papers includ-
pp. 483–487, 2009. ing four best paper awards. He is interested in
[57] C. Castellini and P. van der Smagt, “Surface EMG in advanced hand approximate computation, pattern recognition, intelligent video analytics and
prosthetics,” Biological Cybernetics, vol. 100, no. 1, pp. 35–47, 2009. cognitive robotics and their practical applications with an emphasis on
[58] Y. Liu, H. Huang, and C. Weng, “Recognition of electromyographic sig- approaches which could make contribution to the intelligent connection of
nals using cascaded kernel learning machine,” IEEE/ASME Transactions perception to action using contextual information. He is Associate Editor of
on Mechatronics, vol. 12, no. 3, pp. 253–264, 2007. IEEE Transactions on Industrial Informatics, IEEE Transactions on Systems,
Man and Cybernetics, Part C and International Journal of Fuzzy Systems.
[59] L. Mesin, R. Merletti, and A. Rainoldi, “Surface EMG: The issue
of electrode location,” Journal of Electromyography and Kinesiology,
vol. 19, no. 5, pp. 719–726, 2009.