Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or...

10
Directions, Methods and Metrics for Mapping Human to Robot Motion with Functional Anthropomorphism: A Review Minas V. Liarokapis, Charalampos P. Bechlioulis and Kostas J. Kyriakopoulos School of Mechanical Engineering National Technical University of Athens (NTUA) 9 Heroon Polytechniou Str, Athens, 15780, Greece mliaro | chmpechl | kkyria @mail.ntua.gr Panagiotis K. Artemiadis School for Engineering of Matter, Transport and Energy Ira A. Fulton Schools of Engineering Arizona State University (ASU) 501 E. Tyler Mall, ECG 301, Tempe, AZ 85287-6106, USA [email protected] Abstract—In this paper we provide directions, methods and metrics that can be used to synthesize a complete framework for mapping human to robot motion with Functional Anthropomor- phism. Such a mapping can be used in order to perform skill transfer from humans to robot arm hand systems with arbitrary kinematics. The mapping schemes proposed, first guarantees the execution of specific functionalities by the robotic artifact in 3D space (task space functional constraints), and then optimizes anthropomorphism of robot motion. Human-likeness of robot mo- tion is achieved through minimization of structural dissimilarity between human and robot arm hand system configurations. The proposed methodology is suitable for a wide range of applications ranging from learn by demonstration for autonomous grasp planning, to real time teleoperation and telemanipulation with robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo- morphism, Teleoperation, Telemanipulation, Analytic IK, Opti- mization, Robot Arm Hand Systems. I. I NTRODUCTION Mapping human to robot motion is an active topic of research for almost 50 years. During the last decade, many robots have been designed and manufactured in order to become a vital part of our life; humanoids and toy robots created to interact with children, industrial robots to cooperate advantageously with humans, robot companions for elderly people and prosthetic or assistive robotic devices to help patients or amputees, regain lost dexterity. A common characteristic of these robots is the fact that they have been designed to serve, interact or mimic the human, thus their design is intrinsically human-centric. Anthropomorphism in terms of morphological and/or kinematical similarity, has become a major trend of robotics, for two reasons; safety in human robot interaction (HRI) applications (humanlike motion can be predicted more easily by humans, in order to avoid injuries during interaction) and social connection through robot likeability (humanlikeness of robots increases their social acceptability). Anthropomorphism affects also the complexity of human to robot motion mapping, as for “humanlike” robots mapping is easy, while for “non-humanlike” robots, mapping becomes complex [1]. Recently, we proposed Functional Anthropomorphism as a mapping approach that has as first priority to guarantee the execution of a specific functionality in task-space and then, having accomplished that, to optimize anthropomorphism of structure or form, minimizing a “distance” between the human and robot motion [2]. Such a functional constraint is more evident for robot arm hand systems, where the human and the robot end-effectors must achieve same position and orientation in 3D space. Another useful application of anthropomorphism is to handle redundancy presented at the solution space of complex robotic artifacts, such as redundant or even hyper- redundant robot arms. Regarding human to robot hand motion mapping, various methodologies have been proposed over the last years; finger- tips mapping, joint-to-joint mapping, functional pose mapping and object specific mapping. Fingertips mapping, computes first the forward kinematics (FK) for each human finger and then the inverse kinematics (IK) for each robot finger in order for each pair of human and robot fingertips to achieve same position in 3D space [3]–[7]. The joint-to-joint mapping is a one-to-one mapping, where the joint angles of the human hand are mapped to the corresponding joints of the robot hand, with the replicated by the robot hand postures to be identical to the human hand postures [8], [9]. In Functional Pose Mapping, the human and the robot hand are driven in a number of similar poses and then a relationship for each pair of human and robot joints is found, using a method like the least squares fit [10]. Finally, the object-specific mapping, originally proposed by Griffin et al. [11], provides a mapping between human and robot hand configurations assuming that a virtual sphere is held between the human thumb and index fingers. The parameters of the virtual object (size, position and orientation) are scaled independently and non-linearly, to create a corresponding virtual object in the robot hand workspace, that is then used to compute the robot fingertip locations. Recently Gioioso et al presented in [12] and [13] an extension of the object based approach to map synergies from human to robot hands with dissimilar kinematics.

Transcript of Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or...

Page 1: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

Directions, Methods and Metricsfor Mapping Human to Robot Motion withFunctional Anthropomorphism: A Review

Minas V. Liarokapis, Charalampos P. Bechlioulisand Kostas J. Kyriakopoulos

School of Mechanical EngineeringNational Technical University of Athens (NTUA)

9 Heroon Polytechniou Str, Athens, 15780, Greecemliaro | chmpechl | kkyria @mail.ntua.gr

Panagiotis K. ArtemiadisSchool for Engineering of Matter, Transport and Energy

Ira A. Fulton Schools of EngineeringArizona State University (ASU)

501 E. Tyler Mall, ECG 301, Tempe, AZ 85287-6106, [email protected]

Abstract—In this paper we provide directions, methods andmetrics that can be used to synthesize a complete framework formapping human to robot motion with Functional Anthropomor-phism. Such a mapping can be used in order to perform skilltransfer from humans to robot arm hand systems with arbitrarykinematics. The mapping schemes proposed, first guarantees theexecution of specific functionalities by the robotic artifact in3D space (task space functional constraints), and then optimizesanthropomorphism of robot motion. Human-likeness of robot mo-tion is achieved through minimization of structural dissimilaritybetween human and robot arm hand system configurations. Theproposed methodology is suitable for a wide range of applicationsranging from learn by demonstration for autonomous graspplanning, to real time teleoperation and telemanipulation withrobot arm hand systems in remote or dangerous environments.

Index Terms—Mapping Human to Robot Motion, Anthropo-morphism, Teleoperation, Telemanipulation, Analytic IK, Opti-mization, Robot Arm Hand Systems.

I. INTRODUCTION

Mapping human to robot motion is an active topic ofresearch for almost 50 years. During the last decade, manyrobots have been designed and manufactured in order tobecome a vital part of our life; humanoids and toy robotscreated to interact with children, industrial robots to cooperateadvantageously with humans, robot companions for elderlypeople and prosthetic or assistive robotic devices to helppatients or amputees, regain lost dexterity.

A common characteristic of these robots is the fact that theyhave been designed to serve, interact or mimic the human, thustheir design is intrinsically human-centric. Anthropomorphismin terms of morphological and/or kinematical similarity, hasbecome a major trend of robotics, for two reasons; safetyin human robot interaction (HRI) applications (humanlikemotion can be predicted more easily by humans, in order toavoid injuries during interaction) and social connection throughrobot likeability (humanlikeness of robots increases their socialacceptability). Anthropomorphism affects also the complexityof human to robot motion mapping, as for “humanlike” robotsmapping is easy, while for “non-humanlike” robots, mappingbecomes complex [1].

Recently, we proposed Functional Anthropomorphism as amapping approach that has as first priority to guarantee theexecution of a specific functionality in task-space and then,having accomplished that, to optimize anthropomorphism ofstructure or form, minimizing a “distance” between the humanand robot motion [2]. Such a functional constraint is moreevident for robot arm hand systems, where the human and therobot end-effectors must achieve same position and orientationin 3D space. Another useful application of anthropomorphismis to handle redundancy presented at the solution space ofcomplex robotic artifacts, such as redundant or even hyper-redundant robot arms.

Regarding human to robot hand motion mapping, variousmethodologies have been proposed over the last years; finger-tips mapping, joint-to-joint mapping, functional pose mappingand object specific mapping. Fingertips mapping, computesfirst the forward kinematics (FK) for each human finger andthen the inverse kinematics (IK) for each robot finger in orderfor each pair of human and robot fingertips to achieve sameposition in 3D space [3]–[7]. The joint-to-joint mapping isa one-to-one mapping, where the joint angles of the humanhand are mapped to the corresponding joints of the robothand, with the replicated by the robot hand postures to beidentical to the human hand postures [8], [9]. In FunctionalPose Mapping, the human and the robot hand are driven ina number of similar poses and then a relationship for eachpair of human and robot joints is found, using a method likethe least squares fit [10]. Finally, the object-specific mapping,originally proposed by Griffin et al. [11], provides a mappingbetween human and robot hand configurations assuming thata virtual sphere is held between the human thumb and indexfingers. The parameters of the virtual object (size, positionand orientation) are scaled independently and non-linearly,to create a corresponding virtual object in the robot handworkspace, that is then used to compute the robot fingertiplocations. Recently Gioioso et al presented in [12] and [13] anextension of the object based approach to map synergies fromhuman to robot hands with dissimilar kinematics.

Page 2: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

Regarding human to robot arm motion mapping, most ofthe previous studies followed a forward-inverse kinematicsapproach in order to achieve same position and orientation forthe end-effectors (wrists) of the human and the robot arm. Ananalytical computation of inverse kinematics for seven Degreesof Freedom (DoFs) redundant arms was performed in [14] and[15]. Artemiadis et al proposed in [16], a biomimetic approachfor the inverse kinematics of a seven DoF redundant robot arm(Mitsubishi PA10), using captured human arm kinematics, todescribe and model human joint angles dependencies with aBayesian Network. The model extracted was then used in aclosed loop iterative inverse kinematics algorithm.

For more complicated kinematic models such as highlyarticulated figures, various optimization based approaches havebeen proposed. In [17] the manipulation of an articulatedfigure, is formulated as a non-linear optimization problem. In[18] authors handle the inverse kinematics of highly articulatedfigures with nonlinear programming, performing constrainedminimization of a nonlinear function. Recently we formulatedthe problem of human to robot arm hand system motionmapping as a constraint non-linear optimization problem,achieving human like motion for robot arms and hands with ar-bitrary kinematics (hyper redundant robot arms and m-fingeredhands) [1]. It must be noted that nonlinear programmingalgorithms may terminate at local minima, but typically eventhe local minima may be sufficient. The latter hypothesiscan be validated with a series of experimental paradigms.Anthropomorphism of robot motion, has been examined in[19] where authors focused on the extraction of human-likegoal configurations for robotic arm hand systems, using acriterion “borrowed” from ergonomics studies [20] that yieldsa score of posture’s ergonomical quality. In [21] authors useda combination of bio-inspired optimization principles (e.g.,minimization of hand jerk) to formulate an optimization prob-lem and compute robot reaching motions similar to human’s.Finally in [22] a nonlinear optimization problem that usedobstacle constraints, between the arm hand system and theenvironment, was formulated to generate human-like motion.

In this paper, we propose a generic methodology for trans-ferring skills from humans to robot artifacts with arbitrarykinematics, mapping human to robot motion using criteria ofFunctional Anthropomorphism. More specifically, we formu-late different mapping schemes for different applications. Areal-time analytical scheme - proposed for real-time teleopera-tion and telemanipulation studies with robot arm hand systems[23] - performs human to robot arm motion mapping followinga forward/inverse kinematics approach and applies the joint-to-joint motion mapping method for the case of the hand.Redundancies are handled selecting the most anthropomorphicsolution, for which the scores of specific criteria of FunctionalAnthropomorphism become optimum. Another scheme basedon numerical methods, formulates human to robot motion map-ping as an optimization problem that solves inverse kinematicsunder position and orientation goals (task space functionalconstraints) and handles once again redundancies with specificcriteria of Functional Anthropomorphism.

A series of criteria of Functional Anthropomorphism andmapping schemes proposed for different applications, arediscussed in detail in this paper. The rest of the documentis organized as follows: Section II presents the equipmentrequired and the kinematic models of human and robot armhand systems, Section III present different methods used tomap human to robot motion, Section IV focuses on a seriesof metrics for the quantification of Functional Anthropomor-phism, Section V presents the results of a series of simulatedparadigms, while Section VI concludes the paper.

II. APPARATUS AND KINEMATIC MODELS

A. Equipment

In order to record the motion of the human arm handsystem and to extract the corresponding joint angles (27 DoFsmodeled) appropriate motion capture systems must be used.Our equipment consists of magnetic position tracking systemand a dataglove. The magnetic position tracking system is theLiberty R⃝ (Polhemus Inc.) which is equipped with four positiontracking sensors and a reference system. In order to capturehuman arm kinematics, three sensors are placed on the humanshoulder, the elbow, and the wrist respectively.

In order to measure the rest 22 DoFs of the humanhand and the wrist we use the Cyberglove II R⃝ (CybergloveSystems). The Cyberglove II has 22 flex sensors capturingall twenty DoFs of the human hand and the two DoFs ofthe human wrist. More specifically, the abduction-adductionand flexion-extension of the wrist, the flexion-extension of theMetaCarpoPhalangeal (MCP), Proximal Interphalangeal (PIP)and Distal Interphalangeal (DIP) joints of each finger and theabduction/adduction of the fingers, can be measured.

The acquisition frequency of the Cyberglove II is 90 Hzand the accuracy is 1 degree. The position measurements areprovided by the Liberty system at the frequency of 240 Hz.The Liberty system provides high accuracy in both positionand orientation, 0.03 in and 0.15 degrees respectively.

Liberty Cyberglove II(Polhemus Inc.) (Cyberglove Systems)

Fig. 1. Motion capture systems used to measure human arm hand systemmotion.

B. Kinematic Model of the Human Arm Hand System

In this paper we use a kinematic model of the humanhand, inspired by the positioning of Cyberglove II flex sensors.The model consists of twenty DoFs, four DoFs for index,middle, ring and pinky fingers (three for flexion/extensionand one for abduction/adduction) and four DoFs for thumb

Page 3: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

(two for flexion/extension, one for abduction/adduction andone to model thumb’s opposition to other fingers). A moresophisticated human hand model, like the one proposed in[24], could have been used if a more sophisticated motioncapture system - capable of measuring all DoFs of the humanhand - was available. Typically human hand digit/link lengths,can be easily measured, but in order to define the fingerbase frames positions and orientations relatively to the handreference frame at the base of the wrist, advance techniqueslike fMRI [25] are required. In order to define the link lengthsof each finger, we use parametric models defined - relativelyto the hand length and hand breadth - in hand anthropometrystudies like the one reported in [26].

In order to define human arm kinematics, we use a kine-matic model with seven DoFs. Human arm model consistsof three DoFs for the shoulder (abduction/adduction, flex-ion/extension and internal/external rotation), two DoFs for theelbow (flexion/extension and pronation/supination) and twoDoFs for the wrist (flexion/extension and abduction/adduction).

C. Kinematics of the Robot Arm Hand Systems

Regarding robot hand kinematics we examine the problemof mapping human to robot hand motion for the DLR/HITII five fingered robot hand [27] which has 15 DoFs, threeDoFs for each finger (one for abduction/adduction and twofor flexion/extension), while the last two joints of each finger(middle interphalangeal (MIP) and distal interphalangeal (DIP)corresponding joints) are coupled with a steel-wire basedmechanical coupling.

Regarding robot arm kinematics, we examine differentredundant and hyper redundant robot arms. More specificallythe first robot arm is the Mitsubishi PA10 7 DoF manipulator[28] which is considered to be anthropomorphic. The rest armsare hyper redundant robot arm models with 9, 18 and 27 DoFsthat consist of n

3 spherical joints and n3 links of equal length,

where n is the number of DoFs. The total length of the hyperredundant robot arm has been set to be, equal to the 90% andthe 125% of the mean human arm length that we use in thisstudy. In order to conclude to a common human arm length,we used the mean value of the 50th percentile of men andwomen, as reported in [29].

III. METHODS

A. Mapping Human to Robot Arm Motion Methods

1) Mapping Human to Robot Arm Motion using a For-ward/Inverse Kinematics Approach: The first human to robotarm motion mapping scheme described in this paper is atypical forward/inverse kinematics mapping like the one pro-posed in [2]. Human and robot arm end-effectors achieveonce again same position and orientation in 3D space (taskspace functional constraint), and for the redundant robot armMitsubishi PA10 all possible inverse kinematics (IK) solutionsare computed, in an analytical manner. Closed-form analyticalsolutions are desirable for fast motion planning, for two mainreasons; are much faster than the numerical IK solvers and canexplore the null space of the solution set, as all solutions are

computed. Thus from the set of all possible IK solutions wechoose, as the most anthropomorphic solution, the one thatminimizes the distance between human and robot elbows in3D space. Human and robot elbow positions can be easilycomputed using the human and robot forward kinematics.More details regarding the forward/inverse kinematics ap-proach can be found in [2]. A study focusing on teleoperationand telemanipulation with the Mitsubishi PA10 7 DoF robotmanipulator, using the analytical IK approach can be found in[23].

2) Mapping Human to Robot Arm Motion as an Optimiza-tion Problem: In this section, we present the second mappingscheme following the optimization approach for mappinghuman to robot motion. The second scheme formulates themapping as a constrained non-linear optimization problem.Even if the algorithm terminates at a local minima, thesolutions acquired suffice for our purposes. Similar hypothe-ses are typical in related studies [18] and the efficiency ofthe optimization schemes is typically proved using extensiveexperimental paradigms.

Formulation: Let xRA = fRA(qRA) denote the forward kine-matics mapping from joint to task space for the robot arm,let xRAgoal ∈ R3 denote the desired end-effector position (i.e.human end-effector position) and let oRAgoal ∈ R4 denote thedesired end-effector orientation represented with quaternions.We can define an objective function using position and orien-tation goals, as follows:

FRA(qRA) = wRAx∥∥xRA −xRAgoal

∥∥2

+wRAo∥∥oRA −oRAgoal

∥∥2 (1)

where wRAx and wRAo are weights that adjust the relativeimportance of the translation goal with respect to the rotationgoal. Typically wRAx = 1 and wRAo = 10.

We handle multiple goals by combining individual goalsinto a global objective function using a series of appropriateweight factors. The problem formulation for the global objec-tive function, can be defined as follows:

minimize FRA(qRA) (2)

subject to the inequality constraints of joints limits

q−RA < qRA < q+

RA (3)

where qRA ∈ Rn is the vector of the joint angles for the hyper-redundant robot arm with n DoFs and q−

RA, q+RA are lower and

upper limits of the joints respectively. More details regardingthe optimization approach can be found in [1].

3) Shoulder Offset to Increase Dexterous Workspace inTeleoperation and Telemanipulation Studies: When we com-pute the human arm end-effector (wrist) position and orienta-tion relatively to the human shoulder and use them as positionand orientation goals for the robot arm end-effector, thesolution of the robot arm (Mitsubishi PA10) inverse kinematicsresults to configurations where the robot arm is quite folded.This is an undesired behavior caused by the fact that the robotarm has typically a total length (sum of links lengths) quite

Page 4: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

bigger than the human arm length. In order for the user of theteleoperation system to be able to explore a greater part of therobot arm workspace, we choose to assign as “virtual” humanshoulder the reference system of the liberty motion capturesystem, introducing a dynamic offset between the actual humanshoulder and the assigned “virtual” shoulder, which varies asthe user moves. More details regarding the shoulder offset canbe found in [23].

4) Algorithms for Real Time Computation of IK: Some ap-plications that require mapping of human to robot motion, mayalso require real time computation of inverse kinematics (IK).For example in teleoperation studies, an interface that controlsthe robot at a rate of 100Hz (at least) or faster (preferably) isrequired. In this section we present the algorithms used in orderto perform real time computation of IK, following either theanalytical or the optimization approach, that we have alreadydescribed.

Analytical IK using the IKFast library of OpenRAVE:The IKFast module of OpenRAVE simulation environment

is used to compute analytically robot inverse kinematics equa-tions, as well as to generate optimized C++ files. Some featuresof the IK fast module are: can handle robots with arbitrary jointcomplexity, calculates all possible discrete solutions, generatesC++ code independent of OpenRAVE or any other library,detects cases where 2 or more axes align and cause infinitesolutions. More details regarding the IKFast algorithm can befound in [30] and [31].

Real Time Implementation of the Optimization Scheme:In order for the proposed scheme to be able to compute

efficiently and fast the inverse kinematics of the robot arm andconsequently to compute anthropomorphic robot motion wechose to develop the formulated optimization problem in C++using the NLopt open-source library for nonlinear optimizationand more specifically the Cobyla algorithm. More informationregarding the NLopt can be found in [32] and for the Cobylaalgorithm in [33].

B. Mapping Human to Robot Hand Motion Methods

Regarding the DLR/HIT II robot hand, two different humanto robot motion mapping schemes can be applied, the joint-to-joint mapping that is a simple yet effective method forteleoperation and telemanipulation studies and the optimizationbased approach that can be used mainly for applications thatdon’t require an online human to robot motion mapping.

1) Joint to Joint Mapping: In order to map human torobot hand motion using the joint-to-joint mapping scheme,we must first calibrate the dataglove used (Cyberglove II),mapping dataglove raw values to joint angles. For doing sowe use the robot hand specific fast calibration procedure, thatwe described in [34]. In this paper we use a modified versionof the joint-to-joint method, mapping actually the CybergloveII raw values to robot joint angles, ignoring some dataglovemeasurements and keeping some robot DoFs fixed to zero(e.g., middle finger abduction/adduction). The last two jointsof each DLR/HIT II robot finger are coupled with a mechanicalcoupling based on a steel wire, so we are not able to map all

the measurements of the Distal Interphalangeal Joint (DIP), theProximal Interphalangeal Joint (PIP) and the MetaCarpoPha-langeal Joint (MCP) of human hand, to the correspondingjoints of the robot hand. Thus some correspondences must bedefined and some Cyberglove II values must be ignored.

In this paper we map the cyberglove values of the PIP jointsof the human hand to both the PIP and the DIP joints ofthe robot hand (due to the mechanical coupling). We choosethe PIP joint as the human is able to flex PIP independently,but not DIP (due to tendon coupling), so if we had selectedthe DIP there would be cases in which the user would flexonly the PIP joint of the human hand and the correspondingrobot finger wouldn’t move as the DIP value provided bythe Cyberglove II motion capture system would be zero. TheMetaCarpoPhalangeal (MCP) joints of the human hand aremapped one-to-one to the corresponding joints of the robothand. Finally abduction/adduction of human fingers is mappedin a one-to-one manner to the robot fingers, with the exceptionof the middle finger whose abduction/adduction DoF is keptfixed to zero, as it cannot be measured by the Cyberglove II.

2) Mapping Human to Robot Hand Motion as an Opti-mization Problem: In order to formulate mapping human torobot hand motion as an optimization problem for a five-fingered robot hand, we use five objective functions one foreach finger, under position goals or position and orientationgoals, according to the task specifications.

Let xRF = fRF(qRF) be the forward kinematics mappingfrom joint to task space for each robot finger and let xRFgoal∈ R3 and oRFgoal ∈ R4 denote the desired fingertip position andorientation (in quaternions) respectively. Then, the objectivefunction for the case of each robot finger can be denoted asfollows:

FRF(q) = wRFx∥∥xRF −xRFgoal

∥∥+wRFo∥∥oRF −oRFgoal

∥∥ (4)

In order to incorporate in our analysis possible couplingsbetween subsequent joints of a robot finger, we introducethem as equality constraints. Thus the problem formulationfor mapping human to robot finger motion can be defined as:

minimize FRF(qRF) (5)

subject to the inequality constraints of joint limits

q−RF < qRF < q+

RF (6)

and equality constraints set for possible joint couplings, amongsequential joints

qn = qn−1 (7)

where qRF ∈ R4 is the vector of the joint angles for each fingerof the five fingered robot hand. More details regarding theoptimization approach can be found in [1]. A wrist offset canalso be applied - during robot arm IK computation - in case thatthe human and the robot hands have significant dimensionaldifferences, following the directions provided in [2]. A muchmore sophisticated optimization problem has been formulatedfor the whole arm hand system in [1].

Page 5: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

IV. METRICS FOR THE QUANTIFICATION OF FUNCTIONALANTHROPOMORPHISM

In this section we will present a series of criteria of anthro-pomorphism that result to the minimization of the structuraldissimilarity between the human and the robot arm, using theposition of the human elbow as a reference.

A. Volume of the convex hull created by human and robot jointpositions

In order to incorporate in the objective function an anthro-pomorphic criterion that will handle redundancy presented atthe solution space of a hyper-redundant robotic arm, or if wewant to use a metric capable to extract the most humanlikesolution of all solutions computed using the analytical IK ap-proach, we first examine the volume of the convex hull createdby the human and the robot joint positions, the common baseframe (shoulder) and the common end-effector (wrist).

The convex hull of a set of points S in three dimensions isthe intersection of all convex sets containing S. For N pointss1,s2, ...,sN , the convex hull C is given by the expression:

C ≡

{N

∑k=1

aksk : ak ≥ 0 for all k andN

∑k=1

ak = 1

}(8)

The volume formula of a simplex:

Vol(∆(s1, ...,sn)) =|det(s2 − s1, ...,sn − s1)|

n!(9)

where ∆(s1, ...,sn) denotes the simplex in Rn with verticess1, ...,sn ∈Rn. Moreover, when the triangulation method is usedto decompose the polytope into simplices, then the volume ofP is simply the sum of the volumes of the simplices:

Vol(P) =N

∑i=1

Vol(∆(i)) (10)

There are plenty of methods available to compute the convexhull of a set S of points. In this study we choose to use thewell known quickhull algorithm for convex hulls, that has beenproposed in [35]. More details regarding the decompositions ofthe convex hulls and their volumes the reader can find in [36]and [37]. Thus, the objective function FRA defined for the caseof the hyper-redundant robot arm in eq. 1, under position goals,orientation goals, and the convex hull based anthropomorphiccriterion becomes:

FRA(q) = wRAx∥∥xRA −xRAgoal

∥∥2+wRAo

∥∥oRA −oRAgoal∥∥2

+wCHVol(C(SRA ∪SHA))(11)

where SRA and SHA are the sets of the joint positions of therobot and the human arm respectively, wCH is the weight thatadjusts the relative importance of the convex hull volume,relatively to the translation and orientation goals and C denotesthe convex hull created by the human and the robot jointpositions in 3D space.

In case that we choose to use the analytical IK approach(if applicable, e.g., hyper-redundant robot arms typically don’thave an analytical IK solution), we select from all solutionscomputed the robot arm’s configuration that minimizes themetric’s value.

B. Distances between robot joint positions and human elbow.

Another useful criterion of anthropomorphism would be tominimize the distances between the robot joint positions in 3Dspace and the human elbow. Let selbow ∈ R3 be the positionof the human elbow in 3D space and SRA be the set of then robot joint positions in 3D space. For n points s1,s2, ...,sn,the distance between the robot joints positions and the humanelbow, is given by the expression:

D =n

∑j=1

∥∥s j − selbow∥∥2 (12)

Thus the objective function FRA for the case of the hyper-redundant robot arm, under position goals, orientation goals,and the “distances” anthropomorphic criterion becomes:

FRA(q) = wRAx∥∥xRA −xRAgoal

∥∥2

+wRAo∥∥oRA −oRAgoal

∥∥2+wDD

(13)

where wD is the weight that adjusts the relative importance ofthe metric of anthropomorphism, relatively to the translationand orientation goals and D denotes the sum of distancesbetween the human elbow and the robot joint positions in 3Dspace. Once again the metric can also be used for selecting themost anthropomorphic solution computed with the analyticalIK approach.

C. Area of the triangles defined by human and robot jointpositions.

Consider a n-link robotic arm, where n is in generaldifferent from the number of the human arm links. We initiallyinterpolate extra “virtual” joints in both human and roboticarms, according to the normalized length along their linksfrom the common base to the end-effector. In this respect,both arms possess equal number of “virtual” joints with thesame normalized length. Selecting one of the arms (e.g., thehuman arm), the structural similarity is quantified via the sumof the area of all facet triangles that are formed by joiningevery internal joint (besides the common base and the endeffector) of the human arm with the corresponding and itssubsequent joint of the robot arm. Such similarity criterion isreasonable since its minimum value, which is zero, implies thatall triangle areas are zero and consequently that the triplet ofjoints that form each triangle, are collinear. Thus, the humanarm when the criterion reaches its minimum, coincides withthe robot arm. To compute the area of the triangles defined bythe human and the robot joint positions in 3D space, we usethe Heron’s formula:

T =14

√(a+b+ c)(a−b+ c)(a+b− c)(−a+b+ c) (14)

where a, b and c are the lengths of the sides of each triangle.

Page 6: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

No Criterion Distance Criterion Convex Hull Criterion Triangles Criterion

Fig. 2. Comparison of optimization solutions with and without criterion for anthropomorphism, for a 18 DoF robot arm. DC = Distance Criterion, CC = ConvexHull Criterion, TC = Triangles Criterion.

V. RESULTS AND SIMULATIONS

In order to test the aforementioned methodologies andcheck the correctness of the forward and inverse kinematicscomputations we used the ninth version of Robotics Toolbox,while to visualize the convex hulls and compute their volumeswe used the multiparametric toolbox (MPT) [38].

The efficacy of the presented criteria of anthropomorphismis assessed with simulated paradigms in Fig. 2. All threecriteria result to anthropomorphic configurations for the 18DoF robot arm, in contrary with the no-criterion case.

In Fig. 3, Fig. 4 and Fig. 5, we perform an extensive com-parison of the different criteria of anthropomorphism proposed,for 9, 18 and 27 DoF hyper redundant robot manipulators (i.e.,robot arms). It must be noted that following the directionsprovided in [1] our methodology can be applied to robot armhand systems that consist of hyper redundant robot arms witharbitrary number and type of DoFs and m-fingered hands (e.g.three, four or even six-fingered hands) with arbitrary numberand type of DoFs per finger.

It must be noted that all metrics perform satisfactory interms of achieving humanlike configuration for redundant andhyper-redundant robot arms, while their speed of executionis considerably high (C++ implementations perform in “realtime”). Generally we choose the “Distance Criterion”, becausewe concluded that it provides the most anthropomorphic so-lutions (through qualitative assessment) and because it’s thefastest method examined.

VI. DISCUSSION AND CONCLUSIONS

In this paper, we proposed a series of directions, methodsand metrics that can be used to synthesize a complete humanto robot motion mapping scheme that facilitates skill transferfrom humans to robot artifacts with arbitrary kinematics (e.g.,even for the case of hyper redundant robot arms). Differenthuman to robot motion mapping schemes are examined andproposed for different applications. A fast analytical schemebased on a forward/inverse kinematics approach for mappinghuman to robot arm motion and a joint-to-joint mapping formapping human to robot hand motion, can be used for teleoper-ation/telemanipulation studies. A numerical scheme formulatesmapping human to robot motion as an constrained non-linearoptimization problem. A series of criteria of anthropomor-phism can be incorporated in a composite objective function

with position and orientation goals in order to derive humanlikerobot motion and handle redundancies, or can be used to selectthe most anthropomorphic solution, of all solutions computedusing an analytical IK approach. Those criteria minimize thestructural dissimilarity between the human and the roboticartifact, while guaranteeing specific human-imposed task spacefunctional constraints. The proposed methodology can beused for teleoperation or autonomous operation applications,where anthropomorphism is required and skill transfer mustbe achieved in a learn by demonstration manner.

Regarding future directions the authors plan to map humanto robot motion for Mitsubishi PA10 DLR/HIT II robot armhand system in order to perform various telemanipulationtasks, extending the work presented in [23].

ACKNOWLEDGMENT

This work has been partially supported by the EuropeanCommission with the Integrated Project no. 248587, THEHand Embodied, within the FP7-ICT-2009-4-2-1 programCognitive Systems and Robotics.

REFERENCES

[1] M. V. Liarokapis, P. K. Artemiadis, C. P. Bechlioulis, and K. J.Kyriakopoulos, “Mapping human to robot motion with functional an-thropomorphism: The case of hyper redundant robot arm hand systems,”in IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), September 2014 (submitted).

[2] M. V. Liarokapis, P. K. Artemiadis, and K. J. Kyriakopoulos, “Functionalanthropomorphism for human to robot motion mapping,” in 21st IEEEInternational Symposium on Robot and Human Interactive Communica-tion (RO-MAN), Sept. 2012, pp. 31–36.

[3] J. Hong and X. Tan, “Calibrating a VPL dataglove for teleoperating theUtah/MIT hand,” in Robotics and Automation, 1989. Proceedings., 1989IEEE International Conference on, may 1989, pp. 1752 –1757 vol.3.

[4] T. H. Speeter, “Transforming human hand motion for telemanipulation,”Presence: Teleoper. Virtual Environ., vol. 1, no. 1, pp. 63–79, January1992.

[5] R. Rohling, “Optimized fingertip mapping and hand modeling fortelemanipulation,” Master’s thesis, McGill University, 1993.

[6] M. Fischer, P. van der Smagt, and G. Hirzinger, “Learning techniques ina dataglove based telemanipulation system for the DLR hand,” in IEEEInternational Conference on Robotics and Automation (ICRA), 1998, pp.1603–1608.

[7] H. Hu, X. Gao, J. Li, J. Wang, and H. Liu, “Calibrating human hand forteleoperating the HIT/DLR hand,” in IEEE International Conference onRobotics and Automation (ICRA), vol. 5, April 2004, pp. 4571–4576.

[8] M. Ciocarlie, C. Goldfeder, and P. Allen, “Dexterous grasping via eigen-grasps: A low-dimensional approach to a high-complexity problem,” inRobotics: Science and Systems Manipulation Workshop - Sensing andAdapting to the Real World, 2007.

Page 7: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

[9] ——, “Dimensionality reduction for hand-independent dexterous roboticgrasping,” in IEEE/RSJ International Conference on Intelligent Robotsand Systems (IROS), November 2007, pp. 3270–3275.

[10] L. Pao and T. Speeter, “Transformation of human hand positions forrobotic hand control,” in IEEE International Conference on Roboticsand Automation (ICRA), vol. 3, May 1989, pp. 1758–1763.

[11] W. B. Griffin, R. P. Findley, M. L. Turner, and M. R. Cutkosky, “Cali-bration and mapping of a human hand for dexterous telemanipulation,”in ASME IMECE 2000 Symposium on Haptic Interfaces for VirtualEnvironments and Teleoperator Systems, 2000.

[12] G. Gioioso, G. Salvietti, M. Malvezzi, and D. Prattichizzo, “Mappingsynergies from human to robotic hands with dissimilar kinematics: anapproach in the object domain,” IEEE Trans. on Robotics, 2013.

[13] ——, “Mapping synergies from human to robotic hands with dissimilarkinematics: an object based approach,” in IEEE ICRA 2011 Workshopon Manipulation Under Uncertainty, Shanghai, China, May 2011.

[14] D. Tolani, A. Goswami, and N. I. Badler, “Real-time inverse kinematicstechniques for anthropomorphic limbs,” Graph. Models Image Process.,vol. 62, no. 5, pp. 353–388, September 2000.

[15] M. Shimizu, H. Kakuya, W.-K. Yoon, K. Kitagaki, and K. Kosuge, “Ana-lytical inverse kinematic computation for 7-DoF redundant manipulatorswith joint limits and its application to redundancy resolution,” IEEETransactions on Robotics (TRO), vol. 24, no. 5, pp. 1131–1142, October2008.

[16] P. K. Artemiadis, P. T. Katsiaris, and K. J. Kyriakopoulos, “A biomimeticapproach to inverse kinematics for a redundant robot arm,” Auton.Robots, vol. 29, no. 3-4, pp. 293–308, November 2010.

[17] M. Engell-Nrregrd and K. Erleben, “A projected back-tracking line-search for constrained interactive inverse kinematics,” Computers Graph-ics, vol. 35, no. 2, pp. 288 – 298, 2011.

[18] J. Zhao, Z. Xie, L. Jiang, H. Cai, H. Liu, and G. Hirzinger, “Levenberg-marquardt based neural network control for a five-fingered prosthetichand,” in Robotics and Automation, 2005. ICRA 2005. Proceedings ofthe 2005 IEEE International Conference on, april 2005, pp. 4482 – 4487.

[19] F. Zacharias, C. Schlette, F. Schmidt, C. Borst, J. Rossmann, andG. Hirzinger, “Making planned paths look more human-like in humanoidrobot manipulation planning,” in Robotics and Automation (ICRA), 2011IEEE International Conference on, may 2011, pp. 1192 –1198.

[20] L. McAtamney and E. N. Corlett, “Rula: a survey method for theinvestigation of work-related upper limb disorders,” Applied Ergonomics,vol. 24, no. 2, pp. 91 – 99, 1993.

[21] S. Albrecht, K. Ramirez-Amaro, F. Ruiz-Ugalde, D. Weikersdorfer,M. Leibold, M. Ulbrich, and M. Beetz, “Imitating human reachingmotions using physically inspired optimization principles,” in HumanoidRobots (Humanoids), 2011 11th IEEE-RAS International Conference on,oct. 2011, pp. 602 –607.

[22] E. C. e Silva, F. Costa, E. Bicho, and W. Erlhagen, “Nonlinear optimiza-tion for human-like movements of a high degree of freedom roboticsarm-hand system,” in Proceedings of the 2011 international conferenceon Computational science and its applications - Volume Part III, ser.ICCSA’11, 2011, pp. 327–342.

[23] M. V. Liarokapis, P. K. Artemiadis, and K. J. Kyriakopoulos, “Mappinghuman to robot motion with functional anthropomorphism for teleoper-ation and telemanipulation with robot arm hand systems,” in IEEE/RSJInternational Conference on Intelligent Robots and Systems (IROS),November 2013.

[24] ——, “Quantifying anthropomorphism of robot hands,” in IEEE Inter-national Conference on Robotics and Automation (ICRA), 2013.

[25] G. Stillfried and P. van der Smagt, “Movement model of a humanhand based on magnetic resonance imaging (MRI),” in InternationalConference on Applied Bionics and Biomechanics, ICABB, 2010.

[26] E. P. Pitarch, “Virtual human hand: Grasping strategy and simulation,”Ph.D. dissertation, Universita Politechnica de Catalunya, 2007.

[27] H. Liu, K. Wu, P. Meusel, N. Seitz, G. Hirzinger, M. Jin, Y. Liu, S. Fan,T. Lan, and Z. Chen, “Multisensory five-finger dexterous hand: Thedlr/hit hand ii,” in Intelligent Robots and Systems, 2008. IROS 2008.IEEE/RSJ International Conference on, sept. 2008, pp. 3692 –3697.

[28] N. Bompos, P. Artemiadis, A. Oikonomopoulos, and K. Kyriakopoulos,“Modeling, full identification and control of the mitsubishi pa-10 robotarm,” in Advanced intelligent mechatronics, 2007 IEEE/ASME interna-tional conference on, 2007, pp. 1–6.

[29] A. Poston, Human engineering design data digest: human factorsstandardization systems / Human Factors Standardization SubTAG. TheGroup, Washington, D.C, 2000.

[30] R. Diankov and J. Kuffner, “Openrave: A planning architecture forautonomous robotics,” Robotics Institute, Pittsburgh, PA, Tech. Rep.CMU-RI-TR-08-34, July 2008.

[31] R. Diankov, “Automated construction of robotic manipulation programs,”Ph.D. dissertation, Robotics Institute, Carnegie Mellon University, Pitts-burgh, PA, September 2010.

[32] S. G. Johnson, “The nlopt nonlinear-optimization package,” 2013.[Online]. Available: http://ab-initio.mit.edu/nlopt

[33] M. Powell, “A direct search optimization method that models theobjective and constraint functions by linear interpolation,” in Advancesin Optimization and Numerical Analysis, ser. Mathematics and ItsApplications, S. Gomez and J.-P. Hennart, Eds. Springer Netherlands,1994, vol. 275, pp. 51–67.

[34] M. V. Liarokapis, P. K. Artemiadis, and K. J. Kyriakopoulos, “Tele-manipulation with the DLR/HIT II robot hand using a dataglove and alow cost force feedback device,” in IEEE Mediterranean Conference onControl and Automation (MED), June 2013.

[35] C. B. Barber, D. P. Dobkin, and H. Huhdanpaa, “The quickhull algorithmfor convex hulls,” ACM Trans. Math. Softw., vol. 22, no. 4, pp. 469–483,Dec. 1996.

[36] J. Lawrence, “Polytope volume computation,” Mathematics of Compu-tation, vol. 57, no. 195, pp. 259–271, Jul. 1991.

[37] A. Enge and K. Fukuda, “Exact volume computation for polytopes:a practical study,” in in: Polytopes-combinatorics and computation(Oberwolfach), 1997, pp. 131–154.

[38] M. Kvasnica, P. Grieder, and M. Baotic, “Multi-Parametric Toolbox(MPT),” 2004. [Online]. Available: http://control.ee.ethz.ch/∼mpt/

Page 8: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

9 DoFs 18 DoFs 27 DoFsRL = HL RL = HL RL = HL

RL = 0.9HL RL = 0.9HL RL = 0.9HL

RL = 1.25HL RL = 1.25HL RL = 1.25HL

Fig. 3. Human to robot motion mapping using the joint positions distance minimization criterion for hyper-redundant robot arms with 9, 18 and 27 DoFs, HL=Human Arm Length, RL = Robot Arm Length.

Page 9: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

9 DoFs 18 DoFs 27 DoFsRL = HL RL = HL RL = HL

RL = 0.9HL RL = 0.9HL RL = 0.9HL

RL = 1.25HL RL = 1.25HL RL = 1.25HL

Fig. 4. Human to robot motion mapping using the joint positions convex hull minimization criterion for hyper-redundant robot arms with 9, 18 and 27 DoFs,HL= Human Arm Length, RL = Robot Arm Length.

Page 10: Directions, Methods and Metrics for Mapping Human to Robot ......robot arm hand systems in remote or dangerous environments. Index Terms—Mapping Human to Robot Motion, Anthropo-morphism,

9 DoFs 18 DoFs 27 DoFsRL = HL RL = HL RL = HL

RL = 0.9HL RL = 0.9HL RL = 0.9HL

RL = 1.25HL RL = 1.25HL RL = 1.25HL

Fig. 5. Human to robot motion mapping using the joint positions triangles area minimization criterion for hyper-redundant robot arms with 9, 18 and 27 DoFs,HL= Human Arm Length, RL = Robot Arm Length.