You are on page 1of 9

Int J Soc Robot (2012) 4:97105

DOI 10.1007/s12369-011-0131-x

Two-Hand-Arm Manipulation Based on Tri-axial Tactile Data


Masahiro Ohka Sukarnur Che Abdullah Jiro Wada
Hanafiah Bin Yussof

Accepted: 19 November 2011 / Published online: 21 December 2011


Springer Science & Business Media BV 2011

Abstract Tactile sensing ability is important for social


robots, which perform daily work instead of persons. The
authors have developed a three-axis tactile sensor based on
an optical measurement method. Since our optical three-axis
tactile sensor can measure distributed tri-axial tactile data, a
robot equipped with the tactile sensors can detect not only
grasping force but also slippage from its hands. In this paper,
the authors have two objectives: one of them is evaluation
of the three-axis tactile sensor in actual robotic tasks; the
other is to demonstrate effectiveness of tri-axial tactile data
for motion control. To accomplish these objectives, the authors have developed a two-hand-arm robot equipped with
three-axis tactile sensors. In the robot motion control, we
implement a recurrent mechanism in which the next behavior is induced by the tactile data to make the robot accept
intention embedded in the environment. Since this mechanism is based on the tactile data, it is easy to apply it to communication between the hand-arms to obtain the best timing
for cooperative work. In a series of experiments, the twohand-arm robot performed object transfer and assembling
tasks. Experimental results show that this tri-axial tactile
base programming works well because appropriate behavior
is induced according to slippage direction.
Keywords Robot tactile systems Tactile sensors
Tri-axial tactile data Two-manipulator systems
M. Ohka () S.C. Abdullah J. Wada
Graduate School of Information Science, Nagoya University,
Nagoya, Japan
e-mail: ohka@is.nagoya-u.ac.jp
H.B. Yussof
Faculty of Mechanical Engineering, Universiti Teknologi MARA,
Shah Alam, Malaysia

1 Introduction
Tactile sensation possesses a salient characteristic compared
to vision and hearing in that it does not occur without interaction between sensory organs and objects. Touching an
object induces both deformation of it and the sensory organ.
Since robots tactile sensing is important for performing any
task [13], we are developing an optical three-axis tactile
sensor that measures distributed tri-axial tactile data using
the principle of a uni-axial optical tactile sensor [47]. If a
robot is equipped with tactile sensors on its fingers, it can
detect not only grasping force but also slippage caused on
its hands [810].
In this paper, the authors have two objectives: one of them
is evaluation of the three-axis tactile sensor in actual robotic
tasks; the other is to demonstrate the effectiveness of triaxial tactile data for motion control. To accomplish these objectives, the authors have developed a two-hand-arm robot
equipped with three-axis tactile sensors on both hands. In
the robot motion control, we implement a recurrent mechanism in which the next behavior is induced by the tactile data
to make the robot accept intention embedded in the environment. Since this mechanism is based on the tri-axial tactile
data, it is easy to apply it to communication between the
hand-arms to obtain the best timing for cooperative work.
In the experiments, which were performed for the objectives, a small set of sensing-action programs was first produced, to accomplish basic tasks such as object transfer and
assembling two parts. In the former task, a fragile object is
grasped by the left hand and transferred to the right hand.
In the latter task, a cap grasped by the right hand is twisted
onto a bottle grasped by the left hand. These tasks require
cooperative work of the two hand-arms of a robot, in which
appropriate grasping force should be generated to prevent
from dropping or crushing the object. Program modules in-

98

Int J Soc Robot (2012) 4:97105

Fig. 2 DOFs of two-hand-arm robot


Fig. 1 Two-hand-arm robot equipped with optical three-axis tactile
sensors

cluded in the experimental program software are very simple, and they adopt tangential force vector and slippage information (defined as time derivative of tangential force) as
key signals that induce robot behaviors. Since each handarm determines its motion based on the above keys without
a wired connection between them, each is controlled by an
isolated controller. In a series of verification tests, we show
that the above tasks are accomplished by the information of
keys.

2 Two-Hand-Arm Robot Equipped with Optical


Three-Axis Tactile Sensors
2.1 Robot Hardware
We adopted a two-hand-arm robot to accomplish the present
objectives. Figure 1 shows the two-hand-arm robot, which is
an advancement over the one-finger robot presented in a previous article [10]. Figure 2 shows the structure of the present
robot; the arm systems DOF is 5; each fingers DOF is 3.
To compensate for the lack of arm DOF, this robot uses its
fingers root joint as its wrist DOF (two joints of the finger
mounted on the left hand are fixed). The actuator of the finger joint has the following specifications: maximum torque,
0.7 Nm to generate 10 N force; encoder resolution, 40,960
pulses/revolution.
On each fingertip, it has a novel, optical three-axis tactile
sensor (Fig. 3) that can obtain not only normal force distribution but also tangential force distribution. We presented
this sensor in previous articles [810], which are helpful for
explanation. Although ordinary tactile sensors can only detect either normal or tangential force, since this tactile sensor
can detect both, the robot can use the sensor information as
an effective key to induce a specified behavior.

Fig. 3 Optical three-axis tactile sensor

A local coordinate is embedded on sensing elements


of the three-axis sensor attached to the fingertip. Figure 4
shows the relationship between the location of the sensing
element and the local coordinate. Although the three-axis
tactile sensor has 41 sensing elements, experimental results
of the specified sensing element will be demonstrated in a
subsequent chapter. Using coordinate transformation, component slippage vectors are calculated with respect to the
global coordinate O-XG YG ZG in Fig. 2.
2.2 Control Program
The architecture of the present control program resembles
that of many ordinal behavior-based robots, such as subsumption architecture (SSA) [11, 12], which is for hardware architecture to mimic reflection behavior of insects.
However, since the present architecture is established as
software architecture to accomplish more complicated tasks
than those performed by insects, its aspects are different
from SSA as follows.

Int J Soc Robot (2012) 4:97105

99

Fig. 4 Local coordinate on sensor

Behaviors included in the present software have no


marked priority, while behaviors in SSA have priority. When
tactile information obtained from the environment is input
into this software, a command for robot behavior is output,
and the robot actuators are controlled based on that command. After that, the environment is changed due to the deformation and transformation caused by the robot behavior,
and the result is sent as tactile information to the program
software inlet by feedback loop. In this software, the basic tasks such as pick up-and-place, cap twisting and object
transfer are accomplished with several program modules as
described in the following sections.
In each module, the specific pattern of tactile data is related to the specific simple behavior such as release and
grasp to accomplish reflection sensor input versus reflection behavior. This control program concept resembles an
expert system in artificial intelligence [13]. The difference
between this program and an expert system is that the fact
database is stored inside a computer but in this program the
whole environment is treated as the fact database. In the action module, we can include a relatively complex procedure
if we need it.
In our architecture, each module is programmed based on
a simple algorithm. This is an advantage of this architecture
because individual module check is possible. The timing of
activating these modules is not previously decided, but the
specific module is activated when the requirement of input
sensor data is satisfied.

3 Basic Experiment
3.1 Object Grasping
In this chapter, the simplest and most basic case is explained.
Since the relationship between sensor input and action is
complicated, we use a schematic diagram as shown in Fig. 5,
and use it in subsequent chapters. For simplicity, the objectgrasping task is adopted as the most basic case.

Fig. 5 Object grasping task

First, the object-grasping task is a common problem for


both one hand and two hands. Figure 5 shows a typical program module. To accomplish object grasping tasks, we used
two basic behaviors: (1) grasping force is enhanced by slippage to prevent instability in grasping an object; (2) grasping
force is maintained to prevent crushing if no slippage occurs.
In subsequent sections, we will introduce several parameters summarized in Table 1. Threshold values were adjusted
to be able to grasp several specimens such as aluminum,
wood, Styrofoam and paper dies of 30 mm in each segment
and to perform pick up-and-place based on ad hoc trial and
error.
Although we produce three program modules for a grasping task, action contents are changed according to two
modes: search mode and grasping mode. Before the hand
touches an object, search mode is valid and grasping mode
is not valid.
In the search mode, touching is judged through the time
derivative of tangential force caused on one element because
it is more sensitive than the normal force time derivative
dFt and is designated as derivative of the tangential force
measured on an element accepting the largest normal force
among 41 elements. Three modules functions are shown in

100

Int J Soc Robot (2012) 4:97105

Table 1 Parameters used for


experiments

the following. Module 1: if dFt dr (dr is a threshold), initial velocity v0 (magnitude is 2 mm/sec) is substituted into
fingertip velocity v. Module 2: if dFt > dr, it is judged that
the first touch has occurred, and then the mode changes into
grasping mode. Module 3: if normal force of even one element accepting the largest normal force exceeds the threshold of normal force Fn max , grasping force enhancement is
immediately stopped to prevent from crushing the object and
the sensing element itself.
In the grasping mode, grasping force is changed if slippage occurs; otherwise the current grasping force is maintained. Module 1: if dFt dr, zero velocity is substituted
into fingertip velocity v to maintain current grasping force.
Module 2: if dFt > dr, re-push velocity cv0 is substituted
into fingertip velocity v where c is determined through the
hardness (soft: c = 0.25, medium: c = 0.5, hard: c = 0.6).
Module 3: this is the same as one of the search modes.
After the first touch during 100 msec, the robot measures
the objects hardness (three levels: soft, medium, hard). The
coefficient for re-push velocity c is selected to an appropriate value based on maximum normal force increase among
elements, which specifies hardness of the object.
With these modules, the robot can grasp not only a hard
object but also a very fragile object such as a paper die without experiencing insufficient grasping force. Since around
seven tactile elements touch an object in the common condition and slippage occurs from the outer elements, the robot
can enhance grasping force to prevent from causing fatal
slippage. Consequently, the robot can grasp an object with
minimum grasping force to prevent it from crushing.
Since the element of rubber acts as a nonskid pad, the
hand can grasp a light object through even one element contact on each finger except for the case of changing weight,
such as pouring water into a paper cup.

3.2 Cap-Twisting Task


The cap-twisting task is performed based on modules 1, 2
and 3 described in the previous section. The initial trajectory
for the fingers is applied to the robot to start cap twisting;
the termination condition of the cap-twisting task must be
determined. For these behaviors, we added two modules to
the previous modules in Fig. 5.
Module 4 provides an initial cap-twisting task in which
the fingers follow the square trajectories shown in Fig. 6 as
an inserted figure. This behavior is repeated until the behavior provided by Module 5 is defined by the following.
Since the finger trajectory is intersected at the caps contour, the fingertips slip on its surface. If slippage occurs,
the increment of the shearing force is observed. At this moment, module 2 of grasping mode is activated to enhance the
grasping force.
Module 5 decides when to terminate the twisting task.
Empirically, we know that much slippage occurs when the
cap is tightened. If the increment of shearing force on an element dFt exceeds 1.4 times dFt max , which is the maximum
during the first touch, the finger motion is terminated. In the
authors previous work [14], this termination rule was not
included.
The fingertip trajectory is shown in Fig. 6. Modification
of the initial trajectory is saturated after closing the cap. Although the initial finger trajectory is a rough rectangle determined to touch and turn the cap, a segment of it is changed
from a straight line to a curved line to fit the cap contour.
The curved line is obtained from this task.
3.3 Pick up and Place
Since the pick-up task is accomplished by the object grasping explained in Fig. 5, releasing-object behavior is added

Int J Soc Robot (2012) 4:97105

101

4 Task Accomplished by Tri-axial Tactile Data


4.1 Object Transfer

Fig. 6 Modified trajectory based on interaction

Fig. 7 Picking up and placing task

to accomplish the placing task. Due to this, modules 4 and 5


for cap-twisting mode are replaced by new modules 6 and 7.
Module 6 conveys and sets an object down at a specified plane position after it picks the object up from a specified position. Laying down is terminated by the activation of
module 7, which is shown in the following.
Module 7 releases the object when the tactile sensors
catch the upward slippage. If the robotic hand feels upward slippage while moving down, the objects bottom has
touched a table or the floor. In this module, the upward slippage induces opening of the hand. Using this program, the
pick up-and-place task shown in Fig. 7 is accomplished.

When an object grasped by one hand of a robot is transferred to the other hand, both hands must adjust their grasping force. When a person transfers an object from one hand
to the other, one hand increases grasping force while the
other reduces it. In this study, we are attempting to achieve
this type of handover between two hand-arms of a robot.
The object-transfer task is performed as an application
of the pick up-and-place task mentioned in the preceding
section. While the key of releasing an object is defined as
the direction of reaction from the floor in the pick up-andplace task, we improve Module 7 for the pick up-and-place
task to specify an arbitrary direction of reaction vector d for
this object-transfer task.
In this experiment, the robot grasps an object (paper die;
each segment is 30 mm) on point A with the left hand, and
puts it on point B. Points A and B are located on the ends
of posts. Planar positions of A and B are programmed in
the robot while the vertical position of B is not provided.
Figure 8 shows the relationship between the modules of the
left and right hands.
After picking up the object with the left hand, the robot
passes it to the right hand. At that moment, Modules 6 and 7
of pick up-and-place mode for the right and left hands work
as shown in Fig. 8(a). After that, the right hand places the
object according to module activation as shown in Fig. 8(b).
This task proceeded as shown in Fig. 9.
To show this succession task, variation in normal force
applied to each sensing element is shown in Fig. 10, which
shows 17 of 41 selected sensing elements because these
elements usually touch the object. In this figure, variation
in normal force of the selected element is shown, and the
selected element number appears in the figure. At around
7 sec, normal force abruptly increases in Fig. 10(a) because
the left hand grasps the object. At around 40 sec, normal
force abruptly increases in Fig. 10(b) as well because the
right hand grasps the object.
Next, for variation in slippage (time derivative of tangential force in the global coordinate obtained from an element causing the largest normal force), we select one finger from both hands and exemplify their variations as shown
in Fig. 11. Since finger position in the local hand coordinate (open-and-close axis) is shown by a solid line, opening
and closing of the hand are determined by the decrease and
increase of the solid line. At around 40 seconds, the right
hand grasps and retrieves the object and horizontal slippage
is generated on the left hand. This horizontal slippage induces a releasing motion of the left hand. After that, on the
right hand, upward slippage occurs by the bottom of object
contacting the end of post B.

102

Int J Soc Robot (2012) 4:97105

4.2 Assembly
If all modules described in the previous section are combined, the robot can perform simple assembly tasks. In
this experiment, first the right and left hands grasp a cap
and a bottle-like object, respectively. The cap and bottle
are made of plastic and wood, respectively; their sizes are
32 14 mm and 30 30 93 mm; their masses are around

Fig. 8 Object-transfer task

Fig. 10 Variation in normal force of each sensor element

Fig. 9 Sequential pictures in object-transfer task (http://ns1.ohka.cs.is.nagoya-u.ac.jp/ohka_lab2/images/Passing(short).wmv)

Int J Soc Robot (2012) 4:97105

103

Fig. 11 Observed slippage during object transfer task on sensor element accepting largest normal force

2.5 g and 32.8 g. The cap and bottle were set by a tester
before the start of this experiment. Then, the left hand approaches the right hand to mate the screw thread of the bottle to the cap. After that, the twisting task is started in the
same manner of the twisting task described in Sect. 3. After
screwing is completed, the left hand moves the assembled
bottle to the home position. At that time, horizontal slippage
occurs and it induces release of the cap.
Figure 12 shows this chain reaction of modules during
this assembly task with respect to important modules. In this
chain reaction, XG , YG , ZG -directional slippages become keys for specified behaviors. These directions are defined in Fig. 3. Using this program software, this assembly
task is completed as shown in Fig. 13. This assembly in the
air using the two-arm-hand robot is advanced compared to
our previous work [14].
Next, for variation in slippage (time derivative of tangential force in the global coordinate), we select one finger
from both hands and exemplify their variations as shown in
Fig. 14. At around 40 seconds, the right hand grasps and
retrieves the object and horizontal slippage is generated on
the left hand. This horizontal slippage induces a releasing
motion of the left hand.
To show this task succession, we select one finger from
the left hand and exemplify the variations as shown in

Fig. 12 Chain-reaction of modules

Fig. 14, which shows variation in slippage during interaction


between the right and left hands. From Fig. 14(a), ZG directional slippage occurs around 45 seconds. At this time,
the mouth of the bottle hits the cap grasped by the right
hand. Since finger position in the local hand coordinate (perpendicularly intersecting open-and-close axis) is shown by a
solid line, the hand movement is observed by the solid line.

104

Int J Soc Robot (2012) 4:97105

Fig. 13 Sequential pictures in assembly task (http://ns1.ohka.cs.is.nagoya-u.ac.jp/ohka_lab2/images/Assy(short).wmv)

cap; the latter causes the behavior in which the left hand
moves the assembled bottle to the home position.
In the passing-object test, the intention of the right hands
taking the object is acquired by the left hand through the
slippage. In the latter assembling test, the former slippage
intends to make the right hand screw the cap. The latter slippage intends to make the assembled bottle return to the home
position with the left hand. Since the present tactile sensor
can distinguish direction of slippage, intentions embedded
in the slippage direction are acquired through the tactile sensor to perform the passing-object and assembly tasks.
Since the present robot does not have a vision system, it
cannot perform any task without a-priori knowledge. For example, the global coordinate (XG , YG ) of the cap grasped by
the right hand is previously provided in the assembly task,
when the task is begun. However coordinate ZG is obtained
by hitting the bottle on the cap.

5 Conclusion

Fig. 14 Observed slippage in left-hand finger during assembly task

The reaction causes ZG -directional slippage on the fingertips of the left hand.
Around 104 seconds in Fig. 14(b), +YG -directional slippage occurs, caused by tightening of the cap. Since the
right hand screws on the cap, information about completion
of screwing is transmitted from the right hand to the left
hand through +YG -directional slippage. The former slippage causes the behavior in which the right hand screws the

To evaluate the three-axis tactile sensor in actual robotic


tasks and demonstrate the effectiveness of tri-axial tactile
data for motion control, we implemented a program software to induce behavior of a robot by tri-axial tactile information acquired by the two-hand-arm robot. After each rule
is transformed into an algorithm, a program module is coded
based on the algorithm. The program software is composed
of several program modules, and a module is selected from
the set of modules based on sensor information. We established the program software to be composed of 3 to 7 modules to accomplish such tasks as object grasping, picking up
and placing, cap screwing, and assembling. The two-handarm robot equipped with an optical three-axis tactile sensor
performed the above tasks. Since logged tri-axial tactile data

Int J Soc Robot (2012) 4:97105

show several keys for specified behavior, the effectiveness of


the present tactile sensor is confirmed.
In the tasks performed in this study, the series of reflections is accomplished based on tri-axial tactile data. In this
series, since a short interval is required between behaviors,
the task is not always performed smoothly. Although excluding this idle period is difficult because the basic behavior is
switched based on tactile data, we will treat this issue in
future work by increasing the processing speed of tactile information.
The robot cannot assemble the cap and bottle if there is
misalignment. Furthermore, if an unknown external force
coincides with a sensory key inducing specific behavior,
the robot might demonstrate incorrect behavior. To overcome these problems, we will incorporate visual keys versus specified behavior into the present software in future
work.

References
1. Nicholls HR, Lee MH (1989) A survey of robot tactile sensing
technology. Int J Robot Res 83:330
2. Gger D, Gorges N, Wrrn H (2009) Tactile sensing for an anthropomorphic robotic hand: hardware and signal processing. In:
2009 IEEE int conf on robotics and automation, pp 895901
3. Ho VA, Dao DV, Sugiyama S, Hirai S (2009) Analysis of sliding of a soft fingertip embedded with a novel micro force/moment
sensor: simulation, experiment, and application. In: 2009 IEEE int
conf on robotics and automation, pp 889894
4. Mott H, Lee MH, Nicholls HR (1984) An experimental very-highresolution tactile sensor array. In: Proc 4th int conf on robot vision
and sensory control, pp 241250
5. Tanie K, Komoriya K, Kaneko M, Tachi S, Fujiwara A (1986)
A high-resolution tactile sensor array. In: Robot sensors vol 2:
Tactile and non-vision. IFS, Kempston, pp 189198
6. Kaneko M, Maekawa H, Tanie K (1992) Active tactile sensing by
robotic fingers based on minimum-external-sensor-realization. In:
1992 IEEE int conf on robotics and automation, pp 12891294
7. Maekawa H, Tanie K, Komoriya K, Kaneko M, Horiguchi C, Sugawara T (1992) Development of a finger-shaped tactile sensor and

105

8.

9.

10.

11.
12.

13.
14.

its evaluation by active touch. In: 1992 IEEE int conf on robotics
and automation, pp 13271334
Ohka M, Mitsuya Y, Matsunaga Y, Takeuchi S (2004) Sensing
characteristics of an optical three-axis tactile sensor under combined loading. Robotica 22-2:213221
Ohka M, Kobayashi H, Takata J, Mitsuya Y (2008) An experimental optical three-axis tactile sensor featured with hemispherical surface. J Adv Mech Des Syst Manuf 2(5):860873
Ohka M, Takata J, Kobayashi H, Suzuki H, Morisawa N, Yussof HB (2009) Object exploration and manipulation using a
robotic finger equipped with an optical three-axis tactile sensor.
Robotica 27:763770
Brooks RA (1986) A robust layered control system for a mobile
robot. IEEE J Robot Autom 2(1):1423
Kube CR, Zhang H (1993) Controlling collective tasks with an
ALN. In: IEEE/RSJ int conf on intelligent robots and systems, pp
289293
Winston PH (1984) Artificial intelligence, 2nd edn. AddisonWesley, Reading, pp 159204
Ohka M, Morisawa N, Yussof HB (2009) Trajectory generation of
robotic fingers based on tri-axial tactile data for cap screwing task.
In: 2009 IEEE int conf on robotics and automation, pp 883888

Masahiro Ohka is a Professor in the Graduate School of Information


Science at Nagoya University. In 1986, he received a Doctorate in Engineering from Nagoya University. He was a Researcher at Fuji Electric Company working to produce three-axis tactile sensors for an advanced robot of the Japanese National Project promoted by MITI. His
research focuses in the areas of robotic tactile sensors, complex systems science, human-robot interaction, and behavior-based robotics.
Sukarnur Che Abdullah is a post graduate student in the Doctorate
Course of the Graduate School of Information Science, Nagoya University. He received an M.Sc. in Manufacturing System Engineering
from University Putra Malaysia, in 2005. He was an academic staff
member and an assistant researcher at Universiti Teknologi MARA,
Malaysia from 2005 and is currently on leave for pursuing his Ph.D.
His research focuses in the areas of robotic tactile sensors, humanoid
robots and robot vision sensors.
Jiro Wada is an employee at Mitsubishi Heavy Industries. In 2009, he
received a Masters Degree of Engineering from Nagoya University.
Hanafiah Bin Yussof is a Senior Lecturer at MARA University of
Technology (UiTM). In 2008, he received a Doctorate of Information
Science from Nagoya University. His research focuses in the areas of
humanoid robotics, tactile sensors and behavior-based robotics.

You might also like