Professional Documents
Culture Documents
DOI 10.1007/s12369-011-0131-x
1 Introduction
Tactile sensation possesses a salient characteristic compared
to vision and hearing in that it does not occur without interaction between sensory organs and objects. Touching an
object induces both deformation of it and the sensory organ.
Since robots tactile sensing is important for performing any
task [13], we are developing an optical three-axis tactile
sensor that measures distributed tri-axial tactile data using
the principle of a uni-axial optical tactile sensor [47]. If a
robot is equipped with tactile sensors on its fingers, it can
detect not only grasping force but also slippage caused on
its hands [810].
In this paper, the authors have two objectives: one of them
is evaluation of the three-axis tactile sensor in actual robotic
tasks; the other is to demonstrate the effectiveness of triaxial tactile data for motion control. To accomplish these objectives, the authors have developed a two-hand-arm robot
equipped with three-axis tactile sensors on both hands. In
the robot motion control, we implement a recurrent mechanism in which the next behavior is induced by the tactile data
to make the robot accept intention embedded in the environment. Since this mechanism is based on the tri-axial tactile
data, it is easy to apply it to communication between the
hand-arms to obtain the best timing for cooperative work.
In the experiments, which were performed for the objectives, a small set of sensing-action programs was first produced, to accomplish basic tasks such as object transfer and
assembling two parts. In the former task, a fragile object is
grasped by the left hand and transferred to the right hand.
In the latter task, a cap grasped by the right hand is twisted
onto a bottle grasped by the left hand. These tasks require
cooperative work of the two hand-arms of a robot, in which
appropriate grasping force should be generated to prevent
from dropping or crushing the object. Program modules in-
98
cluded in the experimental program software are very simple, and they adopt tangential force vector and slippage information (defined as time derivative of tangential force) as
key signals that induce robot behaviors. Since each handarm determines its motion based on the above keys without
a wired connection between them, each is controlled by an
isolated controller. In a series of verification tests, we show
that the above tasks are accomplished by the information of
keys.
99
3 Basic Experiment
3.1 Object Grasping
In this chapter, the simplest and most basic case is explained.
Since the relationship between sensor input and action is
complicated, we use a schematic diagram as shown in Fig. 5,
and use it in subsequent chapters. For simplicity, the objectgrasping task is adopted as the most basic case.
100
the following. Module 1: if dFt dr (dr is a threshold), initial velocity v0 (magnitude is 2 mm/sec) is substituted into
fingertip velocity v. Module 2: if dFt > dr, it is judged that
the first touch has occurred, and then the mode changes into
grasping mode. Module 3: if normal force of even one element accepting the largest normal force exceeds the threshold of normal force Fn max , grasping force enhancement is
immediately stopped to prevent from crushing the object and
the sensing element itself.
In the grasping mode, grasping force is changed if slippage occurs; otherwise the current grasping force is maintained. Module 1: if dFt dr, zero velocity is substituted
into fingertip velocity v to maintain current grasping force.
Module 2: if dFt > dr, re-push velocity cv0 is substituted
into fingertip velocity v where c is determined through the
hardness (soft: c = 0.25, medium: c = 0.5, hard: c = 0.6).
Module 3: this is the same as one of the search modes.
After the first touch during 100 msec, the robot measures
the objects hardness (three levels: soft, medium, hard). The
coefficient for re-push velocity c is selected to an appropriate value based on maximum normal force increase among
elements, which specifies hardness of the object.
With these modules, the robot can grasp not only a hard
object but also a very fragile object such as a paper die without experiencing insufficient grasping force. Since around
seven tactile elements touch an object in the common condition and slippage occurs from the outer elements, the robot
can enhance grasping force to prevent from causing fatal
slippage. Consequently, the robot can grasp an object with
minimum grasping force to prevent it from crushing.
Since the element of rubber acts as a nonskid pad, the
hand can grasp a light object through even one element contact on each finger except for the case of changing weight,
such as pouring water into a paper cup.
101
When an object grasped by one hand of a robot is transferred to the other hand, both hands must adjust their grasping force. When a person transfers an object from one hand
to the other, one hand increases grasping force while the
other reduces it. In this study, we are attempting to achieve
this type of handover between two hand-arms of a robot.
The object-transfer task is performed as an application
of the pick up-and-place task mentioned in the preceding
section. While the key of releasing an object is defined as
the direction of reaction from the floor in the pick up-andplace task, we improve Module 7 for the pick up-and-place
task to specify an arbitrary direction of reaction vector d for
this object-transfer task.
In this experiment, the robot grasps an object (paper die;
each segment is 30 mm) on point A with the left hand, and
puts it on point B. Points A and B are located on the ends
of posts. Planar positions of A and B are programmed in
the robot while the vertical position of B is not provided.
Figure 8 shows the relationship between the modules of the
left and right hands.
After picking up the object with the left hand, the robot
passes it to the right hand. At that moment, Modules 6 and 7
of pick up-and-place mode for the right and left hands work
as shown in Fig. 8(a). After that, the right hand places the
object according to module activation as shown in Fig. 8(b).
This task proceeded as shown in Fig. 9.
To show this succession task, variation in normal force
applied to each sensing element is shown in Fig. 10, which
shows 17 of 41 selected sensing elements because these
elements usually touch the object. In this figure, variation
in normal force of the selected element is shown, and the
selected element number appears in the figure. At around
7 sec, normal force abruptly increases in Fig. 10(a) because
the left hand grasps the object. At around 40 sec, normal
force abruptly increases in Fig. 10(b) as well because the
right hand grasps the object.
Next, for variation in slippage (time derivative of tangential force in the global coordinate obtained from an element causing the largest normal force), we select one finger from both hands and exemplify their variations as shown
in Fig. 11. Since finger position in the local hand coordinate (open-and-close axis) is shown by a solid line, opening
and closing of the hand are determined by the decrease and
increase of the solid line. At around 40 seconds, the right
hand grasps and retrieves the object and horizontal slippage
is generated on the left hand. This horizontal slippage induces a releasing motion of the left hand. After that, on the
right hand, upward slippage occurs by the bottom of object
contacting the end of post B.
102
4.2 Assembly
If all modules described in the previous section are combined, the robot can perform simple assembly tasks. In
this experiment, first the right and left hands grasp a cap
and a bottle-like object, respectively. The cap and bottle
are made of plastic and wood, respectively; their sizes are
32 14 mm and 30 30 93 mm; their masses are around
103
Fig. 11 Observed slippage during object transfer task on sensor element accepting largest normal force
2.5 g and 32.8 g. The cap and bottle were set by a tester
before the start of this experiment. Then, the left hand approaches the right hand to mate the screw thread of the bottle to the cap. After that, the twisting task is started in the
same manner of the twisting task described in Sect. 3. After
screwing is completed, the left hand moves the assembled
bottle to the home position. At that time, horizontal slippage
occurs and it induces release of the cap.
Figure 12 shows this chain reaction of modules during
this assembly task with respect to important modules. In this
chain reaction, XG , YG , ZG -directional slippages become keys for specified behaviors. These directions are defined in Fig. 3. Using this program software, this assembly
task is completed as shown in Fig. 13. This assembly in the
air using the two-arm-hand robot is advanced compared to
our previous work [14].
Next, for variation in slippage (time derivative of tangential force in the global coordinate), we select one finger
from both hands and exemplify their variations as shown in
Fig. 14. At around 40 seconds, the right hand grasps and
retrieves the object and horizontal slippage is generated on
the left hand. This horizontal slippage induces a releasing
motion of the left hand.
To show this task succession, we select one finger from
the left hand and exemplify the variations as shown in
104
cap; the latter causes the behavior in which the left hand
moves the assembled bottle to the home position.
In the passing-object test, the intention of the right hands
taking the object is acquired by the left hand through the
slippage. In the latter assembling test, the former slippage
intends to make the right hand screw the cap. The latter slippage intends to make the assembled bottle return to the home
position with the left hand. Since the present tactile sensor
can distinguish direction of slippage, intentions embedded
in the slippage direction are acquired through the tactile sensor to perform the passing-object and assembly tasks.
Since the present robot does not have a vision system, it
cannot perform any task without a-priori knowledge. For example, the global coordinate (XG , YG ) of the cap grasped by
the right hand is previously provided in the assembly task,
when the task is begun. However coordinate ZG is obtained
by hitting the bottle on the cap.
5 Conclusion
The reaction causes ZG -directional slippage on the fingertips of the left hand.
Around 104 seconds in Fig. 14(b), +YG -directional slippage occurs, caused by tightening of the cap. Since the
right hand screws on the cap, information about completion
of screwing is transmitted from the right hand to the left
hand through +YG -directional slippage. The former slippage causes the behavior in which the right hand screws the
References
1. Nicholls HR, Lee MH (1989) A survey of robot tactile sensing
technology. Int J Robot Res 83:330
2. Gger D, Gorges N, Wrrn H (2009) Tactile sensing for an anthropomorphic robotic hand: hardware and signal processing. In:
2009 IEEE int conf on robotics and automation, pp 895901
3. Ho VA, Dao DV, Sugiyama S, Hirai S (2009) Analysis of sliding of a soft fingertip embedded with a novel micro force/moment
sensor: simulation, experiment, and application. In: 2009 IEEE int
conf on robotics and automation, pp 889894
4. Mott H, Lee MH, Nicholls HR (1984) An experimental very-highresolution tactile sensor array. In: Proc 4th int conf on robot vision
and sensory control, pp 241250
5. Tanie K, Komoriya K, Kaneko M, Tachi S, Fujiwara A (1986)
A high-resolution tactile sensor array. In: Robot sensors vol 2:
Tactile and non-vision. IFS, Kempston, pp 189198
6. Kaneko M, Maekawa H, Tanie K (1992) Active tactile sensing by
robotic fingers based on minimum-external-sensor-realization. In:
1992 IEEE int conf on robotics and automation, pp 12891294
7. Maekawa H, Tanie K, Komoriya K, Kaneko M, Horiguchi C, Sugawara T (1992) Development of a finger-shaped tactile sensor and
105
8.
9.
10.
11.
12.
13.
14.
its evaluation by active touch. In: 1992 IEEE int conf on robotics
and automation, pp 13271334
Ohka M, Mitsuya Y, Matsunaga Y, Takeuchi S (2004) Sensing
characteristics of an optical three-axis tactile sensor under combined loading. Robotica 22-2:213221
Ohka M, Kobayashi H, Takata J, Mitsuya Y (2008) An experimental optical three-axis tactile sensor featured with hemispherical surface. J Adv Mech Des Syst Manuf 2(5):860873
Ohka M, Takata J, Kobayashi H, Suzuki H, Morisawa N, Yussof HB (2009) Object exploration and manipulation using a
robotic finger equipped with an optical three-axis tactile sensor.
Robotica 27:763770
Brooks RA (1986) A robust layered control system for a mobile
robot. IEEE J Robot Autom 2(1):1423
Kube CR, Zhang H (1993) Controlling collective tasks with an
ALN. In: IEEE/RSJ int conf on intelligent robots and systems, pp
289293
Winston PH (1984) Artificial intelligence, 2nd edn. AddisonWesley, Reading, pp 159204
Ohka M, Morisawa N, Yussof HB (2009) Trajectory generation of
robotic fingers based on tri-axial tactile data for cap screwing task.
In: 2009 IEEE int conf on robotics and automation, pp 883888