SlideShare une entreprise Scribd logo
1  sur  7
Télécharger pour lire hors ligne

Abstract — The new-generation humanoid robots, such as
the NAO robot made by Aldebaran Robotics, are characterized
by autonomous learning and close interaction with the
environment including humans. While equipped with advanced
vision and audio features including object and face recognition,
speech recognition, sound source recognition, speech synthesis,
etc., the NAO robot’s tactile sensing is limited to several
buttons that can be used to trigger associated actions. This
paper presents the wireless integration of tactile sensing on the
hand of a NAO robot. Without any replacement or
modification of its existing hardware, the add-on allows the
NAO robot to differentiate objects with similar size, color and
shape but different weight, stiffness, or texture.
I. INTRODUCTION
The “sense of touch” is particularly important among
various modalities that are needed to perceive and react to the
dynamics of the real world. It allows the assessment of object
properties such as size, shape, weight, stiffness, and texture.
The new-generation humanoid robots are characterized by
autonomous learning and close interaction with their
environments including humans. Tactile sensing, combined
with vision and/or audio in many cases, enhances
multisensory perception of humanoid robots. Considerable
research has been conducted on robot tactile sensing [1]-[7],
especially on humanoid robots [5]-[7]. The advancements in
intelligent humanoid robots provide new opportunities and
challenges for tactile sensing. For example, the smaller finger
area comparing to robot manipulators, the coordination
between vision, touch, and other senses, requirements on the
touch sensory system to avoid interference to mobility, etc.
The humanoid robot NAO (shown in Fig. 1), made by
Aldebaran Robotics, has been used by more than 350
universities and research labs around the world for a wide
variety of research topics in robotics as well as computer
science, human-machine interaction, and social sciences.
While equipped with advanced vision and audio features
including object and face recognition, speech recognition,
sound source recognition, speech synthesis, etc., the NAO
L. G. Ni is an associate professor at the Electrical and Computer
Engineering Department, College of Engineering, California Baptist
University, Riverside, CA 92504 USA (corresponding author, phone: 951-
343-4470; fax: 951-343-4782; e-mail: gni@calbaptist.edu ).
D. P. Kari recently received his BS ECE degree from California Baptist
University. He is currently a Master EE degree candidate at the Bourns
College of Engineering, University of California - Riverside, CA 92521
USA. (e-mail: David.Kari@calbaptist.edu ).
A. Muganza, B. Dushime, A. Zebaze are recent BS ECE, BS ECE and
BS ME graduates from California Baptist University. (e-mail:
Alex.Muganza@calbaptist.edu, Bertrand.Dushime@calbaptist.edu,
Andre.Zebaze@calbaptist.edu ).
robot’s tactile sensing is limited to several buttons that can be
used to trigger associated actions. The main objective of our
project is to enhance the NAO robot’s perception and
intelligence by giving it the capability of identifying different
objects by their weight, stiffness and texture.
Figure 1. The humanoid robot NAO.
Several features of the NAO robot make it appealing to
our research. With its intuitive graphical interface, the
programming software developed by Aldebaran Robotics for
the NAO robot called Choregraphe allows easy programming
of behaviors required for active exploration of objects using
touch sense. For instance, the robot needs to grasp the object
with one hand and stroke the object surface with the other
hand in order to identify the surface texture. The built-in
speech recognition and speech synthesis capabilities of the
NAO robot makes the development more interactive, as the
robot asks questions about the object it is touching or tells the
name of the object based on tactile properties. The built-in
visual object recognition capability provides opportunity for
fusion of vision and touch in future research.
This paper is organized as follows: The system
configuration including hardware setup and software
components are presented in Section II. The calibration and
testing of tactile sensors are discussed in Section III. The
experimental results of active exploration and identification
of objects are presented in Section IV. Conclusions and
directions for future work are summarized in Section V.
II. SYSTEM CONFIGURATION
From the system integration point of view, the
framework of our wireless integration of tactile sensing on
the NAO robot can be applied to other commercially
Wireless Integration of Tactile Sensing on the Hand of a Humanoid
Robot NAO
Liya Grace Ni, Senior Member, IEEE, David P. Kari, Member, IEEE, Alex Muganza, Member, IEEE
Bertrand Dushime, Member, IEEE, and Andre N. Zebaze
2012 IEEE RO-MAN: The 21st IEEE International Symposium on
Robot and Human Interactive Communication.
September 9-13, 2012. Paris, France.
978-1-4673-4606-1/12/$31.00 ©2012 IEEE 982
available humanoid robots or adding other external sensors
to the NAO robot.
A. Hardware Setup
In order to avoid violation of warranty, we have the
constraint of not replacing or modifying any existing
hardware on the NAO robot. The hardware components we
used in the system, besides the NAO robot itself, are listed
in Table I, along with simple descriptions of their
functionalities, their locations, and mounting methods. Fig. 2
shows how the hardware components are physically
mounted on the NAO robot.
TABLE I. HARDWARE COMPONENTS
Component Name Functionality Location
Mounting
Method
FlexiForce sensors Tactile sensing Fingers
Double sticky
tape
Pinted Circuit
Board (PCB)
Auxiliary circuit
for sensors
Upper
arm
Velcro strap
& tape
RF Link transmitter
Transmits sensor
data
Back Enclosure box
Arduino Mega 2560
w. battery
Interface
between PCB
and RF
transmitter
Back Enclosure box
RF Link receiver
Receives sensor
data
Not on
the robot
N/A
Arduino Uno
Interface
between RF
receiver and
computer
Not on
the robot
N/A
Computer
Processes sensor
data; delivers
behavioral
commands to the
NAO robot
Not on
the robot
N/A
Figure 2. Hardward mounted on the NAO robot.
The configuration of the five FlexiForce Sensors on the
NAO robot’s three-fingered right hand is shown in Fig. 3.
On both the left and right fingers, one sensor is mounted at
the tip and one is at the center. The fifth one is at the tip of
the thumb. The same sensor labeling as shown in Fig. 3 will
be used later in Section IV.
Figure 3. Configuration of the five sensors on NAO’s hand.
The data flow in the system is illustrated in Fig. 4. The
tactile sensor measurements are sent to the computer
wirelessly through the RF module and microcontrollers. The
computer analyzes and logs the sensor measurements, and
sends speech and behavior commands to the CPU on the
NAO robot itself wirelessly. The connection between the
NAO robot and the tactile sensors is only a physical
attachment without data flow in between.
Figure 4. Data flow chart.
B. Software Components
The software we developed for this project contains
several components, as listed in Table II.
Choregraphe allows easy capture of the joint angles for
the starting and ending positions of each motion we
implemented on the NAO robot later for the integration of
touch sensing. Fig. 5 shows how the arm angles were
captured in Choregraphe.
A cross-platform Arduino IDE is used to program the
microcontrollers that interface with the RF transmitter and
receiver, which communicate with 434 MHz radio frequency
signals. With the restriction of a maximum 4800 bits per
second (BPS) data rate of the RF module, currently the data
packets containing measurements of all five sensors are
transmitted at the frequency of 25 Hz.
983
TABLE II. SOFTWARE COMPONENTS
Component
Description
Development
Language
Development
Stage
Location of
Execution
Motion
recording
Choregraphe Preparation Computer
Wireless
communication
C Integration Microcontrollers
Main
application
C# Integration Computer
Speech and
behavioral
modules
Python Integration The NAO robot
Display of
measurements
C# Testing Computer
Figure 5. Arm angles obtained in Choregraphe.
The main application involves a learning process for the
NAO robot based on tactile information including weight,
stiffness and roughness. Just like how a toddler learns about
objects in his/her surroundings, the NAO robot will go
through the following steps during its learning process:
Step 1. Pick up an object and learn how heavy/light, how
hard/soft, and how rough/smooth it is, with measurements
from tactile sensors and associated actions.
Step 2. Characteristics extracted from measurements are
compared with the corresponding features of objects in the
database. Decisions are made as follows.
- If the actual features of weight, stiffness and roughness
are close to the features of the current object in the database,
in other words, the absolute values of the differences are
below predefined thresholds, say the name of the current
object.
- If the actual features do not match the features of the
current object, and the current object is not the last one in the
database, move on to the next object.
- If the actual features do not match the features of the
current object, and the current object is the last one in the
database, go to Step 3.
Step 3. Ask the name of the object and add it to the
database.
The software flow of the main application is shown in
Fig. 6. Although the main application was developed in C#,
in order to use the NAO SDK to send speech and behavioral
commands to the NAO robot, a Python script was written
for each action and was invoked in the C# program.
Figure 6. The Unified Modeling Language (UML) diagram of software
flow in the C# application.
A graphical user interface programmed in C# is
embedded in the main application to display the weight,
stiffness and roughness data during testing and
demonstrations.
III. SENSOR CALIBRATION AND TESTING
A. Design of Printed Circuit Board (PCB)
The FlexiForce sensor is an ultra-thin and flexible printed
circuit that uses a resistive-based technology. The application
of a force to the active sensing area of the sensor results in a
change in the resistance of the sensing element in inverse
proportion to the force applied. A modified version of the
recommended amplifier circuit in the user manual [8] is
shown in Fig. 7.
984
Figure 7. Amplifier circuit for FlexiForce sensors [8].
The feedback resistance RF as well as the drive voltage
VT can be used to adjust the sensitivity of the sensor. A
feedback resistance value of 100 kΩ and a drive voltage of
-1.5 V were selected in our design. A two-step process was
implemented to supply the -1.5 V to the sensors. First, a
voltage regulator consisting of two IN914 diodes connected
in series and a 240 Ω resistor provides +1.5 V with a 5 V
supply from the microcontroller. Next, an ADM660 Switched
Capacitor Voltage Converter was used to convert it to -1.5 V.
Considering the constraint of the size of PCB in order to
mount it on the robot’s upper arm, we chose a quad Op-Amp
chip MCP6004 and a dual Op-Amp chip MCP6002 to
provide all the Op-Amps needed in the circuit. The layout of
the custom PCB is shown in Fig. 8.
Figure 8. PCB layout.
B. Sensor Calibration
Two different methods were used to calibrate the sensor.
First, a flat load was applied to the sensor with its weight
changed by adding additional mass on top of it. Second, a
plastic ball, or spherical load, was used as test object. The
weight was also changed by adding additional mass on top of
the ball. The sensor was calibrated over a range of 0 to
approximately 2 N. The results are shown in Fig. 9. There is
a linear relationship for a flat load between the voltage output
from the sensor and the force applied to the sensor. The
linear equation fits the data with an R2
value (a statistical
metric that indicates how close the curve fits the data points)
of 0.9875. The values obtained for the spherical loads are
higher than the values obtained for the flat loads, likely
because of the smaller contact area for the spherical loads.
The results indicate that at higher values of weight, the
spherical load more closely approximates a flat load because
of more even distribution of the load due to compression of
the spherical object.
Figure 9. Experimental results of FlexiForce sensor measurements (square:
flat loads, circle: spherical loads).
Next, the FlexiForce sensors were attached on the fingers
of the NAO robot and calibration was performed with the
following configuration: the robot’s right arm and hand
were kept still and a plastic ball was placed in its hand with
a fixed position. Measurements were taken from two sensors
mounted at the centers of both left and right fingers. Then
the sensors were replaced by two other sensors, and so on.
The weight of the plastic ball was adjusted by adding water
through a hole on its top. The voltage for a particular mass
was obtained by averaging the voltages measured by the
sensors. Fig. 10 shows the experimental results of the ranges
and average voltages of the five sensors versus different
weight inputs.
Figure 10. Calibration results with sensors on the NAO robot
(bar: range of voltages, square: average of voltages).
IV. EXPERIMENTAL RESULTS
In this preliminary study, three objects, namely a golf
ball, a ping pong ball and a cotton ball, which are of similar
size, shape and color but different weight, stiffness and
roughness, were chosen for our experiments.
985
A. Comparison of Weight
The NAO robot was programmed to reach out with its
right forearm and open its right hand. At the same time, it
asked “Give me the ball.” The golf ball was then placed in its
right hand. The measurements from the two sensors mounted
at the center of the fingers are shown in Fig. 11. The sensors
at the finger tips and on the thumb were not pressed due to
the size and shape of objects in our experiments, therefore
their readings were discarded. A period of five seconds was
allotted to complete the test, and the voltage samples from
each of the above two sensors throughout the testing period
were averaged and logged in the database. A progress bar on
the display shows how much of the five-second period has
elapsed. As can be observed from Fig. 11, the center of
gravity is closer to one of the two fingers during that
particular test. Therefore, the average of sensor #3 and sensor
#4 voltages is recoded as the indicator of object weight.
Figure 11. Display of weight test result for a golf ball.
This process was repeated for a ping pong ball and a
cotton ball. The average voltages over a period of five
seconds for sensor #3 and #4 are shown in Table III for all
three objects. It can be easily observed from the sensor data
that the weight of the golf ball is much higher comparing to
either a ping pong ball or a cotton ball.
TABLE III. WEIGHT MEASUREMENTS
B. Comparison of Stiffness
Stiffness is an important property of an object that can be
obtained using the sense of touch. Stiffness is defined as the
extent to which an object resists deformation in response to
an applied force. Ideally, stiffness of the object being
identified by the NAO robot should be calculated as:
Fx,
where F is the force applied on the object and x is the amount
of deformation of the object surface at the contact point.
Unfortunately the positions of the fingertips of the NAO
robot cannot be obtained programmatically, which means the
measurement of object deformation using the penetration of
the fingertip in the object is very difficult to implement if not
impossible. Therefore, the sensor measurements were utilized
to characterize stiffness.
The NAO robot’s right hand was open for the weight test.
When the stiffness test started immediately after the weight
data were logged, the robot was commanded to secure the
ball using its left hand from above and then close its right
hand slowly. The assistance by the left hand was necessary,
especially for the ping pong ball which could have easily
slipped out of the NAO robot’s hand. Display of sensor
voltage readings and corresponding forces for a ping pong
ball is shown in Fig. 12.
Figure 12. Display of stiffness test result for a ping pong ball.
As can be observed from Fig. 12, the sensor on the thumb
(labelled as sensor #5) and the one at the center of the left
finger (labelled as sensor #4) showed high voltages but the
others showed zero voltages. This can be explained by how
the ping pong ball was grasped by the NAO robot’s right
hand. The ping pong ball was mainly between the thumb
and the left finger while the right finger was almost simply
resting on the surface of the ball. Due to the relative size of
the ball versus the hand, the finger tips were slightly above
the ball.
Object
Sensor #3
Average Voltage (V)
Sensor #4
Average Voltage (V)
Golf Ball 0.08 0.24
Ping Pong Ball 0.00 0.00
Cotton Ball 0.08 0.07
986
Sensor voltages from stiffness test are shown in Table IV
for all three objects. Although both the ping pong ball and the
cotton ball are very light as mentioned in Section IV-A, the
sensor voltages in Table IV showed their difference in
stiffness clearly, which was used for object identification
later. The sensor voltages for the golf ball were even higher
than those for the ping pong ball, which met our expectation.
TABLE IV. STIFFNESS MEASUREMENTS
In our experiment with only three objects: a golf ball, a
ping pong ball, and a cotton ball, the average of all sensor
measurements is sufficient to serve as an indicator of
stiffness. However, for objects with less difference in
stiffness, the sensor measurements should be analyzed more
selectively.
C. Comparison of Roughness
Due to the fact that currently only the right hand of the
NAO robot is equipped with the FlexiForce sensors, we
programmed the robot to put the ball in his left hand
immediately after stiffness data were logged in the database.
When the left hand grasped the ball firmly, the right hand
started stroking the surface of the ball with the tip of one
finger.
The interpretation of tactile sensor measurements for the
roughness of object surface is more challenging than weight
and stiffness. The research in surface texture discrimination
by robots has been advanced by both the development of
tactile sensing arrays and algorithms for temporal or
spatiotemporal analysis of the sensor data [9][10]. The Fast
Fourier Transform (FFT) was performed on the voltage data
collected from the sensor mounted on the tip of the finger
that stroked the object surface. By comparing the spectrum
of the golf ball data and that of the ping pong ball data
shown in Fig. 13, we noticed significant difference at the
high end of the frequency range. The sampling rate is
limited to 25 Hz by the data rate of the RF module. The
magnitude at the frequencies close to 12.5 Hz (half of the
sampling frequency) for the golf ball, which has a rough
surface, is obviously higher than that of the ping pong ball,
which has a smooth surface. Therefore, the magnitude of the
FFT at the highest frequency 12.5 Hz was logged in the
database for each object as the indicator of roughness.
(a)
(b)
Figure 13. Roughness test results: (a) golf ball; (b) ping pong ball.
D. Object Identification
The NAO robot asked for the name of the object after the
weight, stiffness and roughness data were all logged in the
data base. The learning process was repeated for all three
objects. The main application continued with object
identification following the learning process. The NAO
robot was programmed to ask the user to give it a ball. After
a ball was randomly selected and placed in its right hand, it
was able to identify whether it was a golf ball, a ping pong
ball, or a cotton ball.
V. CONCLUSIONS AND FUTURE RESEARCH
A tactile sensing system with five FlexiForce sensors and
wireless communication was successfully integrated to the
NAO humanoid robot. Active exploration behaviors were
programmed on the NAO robot and software interpretation
of the sensor voltages were implemented for weight,
stiffness, and roughness respectively. The NAO robot was
able to learn these properties of a golf ball, a ping pong ball
and a cotton ball, and identify them based on their
differences.
Future research includes investigation of more advanced
tactile sensing technology such as MEMS tactile sensor
arrays; improvement of hardware integration, for example,
using wearable LilyPad microcontrollers; integration of
tactile sensing with the NAO robot’s existing visual object
recognition capability; and last but not least, evaluation of
the accuracy of tactile-sensing based object identification
with testing on a large variety of objects with different
weight, stiffness and roughness.
REFERENCES
[1] R. D. Howe, “Tactile sensing and control of robotic manipulation,” J.
Adv. Robot., vol. 8, no. 3, pp. 245-261, 1994.
Object
Sensor Voltages (V)
#1 #2 #3 #4 #5
Golf Ball 0.03 0.61 0.01 2.05 4.99
Ping Pong Ball 0.0 0.0 0.0 2.12 3.89
Cotton Ball 0.02 0.02 0.02 0.02 0.02
987
[2] J. S. Son, “Integration of Tactile Sensing and Robot Hand Control,”
Ph.D. dissertation, School of Engineering and Applied Sciences,
Harvard Univ., Cambridge, MA, 1996.
[3] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, “Human-
machine interaction through object using robot arm with tactile
sensors,” in Proc. 17th IEEE Int. Symp. Robot Human Interactive
Commun., Munich, Germany, 2008, pp. 683-688.
[4] M. Ohka, H. Kobayashi, J. Takata, and Y. Mitsuya, “Sensing precision
of an optical three-axis tactile sensor for a robotic finger,” in Proc.
15th IEEE Int. Symp. Robot Human Interactive Commun., Hatfield,
U.K., 2006, pp. 214-219.
[5] R. Kageyama, S. Kagami, M. Inaba, and H. Inoue, “Development of
soft and distributed tactile sensors and the application to a humanoid
robot,” in Proc. IEEE Int. Conf. Systems, Man, and Cybernetics,
Tokyo, Japan, 1999, pp. 981-986.
[6] P. Mittendorfer and G. Cheng, “Humanoid multimodal tactile-sensing
modules,” IEEE Trans. Robotics, vol. 27, no. 3, pp. 401-410, 2011.
[7] R. S. Dahiya, G. Metta, M. Valle, and G. Sandini, “Tactile sensing –
from humans to humanoids,” IEEE Trans. Robotics, vol. 21, pp. 1–20,
Feb. 2010.
[8] Tekscan Inc., FlexiForce Sensors User Manual, 2008.
http://www.tekscan.com/pdf/FlexiForce-Sensors-Manual.pdf
[9] H. B. Muhammad, C. Recchiuto, C. M. Oddo, L. Beccai, C. J.
Anthony, M. J. Adams, M. C. Carrozza, and M. C. L. Ward, “A
capacitive tactile sensor array for surface texture discrimination,”
Microelectronic Engineering, vol. 88, Jan. 2011, pp. 1811–1813.
[10] C. J. Cascio and K. Sathian, “Temporal cues contribute to tactile
perception of roughness,” J. Neurosci., vol. 21, no. 14, pp. 5289-5296,
2001.
988

Contenu connexe

Tendances

ECCV2010: feature learning for image classification, part 0
ECCV2010: feature learning for image classification, part 0ECCV2010: feature learning for image classification, part 0
ECCV2010: feature learning for image classification, part 0zukun
 
google tango technology ppt
google tango technology pptgoogle tango technology ppt
google tango technology pptRUPESHKUMAR633
 
Microsoft Kinect and Kinect SDK
Microsoft Kinect and Kinect SDKMicrosoft Kinect and Kinect SDK
Microsoft Kinect and Kinect SDKSiraj Memon
 
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...Омские ИТ-субботники
 
Project tango
Project tangoProject tango
Project tangosabi_123
 
Google Project Tango
Google Project TangoGoogle Project Tango
Google Project TangoAkhil Nair
 
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ..."Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...Edge AI and Vision Alliance
 
Xbox 360 Kinect
Xbox 360 Kinect  Xbox 360 Kinect
Xbox 360 Kinect Saif Pathan
 
Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Harsha Madusankha
 
Google project soli report
Google project soli reportGoogle project soli report
Google project soli reportSunil Havani
 

Tendances (20)

ECCV2010: feature learning for image classification, part 0
ECCV2010: feature learning for image classification, part 0ECCV2010: feature learning for image classification, part 0
ECCV2010: feature learning for image classification, part 0
 
Project soli
Project soliProject soli
Project soli
 
google tango technology ppt
google tango technology pptgoogle tango technology ppt
google tango technology ppt
 
Microsoft Kinect and Kinect SDK
Microsoft Kinect and Kinect SDKMicrosoft Kinect and Kinect SDK
Microsoft Kinect and Kinect SDK
 
Input devices
Input devicesInput devices
Input devices
 
Aijaz tango
Aijaz tangoAijaz tango
Aijaz tango
 
Week14 final
Week14 finalWeek14 final
Week14 final
 
Project soli:
Project soli:Project soli:
Project soli:
 
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...
2015 09-05 04 Андрей Аржанников. Project Tango - новые возможности мобильной ...
 
Project Tango
Project TangoProject Tango
Project Tango
 
Project tango
Project tangoProject tango
Project tango
 
Sympro
SymproSympro
Sympro
 
Project soli
Project  soliProject  soli
Project soli
 
Google Project Tango
Google Project TangoGoogle Project Tango
Google Project Tango
 
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ..."Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...
"Time of Flight Sensors: How Do I Choose Them and How Do I Integrate Them?," ...
 
Tango
TangoTango
Tango
 
Xbox 360 Kinect
Xbox 360 Kinect  Xbox 360 Kinect
Xbox 360 Kinect
 
Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...Google project tango - Giving mobile devices a human scale understanding of s...
Google project tango - Giving mobile devices a human scale understanding of s...
 
Google project soli report
Google project soli reportGoogle project soli report
Google project soli report
 
Project soli
Project soliProject soli
Project soli
 

En vedette

Robots Need Game Designers (C. Boudier / N. Rigaud)
Robots Need Game Designers (C. Boudier / N. Rigaud)Robots Need Game Designers (C. Boudier / N. Rigaud)
Robots Need Game Designers (C. Boudier / N. Rigaud)Nicolas Rigaud
 
Introduction au robot Nao
Introduction au robot NaoIntroduction au robot Nao
Introduction au robot Naobbourgois
 
NAO Robot workshop for kids (english)
NAO Robot workshop for kids (english)NAO Robot workshop for kids (english)
NAO Robot workshop for kids (english)Nicolas Rigaud
 
[SGPKOR] PLATFORM VS SERVICE
[SGPKOR] PLATFORM VS SERVICE[SGPKOR] PLATFORM VS SERVICE
[SGPKOR] PLATFORM VS SERVICEGAMENEXT Works
 
Prezentáció
PrezentációPrezentáció
Prezentációpintadam
 
Devoxx4Kids workshop - Programming a humanoid robot - english version
Devoxx4Kids workshop - Programming a humanoid robot - english versionDevoxx4Kids workshop - Programming a humanoid robot - english version
Devoxx4Kids workshop - Programming a humanoid robot - english versionNicolas Rigaud
 
NAO/Pepper 開発環境 について
NAO/Pepper 開発環境 についてNAO/Pepper 開発環境 について
NAO/Pepper 開発環境 についてTakuji Kawata
 
Devoxx4Kids NAO Workshop
Devoxx4Kids NAO WorkshopDevoxx4Kids NAO Workshop
Devoxx4Kids NAO WorkshopStephen Chin
 
Pepper アプリデベロッパーのための NAO アプリ開発講座1
Pepper アプリデベロッパーのための NAO アプリ開発講座1Pepper アプリデベロッパーのための NAO アプリ開発講座1
Pepper アプリデベロッパーのための NAO アプリ開発講座1Takuji Kawata
 
Programming NAO the humanoid robot
Programming NAO the humanoid robotProgramming NAO the humanoid robot
Programming NAO the humanoid robotElise Devaux
 
Free Download Powerpoint Slides
Free Download Powerpoint SlidesFree Download Powerpoint Slides
Free Download Powerpoint SlidesGeorge
 

En vedette (14)

Robots Need Game Designers (C. Boudier / N. Rigaud)
Robots Need Game Designers (C. Boudier / N. Rigaud)Robots Need Game Designers (C. Boudier / N. Rigaud)
Robots Need Game Designers (C. Boudier / N. Rigaud)
 
Introduction au robot Nao
Introduction au robot NaoIntroduction au robot Nao
Introduction au robot Nao
 
NAO Robot workshop for kids (english)
NAO Robot workshop for kids (english)NAO Robot workshop for kids (english)
NAO Robot workshop for kids (english)
 
[SGPKOR] PLATFORM VS SERVICE
[SGPKOR] PLATFORM VS SERVICE[SGPKOR] PLATFORM VS SERVICE
[SGPKOR] PLATFORM VS SERVICE
 
Ceng4480 a1
Ceng4480 a1Ceng4480 a1
Ceng4480 a1
 
Humanoid robots
Humanoid robotsHumanoid robots
Humanoid robots
 
Prezentáció
PrezentációPrezentáció
Prezentáció
 
A robotok mozgása
A robotok mozgásaA robotok mozgása
A robotok mozgása
 
Devoxx4Kids workshop - Programming a humanoid robot - english version
Devoxx4Kids workshop - Programming a humanoid robot - english versionDevoxx4Kids workshop - Programming a humanoid robot - english version
Devoxx4Kids workshop - Programming a humanoid robot - english version
 
NAO/Pepper 開発環境 について
NAO/Pepper 開発環境 についてNAO/Pepper 開発環境 について
NAO/Pepper 開発環境 について
 
Devoxx4Kids NAO Workshop
Devoxx4Kids NAO WorkshopDevoxx4Kids NAO Workshop
Devoxx4Kids NAO Workshop
 
Pepper アプリデベロッパーのための NAO アプリ開発講座1
Pepper アプリデベロッパーのための NAO アプリ開発講座1Pepper アプリデベロッパーのための NAO アプリ開発講座1
Pepper アプリデベロッパーのための NAO アプリ開発講座1
 
Programming NAO the humanoid robot
Programming NAO the humanoid robotProgramming NAO the humanoid robot
Programming NAO the humanoid robot
 
Free Download Powerpoint Slides
Free Download Powerpoint SlidesFree Download Powerpoint Slides
Free Download Powerpoint Slides
 

Similaire à Wireless Tactile Sensing for NAO Robot

Figure 1
Figure 1Figure 1
Figure 1butest
 
Figure 1
Figure 1Figure 1
Figure 1butest
 
Figure 1
Figure 1Figure 1
Figure 1butest
 
IRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET Journal
 
Implementation of humanoid robot with using the concept of synthetic brain
Implementation of humanoid robot with using the concept of synthetic brainImplementation of humanoid robot with using the concept of synthetic brain
Implementation of humanoid robot with using the concept of synthetic braineSAT Journals
 
Implementation of humanoid robot with using the
Implementation of humanoid robot with using theImplementation of humanoid robot with using the
Implementation of humanoid robot with using theeSAT Publishing House
 
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOT
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOTA Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOT
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOTIRJET Journal
 
Tool Use Learning for a Real Robot
Tool Use Learning for a Real RobotTool Use Learning for a Real Robot
Tool Use Learning for a Real RobotIJECEIAES
 
GILS: Automatic Security and Gas Detection Robot
GILS: Automatic Security and Gas Detection RobotGILS: Automatic Security and Gas Detection Robot
GILS: Automatic Security and Gas Detection RobotIRJET Journal
 
humanoid final ppt.pptx based on the servo motor
humanoid final ppt.pptx based on the servo motorhumanoid final ppt.pptx based on the servo motor
humanoid final ppt.pptx based on the servo motortulsamma584101
 
Wireless and uninstrumented communication by gestures for deaf and mute based...
Wireless and uninstrumented communication by gestures for deaf and mute based...Wireless and uninstrumented communication by gestures for deaf and mute based...
Wireless and uninstrumented communication by gestures for deaf and mute based...IOSR Journals
 
IRJET- Can Robots are Changing the World: A Review
IRJET- Can Robots are Changing the World: A ReviewIRJET- Can Robots are Changing the World: A Review
IRJET- Can Robots are Changing the World: A ReviewIRJET Journal
 
Development of Pick and Place Robot for Industrial Applications
Development of Pick and Place Robot for Industrial ApplicationsDevelopment of Pick and Place Robot for Industrial Applications
Development of Pick and Place Robot for Industrial ApplicationsIRJET Journal
 
Human-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotHuman-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotAngela Williams
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
 
Smart Robotic Assistant using IOT
Smart Robotic Assistant using IOTSmart Robotic Assistant using IOT
Smart Robotic Assistant using IOTIRJET Journal
 
IRJET - A Locomotive Voice-Based Assistant using Raspberry Pi
IRJET -  	  A Locomotive Voice-Based Assistant using Raspberry PiIRJET -  	  A Locomotive Voice-Based Assistant using Raspberry Pi
IRJET - A Locomotive Voice-Based Assistant using Raspberry PiIRJET Journal
 
Page 1 of 14 ENS4152 Project Development Proposal a.docx
Page 1 of 14  ENS4152 Project Development Proposal a.docxPage 1 of 14  ENS4152 Project Development Proposal a.docx
Page 1 of 14 ENS4152 Project Development Proposal a.docxkarlhennesey
 

Similaire à Wireless Tactile Sensing for NAO Robot (20)

H011124050
H011124050H011124050
H011124050
 
Figure 1
Figure 1Figure 1
Figure 1
 
Figure 1
Figure 1Figure 1
Figure 1
 
Figure 1
Figure 1Figure 1
Figure 1
 
IRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled Robot
 
Implementation of humanoid robot with using the concept of synthetic brain
Implementation of humanoid robot with using the concept of synthetic brainImplementation of humanoid robot with using the concept of synthetic brain
Implementation of humanoid robot with using the concept of synthetic brain
 
Implementation of humanoid robot with using the
Implementation of humanoid robot with using theImplementation of humanoid robot with using the
Implementation of humanoid robot with using the
 
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOT
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOTA Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOT
A Research Paper on HUMAN MACHINE CONVERSATION USING CHATBOT
 
Tool Use Learning for a Real Robot
Tool Use Learning for a Real RobotTool Use Learning for a Real Robot
Tool Use Learning for a Real Robot
 
GILS: Automatic Security and Gas Detection Robot
GILS: Automatic Security and Gas Detection RobotGILS: Automatic Security and Gas Detection Robot
GILS: Automatic Security and Gas Detection Robot
 
humanoid final ppt.pptx based on the servo motor
humanoid final ppt.pptx based on the servo motorhumanoid final ppt.pptx based on the servo motor
humanoid final ppt.pptx based on the servo motor
 
Wireless and uninstrumented communication by gestures for deaf and mute based...
Wireless and uninstrumented communication by gestures for deaf and mute based...Wireless and uninstrumented communication by gestures for deaf and mute based...
Wireless and uninstrumented communication by gestures for deaf and mute based...
 
IRJET- Can Robots are Changing the World: A Review
IRJET- Can Robots are Changing the World: A ReviewIRJET- Can Robots are Changing the World: A Review
IRJET- Can Robots are Changing the World: A Review
 
Development of Pick and Place Robot for Industrial Applications
Development of Pick and Place Robot for Industrial ApplicationsDevelopment of Pick and Place Robot for Industrial Applications
Development of Pick and Place Robot for Industrial Applications
 
Human-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotHuman-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation Robot
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AI
 
Smart Robotic Assistant using IOT
Smart Robotic Assistant using IOTSmart Robotic Assistant using IOT
Smart Robotic Assistant using IOT
 
IRJET - A Locomotive Voice-Based Assistant using Raspberry Pi
IRJET -  	  A Locomotive Voice-Based Assistant using Raspberry PiIRJET -  	  A Locomotive Voice-Based Assistant using Raspberry Pi
IRJET - A Locomotive Voice-Based Assistant using Raspberry Pi
 
Shamira-IIECE.pptx
Shamira-IIECE.pptxShamira-IIECE.pptx
Shamira-IIECE.pptx
 
Page 1 of 14 ENS4152 Project Development Proposal a.docx
Page 1 of 14  ENS4152 Project Development Proposal a.docxPage 1 of 14  ENS4152 Project Development Proposal a.docx
Page 1 of 14 ENS4152 Project Development Proposal a.docx
 

Wireless Tactile Sensing for NAO Robot

  • 1.  Abstract — The new-generation humanoid robots, such as the NAO robot made by Aldebaran Robotics, are characterized by autonomous learning and close interaction with the environment including humans. While equipped with advanced vision and audio features including object and face recognition, speech recognition, sound source recognition, speech synthesis, etc., the NAO robot’s tactile sensing is limited to several buttons that can be used to trigger associated actions. This paper presents the wireless integration of tactile sensing on the hand of a NAO robot. Without any replacement or modification of its existing hardware, the add-on allows the NAO robot to differentiate objects with similar size, color and shape but different weight, stiffness, or texture. I. INTRODUCTION The “sense of touch” is particularly important among various modalities that are needed to perceive and react to the dynamics of the real world. It allows the assessment of object properties such as size, shape, weight, stiffness, and texture. The new-generation humanoid robots are characterized by autonomous learning and close interaction with their environments including humans. Tactile sensing, combined with vision and/or audio in many cases, enhances multisensory perception of humanoid robots. Considerable research has been conducted on robot tactile sensing [1]-[7], especially on humanoid robots [5]-[7]. The advancements in intelligent humanoid robots provide new opportunities and challenges for tactile sensing. For example, the smaller finger area comparing to robot manipulators, the coordination between vision, touch, and other senses, requirements on the touch sensory system to avoid interference to mobility, etc. The humanoid robot NAO (shown in Fig. 1), made by Aldebaran Robotics, has been used by more than 350 universities and research labs around the world for a wide variety of research topics in robotics as well as computer science, human-machine interaction, and social sciences. While equipped with advanced vision and audio features including object and face recognition, speech recognition, sound source recognition, speech synthesis, etc., the NAO L. G. Ni is an associate professor at the Electrical and Computer Engineering Department, College of Engineering, California Baptist University, Riverside, CA 92504 USA (corresponding author, phone: 951- 343-4470; fax: 951-343-4782; e-mail: gni@calbaptist.edu ). D. P. Kari recently received his BS ECE degree from California Baptist University. He is currently a Master EE degree candidate at the Bourns College of Engineering, University of California - Riverside, CA 92521 USA. (e-mail: David.Kari@calbaptist.edu ). A. Muganza, B. Dushime, A. Zebaze are recent BS ECE, BS ECE and BS ME graduates from California Baptist University. (e-mail: Alex.Muganza@calbaptist.edu, Bertrand.Dushime@calbaptist.edu, Andre.Zebaze@calbaptist.edu ). robot’s tactile sensing is limited to several buttons that can be used to trigger associated actions. The main objective of our project is to enhance the NAO robot’s perception and intelligence by giving it the capability of identifying different objects by their weight, stiffness and texture. Figure 1. The humanoid robot NAO. Several features of the NAO robot make it appealing to our research. With its intuitive graphical interface, the programming software developed by Aldebaran Robotics for the NAO robot called Choregraphe allows easy programming of behaviors required for active exploration of objects using touch sense. For instance, the robot needs to grasp the object with one hand and stroke the object surface with the other hand in order to identify the surface texture. The built-in speech recognition and speech synthesis capabilities of the NAO robot makes the development more interactive, as the robot asks questions about the object it is touching or tells the name of the object based on tactile properties. The built-in visual object recognition capability provides opportunity for fusion of vision and touch in future research. This paper is organized as follows: The system configuration including hardware setup and software components are presented in Section II. The calibration and testing of tactile sensors are discussed in Section III. The experimental results of active exploration and identification of objects are presented in Section IV. Conclusions and directions for future work are summarized in Section V. II. SYSTEM CONFIGURATION From the system integration point of view, the framework of our wireless integration of tactile sensing on the NAO robot can be applied to other commercially Wireless Integration of Tactile Sensing on the Hand of a Humanoid Robot NAO Liya Grace Ni, Senior Member, IEEE, David P. Kari, Member, IEEE, Alex Muganza, Member, IEEE Bertrand Dushime, Member, IEEE, and Andre N. Zebaze 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. September 9-13, 2012. Paris, France. 978-1-4673-4606-1/12/$31.00 ©2012 IEEE 982
  • 2. available humanoid robots or adding other external sensors to the NAO robot. A. Hardware Setup In order to avoid violation of warranty, we have the constraint of not replacing or modifying any existing hardware on the NAO robot. The hardware components we used in the system, besides the NAO robot itself, are listed in Table I, along with simple descriptions of their functionalities, their locations, and mounting methods. Fig. 2 shows how the hardware components are physically mounted on the NAO robot. TABLE I. HARDWARE COMPONENTS Component Name Functionality Location Mounting Method FlexiForce sensors Tactile sensing Fingers Double sticky tape Pinted Circuit Board (PCB) Auxiliary circuit for sensors Upper arm Velcro strap & tape RF Link transmitter Transmits sensor data Back Enclosure box Arduino Mega 2560 w. battery Interface between PCB and RF transmitter Back Enclosure box RF Link receiver Receives sensor data Not on the robot N/A Arduino Uno Interface between RF receiver and computer Not on the robot N/A Computer Processes sensor data; delivers behavioral commands to the NAO robot Not on the robot N/A Figure 2. Hardward mounted on the NAO robot. The configuration of the five FlexiForce Sensors on the NAO robot’s three-fingered right hand is shown in Fig. 3. On both the left and right fingers, one sensor is mounted at the tip and one is at the center. The fifth one is at the tip of the thumb. The same sensor labeling as shown in Fig. 3 will be used later in Section IV. Figure 3. Configuration of the five sensors on NAO’s hand. The data flow in the system is illustrated in Fig. 4. The tactile sensor measurements are sent to the computer wirelessly through the RF module and microcontrollers. The computer analyzes and logs the sensor measurements, and sends speech and behavior commands to the CPU on the NAO robot itself wirelessly. The connection between the NAO robot and the tactile sensors is only a physical attachment without data flow in between. Figure 4. Data flow chart. B. Software Components The software we developed for this project contains several components, as listed in Table II. Choregraphe allows easy capture of the joint angles for the starting and ending positions of each motion we implemented on the NAO robot later for the integration of touch sensing. Fig. 5 shows how the arm angles were captured in Choregraphe. A cross-platform Arduino IDE is used to program the microcontrollers that interface with the RF transmitter and receiver, which communicate with 434 MHz radio frequency signals. With the restriction of a maximum 4800 bits per second (BPS) data rate of the RF module, currently the data packets containing measurements of all five sensors are transmitted at the frequency of 25 Hz. 983
  • 3. TABLE II. SOFTWARE COMPONENTS Component Description Development Language Development Stage Location of Execution Motion recording Choregraphe Preparation Computer Wireless communication C Integration Microcontrollers Main application C# Integration Computer Speech and behavioral modules Python Integration The NAO robot Display of measurements C# Testing Computer Figure 5. Arm angles obtained in Choregraphe. The main application involves a learning process for the NAO robot based on tactile information including weight, stiffness and roughness. Just like how a toddler learns about objects in his/her surroundings, the NAO robot will go through the following steps during its learning process: Step 1. Pick up an object and learn how heavy/light, how hard/soft, and how rough/smooth it is, with measurements from tactile sensors and associated actions. Step 2. Characteristics extracted from measurements are compared with the corresponding features of objects in the database. Decisions are made as follows. - If the actual features of weight, stiffness and roughness are close to the features of the current object in the database, in other words, the absolute values of the differences are below predefined thresholds, say the name of the current object. - If the actual features do not match the features of the current object, and the current object is not the last one in the database, move on to the next object. - If the actual features do not match the features of the current object, and the current object is the last one in the database, go to Step 3. Step 3. Ask the name of the object and add it to the database. The software flow of the main application is shown in Fig. 6. Although the main application was developed in C#, in order to use the NAO SDK to send speech and behavioral commands to the NAO robot, a Python script was written for each action and was invoked in the C# program. Figure 6. The Unified Modeling Language (UML) diagram of software flow in the C# application. A graphical user interface programmed in C# is embedded in the main application to display the weight, stiffness and roughness data during testing and demonstrations. III. SENSOR CALIBRATION AND TESTING A. Design of Printed Circuit Board (PCB) The FlexiForce sensor is an ultra-thin and flexible printed circuit that uses a resistive-based technology. The application of a force to the active sensing area of the sensor results in a change in the resistance of the sensing element in inverse proportion to the force applied. A modified version of the recommended amplifier circuit in the user manual [8] is shown in Fig. 7. 984
  • 4. Figure 7. Amplifier circuit for FlexiForce sensors [8]. The feedback resistance RF as well as the drive voltage VT can be used to adjust the sensitivity of the sensor. A feedback resistance value of 100 kΩ and a drive voltage of -1.5 V were selected in our design. A two-step process was implemented to supply the -1.5 V to the sensors. First, a voltage regulator consisting of two IN914 diodes connected in series and a 240 Ω resistor provides +1.5 V with a 5 V supply from the microcontroller. Next, an ADM660 Switched Capacitor Voltage Converter was used to convert it to -1.5 V. Considering the constraint of the size of PCB in order to mount it on the robot’s upper arm, we chose a quad Op-Amp chip MCP6004 and a dual Op-Amp chip MCP6002 to provide all the Op-Amps needed in the circuit. The layout of the custom PCB is shown in Fig. 8. Figure 8. PCB layout. B. Sensor Calibration Two different methods were used to calibrate the sensor. First, a flat load was applied to the sensor with its weight changed by adding additional mass on top of it. Second, a plastic ball, or spherical load, was used as test object. The weight was also changed by adding additional mass on top of the ball. The sensor was calibrated over a range of 0 to approximately 2 N. The results are shown in Fig. 9. There is a linear relationship for a flat load between the voltage output from the sensor and the force applied to the sensor. The linear equation fits the data with an R2 value (a statistical metric that indicates how close the curve fits the data points) of 0.9875. The values obtained for the spherical loads are higher than the values obtained for the flat loads, likely because of the smaller contact area for the spherical loads. The results indicate that at higher values of weight, the spherical load more closely approximates a flat load because of more even distribution of the load due to compression of the spherical object. Figure 9. Experimental results of FlexiForce sensor measurements (square: flat loads, circle: spherical loads). Next, the FlexiForce sensors were attached on the fingers of the NAO robot and calibration was performed with the following configuration: the robot’s right arm and hand were kept still and a plastic ball was placed in its hand with a fixed position. Measurements were taken from two sensors mounted at the centers of both left and right fingers. Then the sensors were replaced by two other sensors, and so on. The weight of the plastic ball was adjusted by adding water through a hole on its top. The voltage for a particular mass was obtained by averaging the voltages measured by the sensors. Fig. 10 shows the experimental results of the ranges and average voltages of the five sensors versus different weight inputs. Figure 10. Calibration results with sensors on the NAO robot (bar: range of voltages, square: average of voltages). IV. EXPERIMENTAL RESULTS In this preliminary study, three objects, namely a golf ball, a ping pong ball and a cotton ball, which are of similar size, shape and color but different weight, stiffness and roughness, were chosen for our experiments. 985
  • 5. A. Comparison of Weight The NAO robot was programmed to reach out with its right forearm and open its right hand. At the same time, it asked “Give me the ball.” The golf ball was then placed in its right hand. The measurements from the two sensors mounted at the center of the fingers are shown in Fig. 11. The sensors at the finger tips and on the thumb were not pressed due to the size and shape of objects in our experiments, therefore their readings were discarded. A period of five seconds was allotted to complete the test, and the voltage samples from each of the above two sensors throughout the testing period were averaged and logged in the database. A progress bar on the display shows how much of the five-second period has elapsed. As can be observed from Fig. 11, the center of gravity is closer to one of the two fingers during that particular test. Therefore, the average of sensor #3 and sensor #4 voltages is recoded as the indicator of object weight. Figure 11. Display of weight test result for a golf ball. This process was repeated for a ping pong ball and a cotton ball. The average voltages over a period of five seconds for sensor #3 and #4 are shown in Table III for all three objects. It can be easily observed from the sensor data that the weight of the golf ball is much higher comparing to either a ping pong ball or a cotton ball. TABLE III. WEIGHT MEASUREMENTS B. Comparison of Stiffness Stiffness is an important property of an object that can be obtained using the sense of touch. Stiffness is defined as the extent to which an object resists deformation in response to an applied force. Ideally, stiffness of the object being identified by the NAO robot should be calculated as: Fx, where F is the force applied on the object and x is the amount of deformation of the object surface at the contact point. Unfortunately the positions of the fingertips of the NAO robot cannot be obtained programmatically, which means the measurement of object deformation using the penetration of the fingertip in the object is very difficult to implement if not impossible. Therefore, the sensor measurements were utilized to characterize stiffness. The NAO robot’s right hand was open for the weight test. When the stiffness test started immediately after the weight data were logged, the robot was commanded to secure the ball using its left hand from above and then close its right hand slowly. The assistance by the left hand was necessary, especially for the ping pong ball which could have easily slipped out of the NAO robot’s hand. Display of sensor voltage readings and corresponding forces for a ping pong ball is shown in Fig. 12. Figure 12. Display of stiffness test result for a ping pong ball. As can be observed from Fig. 12, the sensor on the thumb (labelled as sensor #5) and the one at the center of the left finger (labelled as sensor #4) showed high voltages but the others showed zero voltages. This can be explained by how the ping pong ball was grasped by the NAO robot’s right hand. The ping pong ball was mainly between the thumb and the left finger while the right finger was almost simply resting on the surface of the ball. Due to the relative size of the ball versus the hand, the finger tips were slightly above the ball. Object Sensor #3 Average Voltage (V) Sensor #4 Average Voltage (V) Golf Ball 0.08 0.24 Ping Pong Ball 0.00 0.00 Cotton Ball 0.08 0.07 986
  • 6. Sensor voltages from stiffness test are shown in Table IV for all three objects. Although both the ping pong ball and the cotton ball are very light as mentioned in Section IV-A, the sensor voltages in Table IV showed their difference in stiffness clearly, which was used for object identification later. The sensor voltages for the golf ball were even higher than those for the ping pong ball, which met our expectation. TABLE IV. STIFFNESS MEASUREMENTS In our experiment with only three objects: a golf ball, a ping pong ball, and a cotton ball, the average of all sensor measurements is sufficient to serve as an indicator of stiffness. However, for objects with less difference in stiffness, the sensor measurements should be analyzed more selectively. C. Comparison of Roughness Due to the fact that currently only the right hand of the NAO robot is equipped with the FlexiForce sensors, we programmed the robot to put the ball in his left hand immediately after stiffness data were logged in the database. When the left hand grasped the ball firmly, the right hand started stroking the surface of the ball with the tip of one finger. The interpretation of tactile sensor measurements for the roughness of object surface is more challenging than weight and stiffness. The research in surface texture discrimination by robots has been advanced by both the development of tactile sensing arrays and algorithms for temporal or spatiotemporal analysis of the sensor data [9][10]. The Fast Fourier Transform (FFT) was performed on the voltage data collected from the sensor mounted on the tip of the finger that stroked the object surface. By comparing the spectrum of the golf ball data and that of the ping pong ball data shown in Fig. 13, we noticed significant difference at the high end of the frequency range. The sampling rate is limited to 25 Hz by the data rate of the RF module. The magnitude at the frequencies close to 12.5 Hz (half of the sampling frequency) for the golf ball, which has a rough surface, is obviously higher than that of the ping pong ball, which has a smooth surface. Therefore, the magnitude of the FFT at the highest frequency 12.5 Hz was logged in the database for each object as the indicator of roughness. (a) (b) Figure 13. Roughness test results: (a) golf ball; (b) ping pong ball. D. Object Identification The NAO robot asked for the name of the object after the weight, stiffness and roughness data were all logged in the data base. The learning process was repeated for all three objects. The main application continued with object identification following the learning process. The NAO robot was programmed to ask the user to give it a ball. After a ball was randomly selected and placed in its right hand, it was able to identify whether it was a golf ball, a ping pong ball, or a cotton ball. V. CONCLUSIONS AND FUTURE RESEARCH A tactile sensing system with five FlexiForce sensors and wireless communication was successfully integrated to the NAO humanoid robot. Active exploration behaviors were programmed on the NAO robot and software interpretation of the sensor voltages were implemented for weight, stiffness, and roughness respectively. The NAO robot was able to learn these properties of a golf ball, a ping pong ball and a cotton ball, and identify them based on their differences. Future research includes investigation of more advanced tactile sensing technology such as MEMS tactile sensor arrays; improvement of hardware integration, for example, using wearable LilyPad microcontrollers; integration of tactile sensing with the NAO robot’s existing visual object recognition capability; and last but not least, evaluation of the accuracy of tactile-sensing based object identification with testing on a large variety of objects with different weight, stiffness and roughness. REFERENCES [1] R. D. Howe, “Tactile sensing and control of robotic manipulation,” J. Adv. Robot., vol. 8, no. 3, pp. 245-261, 1994. Object Sensor Voltages (V) #1 #2 #3 #4 #5 Golf Ball 0.03 0.61 0.01 2.05 4.99 Ping Pong Ball 0.0 0.0 0.0 2.12 3.89 Cotton Ball 0.02 0.02 0.02 0.02 0.02 987
  • 7. [2] J. S. Son, “Integration of Tactile Sensing and Robot Hand Control,” Ph.D. dissertation, School of Engineering and Applied Sciences, Harvard Univ., Cambridge, MA, 1996. [3] K. Suwanratchatamanee, M. Matsumoto, and S. Hashimoto, “Human- machine interaction through object using robot arm with tactile sensors,” in Proc. 17th IEEE Int. Symp. Robot Human Interactive Commun., Munich, Germany, 2008, pp. 683-688. [4] M. Ohka, H. Kobayashi, J. Takata, and Y. Mitsuya, “Sensing precision of an optical three-axis tactile sensor for a robotic finger,” in Proc. 15th IEEE Int. Symp. Robot Human Interactive Commun., Hatfield, U.K., 2006, pp. 214-219. [5] R. Kageyama, S. Kagami, M. Inaba, and H. Inoue, “Development of soft and distributed tactile sensors and the application to a humanoid robot,” in Proc. IEEE Int. Conf. Systems, Man, and Cybernetics, Tokyo, Japan, 1999, pp. 981-986. [6] P. Mittendorfer and G. Cheng, “Humanoid multimodal tactile-sensing modules,” IEEE Trans. Robotics, vol. 27, no. 3, pp. 401-410, 2011. [7] R. S. Dahiya, G. Metta, M. Valle, and G. Sandini, “Tactile sensing – from humans to humanoids,” IEEE Trans. Robotics, vol. 21, pp. 1–20, Feb. 2010. [8] Tekscan Inc., FlexiForce Sensors User Manual, 2008. http://www.tekscan.com/pdf/FlexiForce-Sensors-Manual.pdf [9] H. B. Muhammad, C. Recchiuto, C. M. Oddo, L. Beccai, C. J. Anthony, M. J. Adams, M. C. Carrozza, and M. C. L. Ward, “A capacitive tactile sensor array for surface texture discrimination,” Microelectronic Engineering, vol. 88, Jan. 2011, pp. 1811–1813. [10] C. J. Cascio and K. Sathian, “Temporal cues contribute to tactile perception of roughness,” J. Neurosci., vol. 21, no. 14, pp. 5289-5296, 2001. 988