SlideShare une entreprise Scribd logo
1  sur  5
Télécharger pour lire hors ligne
Journal of International Council on Electrical Engineering Vol. 1, No. 2, pp. 151~155, 2011 151


                         A Study on Robot Motion Authoring System using
                                     Finger-Robot Interaction

                                  Kwang-Ho Seok *, Junho Ko ** and Yoon Sang Kim †

           Abstract – This paper proposes a robot motion authoring system using finger-robot interaction. The
           proposed robot motion authoring tool is a user-friendly method that easily authors creates and controls
           robot motion according to the number of fingers. Furthermore, the system can be used to simulate user-
           created robot contents in the 3D virtual environment. This allows the user to not only view the authoring
           process in real time but also transmit the final authored contents. The effectiveness of the proposed
           motion authoring system was verified based on motion authoring simulation of an industrial robot.

           Keywords: Finger-robot interaction, Industrial robot, Robot motion authoring system



                       1. Introduction                                   This paper proposes a robot motion authoring system
                                                                       using finger-robot interaction. The proposed system is
   With rapid developments in today's robot technology,                capable of authoring (creating and controlling) robot
robots have been used in a wide range of fields. As the                motion according to the number of fingers. It is an intuitive
increasing number of applications adopts intelligence, and             robot motion authoring method that allows the user to
thus robot requires a new type of robot software                       easily author robot motions. The effectiveness of the
environment that allows user to author various commands                proposed method is verified based on motion authoring
for robots. However, robots must become easier to use for              simulation of an actual industrial robot.
general users in order to expand the scope of applications of
intelligent robots in our everyday life. Based on this request,
a number of studies are being conducted regarding new                  2. Proposed Robot Motion Authoring System using
types of user-friendly and intuitive robot motion authoring                        Finger-Robot Interaction
that can render various motions [1]-[4]. Also, there are
research projects actively taking place in human-robot                    With the exception of language, the hand is most
interaction, thanks to the progress that has been made in              frequently used for human communication among our body
vision technology. A study is being conducted to develop a             parts such as hands, eyes, mouth, arms and legs. This
finger pad that can replace the computer mouse [5], and                chapter explains how industrial robot motions can be the
there are other studies proposing algorithms to improve                authored according to the number of fingers and verifies the
hand gesture recognition [6]-[8]. However, user-intuitive or           effectiveness of the proposed approach based on simulation.
user-friendly robot command authoring tools that focus on              Fig. 1 illustrates the overall authoring process of the robot
facilitating general users are still rare. The lack of user-           motion authoring system implemented for this study.
intuitive or user-friendly tools is likely to create a barrier
for providing robot services (commands/motions) that
correspond with the preferences and demands of general
users including children, the elderly and housewives, who
are expected to be the major clients in the service robot
industry.
†   Corresponding Author: School of Computer Science and
    Engineering, Korea University of Technology and Education, Korea
    (yoonsang@kut.ar.kr)
*   Dept. of Computer Science and Engineering, Korea University of
    Technology and Education, Korea (mono@kut.ac.kr)
** Dept. of Computer Science and Engineering, Korea University of                Fig. 1. Robot motion authoring system.
    Technology and Education, Korea (lich126@kut.ac.kr)
Received: January 17, 2011; Accepted: March 23, 2011
152                             A Study on Robot Motion Authoring System using Finger-Robot Interaction


2.1 Finger recognition                                               provides robot’s external view. In the ZOOM/VIEW mode,
                                                                     the view can be zoomed in/out and rotated, allowing the
   There are a number of techniques for finger recognition,          user to view robot’s entire structure in detail. The
which is being used in various fields. In this section, finger       simulation panel provides the user with a real time 3D
recognition unit is implemented using color values scheme            viewing of the robot motion being authored with finger-
[8-9]. The finger recognition unit first converts RGB colors         robot interaction. The authored contents(motions) can be
into gray scales and YCrCb for binary representation. Then           transmitted to control the robot by clicking TRANSMISSION
the region inside the hand is filled by masking and noise is         button, and the actual robot motion can be stopped using
removed. The binary image is examined in 25-pixel units,             STOP button. The data panel provides the virtual robot
as shown in Fig. 2. If sum of 1’s examined is greater than           motion degrees and PWMs to the actual industrial robot in
ten, every digit is set to 1. With the image obtained by             real time. The data panel provides RESET, PAUSE,
masking performed by the finger recognition unit, the                RESTART button. If the RESET button is clicked, the robot
center of the hand can be calculated to identify hand’s              is initialized to the default value. If the RESTART button is
location as well as the hand region furthest from the center.        clicked, the robot is restarted.
The center of the hand is marked with a red dot, the palm
region with a red circle and the hand region with a white
circle. If the number of fingers is 0 or 5, a circle image is
displayed. For 1, 2 and 3 fingers, the image is shown in
blue, green and red, respectively. If there are 4 fingers, the
hand is shown in white on a black background (Fig. 2).




                                                                                  Fig. 3. Robot motion authoring tool.

             Fig. 2. Finger recognition process.                     2.3 Virtual Robot Motion authoring simulation

2.2 Robot Motion Authoring Tool                                        This section evaluates the effectiveness of the proposed
                                                                     authoring method based on robot motion authoring
                                                                     simulation. The virtual industrial robot used for the robot
   In this section, a motion authoring unit capable of               motion authoring simulation is an articulated robot with 6
creating and controlling robot motions (industrial robot in          DOF, and its motion ranges and definition are given in
this paper) is implemented using the finger-robot                    Table 1 and Fig. 4.
interaction based on the finger recognition unit explained in
the previous section. The motion authoring unit(editor)               Table 1. Motion Range
consists of four panels: the finger interface panel, mode
                                                                           AXIS                           ANGLE (degree)
panel, simulation panel and data panel(Fig. 3). The finger
                                                                              1-Axis                        0 ~ -360
interface panel means the finger recognition unit that
                                                                              2-Axis                      -60 ~ 60
recognizes the number of fingers (from 0 to 5) from the
                                                                              3-Axis                      -90 ~ 90
finger images taken by camera. The number of fingers
                                                                              4-Axis                        0 ~ -60
recognized by the finger interface panel allows the user to
                                                                              5-Axis                        0 ~ -360
control the industrial robots (such as SCARA, articulated
                                                                              6-Axis                        0 ~ 90
robot, and etc.) displayed on the simulation panel. The
mode panel provides three modes – WIRE, SOLID and
                                                                       Fig. 4 depicts the virtual and actual articulated robot for
ZOOM/VIEW. The mode panel provides TRANSMISSION
                                                                     the motion authoring simulation respectively. The motion
button and STOP button. The WIRE mode allows viewing
                                                                     ranges and directions of the virtual articulated robot are
of robot’s internal structure and the SOLID mode only
                                                                     almost similar to ones of the actual robot.
Kwang-Ho Seok, Junho Ko and Yoon Sang Kim                                      153




                                                                    Fig. 5. The motion authoring simulation in WIRE and
Fig. 4. Definition of an articulated robot used for the                     SOLID mode.
        motion authoring simulation.
                                                                      Fig. 6(a) depicts how a robot rotation part is enlarged in
   In WIRE and SOLID mode, internal and external                    the ZOOM/VIEW mode when the number of finger is
structures of the robot can be viewed, respectively. In either      recognized as 2 and 3. Also, Fig. 6(b) depicts how a robot
mode, if the number of fingers recognized by the finger             part is downsized in the ZOOM/VIEW mode when the
interface is 0 or 4, 1-Axis or 5-Axis rotates around the Y          number of finger is recognized as 1.
axis. For 1, 2, 3 fingers, 2-Axis, 3-Axis, 4-Axis rotates
around the Z axis, respectively(Fig. 5). The ZOOM/VIEW
mode allows the user to zoom in and out of the robot
structure and vary the camera angle for a detailed view.
Similar to the WIRE mode, the camera is rotated clockwise
and counter-clockwise for 3 and 4 fingers, respectively,
according to the definitions of Table 3 so that the user can
view the robot from a desired angle. Accordingly, the user
is able to not only create but control motions for a part of a      Fig. 6. (a) Enlarged robot rotation part in ZOOM/VIEW
robot based on the number of fingers recognized.                             mode with 2 and 3 recognized (b) downsized robot
    Table 2 and 3 list the definitions of finger-value events                in ZOOM/VIEW mode with 1 recognized
for WIRE, SOLID and ZOOM/VIEW mode.

Table 2. Definitions of Finger-value event for WIRE and
                                                                          3. Experimental Results and Discussions
         Solid mode
                                                                    3.1 Virtual Robot Experimental Result
      MODE            FINGER                 EVENT
                         0         1-Axis rotation (base Y)
                         1         2-Axis rotation (base Z)
                                                                       The robot motion authoring system proposed in this
     A. WIRE             2         3-Axis rotation (base Z)         paper was implemented on a personal computer with
                         3         4-Axis rotation (base Z)         1GByte memory, 1.80GHz AMD AthlonTM 64-bit processor
     B. SOLID                                                       CPU and ATI Radeon X1600 graphic card that supports
                         4         5-Axis rotation (base Y)
                                   6-Axis rotation                  DirectX. As well, Logitech Webcam Pro 9000 was used.
                         5
                                   (Gripper ON/OFF)                 The robot motion authoring tool in this paper was
                                                                    implemented with OPENGL, the open source programming
Table 3. Definitions of Finger-value events for ZOOM/VIEW           library.
         mode                                                          Authoring was effective because different colors were
                                                                    displayed according to the number of fingers recognized,
      MODE            FINGER                 EVENT                  allowing the user to immediately discern the part of the
                         0         default                          robot being authored. Fig. 7 is a captured image of the robot
                         1         ZOOM OUT                         motion authoring process using the proposed method. The
  C. ZOOM/VIEW           2         ZOOM IN                          experimental results confirmed that the robot motion
                         3         VIEW rotation (CW)               authoring system proposed by this study provides general
                         4         VIEW rotation (CCW)              users with an intuitive, fun and easy way to create motions.
154                             A Study on Robot Motion Authoring System using Finger-Robot Interaction


                                                                      Table 4 . NT-Robot Arm Specification [10]

                                                                                             SPECIFICATION
                                                                             Speed         6V servomotor
                                                                             Voltage       DC 6V
                                                                              Size         100×ø550
                                                                             Power         16.1Kg-cm (max)
                                                                             Weight        1 Kg
                                                                            Structure      6 axis + gripper
                                                                          Center distant   130m/m
                                                                             Current       12.5 A


                                                                       The purpose of experiment in 3.2 was to confirm whether
                                                                     or not robot motions authored by the proposed tool works
                                                                     with an actual robot well. The motions authored by the tool
                                                                     were applied to an actual robot for the experiment (Fig. 9).
                                                                     The motion data was transmitted to the robot using serial
                                                                     communication and it was examined how the contents
                                                                     produced by the robot motion authoring tool controlled the
                                                                     robot according to user’s intention.




Fig. 7. Robot motion authoring process using the proposed
        method.

3.2 Actual Robot Experimental Result

  Actual articulated robot is an miniature-type robot (NT-
Robot Arm product of NTREX CO., LTD) 7 RC-servo
motors were used, and 5 ball casters were used on the
BASE part for smooth turning. The gripper can seize an
                                                                           Fig. 9. Experiment results using the actual robot.
object of maximum 80mm. For control, the robot arm can
be easily controlled by a computer or embedded board by
using NT-SERVO-16CH-1 and NT-USB2UART (Fig. 8).                                              4. Conclusion
Table 4 depicts NT-Robot Arm specification.
                                                                        This paper proposed an authoring system capable of
                                                                     creating and controlling motions of industrial robots based
                                                                     on finger-robot interaction. The proposed system is user-
                                                                     friendly and intuitive and facilitates motion authoring of
                                                                     industrial robots using fingers, which is second only to
                                                                     language in terms of means of communication. The
                                                                     proposed authoring system also uses graphic motion data to
                                                                     simulate user-created robot contents in a 3D virtual
                                                                     environment so that the user can view the authoring process
                                                                     in real time and transmit the authored robot contents to
                                                                     control the robot. The validity of the proposed robot motion
         Fig. 8. Actual articulated robot structure.                 authoring system was examined based on simulations using
Kwang-Ho Seok, Junho Ko and Yoon Sang Kim                                    155



the authored motion robot contents as well as experiments           University of Technology and Education (KUT), Cheonan,
of actual robot motions. The proposed robot motion                  Korea. His current research interests include immersive
authoring method is expected to provide user-friendly and           virtual reality, human robot interaction, and power system.
intuitive solutions for not only various industrial robots, but
also other types of robots including humanoids.
                                                                                        Junho Ko received the B.S. degree in
                                                                                        Internet-software engineering, from
                        References                                                      Korea University of Technology and
                                                                                        Education (KUT), Korea, in 2009 and
                                                                                        the M.S. degree in Information media
[1] http://www.cyberbotics.com/ (CyberRobotics Webots TM)                               engineering, from Korea University of
[2] http://www.microsoft.com/korea/robotics/studio.mspx                                 Technology and Education (KUT),
[3] Sangyep Nam, Hyoyoung Lee, Sukjoong Kim, Yichul                                     Korea, in 2011. Since 2011 to now, he
     Kang and Keuneun Kim, “A study on design and                   has been Ph.D. student in Humane Interaction Lab (HILab),
     implementation of the intelligent robot simulator              Korea University of Technology and Education (KUT),
     which is connected to an URC system”, vol. 44. no. 4,          Cheonan, Korea. His research fields are artificial
                                                                    intelligence and vehicle interaction.
     IE, pp. 157-164, 2007.
[4] Kwangho Seok and Yoonsang Kim, “A new robot
     motion authoring method using HTM”, International
                                                                                       Yoon Sang Kim received the B.S.,
     Conference on Control, Automation and Systems
                                                                                       M.S., and Ph.D. degrees in electrical
     (ICCAS), pp. 2058-2061, Oct. 2008.
                                                                                       engineering    from      Sungkyunkwan
[5] Hyun Kim, Myeongyun Ock, Taewook Kang and
                                                                                       University, Seoul, Korea, in 1993, 1995,
     Sangwan Kim, “Development of Finger Pad by
                                                                                       and 1999, respectively. He was a
     Webcam”, The Korea Society of Marine Engineering,
                                                                                       Member of the Postdoctoral Research
     pp. 397-398, June 2008.
                                                                                       Staff of Korea Institute of Science and
[6] Kangho Lee and Jongho Choi, “Hand Gesture
                                                                    Technology (KIST), Seoul, Korea. He was also a Faculty
     Sequence Recognition using Morphological Chain
                                                                    Research Associate in the Department of Electrical
     Code Edge Vector”, Korea Society of Computer
                                                                    Engineering, University of Washington, Seattle. He was a
     Information, vol. 9, pp. 85-91.
                                                                    Member of the Senior Research Staff, Samsung Advanced
[7] Attila Licsar and Tamas Sziranyi, “User-adaptive hand
                                                                    Institute of Technology (SAIT), Suwon, Korea. Since
     gesture recognition system with interactive training”,
                                                                    March 2005, he has been an Associate Professor at the
     Image and Vision Computing 23, pp. 1102-1114, 2005.
                                                                    School of Computer and Science Engineering, Korea
[8] Kunwoo Kim, Wonjoo Lee and Changho Jeon, “A
                                                                    University of Technology Education (KUT), Cheonan,
     Hand Gesture Recognition Scheme using WebCAM”,
                                                                    Korea. His current research interests include Virtual
     The Institute of Electronics Engineers of Korea, pp.
                                                                    simulation, Power-IT technology, and device-based
     619-620, June. 2008.
                                                                    interactive application. Dr. Kim was awarded Korea
[9] Mark W. Spong, Seth Hutchinson and Mathukumalli
                                                                    Science and Engineering Foundation (KOSEF) Overseas
     Vidyasagar, “ROBOT MODELING AND CONTROL”,
                                                                    Postdoctoral Fellow in 2000. He is a member of IEEE,
     WILEY, 2006.
                                                                    IEICE, ICASE, KIPS and KIEE.
[10] http://www.ntrex.co.kr/


                  Kwang-Ho Seok received the B.S.
                  degree in Internet-media engineering,
                  from Korea University of Technology
                  and Education (KUT), Korea, in 2007
                  and the M.S. degree in Information
                  media engineering, from Korea
                  University of Technology and Education
(KUT), Korea, in 2009. Since 2009 to now, he has been
Ph.D. student in Humane Interaction Lab (HILab), Korea

Contenu connexe

Similaire à 19 151_155

MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.ppttffttfyyf
 
MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptAfstddrrdv
 
Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technologyijsrd.com
 
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWA Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWMarwan Hammouda
 
Human-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotHuman-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotAngela Williams
 
IRJET- An Approach to Accelerometer based Controlled Robot
IRJET- An Approach to Accelerometer based Controlled RobotIRJET- An Approach to Accelerometer based Controlled Robot
IRJET- An Approach to Accelerometer based Controlled RobotIRJET Journal
 
Servo Based 5 Axis Robotic Arm Project Report
Servo Based 5 Axis Robotic Arm Project ReportServo Based 5 Axis Robotic Arm Project Report
Servo Based 5 Axis Robotic Arm Project ReportRobo India
 
Virtual Automation using Mixed Reality and Leap Motion Control
Virtual Automation using Mixed Reality and Leap Motion ControlVirtual Automation using Mixed Reality and Leap Motion Control
Virtual Automation using Mixed Reality and Leap Motion ControlIRJET Journal
 
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer Game
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer GameDesign of a Library of Motion Functions for a Humanoid Robot for a Soccer Game
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer GameUniversidad Complutense de Madrid
 
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIES
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIESROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIES
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIESIJCNCJournal
 
Project Report on Hand gesture controlled robot part 2
Project Report on Hand gesture controlled robot part 2Project Report on Hand gesture controlled robot part 2
Project Report on Hand gesture controlled robot part 2Pragya
 
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARM
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARMIRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARM
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARMIRJET Journal
 
Vehicle Controlled by Hand Gesture Using Raspberry pi
Vehicle Controlled by Hand Gesture Using Raspberry piVehicle Controlled by Hand Gesture Using Raspberry pi
Vehicle Controlled by Hand Gesture Using Raspberry piIRJET Journal
 
Geasture Control Robotic Arm
Geasture Control Robotic ArmGeasture Control Robotic Arm
Geasture Control Robotic ArmSree Harsha
 

Similaire à 19 151_155 (20)

MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.ppt
 
MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.ppt
 
Follow Me Robot Technology
Follow Me Robot TechnologyFollow Me Robot Technology
Follow Me Robot Technology
 
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWA Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
 
Human-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation RobotHuman-Machine Interface For Presentation Robot
Human-Machine Interface For Presentation Robot
 
robotic arm
robotic arm robotic arm
robotic arm
 
IRJET- An Approach to Accelerometer based Controlled Robot
IRJET- An Approach to Accelerometer based Controlled RobotIRJET- An Approach to Accelerometer based Controlled Robot
IRJET- An Approach to Accelerometer based Controlled Robot
 
complete
completecomplete
complete
 
Robot Software Functions (By Dr. J. Jeya Jeevahan)
Robot Software Functions (By Dr. J. Jeya Jeevahan)Robot Software Functions (By Dr. J. Jeya Jeevahan)
Robot Software Functions (By Dr. J. Jeya Jeevahan)
 
Servo Based 5 Axis Robotic Arm Project Report
Servo Based 5 Axis Robotic Arm Project ReportServo Based 5 Axis Robotic Arm Project Report
Servo Based 5 Axis Robotic Arm Project Report
 
Virtual Automation using Mixed Reality and Leap Motion Control
Virtual Automation using Mixed Reality and Leap Motion ControlVirtual Automation using Mixed Reality and Leap Motion Control
Virtual Automation using Mixed Reality and Leap Motion Control
 
Space Mouse
Space MouseSpace Mouse
Space Mouse
 
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer Game
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer GameDesign of a Library of Motion Functions for a Humanoid Robot for a Soccer Game
Design of a Library of Motion Functions for a Humanoid Robot for a Soccer Game
 
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIES
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIESROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIES
ROBOT HUMAN INTERFACE FOR HOUSEKEEPER ROBOT WITH WIRELESS CAPABILITIES
 
Project Report on Hand gesture controlled robot part 2
Project Report on Hand gesture controlled robot part 2Project Report on Hand gesture controlled robot part 2
Project Report on Hand gesture controlled robot part 2
 
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARM
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARMIRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARM
IRJET- Design and Implementation of Gesture Controlled Robot with a Robotic ARM
 
Robotic 6DOF ARM
Robotic 6DOF ARMRobotic 6DOF ARM
Robotic 6DOF ARM
 
Project Report
Project ReportProject Report
Project Report
 
Vehicle Controlled by Hand Gesture Using Raspberry pi
Vehicle Controlled by Hand Gesture Using Raspberry piVehicle Controlled by Hand Gesture Using Raspberry pi
Vehicle Controlled by Hand Gesture Using Raspberry pi
 
Geasture Control Robotic Arm
Geasture Control Robotic ArmGeasture Control Robotic Arm
Geasture Control Robotic Arm
 

Dernier

Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 

Dernier (20)

Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 

19 151_155

  • 1. Journal of International Council on Electrical Engineering Vol. 1, No. 2, pp. 151~155, 2011 151 A Study on Robot Motion Authoring System using Finger-Robot Interaction Kwang-Ho Seok *, Junho Ko ** and Yoon Sang Kim † Abstract – This paper proposes a robot motion authoring system using finger-robot interaction. The proposed robot motion authoring tool is a user-friendly method that easily authors creates and controls robot motion according to the number of fingers. Furthermore, the system can be used to simulate user- created robot contents in the 3D virtual environment. This allows the user to not only view the authoring process in real time but also transmit the final authored contents. The effectiveness of the proposed motion authoring system was verified based on motion authoring simulation of an industrial robot. Keywords: Finger-robot interaction, Industrial robot, Robot motion authoring system 1. Introduction This paper proposes a robot motion authoring system using finger-robot interaction. The proposed system is With rapid developments in today's robot technology, capable of authoring (creating and controlling) robot robots have been used in a wide range of fields. As the motion according to the number of fingers. It is an intuitive increasing number of applications adopts intelligence, and robot motion authoring method that allows the user to thus robot requires a new type of robot software easily author robot motions. The effectiveness of the environment that allows user to author various commands proposed method is verified based on motion authoring for robots. However, robots must become easier to use for simulation of an actual industrial robot. general users in order to expand the scope of applications of intelligent robots in our everyday life. Based on this request, a number of studies are being conducted regarding new 2. Proposed Robot Motion Authoring System using types of user-friendly and intuitive robot motion authoring Finger-Robot Interaction that can render various motions [1]-[4]. Also, there are research projects actively taking place in human-robot With the exception of language, the hand is most interaction, thanks to the progress that has been made in frequently used for human communication among our body vision technology. A study is being conducted to develop a parts such as hands, eyes, mouth, arms and legs. This finger pad that can replace the computer mouse [5], and chapter explains how industrial robot motions can be the there are other studies proposing algorithms to improve authored according to the number of fingers and verifies the hand gesture recognition [6]-[8]. However, user-intuitive or effectiveness of the proposed approach based on simulation. user-friendly robot command authoring tools that focus on Fig. 1 illustrates the overall authoring process of the robot facilitating general users are still rare. The lack of user- motion authoring system implemented for this study. intuitive or user-friendly tools is likely to create a barrier for providing robot services (commands/motions) that correspond with the preferences and demands of general users including children, the elderly and housewives, who are expected to be the major clients in the service robot industry. † Corresponding Author: School of Computer Science and Engineering, Korea University of Technology and Education, Korea (yoonsang@kut.ar.kr) * Dept. of Computer Science and Engineering, Korea University of Technology and Education, Korea (mono@kut.ac.kr) ** Dept. of Computer Science and Engineering, Korea University of Fig. 1. Robot motion authoring system. Technology and Education, Korea (lich126@kut.ac.kr) Received: January 17, 2011; Accepted: March 23, 2011
  • 2. 152 A Study on Robot Motion Authoring System using Finger-Robot Interaction 2.1 Finger recognition provides robot’s external view. In the ZOOM/VIEW mode, the view can be zoomed in/out and rotated, allowing the There are a number of techniques for finger recognition, user to view robot’s entire structure in detail. The which is being used in various fields. In this section, finger simulation panel provides the user with a real time 3D recognition unit is implemented using color values scheme viewing of the robot motion being authored with finger- [8-9]. The finger recognition unit first converts RGB colors robot interaction. The authored contents(motions) can be into gray scales and YCrCb for binary representation. Then transmitted to control the robot by clicking TRANSMISSION the region inside the hand is filled by masking and noise is button, and the actual robot motion can be stopped using removed. The binary image is examined in 25-pixel units, STOP button. The data panel provides the virtual robot as shown in Fig. 2. If sum of 1’s examined is greater than motion degrees and PWMs to the actual industrial robot in ten, every digit is set to 1. With the image obtained by real time. The data panel provides RESET, PAUSE, masking performed by the finger recognition unit, the RESTART button. If the RESET button is clicked, the robot center of the hand can be calculated to identify hand’s is initialized to the default value. If the RESTART button is location as well as the hand region furthest from the center. clicked, the robot is restarted. The center of the hand is marked with a red dot, the palm region with a red circle and the hand region with a white circle. If the number of fingers is 0 or 5, a circle image is displayed. For 1, 2 and 3 fingers, the image is shown in blue, green and red, respectively. If there are 4 fingers, the hand is shown in white on a black background (Fig. 2). Fig. 3. Robot motion authoring tool. Fig. 2. Finger recognition process. 2.3 Virtual Robot Motion authoring simulation 2.2 Robot Motion Authoring Tool This section evaluates the effectiveness of the proposed authoring method based on robot motion authoring simulation. The virtual industrial robot used for the robot In this section, a motion authoring unit capable of motion authoring simulation is an articulated robot with 6 creating and controlling robot motions (industrial robot in DOF, and its motion ranges and definition are given in this paper) is implemented using the finger-robot Table 1 and Fig. 4. interaction based on the finger recognition unit explained in the previous section. The motion authoring unit(editor) Table 1. Motion Range consists of four panels: the finger interface panel, mode AXIS ANGLE (degree) panel, simulation panel and data panel(Fig. 3). The finger 1-Axis 0 ~ -360 interface panel means the finger recognition unit that 2-Axis -60 ~ 60 recognizes the number of fingers (from 0 to 5) from the 3-Axis -90 ~ 90 finger images taken by camera. The number of fingers 4-Axis 0 ~ -60 recognized by the finger interface panel allows the user to 5-Axis 0 ~ -360 control the industrial robots (such as SCARA, articulated 6-Axis 0 ~ 90 robot, and etc.) displayed on the simulation panel. The mode panel provides three modes – WIRE, SOLID and Fig. 4 depicts the virtual and actual articulated robot for ZOOM/VIEW. The mode panel provides TRANSMISSION the motion authoring simulation respectively. The motion button and STOP button. The WIRE mode allows viewing ranges and directions of the virtual articulated robot are of robot’s internal structure and the SOLID mode only almost similar to ones of the actual robot.
  • 3. Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 153 Fig. 5. The motion authoring simulation in WIRE and Fig. 4. Definition of an articulated robot used for the SOLID mode. motion authoring simulation. Fig. 6(a) depicts how a robot rotation part is enlarged in In WIRE and SOLID mode, internal and external the ZOOM/VIEW mode when the number of finger is structures of the robot can be viewed, respectively. In either recognized as 2 and 3. Also, Fig. 6(b) depicts how a robot mode, if the number of fingers recognized by the finger part is downsized in the ZOOM/VIEW mode when the interface is 0 or 4, 1-Axis or 5-Axis rotates around the Y number of finger is recognized as 1. axis. For 1, 2, 3 fingers, 2-Axis, 3-Axis, 4-Axis rotates around the Z axis, respectively(Fig. 5). The ZOOM/VIEW mode allows the user to zoom in and out of the robot structure and vary the camera angle for a detailed view. Similar to the WIRE mode, the camera is rotated clockwise and counter-clockwise for 3 and 4 fingers, respectively, according to the definitions of Table 3 so that the user can view the robot from a desired angle. Accordingly, the user is able to not only create but control motions for a part of a Fig. 6. (a) Enlarged robot rotation part in ZOOM/VIEW robot based on the number of fingers recognized. mode with 2 and 3 recognized (b) downsized robot Table 2 and 3 list the definitions of finger-value events in ZOOM/VIEW mode with 1 recognized for WIRE, SOLID and ZOOM/VIEW mode. Table 2. Definitions of Finger-value event for WIRE and 3. Experimental Results and Discussions Solid mode 3.1 Virtual Robot Experimental Result MODE FINGER EVENT 0 1-Axis rotation (base Y) 1 2-Axis rotation (base Z) The robot motion authoring system proposed in this A. WIRE 2 3-Axis rotation (base Z) paper was implemented on a personal computer with 3 4-Axis rotation (base Z) 1GByte memory, 1.80GHz AMD AthlonTM 64-bit processor B. SOLID CPU and ATI Radeon X1600 graphic card that supports 4 5-Axis rotation (base Y) 6-Axis rotation DirectX. As well, Logitech Webcam Pro 9000 was used. 5 (Gripper ON/OFF) The robot motion authoring tool in this paper was implemented with OPENGL, the open source programming Table 3. Definitions of Finger-value events for ZOOM/VIEW library. mode Authoring was effective because different colors were displayed according to the number of fingers recognized, MODE FINGER EVENT allowing the user to immediately discern the part of the 0 default robot being authored. Fig. 7 is a captured image of the robot 1 ZOOM OUT motion authoring process using the proposed method. The C. ZOOM/VIEW 2 ZOOM IN experimental results confirmed that the robot motion 3 VIEW rotation (CW) authoring system proposed by this study provides general 4 VIEW rotation (CCW) users with an intuitive, fun and easy way to create motions.
  • 4. 154 A Study on Robot Motion Authoring System using Finger-Robot Interaction Table 4 . NT-Robot Arm Specification [10] SPECIFICATION Speed 6V servomotor Voltage DC 6V Size 100×ø550 Power 16.1Kg-cm (max) Weight 1 Kg Structure 6 axis + gripper Center distant 130m/m Current 12.5 A The purpose of experiment in 3.2 was to confirm whether or not robot motions authored by the proposed tool works with an actual robot well. The motions authored by the tool were applied to an actual robot for the experiment (Fig. 9). The motion data was transmitted to the robot using serial communication and it was examined how the contents produced by the robot motion authoring tool controlled the robot according to user’s intention. Fig. 7. Robot motion authoring process using the proposed method. 3.2 Actual Robot Experimental Result Actual articulated robot is an miniature-type robot (NT- Robot Arm product of NTREX CO., LTD) 7 RC-servo motors were used, and 5 ball casters were used on the BASE part for smooth turning. The gripper can seize an Fig. 9. Experiment results using the actual robot. object of maximum 80mm. For control, the robot arm can be easily controlled by a computer or embedded board by using NT-SERVO-16CH-1 and NT-USB2UART (Fig. 8). 4. Conclusion Table 4 depicts NT-Robot Arm specification. This paper proposed an authoring system capable of creating and controlling motions of industrial robots based on finger-robot interaction. The proposed system is user- friendly and intuitive and facilitates motion authoring of industrial robots using fingers, which is second only to language in terms of means of communication. The proposed authoring system also uses graphic motion data to simulate user-created robot contents in a 3D virtual environment so that the user can view the authoring process in real time and transmit the authored robot contents to control the robot. The validity of the proposed robot motion Fig. 8. Actual articulated robot structure. authoring system was examined based on simulations using
  • 5. Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 155 the authored motion robot contents as well as experiments University of Technology and Education (KUT), Cheonan, of actual robot motions. The proposed robot motion Korea. His current research interests include immersive authoring method is expected to provide user-friendly and virtual reality, human robot interaction, and power system. intuitive solutions for not only various industrial robots, but also other types of robots including humanoids. Junho Ko received the B.S. degree in Internet-software engineering, from References Korea University of Technology and Education (KUT), Korea, in 2009 and the M.S. degree in Information media [1] http://www.cyberbotics.com/ (CyberRobotics Webots TM) engineering, from Korea University of [2] http://www.microsoft.com/korea/robotics/studio.mspx Technology and Education (KUT), [3] Sangyep Nam, Hyoyoung Lee, Sukjoong Kim, Yichul Korea, in 2011. Since 2011 to now, he Kang and Keuneun Kim, “A study on design and has been Ph.D. student in Humane Interaction Lab (HILab), implementation of the intelligent robot simulator Korea University of Technology and Education (KUT), which is connected to an URC system”, vol. 44. no. 4, Cheonan, Korea. His research fields are artificial intelligence and vehicle interaction. IE, pp. 157-164, 2007. [4] Kwangho Seok and Yoonsang Kim, “A new robot motion authoring method using HTM”, International Yoon Sang Kim received the B.S., Conference on Control, Automation and Systems M.S., and Ph.D. degrees in electrical (ICCAS), pp. 2058-2061, Oct. 2008. engineering from Sungkyunkwan [5] Hyun Kim, Myeongyun Ock, Taewook Kang and University, Seoul, Korea, in 1993, 1995, Sangwan Kim, “Development of Finger Pad by and 1999, respectively. He was a Webcam”, The Korea Society of Marine Engineering, Member of the Postdoctoral Research pp. 397-398, June 2008. Staff of Korea Institute of Science and [6] Kangho Lee and Jongho Choi, “Hand Gesture Technology (KIST), Seoul, Korea. He was also a Faculty Sequence Recognition using Morphological Chain Research Associate in the Department of Electrical Code Edge Vector”, Korea Society of Computer Engineering, University of Washington, Seattle. He was a Information, vol. 9, pp. 85-91. Member of the Senior Research Staff, Samsung Advanced [7] Attila Licsar and Tamas Sziranyi, “User-adaptive hand Institute of Technology (SAIT), Suwon, Korea. Since gesture recognition system with interactive training”, March 2005, he has been an Associate Professor at the Image and Vision Computing 23, pp. 1102-1114, 2005. School of Computer and Science Engineering, Korea [8] Kunwoo Kim, Wonjoo Lee and Changho Jeon, “A University of Technology Education (KUT), Cheonan, Hand Gesture Recognition Scheme using WebCAM”, Korea. His current research interests include Virtual The Institute of Electronics Engineers of Korea, pp. simulation, Power-IT technology, and device-based 619-620, June. 2008. interactive application. Dr. Kim was awarded Korea [9] Mark W. Spong, Seth Hutchinson and Mathukumalli Science and Engineering Foundation (KOSEF) Overseas Vidyasagar, “ROBOT MODELING AND CONTROL”, Postdoctoral Fellow in 2000. He is a member of IEEE, WILEY, 2006. IEICE, ICASE, KIPS and KIEE. [10] http://www.ntrex.co.kr/ Kwang-Ho Seok received the B.S. degree in Internet-media engineering, from Korea University of Technology and Education (KUT), Korea, in 2007 and the M.S. degree in Information media engineering, from Korea University of Technology and Education (KUT), Korea, in 2009. Since 2009 to now, he has been Ph.D. student in Humane Interaction Lab (HILab), Korea