Gesture encoding and reproduction for human‐robot interaction in text‐to‐gesture systems

Industrial Robot - Tập 39 Số 6 - Trang 551-563 - 2012
Heon‐HuiKim1, Yun‐SuHa2, ZeungnamBien3, Kwang‐HyunPark4
1Art & Robotics Institute, Kwangwoon University, Seoul, Republic of Korea
2Division of Information Technology, Korea Maritime University, Pusan, Republic of Korea
3School of Electrical & Computer Engineering, Ulsan National Institute of Science and Technology, Ulsan, Republic of Korea
4School of Robotics, Kwangwoon University, Seoul, Republic of Korea

Tóm tắt

PurposeThe purpose of this paper is to deal with a method for gesture encoding and reproduction, particularly aiming at a text‐to‐gesture (TTG) system that enables robotic agents to generate proper gestures automatically and naturally in human‐robot interaction.Design/methodology/approachReproducing proper gestures, naturally synchronized with speech, is important under the TTG concept. The authors first introduce a gesture model that is effective to abstract and describe a variety of human gestures. Based on the model, a gesture encoding/decoding scheme is proposed to encode observed gestures symbolically and parametrically and to reproduce robot gestures from the codes. In particular, this paper mainly addresses a gesture scheduling method that deals with the alignment and refinement of gestural motions, in order to reproduce robotic gesticulation in a human‐like, natural fashion.FindingsThe proposed method has been evaluated through a series of questionnaire surveys, and it was found that reproduced gestures by a robotic agent could appeal satisfactorily to human beings.Originality/valueThis paper provides a series of algorithms to treat overlapped motions and to refine the timing parameters for the motions, so that robotic agents reproduce human‐like, natural gestures.

Từ khóa


Tài liệu tham khảo

Bien, Z. and Lee, H.E. (2007), “Effective learning system techniques for human‐robot interaction in service environment”, Knowledge‐Based Systems, Vol. 20 No. 5, pp. 439‐56.

Bien, Z., Park, K.H., Kim, D.J. and Jung, J.W. (2004), “Welfare‐oriented service robotic systems: intelligent sweet home & KARES II”, Advances in Rehabilitation Robotics, Vol. 306, pp. 57‐94.

Bien, Z., Lee, H.E., Do, J.H., Kim, Y.H., Park, K.H. and Yang, S.E. (2008), “Intelligent interaction for human‐friendly service robot in smart house environment”, International Journal of Computational Intelligence System, Vol. 1 No. 1, pp. 77‐93.

Cassell, J., Vilhjálmsson, H.H. and Bickmore, T. (2001), “BEAT: the behavior expression animation toolkit”, Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 477‐86.

Kim, C.H., Kim, D. and Oh, Y. (2006), “Adaptation of human motion capture data to humanoid robots for motion imitation using optimization”, Integrated Computer‐Aided Engineering, Vol. 13 No. 4, pp. 377‐89.

Kim, H.H., Lee, H.E., Kim, Y., Park, K.H. and Bien, Z. (2007), “Automatic generation of conversational robot gestures for human‐friendly steward robot”, The 16th IEEE International Symposium on Robot and Human Interactive Communication, RO‐MAN, pp. 1155‐60.

Kipp, M. (2004), Gesture Generation by Imitation – From Human Behavior to Computer Character Animation, Dissertation.com, Boca Raton, FL.

Kirihara, K., Saga, N. and Saito, N. (2010), “Design and control of an upper limb rehabilitation support device for disabled people using a pneumatic cylinder”, Industrial Robot: An International Journal, Vol. 37 No. 4, pp. 354‐63.

Kober, J. and Peters, J. (2010), “Imitation and reinforcement learning”, IEEE Robotics Automation Magazine, Vol. 17 No. 2, pp. 55‐62.

Kopp, S. and Wachsmuth, I. (2004), “Synthesizing multimodal utterances for conversational agents”, Computer Animation and Virtual Worlds, Vol. 15 No. 1, pp. 39‐52.

Lee, H.E., Kim, Y.M., Park, K.H. and Bien, Z.Z. (2005), “Development of a steward robot for intelligent sweet home”, International Journal of Human‐friendly Welfare Robotic Systems, Vol. 6 No. 4, pp. 57‐64.

Lee, I. and Ramsey, S.R. (2000), The Korean Language, New York Press, New York, NY.

McNeill, D. (1996), Hand and Mind: What Gestures Reveal About Thought, University of Chicago Press, Chicago, IL.

Martens, C., Prenzel, O. and Graser, A. (2007), “The rehabilitation robots FRIEND‐I & II: daily life independency through semi‐autonomous task‐execution”, Rehabilitation Robotics.

Matsui, D., Minato, T., MacDorman, K.F. and Ishiguro, H. (2005), “Generating natural motion in an android by mapping human motion”, IEEE/RSJ International Conference on Intelligent Robots and System, pp. 3301‐8.

Merlet, J.P. (2006), “Jacobian, manipulability, condition number, and accuracy of parallel robots”, Journal of Mechanical Design, Vol. 128, pp. 199‐205.

Mori, Y., Maejima, K., Inoue, K., Shiroma, N. and Fukuoka, Y. (2011), “ABLE: a standing style transfer system for a person with disabled lower limbs (improvement of stability when traveling)”, Industrial Robot: An International Journal, Vol. 38 No. 3, pp. 234‐45.

Nakano, Y.I., Okamoto, M. and Nishida, T. (2005), “Enriching agent animations with gestures and highlighting effects”, Intelligent Media Technology for Communicative Intelligence, Springer, Berlin, pp. 91‐8.

Ng‐Thow‐Hing, V., Luo, P. and Okita, S. (2010), “Synchronized gesture and speech production for humanoid robots”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4617‐24.

Park, K.‐H., Lee, H.‐E., Kim, Y. and Bien, Z.Z. (2008), “A steward robot for human‐friendly human‐machine interaction in a smart house environment”, IEEE Transactions on Automation Science and Engineering, Vol. 5 No. 1, pp. 21‐5.

Pei, J., Huosheng, H.H., Tao, L. and Kui, Y. (2007), “Head gesture recognition for hands‐free control of an intelligent wheelchair”, Industrial Robot: An International Journal, Vol. 34 No. 1, pp. 60‐8.

Poggi, I. (2002), “From a typology of gestures to a procedure for gesture production”, Gesture and Sign Language in Human‐computer Interaction, pp. 158‐68.

Ruchanurucks, M., Nakaoka, S., Kudoh, S. and Ikeuchi, K. (2006), “Humanoid robot motion generation with sequential physical constraints”, Proceedings 2006 IEEE International Conference on Robotics and Automation, ICRA, pp. 2649‐54.

Schegloff, E. (1984), “On some gestures' relation to talk”, in Atkinson, J. and Heritage, J. (Eds), Structures of Social Action, Cambridge University Press, Cambridge, pp. 266‐98.

Seo, K.H., Choi, T.Y. and Oh, C. (2010), “Development of a robotic system for the bed‐ridden”, Mechatronics, Vol. 21 No. 1, pp. 227‐38.

Shim, K. and Yang, J. (2002), “Mach: a supersonic Korean morphological analyzer”, Proceedings of the 19th International Conference on Computational Linguistics, pp. 1‐7.

Tsukahara, A., Hasegawa, Y. and Sankai, Y. (2009), “Standing‐up motion support for paraplegic patient with Robot Suit HAL”, IEEE International Conference on Rehabilitation Robotics, ICORR, pp. 211‐17.