Ir al contenido

Documat


Impact of Real time on Active Perception Systems applied to Social Robotics

  • Antonio Alberto García Gómez Jacinto ; Juan Diego Peña Narváez [1] ; Rodrigo Pérez Rodríguez Árbol académico ; José Miguel Guerrero Hernández ; Francisco Martín Rico Árbol académico ; Juan Carlos Manzanares Serrano [1]
    1. [1] Universidad Rey Juan Carlos

      Universidad Rey Juan Carlos

      Madrid, España

  • Localización: Proceedings of the XXIV Workshop of Physical Agents: September 5-6, 2024 / coord. por Miguel Cazorla Quevedo Árbol académico, Francisco Gómez Donoso Árbol académico, Félix Escalona Moncholi Árbol académico, 2024, ISBN 978-84-09-63822-2, págs. 169-185
  • Idioma: español
  • Enlaces
  • Resumen
    • Real-time capabilities are crucial for enabling robots to interact effectively in dynamic human environments, addressing latency and computational constraints that hinder traditional systems. This paper examines the essential role of real-time processing in active perception systems within social robotics.We propose an integrated approach within the ROS 2 framework, leveraging advanced object detection models and cascade lifecycle nodes to ensure robust and efficient tracking of individuals and objects. The robot’s head or camera is required to move and direct itself toward visual stimuli. Our experimental validation demonstrates significant improvements in orientation error rates with real-time configurations, particularly under high-stress scenarios. The findings highlight the practical advantages of real-time systems in enhancing situational awareness and interaction quality in social robotics.

  • Referencias bibliográficas
    • Zhang, H., Reardon, C., Parker, L.E.: Real-time multiple human perception with color-depth cameras on a mobile robot. IEEE Trans Cybern 43(5),...
    • Scheier, C., Egner, S.: Visual attention in a mobile robot. In: ISIE’97 Proceeding of the IEEE International Symposium on Industrial Electronics,...
    • Bochkovskiy, A., Wang, C.Y., Liao, H.Y.M.: Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934 (2020).
    • Butko, N., and Movellan, J. (2009). "Optimal scanning for faster object detection," in Proceedings of the IEEE Conference on Computer...
    • Meger, D., Forssén, P.-E., Lai, K., Helmer, S., McCann, S., Southey, T., et al. (2008). Curious george: an attentive semantic robot. Rob....
    • Breazeal, C., and Scassellati, B. (1999). “A context-dependent attention system for a social robot,” in Proceedings of the Sixteenth International...
    • Hashimoto, S., Narita, S., Kasahara, H., Shirai, K., Kobayashi, T., Takanishi, A., et al. (2002). Humanoid robots in Waseda University—Hadaly-2...
    • Itti, L., Koch, C., and Niebur, E. (1998). A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal....
    • Hou, X., and Zhang, L. (2007). “Saliency detection: a spectral residual approach,” in 2007 IEEE Conference on Computer Vision and Pattern...
    • Harel, J., Koch, C., and Perona, P. (2006). “Graph-based visual saliency,” in Proceedings of the 19th International Conference on Neural Information...
    • Goferman, S., Zelnik-Manor, L., and Tal, A. (2012). Context-aware saliency detection. IEEE Trans. Pattern Anal. Mach. Intell. 34, 1915–1926....
    • Grotz, M., Habra, T., Ronsse, R., and Asfour, T. (2017). “Autonomous view selection and gaze stabilization for humanoid robots,” in 2017 IEEE/RSJ...
    • García, J. F., Rodríguez, F. J., Martín, F., and Matellán, V. (2010). “Using visual attention in a NAO humanoid to face the RoboCup any-ball...
    • J. Ruesch, M. Lopes, A. Bernardino, J. Hornstein, J. Santos-Victor, and R. Pfeifer, Multimodal saliency-based bottom-up attention: a framework...
    • Bachiller, P., Bustos, P., and Manso, L. J. (2008). “Attentional selection for action in mobile robots,” in Advances in Robotics, Automation...
    • Stefanov, K., Salvi, G., Kontogiorgos, D., Kjellström, H., and Beskow, J. (2019). Modeling of human visual attention in multiparty open-world...
    • Agüero, C. E., Martín, F., Rubio, L., and Cañas, J. M. (2012). Comparison of smart visual attention mechanisms for humanoid robots. Int. J....
    • Manso, L. J., Gutierrez, M. A., Bustos, P., and Bachiller, P. (2018). Integrating planning perception and action for informed object search....
    • P. Aliasghari, A. Taheri, A. Meghdari, and E. Maghsoodi, Implementing a gaze control system on a social robot in multi-person interactions,...
    • Martín, F., González, A., López, J., Rodríguez, S.: Dynamic Visual Attention System for a Mobile Robot. J Intell Robot Syst 102, 36 (2021).
    • Macenski, S., Soragna, A., Carroll, M., Ge, Z.: Impact of ROS 2 Node Composition in Robotic Systems. IEEE Robotics and Automation Letters...
    • RoboCup Federation, RoboCup@Home: Overview and Information, https:// athome.robocup.org/, last accessed 2024/06/14.
    • RoboCup Federation, RoboCup 2024: Official Announcement and Overview, https://www.robocup.org/, last accessed 2024/05/27.
    • Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: ROS: an open-source Robot Operating System. ICRA...
    • Macenski, S., Foote, T., Gerkey, B., Lalancette, C., Woodall, W.: Robot Operating System 2: Design, architecture, and uses in the wild. Sci...
    • Rico, F.M.: A Concise Introduction to Robot Programming with ROS2. 1st edn. Chapman and Hall/CRC, New York (2022). https://doi.org/10.1201/ 9781003289623
    • S.Wu, J. Staschulat, S. Eng, and O. Bell. (2023). Real-time programming with ROS 2, ROScon 2023 [Workshop]. Available: https://docs.google.com/presentation/ d/1yHaHiukJe-87RhiN8WIkncY23HxFkJynCQ8j3dIFx_w/edit#slide=id.p

Fundación Dialnet

Mi Documat

Opciones de artículo

Opciones de compartir

Opciones de entorno