Design of a Head Movement Navigation System for Mobile Telepresence Robot Using Open-source Electronics Software and Hardware

Authors

  • Tan Jia Wee Universiti Teknologi Malaysia
  • Herman Wahid Universiti Teknologi Malaysia

Abstract

Head movement is frequently associated with human motion navigation, and an indispensable aspect of how humans interact with the surrounding environment. In spite of that, the incorporation of head motion and navigation is more often used in the VR (Virtual Reality) environment than the physical environment. This study aims to develop a robot car capable of simple teleoperation, incorporated with telepresence and head movement control for an on-robot real-time head motion mimicking mechanism and directional control, in attempt to provide users the experience of an avatar-like third person’s point of view amid the physical environment. The design consists of three processes running in parallel; Motion JPEG (MJPEG) live streaming to html-Site via local server, Bluetooth communication, and the corresponding movements for the head motion mimicking mechanism and motors which acts in accordance to head motion as captured by the Attitude Sensor and apparent command issued by the user. The design serves its purpose of demonstration with the usage of basic components and is not aimed to provide nor research with regards to user experience.

References

Janice. (2017). What Is Telepresence Technology? Retrieved from: https://www.eztalks.com/video-conference/what-is-telepresence- technology.html

Malczewski. K. (2014). The Rise of Telepresence Robots for Business and Beyond. Retrieved from: https://www.factor- tech.com/roundup/this-week-facial-recognition-used-to-capture-fugitive- spacex-commits-to-city-to-city-rocket-travel-and-uk-reveals-it-launched-a-cyber-attack-on-islamic-state/

Kerruish, E. (2019). Lessons on telepresence from the Mars explorer Rovers: Merleau-Ponty and the open perceptual circuit. Culture, Theory and Critique, 1-15.

Mould, R. F. (2000). Chernobyl record: the definitive history of the Chernobyl catastrophe. CRC Press.

Kratz, S., Vaughan, J., Mizutani, R., & Kimber, D. (2015). Evaluating stereoscopic video with head tracking for immersive teleoperation of mobile telepresence robots. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (pp. 43-44). ACM.

Wen, M. C., Yang, C. H., Tsai, M. H., & Kang, S. C. (2018). Teleyes: A telepresence system based on stereoscopic vision and head motion tracking. Automation in Construction, 89, 199-213.

Pinter, M., Lai, F., Sanchez, D. S., Ballantyne, J., Roe, D. B., Wang, Y., ... & Wong, C. W. (2017). U.S. Patent No. 9,776,327. Washington, DC: U.S. Patent and Trademark Office.

Lester, D. F., Hodges, K. V., & Anderson, R. C. (2017). Exploration telepresence: A strategy for optimizing scientific research at remote space destinations. Science Robotics, 2(7), Art-No.

Nakashima, R., & Shioiri, S. (2014). Why do we move our head to look at an object in our peripheral region? Lateral viewing interferes with attentive search. PloS one, 9(3), e92284.

Sadik, M. J., & Lam, M. C. (2017). Stereoscopic Vision Mobile Augmented Reality System Architecture in Assembly Tasks. J Eng Appl Sci, 12, 2098-2105.

Raja, D. (2016). DC Motor Control with Raspberry-Pi. Retrieved from https://circuitdigest.com/microcontroller-projects/controlling-dc-motor-using-raspberry-pi.

Downloads

Published

2024-04-19

Issue

Section

Signals, Circuits, Systems