Effective and Immersive Teleoperation with Real-World Constraints
Main Article Content
Abstract
This study investigates the development of a telepresence system that leverages standard, industry-available hardware to support teleoperation in a cost-effective and accessible manner. Conventional telepresence solutions often rely on advanced technologies such as 360-degree or stereoscopic cameras, high-end haptic feedback devices, and specialized robotic platforms. While these approaches can deliver highly immersive experiences, they frequently involve significant implementation costs, limited accessibility, or insufficient locomotion support, which restrict their broader adoption. Consequently, there is a critical need for a telepresence method that balances usability, immersion, and affordability while maintaining precise and reliable control mechanisms. The proposed solution integrates commercially available virtual reality (VR) equipment with a mobile robotic platform to construct a virtual environment that enhances user interaction and spatial awareness. Real-time video input from the robot’s onboard camera is projected into the VR environment, enabling users to perceive the remote physical space intuitively. To compensate for hardware limitations, the system incorporates visual cues that represent the robot’s orientation, movement direction, and control latency. These cues play a crucial role in improving situational awareness and assisting users in making informed navigation decisions during teleoperation tasks. The study evaluates the system in terms of control simplicity, precision, and overall usability. Particular emphasis is placed on how the virtual environment mitigates latency effects and provides smooth locomotion feedback, resulting in a fluid user experience. The findings demonstrate that effective telepresence can be achieved using standard hardware, offering a practical alternative to more complex and expensive systems while maintaining immersive and accurate teleoperation capabilities.
Article Details

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal the right of first publication. The work is simultaneously licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits others to copy, distribute, remix, adapt, and build upon the work, even commercially, provided proper attribution is given to the original authors and the journal as the source of first publication.
- Authors may enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal’s published version of the work (e.g., posting to institutional repositories, inclusion in books), with an acknowledgment of its initial publication in this journal.
- Authors are encouraged to post their work online (e.g., in institutional repositories, personal websites, or preprint servers) prior to and during the submission process. This practice can foster productive exchanges and may lead to earlier and increased citation of the published work.
User Rights:
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: Creative Commons Attribution 4.0 International License.
References
[1] N. Dużmańska, P. Strojny, and A. Strojny, “Can Simulator Sickness Be Avoided? A Review on Temporal Aspects of Simulator Sickness,” Front. Psychol., vol. 9, 2018, doi: 10.3389/fpsyg.2018.02132.
[2] Interaction Design Foundation, “What is Cybersickness in Virtual Reality? — updated 2024,” The Interaction Design Foundation. Accessed: Oct. 29, 2024. [Online]. Available: https://www.interaction-design.org/literature/topics/cybersickness-in-virtual-reality
[3] A. Widagdo et al., “A Study of Preference and Comfort for Users Immersed in a Telepresence Robot,” Mar. 05, 2022, arXiv: arXiv:2203.02699. Accessed: Feb. 22, 2024. [Online]. Available: http://arxiv.org/abs/2203.02699
[4] S. Kratz, J. Vaughan, and D. Kimber, “Evaluating Stereoscopic Video with Head Tracking for Immersive Teleoperation of Mobile Telepresence Robots,” Mar. 2015, doi: 10.1145/2701973.2701982.
[5] S. Kratz and F. Rabelo Ferriera, “Immersed remotely: Evaluating the use of Head Mounted Devices for remote collaboration in robotic telepresence,” in 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA: IEEE, Aug. 2016, pp. 638–645, doi: 10.1109/ROMAN.2016.7745185.
[6] J. Botev and F. J. Rodríguez Lera, “Immersive Robotic Telepresence for Remote Educational Scenarios,” Sustainability, vol. 13, no. 9, Art. no. 9, Jan. 2021, doi: 10.3390/su13094717.
[7] T. Rhee, S. Thompson, D. Medeiros, R. dos Anjos, and A. Chalmers, “Augmented Virtual Teleportation for High-Fidelity Telecollaboration,” IEEE Trans. Vis. Comput. Graph., vol. 26, no. 5, pp. 1923–1933, May 2020, doi: 10.1109/TVCG.2020.2973065.
[8] Angelika Peer, Helena Pongrac, Martin Buss, “Influence of Varied Human Movement Control on Task Performance and Feeling of Telepresence,” Presence Teleoperators Virtual Environ. 2010, vol. 19, no. 5, pp. 463–481.
[9] J. Seibert and D. M. Shafer, “Control mapping in virtual reality: effects on spatial presence and controller naturalness,” Virtual Real., vol. 22, no. 1, pp. 79–88, Mar. 2018, doi: 10.1007/s10055-017-0316-1.
[10] B. Jones, Y. Zhang, P. N. Y. Wong, and S. Rintel, “Belonging There: VROOM-ing into the Uncanny Valley of XR Telepresence,” Proc. ACM Hum.-Comput. Interact., vol. 5, no. CSCW1, p. 59:1-59:31, Apr. 2021, doi: 10.1145/3449133.
[11] A. K. Bejczy, W. S. Kim, and S. C. Venema, “The phantom robot: predictive displays for teleoperation with time delay,” in , IEEE International Conference on Robotics and Automation Proceedings, May 1990, pp. 546–551 vol.1, doi: 10.1109/ROBOT.1990.126037.
[12] H. Brock, S. Šabanović, and R. Gomez, “Remote You, Haru and Me: Exploring Social Interaction in Telepresence Gaming With a Robotic Agent,” in Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, in HRI ’21 Companion. New York, NY, USA: Association for Computing Machinery, Mar. 2021, pp. 283–287, doi: 10.1145/3434074.3447177.
[13] B. P. H. Iii et al., “The Application of Telepresence and Virtual Reality to Subsea Exploration”.
[14] S. M. Goza, R. O. Ambrose, M. A. Diftler, and I. M. Spain, “Telepresence Control of the NASA/DARPA Robonaut on a Mobility Platform,” vol. 6, no. 1, 2004.
[15] M. Walker, T. Phung, T. Chakraborti, T. Williams, and D. Szafir, “Virtual, Augmented, and Mixed Reality for Human-robot Interaction: A Survey and Virtual Design Element Taxonomy,” ACM Trans. Hum.-Robot Interact., vol. 12, no. 4, pp. 1–39, Dec. 2023, doi: 10.1145/3597623.
[16] A. Naceri et al., “Towards a Virtual Reality Interface for Remote Robotic Teleoperation,” in 2019 19th International Conference on Advanced Robotics (ICAR), Dec. 2019, pp. 284–289, doi: 10.1109/ICAR46387.2019.8981649.
[17] P. Stotko et al., “A VR System for Immersive Teleoperation and Live Exploration with a Mobile Robot,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nov. 2019, pp. 3630–3637, doi: 10.1109/IROS40897.2019.8968598.
[18] Husarion, “ROSbot 2R,” Husarion Store. Accessed: Apr. 08, 2024. [Online]. Available: https://store.husarion.com/products/rosbot
[19] G. Ghinea, F. Andres, and S. R. Gulliver, Eds., Multiple Sensorial Media Advances and Applications: New Developments in MulSeMedia. IGI Global, 2012, doi: 10.4018/978-1-60960-821-7.