This repository contains a modular software architecture designed for immersive teleoperation of the Boston Dynamics Spot robot using the Meta Quest 2.
Authors:
- Ali Yousefi, [email protected]
- Carmine Tommaso Recchiuto, [email protected]
- Antonio Sgorbissa, [email protected]
©2025 RICE Lab - DIBRIS, University of Genova
The package includes source code developed for the following functionalities:
- Control Algorithm: Implements the control logic for the robot (
scripts/spot_client/spot_interface.py
). - Sensor Measurements and Control Commands: Handles robot sensor data and applies control commands (
scripts/spot_client/spot_controller.py
). - Stereo Image Capture: Captures stereo images from the ZED camera (
scripts/spot_client/zed_interface.py
). - Robot Control and Image Compression: Manages robot control and compresses images before transmission using a GStreamer pipeline (
scripts/spot_client/spot_client.py
). - HMD Rendering and Command Reading: Renders stereo images on the head-mounted display (HMD) and reads control commands (
src/main.cpp
). - Control Command Transmission: Sends the measured control commands to the robot (
scripts/oculus_client.pt
).
The last two source codes run on the user's PC (Windows), while the others run on the Jetson board (Ubuntu) mounted on the robot.
The required software on the user's PC are the following:
- Windows 64 bits
- python (3.7.0 or later)
- CMake
- Visual Studio 2022
- Oculus SDK (1.17 or later)
- CUDA.
- GLEW included in the ZED SDK dependencies folder
- SDL
- GStreamer
- OpenCV
- paho-mqtt
On the Jetson board:
- Ubuntu 18.04
- python (3.7.0 or later)
- CMake
- Spot SDK
- ZED SDK 3.x
- ZED Python API
- Gstreamer
- OpenCV
- do-mpc
- paho-mqtt
On the cloud server: Create a Virtual Machine on a cloud server e.g., Google Cloud, or Microsoft Azure and get the following dependencies:
On the user's PC: Clone the source files from the repositoy and follow the instructions below:
- Create a folder called "build" in the root folder
- Open cmake-gui and select the source and build folders
- Generate the Visual Studio Win64 solution
- Open the resulting solution and change configuration to Release. You may have to modify the path of the dependencies to match your configuration
- Build solution Build OpenCV with GStreamer (Tutorial).
On the Jetson: The software does not require build, just clone the scripts files in the spot_clinet folder from the repository, but you have to build OpenCV with GStreamer on the Jetson as well (Tutorial).
On the robot side (Linux/Jetson): Run the spot_client.py
script as follows:
python spot_client.py <cloud-server-public-ip> ZED
On the user side (Windows): Run the ''ZED_Stereo_Passthrough.exe'' in a terminal as it follows:
'./ZED Stereo Passthrough.exe' <cloud-server-public-ip> ZED
On the cloud server: Run the following commands:
sudo systemctl start mosqitto
./mediamtx
For future work, we aim to address the limitations of the current system and enhance its functionality. Planned improvements include:
-
Semi-Autonomous Control Logic: Incorporate advanced algorithms to enable semi-autonomous navigation and task execution, reducing the cognitive load on the operator.
-
Enhanced Perception Capabilities: Integrate additional sensors such as LIDAR or thermal cameras to improve environmental awareness and support complex tasks in challenging scenarios.
-
Real-Time Feedback Optimization: Optimize latency in image and control data transmission to enhance real-time responsiveness, particularly in low-bandwidth network conditions.
-
Robust Communication Framework: Implement fail-safe mechanisms and adaptive communication protocols to maintain system reliability in varying network conditions.
-
User-Centered Design Improvements: Conduct user evaluations to refine the system interface, improving usability and immersion in teleoperation tasks.
-
Augmented Reality Integration: Explore the use of augmented reality (AR) overlays on the HMD to provide contextual information such as obstacle warnings, path planning, or task-specific guidance.
-
Expanding Robot Compatibility: Extend the system to support other robot platforms, making it a versatile solution for diverse robotic teleoperation applications.
By addressing these objectives, we aim to make the teleoperation system more efficient, reliable, and user-friendly, enabling a broader range of applications in fields such as search and rescue, inspection, and industrial automation.