Think of
CyC comm as the Zoom online meeting for robots. Instead of people talking and exchanging information, you have robots doing the same thing.
Let's analyze the use-case of two robots, a legged robot and a drone, surveying a patch of forest for preventing forest fires. The hardware components used in this example are:
- a legged robot as an Unmanned Ground Vehicle (UGV) equipped with a powerful embedded computer (Nvidia AGX Xavier), LiDAR and cameras.
- an Unmanned Aerial Vehicle (UAV) equipped with a small computer for flight control (Raspberry PI).
- a mission control system, where the user can upload geolocation data for the forest area to be surveilled. The mission control system can be located either in the cloud, directly on a laptop, or on a smartphone.
Both autonomous robots are driven through perception-and-control pipelines, where the paths and waypoints are planned using data from the Environment Perception and Localization & State Estimation modules. Motion and flight control is executed based on the calculated paths and waypoints, respectively.
Apart from the local control system, each autonomous robot can benefit in its mission objectives by using, for example, environment perception information obtained by the other robot. The UGV can plan its path based on obstacles visible solely from the air, while the UAV can plan its waypoints using obstacles detectable only with the LiDAR installed on the UGV. Additionally, computational intensive operations, such as the forward pass through a large deep neural network, should be performed on the most powerful computing platform shared by the robots, that is, on the embedded computer installed on the legged robot. Both functionalities require a fast and stable communication between the two platforms.
The figure below illustrates the initialization and flow of data during operation. The connections between the three systems (legged robot, drone and mission control) are established via the Signaling Servers. Once the robots have discovered themselves using the servers, they can directly exchange position, sensory data, perception information and computational resources. In the datastreams diagram on the right, the Processing column indicates if a Filter's output is computed localy on the robot, or remote (Network). For example, on the legged robot (DataBlock 1), the Filter having ID 1 and BlockID 2, represents the output of the drone's camera, which has been also mirrored on the legged robot.
The
CyC comm communication pipeline is implemented as a
CyberCortex Filter, allowing for interchangeable data transfer protocols. In our work, we have routed the communication through the internet using the WebRTC protocol. The WebRTC implementation of
CyC comm enables direct peer-to-peer communication, which is established via a set of redundant Signaling Servers that are facilitating the discovery of robots between themselves, once a robot starts its
CyberCortex DataBlock and registers it in the signaling servers.
Our approach enables the deployment of complex robotics applications, where heavy computation (e.g. optimization algorithms) can be executed in real-time in the cloud, using clusters of high-end computers, while the results are used on the low-cost embedded microcontrollers powering the robotic systems.