CyC GraphSense

Simultaneous Perception, Localization and Mapping

CyC GraphSense: Simultaneous Perception, Localization and Mapping

A safe and reliable autonomous robotic navigation stack is based on a robust Simultaneous Perception, Localization, and Mapping solution. By integrating high-fidelity spatial and inertial data with real-time semantic perception, our CyC GraphSense solution enables platforms to not only know where they are, but to understand what surrounds them—distinguishing between static infrastructure and dynamic obstacles with unprecedented accuracy.
Our approach is composed of two systems running in parallel, and build on top of our Visual-Inertial Odometry (VIO) solution:
  • a robotic state and environment mapping algorithm using two DNNs for perception and keypoints matching and a State Graph based Map Optimizer responsible for incrementally building the environment's map and estimating the state of the robot
  • a path-guided trajectory tracking algorithm for safely navigating the given global reference \(\mathbf{z}_{ref}\)
The two subsystems of the proposed framework run in parallel and interact with each other through the joint robot and environment state \(\mathbf{x}_{t}\). The first system is responsible with building the map and for estimating the state of the robot, while the second system reads the current map and calculates the desired local robot trajectory by predicting the dynamics of \(\mathbf{x}_{t}\).
The interaction between state estimation and the trajectory planner is performed through the environment's map. The map is updated using new observations, while the planner uses the map to predict the dynamics of the environment and subsequently to plan a safe trajectory:
CyC GraphSense

The state of the environment is represented by a probabilistic semantic volumetric map denoted as \(\mathbf{m}_t \in \mathscr{M} \subseteq \mathbb{R}^{n_x \times n_y \times n_z}\). Alongside \(\mathbf{m}_t\), we also store relevant keyframes \(K\) which are used to refine the state graph, as well as to improve localization. Keyframes are pivotal frames within the sequence of observations captured by the sensor, that are selected for relocalization or loop closure.

These keyframes typically represent significant poses in the sensor's motion trajectory or in the environment being mapped, and can be visualized in the top-right corner of the video:

Simultaneous perception, localization and mapping using CyC GraphSense.

CyC GraphSense and CyC GraphPlan for Autonomous Ground Vehicle (AGV) control.

Simultaneous perception, localization and mapping using CyC GraphSense.

Request a demo via CyC droids.