Skip to content Skip to footer

Unlocking the Potential of Unrecognized Autonomous Visualization Solutions in Developing Core Autonomy Stack

Preamble

In the current self-driving vehicle application development industry, many algorithms and applications are being designed, developed, integrated, and tested before deployment on the roads. We are all aware that fine-tuning the systems of self-driving vehicle applications such as drive-by-wire, navigation, perception, localization, and critical scenarios in a real-time environment is challenging and requires a significant amount of effort, time, and cost.

To mitigate the effort involved in precision tuning the systems of self-driving applications with various operational design domain (ODD) features, an autonomous visualization solution is essential. Such a solution can help systems visually represent their multi-dimensional data, making it easier to analyze and fine-tune the performance of self-driving applications. This is because such a solution can help analyze the petabyte-size information in multi-dimensional space, which is necessary for the effective testing and validation of self-driving applications.

Abstract

A user interface (UI) for autonomous vehicle data analysis is a graphical interface that provides a way for users to interact with and analyze data collected from autonomous vehicles. The UI provides users with a visual representation of the data, making it easier to identify patterns, trends, and anomalies in the data.

The UI typically includes tools for data visualization, such as graphs, charts, and maps, which allow users to explore the data in different ways. For example, a graph might show how the vehicle’s speed varies over time, while a map might show the vehicle’s location and route.

In addition to visualization tools, the UI may also include features for data filtering, search, and manipulation. These features allow users to focus on specific parts of the data, such as a particular time period or type of event. Users can also use the interface to drill down into the data and explore it in more detail.

A well-designed UI for autonomous vehicle data analysis should be intuitive and easy to use, even for users who may not have a technical background. It should also be customizable, allowing users to tailor the interface to their specific needs and preferences.

Overall, a UI for autonomous vehicle data analysis is an essential tool for understanding how autonomous vehicles are performing in the real world. It allows researchers, engineers, and other stakeholders to analyze and interpret large amounts of data quickly and efficiently, helping to improve the safety, reliability, and effectiveness of autonomous vehicle technology.

In this article, I aim to showcase the different User Interfaces that I have encountered and experimented with during the final stage of autonomous vehicle stack testing within my organization. I will also provide information on various open autonomous visualization solutions that anyone can access to reap their benefits.

Visualization Solutions for Autonomy and Its Subsystems

I would like to highlight the various industry-acknowledged open-sourced autonomy visualization solutions. 

VizViewer

VizViewer is a web-based platform designed for collaborative and interactive visualization of complex datasets. The platform provides a suite of tools for data processing, communication, and visualization, all accessible through an easy-to-use dashboard UI. One of the key strengths of VizViewer is its ability to handle and analyze data from multiple modalities. This makes it a valuable tool for researchers, analysts, and other professionals who need to work with diverse datasets. The platform is also designed to be highly configurable and customizable, so users can tailor it to their specific needs and workflows.

No alt text provided for this image

References: https://www.vizviewer.com/

UBER’ Autonomy Visualization Solution

As a stand-alone, standardized visualization layer, AVS frees developers from having to build custom visualization software for their autonomous vehicles. With AVS abstracting visualization, developers can focus on core autonomy capabilities for drive systems, remote assistance, mapping, and simulation.

This has been built the system around two key pieces: XVIZ provides the data (including management and specification) while streetscape.gl is the component toolkit to power web applications.

No alt text provided for this image

References: https://avs.auto/

Cruise’ Webviz – Data Visualization Framework

At its foundation, Webviz is a web application that lets users configure different layouts of panels. Each panel is a data exploration tool, displaying information like text logs, 2D charts, and 3D depictions of the AV’s environment. Cruise’ initial focus was to build a suite of these panels that corresponded to existing open source tools like rviz, rqt console, rqt_runtime_monitor, rostopic echo, and rqt_plot. With further development, Cruise’ added custom panels outside the realm of existing tools to serve Cruise-specific needs. With our continued customizations and gradual migration away from legacy tools, these panels became more finely tuned to help our engineers solve their problems, while staying useful for people in the open source robotics community.

No alt text provided for this image

References: https://webviz.io/

Rviz

rviz is a 3D visualization platform in ROS. On one hand, it can realize the graphical display of external information, and on the other hand, it can also release control information to an object through rviz, realizing the monitoring and control of a robot.

No alt text provided for this image

Reference: http://wiki.ros.org/rviz  

Recommendation:

Given the considerable effort required to develop autonomy and its subsystems, it would be highly beneficial to leverage existing open-source components for multi-dimensional data visualization. With limited time and resources to build my own autonomy visualization solutions, and with existing interfaces such as RViz proving less user-friendly and insightful, I began searching for better alternatives that integrate well with ROS. To my surprise, I discovered that Uber had released an autonomy visualization solution in 2019, which had only garnered 14 contributions from the open community. This prompted me to explore the platform, which is open, user-friendly and has an interface-rich design. However, it has limited compatibility with existing AV platforms and middleware.

After a high-level review of the AVS, I decided to use it for my autonomy visualization needs, and have been utilizing it for over two months now to analyze my AV data. I greatly appreciate the development team’s efforts, as the AVS has saved me significant time and effort in UI development. For those who are fully engaged in developing the core AV stack and don’t have sufficient time to devote to UI development, I strongly recommend considering AVS as a dependable visualization solution.

Leave a comment