The Uber Visualization & ATG teams today are excited to announce the open source release of the Autonomous Visualization System (AVS)—a powerful, web-based 3D visualization toolkit for building applications with self-driving and robotics data!
AVS is a new standard for describing and visualizing source-agnostic autonomous vehicle perception, motion, and planning data—offering a new way for developers to build applications for exploring, interacting and, most critically, making important development decisions around autonomy specific needs. Developers can quickly deploy applications for viewing logs, triaging results, remote assistance, or previewing simulation runs, freeing up time to focus on core autonomy development instead.
"At Applied Intuition, we're working with the most sophisticated AV teams in the world, and they require the most sophisticated tools,” says Peter Ludwig, CTO of Applied Intuition. “AVS falls in line with this, and what's notably great is that it's web-based and fills a need in the community to not rebuild the same visualization tools again and again. This is an awesome move from Uber for the rest of the AV community."
We are thrilled to be sharing AVS with the broader autonomous community in the hope that collaboration across the industry will unlock more advancement, define a new standard, and lead to safer, more efficient transportation solutions for everyone.
Check out our full announcement on our blog, here!
Interested in partnering? Please reach out to firstname.lastname@example.org or join the AVS Slack group.