Blog

Apr 24, 2019

University Research with Ouster Lidar

Raffi Mardirosian
The Kumar Lab’s Ouster and Ghost Robotics platform for their first SubT trial

A large fraction of Ouster’s employees started their careers in research and academia. We recognize the critical role research continues to play in robotics and from the start have supported researchers through steep discounts on our products. Today we’re furthering our support with the first of a series of blog posts highlighting the incredible work of our university partners. Email us if you’d like to be mentioned in a future post!

 

University of Pennsylvania, GRASP Lab

UPenn’s GRASP Lab is a robotics lab of 250 graduate students tackling challenges in ground and aerial vehicles, machine learning, and automation. Here we highlight the Kumar Lab research group working on the DARPA Subterranean Challenge.

The SubT Challenge is a 3-year competition to develop solutions to map, navigate, and search underground environments during time-sensitive disaster response scenarios, when settings can degrade drastically and create unknown hazards for first responders. The competition consists of three circuit events in different underground environments, culminating in one final event in August 2021.

The competition kicked off in September 2018 when DARPA selected nine teams to compete, including UPenn’s GRASP Lab working in collaboration with Ghost Robotics and Exyn Technologies. The team is currently preparing for the first circuit event scheduled to take place this August!  

The Kumar Lab’s Ouster and Ghost Robotics platform for their first SubT trial

The Kumar Lab’s Ouster and Ghost Robotics platform for their first SubT trial

The Kumar Lab is creating multiple robotics platforms that can achieve long periods of autonomy underground. To navigate through tunnel systems and collect high-resolution data, the team chose to use the Ouster OS1-64 lidar sensor. We spoke with team member Shreyas Shivakumar, a computer vision and robotics PhD student, to learn more about why they picked Ouster for a critical function of their SubT robot.

The OS1-64 was integrated with a color camera outdoors

The OS1-64 was integrated with a color camera outdoors

“What first stood out to us was the ease of use and integration. It takes about 10 minutes to go from taking the OS1-64 out of the box to having it stream a nice point cloud to you on Rviz” said Shreyas. A quick test outdoors with a color camera fusion also demonstrated how easy it was to discern objects from the point cloud alone.  

The OS1-64 mounted on a SubT legged platform from Ghost Robotics

The OS1-64 mounted on a SubT legged platform from Ghost Robotics

This month, the team tested their Ouster and Ghost Robotics platform at a SubT Integration Exercise and we hope to have more updates soon on their accomplishments!

 

Stevens Institute of Technology

SIT is a research university in New Jersey dedicated to engineering. One of the projects SIT’s Robust Field Autonomy Lab, led by Professor Brendan Englot, is working on uses machine learning to enhance lidar resolution for ground vehicles. They’ve successfully increased the resolution of sparse point cloud data by projecting the 3D data into 2D range images to simplify upstream processing, applying a convolutional neural network to upsample the data, and then converting the upsampled 2D data back to 3D.

SIT is the first group to attempt this approach to lidar super-resolution enabled by deep learning. While to date they have trained their neural network using only computer-generated data from a simulation environment, they are using the OS1-64 lidar to validate their algorithm on real-world data.

Architecture for high-resolution lidar predictions using low-resolution data

Architecture for high-resolution lidar predictions using low-resolution data

In order to create their validation dataset, the team mounted an OS1-64 on a ground vehicle, gathered 64-beam data, and removed 75% of the scan lines to obtain 16-beam data. With the Ouster lidar, the data already comes out in an efficient 2D format, so the team was able to skip the step of projecting the 3D point cloud into a 2D range image. This generated 2 datasets: 16-beam data for passing through their algorithm and 64-beam ground truth data to compare against the result. As you can see below, the predicted data is very close in resolution to ground truth data.

Testing with Ouster dataset using OS-1-64

Testing with Ouster dataset using OS-1-64

The team at SIT believes lidar users could apply this neural network to gather denser point clouds of difficult environments or heavily populated areas. Even as lidar resolution improves to 128 beam resolution and beyond, this approach could be applied for still greater resolution.

 

Mississippi State University, HALO Project

The Center for Advanced Vehicular Systems at MSU is a multi-disciplinary automotive research center with over 300 researchers working on projects ranging from designing new steel alloys, to modeling airflow around rockets. The MSU Halo Project, named after the exotic cars automakers sell that create a “halo” effect on the brand, is a self-driving, all-electric supercar designed to showcase the team’s expertise in automotive engineering.

“Within the Halo Project, our autonomous tech team is the largest group.  They are dedicated to finding solutions for the 99% of the Earth that is not paved,” said CAVS Associate Director and HaloProject leader Matt Doude. There’s currently a lot of cool autonomous vehicle research focused on on-road use cases, but the team believes off-road environments are where autonomous technology can most exceed human abilities. In off-road environments, humans have to make imprecise judgments about traversability, whereas lidar can measure exact dimensions of obstacles and a vehicle dynamics model can determine whether the vehicle can clear a given log, hill, or hole with certainty.

MSU Halo has one of the only off-road autonomous vehicle proving grounds in the world, where they’ve been testing Ouster lidar. The team has three Ouster lidar units on their vehicle - one to scan the horizon for objects and potential obstacles, and two to map the terrain directly in front of the vehicle. The horizon scanning lidar on top of the vehicle feeds data into a neural network used to classify objects and generate a 3D occupancy grid. The other two lidar sensors are mounted on the front of the vehicle, tilted on two axes, to ensure that the beam paths are fairly normal to the ground and that all of the beams intersect the ground in front of the vehicle. Using the data from these three lidar sensors, the team can precisely model the terrain in front of the vehicle to complete traversability estimation.

MSU Halo vehicle mapping the woods and off-road terrain with Ouster OS1-64

MSU Halo vehicle mapping the woods and off-road terrain with Ouster OS1-64

The MSU Halo team is compiling the most comprehensive, detailed, multi-modal set of off-road autonomous vehicle data, and plans to open-source that data set on their website. They hope to spur mobility beyond urban centers to rural, undeveloped, and remote environments.

 

Stay tuned for more

Ouster is currently supporting many universities on their autonomous vehicle and robotics competitions through reduced pricing on sensors. We also can’t wait to share more university projects this year. If you are working in non-profit research, reach out to us for discounted pricing or a mention on our blog!