In the last blog post, we saw how easy it is to install the Ouster Python SDK and launch its visualizer to view point clouds using the command simple-viz. While point clouds are great visual validation, the Ouster SDK Visualizer offers a variety of information collected from our lidar sensors. To understand better, let’s review our data layers.
Our lidar sensor outputs four data layers for each pixel: Range, Signal, Near-IR, and Reflectivity.
⇒ Signal: The strength of the light returned to the sensor for a given point. Signal for Ouster digital lidar sensors is expressed in the number of photons of light detected
⇒ Reflectivity: The reflectivity of the surface (or object) that was detected by the sensor
⇒ Range: The distance of the point from the sensor origin, calculated by using the time of flight of the laser pulse
⇒ Near-IR: The strength of sunlight at the 865 nm light wavelength collected for a given point, also expressed in the quantity of photons detected that was not produced by the sensor’s laser pulse
In the Ouster SDK Visualizer, you will see two of the four screens below at the top of the window:
These 2D image outputs can be cycled through by using ‘b’ for the top image, and ‘n’ for the bottom image. You can also enlarge or shrink the size of 2D images by pressing ‘e’ or ‘E.’ If you’ve tried the Ouster SDK Visualizer, you might have noticed that there are selections called ‘Signal2,’ ‘Range2,’ and ‘Reflectivity2.’ These layers are only available on Rev 6 lidar sensors that offer dual returns and are also shown as point clouds. You can find out if the output includes dual returns from the on-screen display at the bottom left corner.
The on-screen display, which can be toggled by pressing ‘o,’ shows:
- Image[X, Y] – x and y show the top and bottom 2D image options.
- Cloud[1, 2]: X
- [1, 2] indicates return points toggled on. If the file or lidar sensor does not support dual returns, toggling 2 on and off does not impact any visual effects.
- X shows the current data layer.
- Frame – Indicates the current frame which can be useful when extracting a single frame.
- Sensor ts – Displays the current timestamp of the lidar sensor.
- Profile shows the current data profile:
- Single Return Profile: RNG19_RFL8_SIG16_NIR16
- Dual Return Profile: RNG19_RFL8_SIG16_NIR16_DUAL
- Low Data Rate Profile: RNG15_RFL8_NIR8
- Legacy Profile (Before FW 2.3): LEGACY
- Lidar sensor information – Shows the product version, firmware version, and the scan details [columns per frame] x [rotations per second].
If you have dual returns, you can toggle the first return or second return point cloud visibility by pressing ‘1’ or ‘2’:
Similar to 2D images, the point clouds can also be manipulated to cycle through data layers by clicking ‘m’ and change the size of points by pressing ‘p’ and ‘P’:
You can easily change the camera angle with left-click and drag, move the camera view with middle-click or shift left-click and drag, and zoom in or out by scrolling up and down. If you want to reset the view, simply press ‘R.’
Another useful set of keyboard shortcuts is the frame controls. You can pause the video by pressing ‘space’ and move frame by frame with ‘,’ and ‘.’ or by 10 frames using ‘ctrl + ,’ and ‘ctrl + .’ The speed of the video can also be changed by using ‘<’ and ‘>’. For the full list of keyboard shorts, visit this documentation page.
As you can see, it’s really easy to launch the Ouster SDK Visualizer to stream or replay point clouds and examine various settings visually. In the following blog posts, we will examine how the Ouster Python SDK can be used in other ways to speed up the development of lidar applications. If you have any questions, feel free to visit our GitHub page and contact us.
Ouster’s Commitment to the Responsible Use of Lidar Technology
Ending Nuisance Alarms with Lidar-based Perimeter Intrusion Detection Systems
The ultimate ITS sensor showdown: lidar vs. camera vs. radar
It can be a daunting task for traffic engineers and city planners to choose the right sensor for their application. In this blog, we dissect the use of three sensor technologies for roadway safety and efficiency to simplify the decision.