Research Area

Author

  • Manasi Muglikar*, Diederik Paul Moeys, Davide Scaramuzza*
  • * External authors

Company

  • Sony Europe B.V.

Venue

  • 3DV

Date

  • 2021

Share

Event Guided Depth Sensing

View Publication

Abstract

Active depth sensors like structured light, lidar, and timeof-flight systems sample the depth of the entire scene uniformly at a fixed scan rate. This leads to limited spatiotemporal resolution where redundant static information is over-sampled and precious motion information might be under-sampled. In this paper, we present an efficient bioinspired event-camera-driven depth estimation algorithm. In our approach, we dynamically illuminate areas of interest densely, depending on the scene activity detected by the event camera, and sparsely illuminate areas in the field of view with no motion. The depth estimation is achieved by an event-based structured light system consisting of a laser point projector coupled with a second event-based sensor tuned to detect the reflection of the laser from the scene. We show the feasibility of our approach in a simulated autonomous driving scenario and real indoor sequences using our prototype. We show that, in natural scenes like autonomous driving and indoor environments, moving edges correspond to less than 10% of the scene on average. Thus our setup requires the sensor to scan only 10% of the scene, which could lead to almost 90% less power consumption by the illumination source. While we present the evaluation and proof-of-concept for an eventbased structured-light system, the ideas presented here are applicable for a wide range of depth sensing modalities like LIDAR, time-of-flight, and standard stereo. Video is available at https://youtu.be/Rvv9IQLYjCQ.

Share

この記事をシェアする