Bio-Inspired Vision with Events

 

Abstract

This presentation will review recent progress in bio-inspired visual processing hardware and algorithms with a focus on activity-driven, event-based cameras. These cameras mimic the brains use of spikes by asynchronously outputting a digital stream of events signifying scene activity. These cameras have recently reached QVGA spatial resolution with microsecond temporal resolution and 10us latency. Their local gain control and sparse output make them ideal for fast object tracking and high speed visual robotics under uncontrolled lighting conditions, but they are also useful for sustained observation of scenes with sparse temporal activity, e.g. surveillance. Parallel hardware convolution processing of this data is also being actively developed.