This workshop is dedicated to event-based vision sensors and algorithms. Event-based cameras are revolutionary vision sensors with three key advantages: a measurement rate that is almost 1 million times faster than standard cameras, a latency of microseconds, and a high dynamic range that is six orders of magnitude larger than that of standard cameras. Event-based sensors open frontiers which are unthinkable with standard cameras (which have been the main sensing technology of the past 50 years). These revolutionary sensors enable the design of a new class of algorithms to track a baseball in the moonlight, build a flying robot with the same agility of a fly, localizing and mapping in challenging lighting conditions and at remarkable speeds. These sensors became commercially available in 2008 and are slowly being adopted in mobile robotics. They covered the main news in 2016 with Intel and Bosch announcing a $15 million investment in event-camera company Chronocam and Samsung announcing its use with the IBM's brain-inspired TrueNorth processor to recognize human gestures. This workshop will cover the sensing hardware as well as the processing, learning, and control methods needed to take advantage of these sensors.

Call for Papers, Extended abstract, and Live Demos!
We encourage the submission of Extended abstracts (1-2 pages), full-length paper (6 pages), and Live Demos IEEE Latex template).

List of confirmed speakers:


Previous related workshops: