Time and Date: Monday, June 17, 2019. Full day workshop: 8:15 - 5:00PM.

Location: room XX - Long Beach Convention & Entertainment Center, Long Beach, California

Confirmed Speakers and Companies:

Location: room XX - Long Beach Convention & Entertainment Center, Long Beach, California

Schedule:TBD

Objectives:
This workshop is dedicated to event-based cameras, smart cameras and algorithms. Event-based cameras are revolutionary vision sensors with three key advantages: a measurement rate that is almost 1 million times faster than standard cameras, a latency of microseconds, and a dynamic range that is eight orders of magnitude larger than that of standard cameras. Because of these advantages, event-based cameras open frontiers that are unthinkable with standard cameras (which have been the main sensing technology of the past 60 years). These revolutionary sensors enable the design of a new class of algorithms to track a baseball in the moonlight, build a flying robot with the same agility of a fly, and doing structure from motion in challenging lighting conditions and at remarkable speeds. These sensors became commercially available in 2008 and are slowly being adopted in computer vision and robotics. They have covered the main news in recent years, with event-camera company Prophesee receiving $40 million in investment from Intel and Bosch, and Samsung announcing mass production as well its use in combination with IBM TrueNorth processor to recognize human gestures. Cellular processor arrays (CPAs), such as the SCAMP sensor, are novel sensors in which each pixel has a programmable processor, thus they yield massively parallel processing near the image plane. Unlike a conventional image sensor, the SCAMP does not output raw images, but rather the results of on-sensor computations, for instance a feature map or optic flow map. Because early vision computations are carried out entirely on-sensor, the resulting system has high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc. This workshop will cover the sensing hardware, as well as the processing and learning methods needed to take advantage of the above-mentioned novel cameras.

Call for Papers and Demos:
Research papers and demos are solicited in, but not limited to, the following topics:

  • Event-based / neuromorphic vision.
  • Near-focal plane processing, such as cellular processor arrays (e.g., SCAMP sensor).
  • Algorithms: visual odometry, SLAM, 3D reconstruction, optical flow estimation, image intensity reconstruction, recognition, stereo depth reconstruction, feature/object detection and tracking, calibration, sensor fusion.
  • Model based, embedded or learning approaches.
  • Event-based signal processing, control, bandwidth control.
  • Event-based active vision.
  • Datasets and/or simulators.
  • Applications in: robotics (navigation, manipulation, drones...), automotive, IoT, AR/VR, space, inspection, surveillance, crowd counting, physics.
  • Biologically-inspired vision and smart cameras.
  • Novel hardware (cameras, neuromorphic processors, etc.) and/or software platforms.
  • New trends and challenges in event-based and/or biologically-inspired vision.

Accepted papers at the main conference:
Authors of accepted papers on the above topics at the main conference (CVPR'19) are encouraged to contact the Workshop organizers (contact details below) to make arrangements to showcase your work at the workshop.

Demos:
We solicit live demonstrations of event-based vision systems and prototypes. We plan to have a dedicated poster and demonstration session for authors to interact with the audience and show their systems and solutions.

Important Dates:
See the right panel.

Paper/Demo Submission:
Submission website: https://cmt3.research.microsoft.com/EVENTVISION2019
Author guidelines: http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines

FAQs:

Organizers:


Previous related workshops: