This workshop is dedicated to event-based cameras, smart cameras and algorithms. Event-based cameras are revolutionary vision sensors with three key advantages: a measurement rate that is almost 1 million times faster than standard cameras, a latency of microseconds, and a dynamic range that is eight orders of magnitude larger than that of standard cameras. Because of these advantages, event-based cameras open frontiers that are unthinkable with standard cameras (which have been the main sensing technology of the past 60 years). These revolutionary sensors enable the design of a new class of algorithms to track a baseball in the moonlight, build a flying robot with the same agility of a fly, and doing structure from motion in challenging lighting conditions and at remarkable speeds. These sensors became commercially available in 2008 and are slowly being adopted in computer vision and robotics. They have covered the main news in recent years, with event-camera company Prophesee receiving $40 million in investment from Intel and Bosch, and Samsung announcing mass production as well its use in combination with IBM TrueNorth processor to recognize human gestures. Cellular processor arrays (CPAs), such as the SCAMP sensor, are novel sensors in which each pixel has a programmable processor, thus they yield massively parallel processing near the image plane. Unlike a conventional image sensor, the SCAMP does not output raw images, but rather the results of on-sensor computations, for instance a feature map or optic flow map. Because early vision computations are carried out entirely on-sensor, the resulting system has high speed and low-power consumption, enabling new embedded vision applications in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc. This workshop will cover the sensing hardware, as well as the processing and learning methods needed to take advantage of the above-mentioned novel cameras.
Call for Papers and Demos:
Research papers and demos are solicited in, but not limited to, the following topics:
- Event-based / neuromorphic vision.
- Near-focal plane processing, such as cellular processor arrays (e.g., SCAMP sensor).
- Algorithms: visual odometry, SLAM, 3D reconstruction, optical flow estimation, image intensity reconstruction, recognition, stereo depth reconstruction, feature/object detection and tracking, calibration, sensor fusion.
- Model based, embedded or learning approaches.
- Event-based signal processing, control, bandwidth control.
- Event-based active vision.
- Datasets and/or simulators.
- Applications in: robotics (navigation, manipulation, drones...), automotive, IoT, AR/VR, space, inspection, surveillance, crowd counting, physics.
- Biologically-inspired vision and smart cameras.
- Novel hardware (cameras, neuromorphic processors, etc.) and/or software platforms.
- New trends and challenges in event-based and/or biologically-inspired vision.
Accepted papers at the main conference:
Authors of accepted papers on the above topics at the main conference (CVPR'19) are encouraged to contact the Workshop organizers (contact details below) to make arrangements to showcase your work at the workshop.
We solicit live demonstrations of event-based vision systems and prototypes. We plan to have a dedicated poster and demonstration session for authors to interact with the audience and show their systems and solutions.
See the right panel.
Submission website: https://cmt3.research.microsoft.com/EVENTVISION2019
Author guidelines: http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines
- For paper submission, please refer to CVPR guidelines.
- See also the policy of Dual/Double Submissions of concurrently-reviewed conferences, such as ICCV'19. Authors may want to limit the submission to four pages, including references, for this matter.
- For demo abstract submission, authors are encouraged to submit an abstract of up to 2 pages.
- What is an event camera? Watch this video explanation.
- What are possible applications of event cameras? A list of papers can be found here.
- Where can I buy an event camera? From Inivation, Insightness, Prophesee, Hillhouse Technology.
- Are there datasets and simulators I can play with, so that I don't have to buy the sensor? Yes, Dataset. Simulator. More.
- What is the SCAMP sensor? Read this page explanation.
- What are possible applications of the scamp sensor? Some applications can be found here.
- Where can I buy a SCAMP sensor? It is not commercially available. Contact Prof. Piotr Dudek.
- Where can I find more information? Check out this List of Event-based Vision Resources.
- Davide Scaramuzza - University of Zurich
- Guillermo Gallego - University of Zurich
- Kostas Daniilidis - UPenn
Previous related workshops:
- ICRA 2017 First International Workshop on Event-based Vision.
- ICRA 2015 Workshop on Innovative Sensing for Robotics, with focus on Neuromorphic Sensors.
- Event-Based Vision for High-Speed Robotics (slides) IROS 2015, Workshop on Alternative Sensing for Robot Perception.
- The Telluride Neuromorphic Cognition Engineering Workshops.
- Capo Caccia Workshops toward Cognitive Neuromorphic Engineering.