8:00 | SESSION 1 |
8:00 | Introduction to the workshop. Slides |
8:05 | Andrew Davison (Imperial College London and SLAMCore) Novel Hardware for Spatial AI. Slides, YouTube |
8:30 | Davide Scaramuzza and Guillermo Gallego (University of Zurich) Event-based Cameras: Challenges and Opportunities. Slides, YouTube |
8:55 | Hyunsurk Eric Ryu (Samsung Electronics) Industrial DVS Design: Key Features and Applications. Slides, YouTube |
9:20 | Contributed papers pitch:
|
10:00 | Posters and Live Demos during coffee break |
10:30 | SESSION 2 |
10:30 | Kostas Daniilidis and Alex Zhu (University of Pennsylvania) Unsupervised Learning of Optical Flow and Camera Motion from Event Data. Slides, YouTube |
10:55 | Mike Davies (Intel Corp.) Realizing the Promise of Spiking Neuromorphic Hardware. Slides, YouTube |
11:20 | Garrick Orchard (Intel Corp.) Spiking Neural Networks for Event-based Vision. Slides, YouTube |
11:45 | Cornelia Fermuller (University of Maryland) Object Motion Estimation and Grouping from Event Data. Slides, YouTube |
12:15 | Lunch Break |
13:15 | SESSION 3 |
13:15 | Piotr Dudek (University of Manchester) SCAMP-5: Vision Sensor with Pixel Parallel SIMD Processor Array. Slides, YouTube |
13:40 | Robert Mahony (Australian National University) Asynchronous Convolutions and Image Reconstruction. Slides, YouTube |
14:05 | Yuchao Dai (Australian National University) Bringing a Blurry Frame Alive at High Frame-Rate with an Event Camera. Slides, YouTube |
14:15 | Henri Rebecq (University of Zurich) Events-to-Video: Bringing Modern Computer Vision to Event Cameras. Slides, YouTube |
14:25 | Yusuke Sekikawa (Denso IT Laboratory) EventNet: Asynchronous recursive event processing. Slides, YouTube |
14:35 | Yulia Sandamirskaya (University of Zurich and ETH Zurich) Neuromorphic Computing: towards event-based cognitive sensing and control. Slides, YouTube |
14:50 | Julien N.P. Martel (Stanford University) Bringing computation on the focal plane: algorithms and systems for sensors with in-pixel processing capabilities. Slides, YouTube |
15:15 | Posters and Live Demos during coffee break |
15:45 | SESSION 4 |
15:45 | Amos Sironi (Prophesee) Learning from Events: on the Future of Machine Learning for Event-based Cameras. Slides, YouTube |
16:05 | Kynan Eng, CEO of iniVation Applications, Software and Hardware for Event-Based Vision. Slides, YouTube |
16:25 | Stefan Isler (Insightness) Event-based Vision for Augmented Reality. Slides, YouTube |
16:45 | Shoushun Chen, founder of CelePixel Technology Introduction of Celex Family Sensor and Event/Frame/Optical-flow Hybrid Processing. Slides, YouTube |
17:05 | Award Ceremony, Slides Best Paper Award: Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras Best Paper Award Finalist: Star Tracking using an Event Camera |
17:15 | Panel discussion. Slides |
Live Demos (during breaks)
- Laurie Bose, Jianing Chen, Stephen Carey, Piotr Dudek, Walterio Mayol-Cuevas,
Digit Recognition On Pixel Processor Arrays - Qian Liu, Ole Richter, Carsten Nielsen, Sadique Sheik, Giacomo Indiveri, Ning Qiao,
Face Recognition on an Ultra-low Power Event-driven Convolutional Neural Network ASIC - Min Liu, Wei-Tse Kao, Tobi Delbruck,
A Real-time Event-based Fast Corner Detection Demo based on FPGA - Gongyu Yang, Qilin Ye, Wanjun He, Lifeng Zhou, Xinyu Chen, Lei Yu, Wen Yang, Shoushun Chen, Wei Li,
Real-time VI-SLAM with High-Resolution Event Camera - Shoushun Chen, Menghan Guo,
CeleX-V: a 1M Pixel Multi-Mode Event-based Sensor - Prasan Shedligeri, Kaushik Mitra,
Joint Estimation of Optical Flow and Intensity Image from Event Sensors - Alex Z Zhu, Liangzhe Yuan, Kenneth Chaney, Kostas Daniilidis,
Unsupervised Event-based Learning of Optical Flow, Depth and Egomotion
Objectives:
This workshop is dedicated to event-based cameras, smart cameras and algorithms.
Event-based cameras are revolutionary vision sensors with three key advantages: a measurement rate that is almost 1
million times faster than standard cameras, a latency of microseconds, and a dynamic range
that is eight orders of magnitude larger than that of standard cameras. Because of these
advantages, event-based cameras open frontiers that are unthinkable with standard cameras
(which have been the main sensing technology of the past 60 years). These revolutionary
sensors enable the design of a new class of algorithms to track a baseball in the moonlight,
build a flying robot with the same agility of a fly, and doing structure from motion in challenging
lighting conditions and at remarkable speeds. These sensors became commercially available in
2008 and are slowly being adopted in computer vision and robotics. They have covered the
main news in recent years, with event-camera company Prophesee receiving $40 million in
investment from Intel and Bosch, and Samsung announcing mass production as well its use in
combination with IBM TrueNorth processor to recognize human gestures.
Cellular processor arrays (CPAs), such as the SCAMP sensor,
are novel sensors in which each pixel has a programmable processor, thus they yield massively parallel processing near the image plane.
Unlike a conventional image sensor, the SCAMP does not output raw images, but rather the results of on-sensor computations,
for instance a feature map or optic flow map.
Because early vision computations are carried out entirely on-sensor,
the resulting system has high speed and low-power consumption, enabling new embedded vision applications
in areas such as robotics, AR/VR, automotive, gaming, surveillance, etc.
This workshop will cover the sensing hardware, as well as the processing and learning methods needed to take
advantage of the above-mentioned novel cameras.
FAQs:
- What is an event camera? Watch this video explanation.
- What are possible applications of event cameras? A list of papers can be found here.
- Where can I buy an event camera? From Inivation, Insightness, Prophesee, Hillhouse Technology.
- Are there datasets and simulators I can play with, so that I don't have to buy the sensor? Yes, Dataset. Simulator. More.
- What is the SCAMP sensor? Read this page explanation.
- What are possible applications of the scamp sensor? Some applications can be found here.
- Where can I buy a SCAMP sensor? It is not commercially available. Contact Prof. Piotr Dudek.
- Where can I find more information? Check out this List of Event-based Vision Resources.
Organizers:
- Davide Scaramuzza - University of Zurich
- Guillermo Gallego - University of Zurich
- Kostas Daniilidis - University of Pennsylvania
Previous related workshops:
- IROS 2018 Workshop on Unconventional Sensing and Processing for Robotic Visual Perception.
- ICRA 2017 First International Workshop on Event-based Vision.
- ICRA 2015 Workshop on Innovative Sensing for Robotics, with focus on Neuromorphic Sensors.
- Event-Based Vision for High-Speed Robotics (slides) IROS 2015, Workshop on Alternative Sensing for Robot Perception.
- The Telluride Neuromorphic Cognition Engineering Workshops.
- Capo Caccia Workshops toward Cognitive Neuromorphic Engineering.