Time Lens: Event-based Video Frame Interpolation



Time Lens (CVPR 2021)


Time Lens++ (CVPR 2022)



Description

Overview: State-of-the-art frame interpolation methods generate intermediate frames by inferring object motions in the image from consecutive key-frames. In the absence of additional information, first-order approximations, i.e. optical flow, must be used, but this choice restricts the types of motions that can be modeled, leading to errors in highly dynamic scenarios. Event cameras are novel sensors that address this limitation by providing auxiliary visual information in the blind-time between frames.

Time Lens: Our seminal work, Time Lens, leverages the advantages of both sensors, thereby outperforming purely frame-based method by up to 5.21 dB in terms of PSNR, various real-world and synthetic datasets. It does this by combining warping- and synthesis-based interpolation. While warping-based interpolation relies on non-linear flow estimated from events, synthesis-based interpolation handles brightness constancy violations, where optical flow is ill-defined.

Time Lens++: In our recent follow-up work, Time Lens++, we address remaining failure cases of Time Lens, and improve its run-time, by introducing multi-scale feature-level fusion and computing one-shot non-linear inter-frame motion. These modules eliminate significant artefacts caused by (i) brittle image-level fusion (ii) potentially temporally inconsistent motion estimation and (iii) ghosting in low-event-rate regions which is caused by optical flow failure.

Datasets: In both works, we introduce novel large-scale datasets with synchronized and aligned events and frames, which will spur the advancement of event- and frame-based video frame interpolation.

Citing

If you use this work in your research, please cite the following papers:

Time Lens: Event-based Video Frame Interpolation

S. Tulyakov*, D. Gehrig*, S. Georgoulis, J. Erbach, M. Gehrig, Y. Li, D. Scaramuzza

Time Lens: Event-based Video Frame Interpolation

IEEE Conference on Computer Vision and Pattern Recognition, 2021.

PDF Video Code Project Page and Dataset Slides


Time Lens++: Event-based Frame Interpolation with Parametric Non-linear Flow and Multi-scale Fusion

S. Tulyakov, A. Bochicchio, D. Gehrig, S. Georgoulis, Y. Li, D. Scaramuzza

Time Lens++: Event-based Frame Interpolation with Parametric Non-linear Flow and Multi-scale Fusion

IEEE Conference of Computer Vision and Pattern Recognition (CVPR), 2022, New Orleans, USA.

PDF YouTube Dataset Project Webpage


Google Colab

Use Time Lens on your own data by using our Google Colab notebook here.

Evaluation Code

Code for evaluation can be downloaded after filling out this form.

Beam Splitter Event and RGB (BS-ERGB) dataset

The test Beam Splitter Event and RGB (HS-ERGB) dataset used in our paper Time Lens++: Event-based Frame Interpolation with Parametric Non-linear Flow and Multi-scale Fusion fill out this form.

Slow Input Video Events TimeLens++








High Speed Event and RGB (HS-ERGB) dataset

The test High Speed Event and RGB (HS-ERGB) dataset used in our paper Time Lens: Event-based Video Frame Interpolation can also be downloaded after filling out this form.


Slow Input Video Events TimeLens



















Slow Input Video Events TimeLens