Events: Recycling Video Datasets for Event Cameras"!
Recent learning methods applied to event cameras require large amounts
of data for training, which is hardly available due the novelty of event
sensors in computer vision research. In this work, we address these
needs by converting any existing video dataset recorded with
conventional cameras to synthetic event data. This unlocks the use of a
virtually unlimited number of existing video datasets for training
networks designed for real event data.
We achieve this by applying the event camera simulator to high
frame-rate video, obtained from state of the art frame interpolation
techniques.
We show that networks trained on synthetic events generalize to real
events especially after fine-tuning.
The code is open-source!
PDF: http://rpg.ifi.uzh.ch/docs/CVPR20_Gehrig.pdf
Video: https://youtu.be/uX6XknBGg0w
Code: https://github.com/uzh-rpg/rpg_vid2e
ESIM: Event camera simulator: https://github.com/uzh-rpg/rpg_esim
Best regards,
Daniel Gehrig, Mathias Gehrig, Javier Hidalgo Carrio', Davide Scaramuzza