Dear colleagues, We are very happy to announce the release of the first public collection of datasets recorded with an event camera (DAVIS) for pose estimation, visual odometry, and SLAM applications! The data also include intensity images, inertial measurements, ground truth from a motion-capture system, synthetic data, as well as an event camera simulator that allows you to create your own sequences! All the data are released both as standard text files and binary files (i.e., rosbag). Dataset: http://rpg.ifi.uzh.ch/davis_data.html Paper: https://arxiv.org/pdf/1610.08336v1 Video of some of the sequences: https://youtu.be/bVVBTQ7l36I More on our research on event vision: http://rpg.ifi.uzh.ch/research_dvs.html We provide data: * from a large variety of scenarios, ranging from indoors to outdoors, and high dynamic range * featuring a variety of motions, from slow to fast, 1-DOF to 6-DOF * with several sequences recorded using a motorized linear slider, leading to very smooth motions! * synthetic data and, moreover, an event camera simulator that allows you to create your own sequences! * including intensity images and inertial measurements at high frequencies. * with precise ground truth from a motion-capture system. * with accurate intrinsic and extrinsic calibration. -------------------------------------------------------------- About event cameras and the DAVIS sensor -------------------------------------------------------------- Event cameras are revolutionary vision sensors that overcome the limitations of standard cameras in scenes characterized by high-dynamic range and high-speed motion: https://youtu.be/iZZ77F-hwzs However, as these cameras are still expensive and not widely spread, we hope that will accelerate research on event-based algorithms! Our dataset was recorded with a DAVIS240C sensor, which incorporates a conventional global-shutter camera and an event-based sensor in the same pixel array. This sensor has great potential for high-speed and high-dynamic range robotics and computer vision applications because it combines the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution (~1 micro-second), and very high dynamic range (120 dB). However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called "events") and synchronous grayscale frames. We greatly acknowledge our sponsors: the DARPA FLA Program, the Google Faculty Research Award, the Qualcomm Innovation Fellowship, the SNSF-ERC Starting Grant, NCCR Robotics, the Swiss National Science Foundation, and the UZH Forschungskredit. All feedback is very welcome! Elias Mueggler, Henri Rebecq, Guillermo Gallego, Tobi Delbruck, Davide Scaramuzza --- [Visit Topic](http://discourse.ros.org/t/release-of-event-camera-dataset-and-simulator/818/1) or reply to this email to respond. To unsubscribe from these emails, [click here](http://discourse.ros.org/email/unsubscribe/861ccd5db5c0fa9f63e2ab14ec9e8b0ff1c98ec9a2bbf72dd618646bd8f1883c).