The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception

Creative Commons License

Zhu A. Z. , Thakur D., Özaslan T. , Pfrommer B., Kumar V., Daniilidis K.

IEEE Robotics and Automation Letters, vol.3, no.3, pp.2032-2039, 2018 (Journal Indexed in SCI Expanded) identifier

  • Publication Type: Article / Article
  • Volume: 3 Issue: 3
  • Publication Date: 2018
  • Doi Number: 10.1109/lra.2018.2800793
  • Title of Journal : IEEE Robotics and Automation Letters
  • Page Numbers: pp.2032-2039


© 2016 IEEE.Event-based cameras are a new passive sensing modality with a number of benefits over traditional cameras, including extremely low latency, asynchronous data acquisition, high dynamic range, and very low power consumption. There has been a lot of recent interest and development in applying algorithms to use the events to perform a variety of three-dimensional perception tasks, such as feature tracking, visual odometry, and stereo depth estimation. However, there currently lacks the wealth of labeled data that exists for traditional cameras to be used for both testing and development. In this letter, we present a large dataset with a synchronized stereo pair event based camera system, carried on a handheld rig, flown by a hexacopter, driven on top of a car, and mounted on a motorcycle, in a variety of different illumination levels and environments. From each camera, we provide the event stream, grayscale images, and inertial measurement unit (IMU) readings. In addition, we utilize a combination of IMU, a rigidly mounted lidar system, indoor and outdoor motion capture, and GPS to provide accurate pose and depth images for each camera at up to 100 Hz. For comparison, we also provide synchronized grayscale images and IMU readings from a frame-based stereo camera system.