by cs-chan

Exclusively Dark (ExDARK) dataset which to the best of our knowledge, is the largest collection of l...

239 Stars 53 Forks Last release: Not found BSD 3-Clause "New" or "Revised" License 98 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

Exclusively Dark (ExDark) Image Dataset (Official Site)


Updated on June 02, 2019 (Code for low-light image enhancement is released)

Updated on Oct. 31, 2018 (Accepted for publication in CVIU)

Released on May 29, 2018


In order to facilitate a new object detection and image enhancement research particularly in the low-light environment, we introduce the Exclusively Dark (ExDark) dataset (CVIU2019). The Exclusively Dark (ExDARK) dataset is a collection of 7,363 low-light images from very low-light environments to twilight (i.e 10 different conditions) with 12 object classes (similar to PASCAL VOC) annotated on both image class level and local object bounding boxes.


Source Code

The source code for our work on low-light image enhancement is now available. Please refer to the SPIC folder.


If you find this dataset useful for your research, please cite

  title={Getting to Know Low-light Images with The Exclusively Dark Dataset},
  author={Loh, Yuen Peng and Chan, Chee Seng},
  journal={Computer Vision and Image Understanding},


Suggestions and opinions on this dataset (both positive and negative) are greatly welcomed. Please contact the authors by sending an email to

lexloh2009 at hotmail.com
cs.chan at um.edu.my

License and Copyright

The project is open source under BSD-3 license (see the


For commercial purpose usage, please contact Dr. Chee Seng Chan at

cs.chan at um.edu.my

©2018-2019 Center of Image and Signal Processing, Faculty of Computer Science and Information Technology, University of Malaya.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.