This workshop assesses current evaluation procedures for object detection, highlights their shortcomings and opens discussion for possible improvements.
Through a focus on evaluation using challenges, the object detection community has been able to quickly identify which methods are effective by examining performance metrics. However, as this technological boom progresses, it is important to assess whether our evaluation metrics and procedures adequately align with how object detection will be used in practical applications. Quantitative results should be easily reconciled with a detector’s performance in applied tasks. This workshop provides a forum to discuss these ideas and evaluate whether current standards meet the needs of the object detection community.
We invite authors to contribute papers to the workshop. Topics of interest comprise, but are not limited to:
- New evaluation measures/metrics for object detection
- New evaluation/visualization tools to analyze object detection systems
- New evaluation procedures for better understanding object detection performance
- Examinations of current evaluation procedures
- New datasets designed to examine specific challenges in object detection
- New detection methods that provide contributions/insights unrewarded by current evaluation procedures (e.g. improved detector calibration, probabilistic object detection, etc.)
- Submissions must follow the ECCV format and be up to 4 pages in length including references
- It is accepted if this is an abbreviated version of a larger paper published elsewhere if properly referenced
- Submit your paper through CMT (link)
- Accepted papers will be presented at a poster session
Dr David Hall
Research Fellow
Robotic Vision Benchmarking and Evaluation Project
Australian Centre for Robotic Vision
Queensland University of Technology
e-mail: d20.hall@qut.edu.au
Phone: +61 7 31380656
ORCiD: 0000-0002-5520-0128