-
[pdf]
[arXiv]
[bibtex]@InProceedings{Jeong_2024_CVPR, author = {Jeong, Kiyoon and Lee, Woojun and Nam, Woongchan and Ma, Minjeong and Kang, Pilsung}, title = {Technical Report of NICE Challenge at CVPR 2024: Caption Re-ranking Evaluation Using Ensembled CLIP and Consensus Scores}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {7366-7372} }
Technical Report of NICE Challenge at CVPR 2024: Caption Re-ranking Evaluation Using Ensembled CLIP and Consensus Scores
Abstract
This report presents the ECO (Ensembled Clip score and cOnsensus score) pipeline from team DSBA LAB which is a new framework used to evaluate and rank captions for a given image. ECO selects the most accurate caption describing image. It is made possible by combining an Ensembled CLIP score which considers the semantic alignment between the image and captions with a Consensus score that accounts for the essentialness of the captions. Using this framework we achieved notable success in the CVPR 2024 Workshop Challenge on Caption Re-ranking Evaluation at the New Frontiers for Zero-Shot Image Captioning Evaluation (NICE). Specifically we secured third place based on the CIDEr metric second in both the SPICE and METEOR metrics and first in the ROUGE-L and all BLEU Score metrics. The code and configuration for the ECO framework are available at https://github.com/ DSBA-Lab/ECO .
Related Material