FLIP: A Difference Evaluator for Alternating Images

Andersson, Pontus; Akenine-Möller, Tomas; Nilsson, Jim; Åström, Kalle, et al. (2020-08). FLIP: A Difference Evaluator for Alternating Images. Proceedings of the ACM in Computer Graphics and Interactive Techniques, 3, (2), 1 - 23
Download:
URL:
DOI:
| Published | English
Authors:
Andersson, Pontus ; Akenine-Möller, Tomas ; Nilsson, Jim ; Åström, Kalle , et al.
Department:
Computer Vision and Machine Learning
ELLIIT: the Linköping-Lund initiative on IT and mobile communication
eSSENCE: The e-Science Collaboration
Mathematics (Faculty of Engineering)
Project:
Evaluating and Improving Rendered Visual Experiences
WASP: Wallenberg AI, Autonomous Systems and Software Program at Lund University
Research Group:
Computer Vision and Machine Learning
Abstract:
Image quality measures are becoming increasingly important in the field of computer graphics. For example, there is currently a major focus on generating photorealistic images in real time by combining path tracing with denoising, for which such quality assessment is integral. We present FLIP, which is a difference evaluator with a particular focus on the differences between rendered images and corresponding ground truths. Our algorithm produces a map that approximates the difference perceived by humans when alternating between two images. FLIP is a combination of modified existing building blocks, and the net result is surprisingly powerful. We have compared our work against a wide range of existing image difference algorithms and we have visually inspected over a thousand image pairs that were either retrieved from image databases or generated in-house. We also present results of a user study which indicate that our method performs substantially better, on average, than the other algorithms. To facilitate the use of FLIP, we provide source code in C++, MATLAB, NumPy/SciPy, and PyTorch.
ISSN:
2577-6193
LUP-ID:
397ced35-879a-43a6-a225-ce31b0e984a2 | Link: https://lup.lub.lu.se/record/397ced35-879a-43a6-a225-ce31b0e984a2 | Statistics

Cite this