Paper
27 September 2024 Efficient calibration dataset sampling for post-training quantization of diffusion models
Yanwei Wang, Wei Huang, Le Yang, Yujie Dai, Kun Zhao, Yu Fang, Kefeng Zhu
Author Affiliations +
Proceedings Volume 13275, Sixth International Conference on Information Science, Electrical, and Automation Engineering (ISEAE 2024); 132751L (2024) https://doi.org/10.1117/12.3037447
Event: 6th International Conference on Information Science, Electrical, and Automation Engineering (ISEAE 2024), 2024, Wuhan, China
Abstract
Diffusion models have emerged as a leading method for generating diverse and realistic samples, revolutionizing tasks like text-to-image generation, super-resolution, and inpainting. However, their slow sampling process hampers implementation. This work focuses on post-training quantization (PTQ) optimization for diffusion models. We propose an error-contribution-aware calibration dataset sampling scheme, effectively reducing calibration dataset size while preserving model precision. Experimental results demonstrate the effectiveness of our approach, achieving performance comparable to uniform sampling methods with a significant reduction in calibration dataset size.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Yanwei Wang, Wei Huang, Le Yang, Yujie Dai, Kun Zhao, Yu Fang, and Kefeng Zhu "Efficient calibration dataset sampling for post-training quantization of diffusion models", Proc. SPIE 13275, Sixth International Conference on Information Science, Electrical, and Automation Engineering (ISEAE 2024), 132751L (27 September 2024); https://doi.org/10.1117/12.3037447
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Calibration

Diffusion

Quantization

Data modeling

Performance modeling

Mathematical optimization

Image processing

Back to Top