Distilling ODE Solvers of Diffusion Models into Smaller Steps

Sanghwan Kim, Hao Tang, Fisher Yu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024, pp. 9410-9419

Abstract


Abstract Diffusion models have recently gained prominence as a novel category of generative models. Despite their success these models face a notable drawback in terms of slow sampling speeds requiring a high number of function evaluations (NFE) in the order of hundreds or thousands. In response both learning-free and learning-based sampling strategies have been explored to expedite the sampling process. Learning-free sampling employs various ordinary differential equation (ODE) solvers based on the formulation of diffusion ODEs. However it encounters challenges in faithfully tracking the true sampling trajectory particularly for small NFE. Conversely learning-based sampling methods such as knowledge distillation demand extensive additional training limiting their practical applicability. To overcome these limitations we introduce Distilled-ODE solvers (D-ODE solvers) a straightforward distillation approach grounded in ODE solver formulations. Our method seamlessly integrates the strengths of both learning-free and learning-based sampling. D-ODE solvers are constructed by introducing a single parameter adjustment to existing ODE solvers. Furthermore we optimize D-ODE solvers with smaller steps using knowledge distillation from ODE solvers with larger steps across a batch of samples. Comprehensive experiments demonstrate the superior performance of D-ODE solvers compared to existing ODE solvers including DDIM PNDM DPM-Solver DEIS and EDM particularly in scenarios with fewer NFE. Notably our method incurs negligible computational overhead compared to previous distillation techniques facilitating straightforward and rapid integration with existing samplers. Qualitative analysis reveals that D-ODE solvers not only enhance image quality but also faithfully follow the target ODE trajectory.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Kim_2024_CVPR, author = {Kim, Sanghwan and Tang, Hao and Yu, Fisher}, title = {Distilling ODE Solvers of Diffusion Models into Smaller Steps}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2024}, pages = {9410-9419} }