An fp8 version (e4m3fn) of Wan2.1 Vace 14B converted from the repackaged version from Comfy-Org

UPDATE: Added fp8 e5m2 version converted from original 58.9GB model

Credits:

Original Vace from Wan-AI:

https://huggingface.co/Wan-AI/Wan2.1-VACE-14B

Repackaged fp16 from Comfy-Org:

https://huggingface.co/Comfy-Org/Wan_2.1_ComfyUI_repackaged

Comfyui node used to convert to fp8:

https://github.com/Shiba-2-shiba/ComfyUI_DiffusionModel_fp8_converter

e5m2 conversion scripts from:

https://huggingface.co/phazei

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support