Load Diffusion Model INT8 (W8A8)
OTUNetLoaderW8A8
Load INT8 tensorwise quantized models with fast torch._int_mm inference.
Pack: ComfyUI-Flux2-INT8
custom_nodes.ComfyUI-Flux2-INT8
Inputs (4)
Outputs (1)
| Name | Type |
|---|---|
| MODEL | MODEL |
OTUNetLoaderW8A8
Load INT8 tensorwise quantized models with fast torch._int_mm inference.
Pack: ComfyUI-Flux2-INT8
custom_nodes.ComfyUI-Flux2-INT8
| Name | Type |
|---|---|
| MODEL | MODEL |