Skip to content

Loading other diffusion models after installing QuantOps #26

@FearL0rd

Description

@FearL0rd

hi,

I'm having a few issues after loading the none.
if I load a non-Quantops FLUX1 or 2 using regular or KJ diffusion loader. I have the error message. Load the Quantops models works using QuantOps node.
No issues with Z_IMAGE non-Quantops diffusion using KJ. only flux. I haven't tested WAN

Removing the ComfyUI-QuantOps node everything works normal.

I have a 3090 Card

"triton.compiler.errors.CompilationError: at 1:0:

def dequantize_fp8_kernel_tl(

^

ValueError("type fp8e4nv not supported in this architecture. The supported fp8 dtypes are ('fp8e4b15', 'fp8e5')")







# ComfyUI Error Report

## Error Details

- **Node ID:** 61

- **Node Type:** KSampler

- **Exception Type:** triton.compiler.errors.CompilationError

- **Exception Message:** triton.compiler.errors.CompilationError: at 1:0:

def dequantize_fp8_kernel_tl(

^

ValueError("type fp8e4nv not supported in this architecture. The supported fp8 dtypes are ('fp8e4b15', 'fp8e5')")



## Stack Trace

File "/home/comfy/ComfyUI/execution.py", line 525, in execute

output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions