Skip to content

ChatGLM3加载器 CUDA Error: no kernel image is available for execution on the device #96

@Ethan-delkal

Description

@Ethan-delkal

ChatGLM3加载器 节点报错
torch version: 2.7.0+cu128
!!! Exception during processing !!! CUDA Error: no kernel image is available for execution on the device
Traceback (most recent call last):
File "D:\AI\ComfyUI-aki-v1.4\execution.py", line 349, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\AI\ComfyUI-aki-v1.4\execution.py", line 224, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\AI\ComfyUI-aki-v1.4\execution.py", line 196, in map_node_over_list
process_inputs(input_dict, i)
File "D:\AI\ComfyUI-aki-v1.4\execution.py", line 185, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ_init
.py", line 47, in load_chatglm3
return mz_kolors_core.MZ_ChatGLM3Loader_call(kwargs)
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ\mz_kolors_core.py", line 115, in MZ_ChatGLM3Loader_call
text_encoder.quantize(4)
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ\chatglm3\modeling_chatglm.py", line 852, in quantize
quantize(self.encoder, weight_bit_width)
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ\chatglm3\quantization.py", line 178, in quantize
layer.self_attention.query_key_value = QuantizedLinear(
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ\chatglm3\quantization.py", line 159, in init
self.weight = compress_int4_weight(self.weight)
File "D:\AI\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Kolors-MZ\chatglm3\quantization.py", line 98, in compress_int4_weight
kernels.int4WeightCompression(
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\kernels\base.py", line 48, in call
func = self._prepare_func()
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\kernels\base.py", line 40, in _prepare_func
self._module.get_module(), self._func_name
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\kernels\base.py", line 24, in get_module
self._module[curr_device] = cuda.cuModuleLoadData(self._code)
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\library\base.py", line 94, in wrapper
return f(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\library\cuda.py", line 233, in cuModuleLoadData
checkCUStatus(cuda.cuModuleLoadData(ctypes.byref(module), data))
File "D:\AI\ComfyUI-aki-v1.4\python\lib\site-packages\cpm_kernels\library\cuda.py", line 216, in checkCUStatus
raise RuntimeError("CUDA Error: %s" % cuGetErrorString(error))
RuntimeError: CUDA Error: no kernel image is available for execution on the device

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions