ComfyUI-CogVideoX-MZ icon indicating copy to clipboard operation
ComfyUI-CogVideoX-MZ copied to clipboard

Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.

Open ZeeMenng opened this issue 1 year ago • 8 comments

换了各种weight_type都不行。

image

ZeeMenng avatar Sep 28 '24 15:09 ZeeMenng

关掉fp8_fast_mode可以吗

wailovet avatar Sep 28 '24 15:09 wailovet

不行,关掉后显示RuntimeError: unsupported scalarType image got prompt transformer type: 5b GGUF: False model weight dtype: torch.float8_e4m3fn manual cast dtype: torch.float16 Encoded latents shape: torch.Size([1, 1, 16, 60, 90]) /opt/homebrew/Caskroom/miniconda/base/lib/python3.12/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: clean_up_tokenization_spaces was not set. It will be set to True by default. This behavior will be depracted in transformers v4.45, and will be then set to False by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Requested to load SD3ClipModel_ Loading 1 new model loaded completely 0.0 4541.693359375 True !!! Exception during processing !!! unsupported scalarType Traceback (most recent call last): File "/Users/ZeeMenng/Project/ComfyUI/execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ZeeMenng/Project/ComfyUI/execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ZeeMenng/Project/ComfyUI/execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "/Users/ZeeMenng/Project/ComfyUI/execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ZeeMenng/Project/ComfyUI/custom_nodes/ComfyUI-CogVideoXWrapper/nodes.py", line 841, in process autocast_context = torch.autocast(mm.get_autocast_device(device)) if autocastcondition else nullcontext() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/Caskroom/miniconda/base/lib/python3.12/site-packages/torch/amp/autocast_mode.py", line 229, in init dtype = torch.get_autocast_dtype(device_type) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: unsupported scalarType

ZeeMenng avatar Sep 28 '24 15:09 ZeeMenng

更新一下ComfyUI-CogVideoXWrapper试试

wailovet avatar Sep 28 '24 16:09 wailovet

全都升级过了,一样的错误,试了各种办法不行,很奇怪。

我的是M1 Pro,Python我用3.12和3.11两个版本分别作了尝试,都不行。不知道哪个依赖包版本有问题,还是怎样的。

image image image

ZeeMenng avatar Sep 29 '24 01:09 ZeeMenng

目前看起来支持应该有些问题, 可以看 https://github.com/kijai/ComfyUI-CogVideoXWrapper/issues/59

wailovet avatar Sep 29 '24 02:09 wailovet

For MPS, you can try: in CogVideoXLoader node in ComfyUI, set weight_type to bf16 enable_vae_encode_tiling to true

then a code change: custom_cogvideox_transformer_3d.py

Line: 103, change to this: query.to(torch.bfloat16), key.to(torch.bfloat16), value.to(torch.bfloat16), attn_mask=attention_mask, dropout_p=0.0, is_causal=False

YAY-3M-TA3 avatar Oct 22 '24 15:10 YAY-3M-TA3

你解决这个问题了吗

LONGG1126 avatar Nov 26 '24 03:11 LONGG1126

我是m1max运行flux模型的时候出现这个问题,解决方式是打开comfyui的时候用cpu运行 python main.py --force-fp16 --use-split-cross-attention --cpu 切换cpu运行以后可以完整运行流程,不过速度很慢很慢=。= 感觉是mac支持的问题,要是想快只能换电脑

lvboy1 avatar Dec 04 '24 02:12 lvboy1