ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

VAEDecode Mac BFloat16 is not supported on MPS

Open Lyonns opened this issue 1 year ago • 21 comments

Expected Behavior

I am running a basic comfyui workflow and it was working just previously, but for some reason something happened when I reloaded the page and now am facing this persistent issue.

Actual Behavior

The VAEDecode stops the production. image

Steps to Reproduce

download comfyui download node manager download and use control net image And yes I've also tried with a normal workflow with no extra nodes, but the issue prevails. Everything is up to date as well.

Debug Logs

# ComfyUI Error Report
## Error Details
- **Node ID:** 24
- **Node Type:** VAEDecode
- **Exception Type:** TypeError
- **Exception Message:** BFloat16 is not supported on MPS
## Stack Trace

  File "/Users/tashielyonns/ComfyUI/execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "/Users/tashielyonns/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/nodes.py", line 285, in decode
    images = vae.decode(samples["samples"])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/comfy/sd.py", line 465, in decode
    model_management.load_models_gpu([self.patcher], memory_required=memory_used)

  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 550, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)

  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 366, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)

  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 395, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 759, in partially_load
    raise e

  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 756, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)

  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 607, in load
    x[2].to(device_to)

  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1160, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 833, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^

  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1158, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

System Information

  • ComfyUI Version: v0.3.10-9-ge1dec3c
  • Arguments: main.py
  • OS: posix
  • Python Version: 3.11.5 (main, Sep 11 2023, 08:31:25) [Clang 14.0.6 ]
  • Embedded Python: false
  • PyTorch Version: 2.1.2

Devices

  • Name: mps
    • Type: mps
    • VRAM Total: 34359738368
    • VRAM Free: 14767276032
    • Torch VRAM Total: 34359738368
    • Torch VRAM Free: 14767276032

Logs

2024-12-28T20:38:18.645701 - [START] Security scan2024-12-28T20:38:18.645717 - 
2024-12-28T20:38:19.152609 - [DONE] Security scan2024-12-28T20:38:19.152649 - 
2024-12-28T20:38:19.193693 - ## ComfyUI-Manager: installing dependencies done.2024-12-28T20:38:19.193731 - 
2024-12-28T20:38:19.193745 - ** ComfyUI startup time:2024-12-28T20:38:19.193758 -  2024-12-28T20:38:19.193770 - 2024-12-28 20:38:19.1937362024-12-28T20:38:19.193780 - 
2024-12-28T20:38:19.193808 - ** Platform:2024-12-28T20:38:19.193823 -  2024-12-28T20:38:19.193834 - Darwin2024-12-28T20:38:19.193844 - 
2024-12-28T20:38:19.193854 - ** Python version:2024-12-28T20:38:19.193863 -  2024-12-28T20:38:19.193871 - 3.11.5 (main, Sep 11 2023, 08:31:25) [Clang 14.0.6 ]2024-12-28T20:38:19.193879 - 
2024-12-28T20:38:19.193890 - ** Python executable:2024-12-28T20:38:19.193898 -  2024-12-28T20:38:19.193907 - /Users/tashielyonns/ComfyUI/venv/bin/python2024-12-28T20:38:19.193914 - 
2024-12-28T20:38:19.193922 - ** ComfyUI Path:2024-12-28T20:38:19.193930 -  2024-12-28T20:38:19.193938 - /Users/tashielyonns/ComfyUI2024-12-28T20:38:19.193946 - 
2024-12-28T20:38:19.193980 - ** Log path:2024-12-28T20:38:19.193988 -  2024-12-28T20:38:19.193996 - /Users/tashielyonns/ComfyUI/comfyui.log2024-12-28T20:38:19.194004 - 
2024-12-28T20:38:19.723184 - 
#######################################################################2024-12-28T20:38:19.723239 - 
2024-12-28T20:38:19.723252 - [ComfyUI-Manager] Starting dependency installation/(de)activation for the extension
2024-12-28T20:38:19.723264 - 
2024-12-28T20:38:19.723428 - Install: pip packages for '/Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager'2024-12-28T20:38:19.723440 - 
2024-12-28T20:38:20.030703 - 2024-12-28T20:38:20.030778 -  2024-12-28T20:38:20.030806 - 2024-12-28T20:38:20.036517 - 2024-12-28T20:38:20.036626 -  2024-12-28T20:38:20.036649 - 2024-12-28T20:38:20.037705 - 2024-12-28T20:38:20.037745 -  2024-12-28T20:38:20.037763 - 2024-12-28T20:38:20.817585 - [!]2024-12-28T20:38:20.817619 -  2024-12-28T20:38:20.817631 - 
2024-12-28T20:38:20.817643 - 2024-12-28T20:38:20.817653 - [!]2024-12-28T20:38:20.817663 -  2024-12-28T20:38:20.817674 - [notice] A new release of pip is available: 23.2.1 -> 24.3.1
2024-12-28T20:38:20.817683 - 2024-12-28T20:38:20.817692 - [!]2024-12-28T20:38:20.817701 -  2024-12-28T20:38:20.817711 - [notice] To update, run: python -m pip install --upgrade pip
2024-12-28T20:38:20.817720 - 2024-12-28T20:38:21.085742 - 2024-12-28T20:38:21.085792 -  2024-12-28T20:38:21.085808 - 2024-12-28T20:38:21.087800 - 2024-12-28T20:38:21.087813 -  2024-12-28T20:38:21.087823 - 2024-12-28T20:38:21.088273 - 2024-12-28T20:38:21.088283 -  2024-12-28T20:38:21.088294 - 2024-12-28T20:38:21.088787 - 2024-12-28T20:38:21.088796 -  2024-12-28T20:38:21.088815 - 2024-12-28T20:38:21.089262 - 2024-12-28T20:38:21.089271 -  2024-12-28T20:38:21.089282 - 2024-12-28T20:38:21.090372 - 2024-12-28T20:38:21.090381 -  2024-12-28T20:38:21.090389 - 2024-12-28T20:38:21.090847 - 2024-12-28T20:38:21.090856 -  2024-12-28T20:38:21.090863 - 2024-12-28T20:38:21.096746 - 2024-12-28T20:38:21.096755 -  2024-12-28T20:38:21.096762 - 2024-12-28T20:38:21.098757 - 2024-12-28T20:38:21.098766 -  2024-12-28T20:38:21.098773 - 2024-12-28T20:38:21.101986 - 2024-12-28T20:38:21.102006 -  2024-12-28T20:38:21.102019 - 2024-12-28T20:38:21.102581 - 2024-12-28T20:38:21.102602 -  2024-12-28T20:38:21.102671 - 2024-12-28T20:38:21.103785 - 2024-12-28T20:38:21.103809 -  2024-12-28T20:38:21.103819 - 2024-12-28T20:38:21.115321 - 2024-12-28T20:38:21.115338 -  2024-12-28T20:38:21.115348 - 2024-12-28T20:38:21.116638 - 2024-12-28T20:38:21.116650 -  2024-12-28T20:38:21.116658 - 2024-12-28T20:38:21.883943 - [!]2024-12-28T20:38:21.883984 -  2024-12-28T20:38:21.883998 - 
2024-12-28T20:38:21.884011 - 2024-12-28T20:38:21.884022 - [!]2024-12-28T20:38:21.884031 -  2024-12-28T20:38:21.884044 - [notice] A new release of pip is available: 23.2.1 -> 24.3.1
2024-12-28T20:38:21.884053 - 2024-12-28T20:38:21.884062 - [!]2024-12-28T20:38:21.884071 -  2024-12-28T20:38:21.884081 - [notice] To update, run: python -m pip install --upgrade pip
2024-12-28T20:38:21.884089 - 2024-12-28T20:38:21.923375 - [SKIP] Downgrading pip package isn't allowed: huggingface-hub (cur=0.27.0)2024-12-28T20:38:21.923415 - 
2024-12-28T20:38:22.175593 - 2024-12-28T20:38:22.175633 -  2024-12-28T20:38:22.175649 - 2024-12-28T20:38:22.935137 - [!]2024-12-28T20:38:22.935170 -  2024-12-28T20:38:22.935185 - 
2024-12-28T20:38:22.935196 - 2024-12-28T20:38:22.935206 - [!]2024-12-28T20:38:22.935217 -  2024-12-28T20:38:22.935238 - [notice] A new release of pip is available: 23.2.1 -> 24.3.1
2024-12-28T20:38:22.935250 - 2024-12-28T20:38:22.935258 - [!]2024-12-28T20:38:22.935272 -  2024-12-28T20:38:22.935281 - [notice] To update, run: python -m pip install --upgrade pip
2024-12-28T20:38:22.935290 - 2024-12-28T20:38:22.966627 - 
[ComfyUI-Manager] Startup script completed.2024-12-28T20:38:22.966668 - 
2024-12-28T20:38:22.966682 - #######################################################################
2024-12-28T20:38:22.966694 - 
2024-12-28T20:38:23.390896 - 
Prestartup times for custom nodes:
2024-12-28T20:38:23.391037 -    4.8 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager
2024-12-28T20:38:23.391075 - 
2024-12-28T20:38:24.226648 - Total VRAM 32768 MB, total RAM 32768 MB
2024-12-28T20:38:24.226745 - pytorch version: 2.1.2
2024-12-28T20:38:24.226839 - Set vram state to: SHARED
2024-12-28T20:38:24.226876 - Device: mps
2024-12-28T20:38:24.642940 - Using sub quadratic optimization for attention, if you have memory or speed issues try using: --use-split-cross-attention
2024-12-28T20:38:25.061279 - [Prompt Server] web root: /Users/tashielyonns/ComfyUI/web
2024-12-28T20:38:25.316569 - ### Loading: ComfyUI-Manager (V2.55.5)2024-12-28T20:38:25.316611 - 
2024-12-28T20:38:25.378977 - ### ComfyUI Version: v0.3.10-9-ge1dec3c | Released on '2024-12-28'2024-12-28T20:38:25.379028 - 
2024-12-28T20:38:25.522107 - [36;20m[comfyui_controlnet_aux] | INFO -> Using ckpts path: /Users/tashielyonns/ComfyUI/custom_nodes/comfyui_controlnet_aux/ckpts[0m
2024-12-28T20:38:25.522161 - [36;20m[comfyui_controlnet_aux] | INFO -> Using symlinks: False[0m
2024-12-28T20:38:25.522196 - [36;20m[comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider'][0m
2024-12-28T20:38:25.626584 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json2024-12-28T20:38:25.626636 - 
2024-12-28T20:38:25.647135 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json2024-12-28T20:38:25.647159 - 
2024-12-28T20:38:25.678254 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2024-12-28T20:38:25.678282 - 
2024-12-28T20:38:25.704516 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json2024-12-28T20:38:25.704589 - 
2024-12-28T20:38:25.713720 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json2024-12-28T20:38:25.713923 - 
2024-12-28T20:38:25.794230 - ### ComfyUI-FaceSwap: Check dependencies2024-12-28T20:38:25.794256 - 
2024-12-28T20:38:25.796062 - ### ComfyUI-FaceSwap: Check basic models2024-12-28T20:38:25.796079 - 
2024-12-28T20:38:25.943355 - Adding2024-12-28T20:38:25.943383 -  2024-12-28T20:38:25.943396 - /Users/tashielyonns/ComfyUI/custom_nodes2024-12-28T20:38:25.943406 -  2024-12-28T20:38:25.943416 - to sys.path2024-12-28T20:38:25.943425 - 
2024-12-28T20:38:25.991730 - Could not find efficiency nodes2024-12-28T20:38:25.991759 - 
2024-12-28T20:38:25.993320 - Loaded ControlNetPreprocessors nodes from2024-12-28T20:38:25.993335 -  2024-12-28T20:38:25.993344 - /Users/tashielyonns/ComfyUI/custom_nodes/comfyui_controlnet_aux2024-12-28T20:38:25.993353 - 
2024-12-28T20:38:25.993505 - Could not find AdvancedControlNet nodes2024-12-28T20:38:25.993516 - 
2024-12-28T20:38:25.993879 - Could not find AnimateDiff nodes2024-12-28T20:38:25.993890 - 
2024-12-28T20:38:25.994054 - Could not find IPAdapter nodes2024-12-28T20:38:25.994065 - 
2024-12-28T20:38:25.995088 - Could not find VideoHelperSuite nodes2024-12-28T20:38:25.995100 - 
2024-12-28T20:38:25.995324 - Could not load ImpactPack nodes2024-12-28T20:38:25.995335 -  2024-12-28T20:38:25.995343 - Could not find ImpactPack nodes2024-12-28T20:38:25.995352 - 
2024-12-28T20:38:26.050554 - 
Import times for custom nodes:
2024-12-28T20:38:26.050618 -    0.0 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/websocket_image_save.py
2024-12-28T20:38:26.050643 -    0.0 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI_UltimateSDUpscale
2024-12-28T20:38:26.050661 -    0.1 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager
2024-12-28T20:38:26.050681 -    0.1 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/comfyui-art-venture
2024-12-28T20:38:26.050696 -    0.1 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-FaceSwap
2024-12-28T20:38:26.050711 -    0.4 seconds: /Users/tashielyonns/ComfyUI/custom_nodes/comfyui_controlnet_aux
2024-12-28T20:38:26.050725 - 
2024-12-28T20:38:26.053092 - Starting server

2024-12-28T20:38:26.053327 - To see the GUI go to: http://127.0.0.1:8188
2024-12-28T20:39:04.314658 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json2024-12-28T20:39:04.314703 - 2024-12-28T20:39:04.318718 -  [DONE]2024-12-28T20:39:04.318784 - 
2024-12-28T20:39:09.004365 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/.cache/1742899825_extension-node-map.json2024-12-28T20:39:09.004454 - 2024-12-28T20:39:09.009295 -  [DONE]2024-12-28T20:39:09.009354 - 
2024-12-28T20:39:09.035911 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/.cache/1514988643_custom-node-list.json2024-12-28T20:39:09.035961 - 2024-12-28T20:39:09.040695 -  [DONE]2024-12-28T20:39:09.040746 - 
2024-12-28T20:39:09.040955 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/.cache/746607195_github-stats.json2024-12-28T20:39:09.040975 - 2024-12-28T20:39:09.043293 -  [DONE]2024-12-28T20:39:09.043345 - 
2024-12-28T20:39:09.131997 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/.cache/1742899825_extension-node-map.json2024-12-28T20:39:09.132027 - 2024-12-28T20:39:09.134656 -  [DONE]2024-12-28T20:39:09.134677 - 
2024-12-28T20:39:20.260562 - got prompt
2024-12-28T20:39:20.584805 - model weight dtype torch.float16, manual cast: None
2024-12-28T20:39:20.585489 - model_type EPS
2024-12-28T20:39:23.761357 - Using split attention in VAE
2024-12-28T20:39:23.761797 - Using split attention in VAE
2024-12-28T20:39:23.857244 - VAE load device: mps, offload device: cpu, dtype: torch.bfloat16
2024-12-28T20:39:23.945538 - Requested to load SDXLClipModel
2024-12-28T20:39:23.951176 - loaded completely 9.5367431640625e+25 1560.802734375 True
2024-12-28T20:39:23.952638 - CLIP model load device: cpu, offload device: cpu, current: cpu, dtype: torch.float16
2024-12-28T20:39:25.954577 - loaded straight to GPU
2024-12-28T20:39:25.954912 - Requested to load SDXL
2024-12-28T20:39:25.970480 - loaded completely 9.5367431640625e+25 4897.0483474731445 True
2024-12-28T20:39:26.250895 - model_path is /Users/tashielyonns/ComfyUI/custom_nodes/comfyui_controlnet_aux/ckpts/lllyasviel/Annotators/ControlNetHED.pth2024-12-28T20:39:26.250935 - 
2024-12-28T20:39:30.485382 - Requested to load ControlNet
2024-12-28T20:39:32.135651 - loaded completely 9.5367431640625e+25 2395.547637939453 True
2024-12-28T20:41:11.373878 - 
100%|███████████████████████████████████████████| 20/20 [01:39<00:00,  4.90s/it]2024-12-28T20:41:11.373991 - 
100%|███████████████████████████████████████████| 20/20 [01:39<00:00,  4.96s/it]2024-12-28T20:41:11.374007 - 
2024-12-28T20:41:11.539077 - Requested to load AutoencoderKL
2024-12-28T20:41:11.640917 - !!! Exception during processing !!! BFloat16 is not supported on MPS
2024-12-28T20:41:11.647614 - Traceback (most recent call last):
  File "/Users/tashielyonns/ComfyUI/execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "/Users/tashielyonns/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/nodes.py", line 285, in decode
    images = vae.decode(samples["samples"])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/comfy/sd.py", line 465, in decode
    model_management.load_models_gpu([self.patcher], memory_required=memory_used)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 550, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 366, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 395, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 759, in partially_load
    raise e
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 756, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 607, in load
    x[2].to(device_to)
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1160, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 833, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1158, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: BFloat16 is not supported on MPS

2024-12-28T20:41:11.649024 - Prompt executed in 111.38 seconds
2024-12-28T20:49:20.520590 - got prompt
2024-12-28T20:49:20.545350 - xFormers not available
2024-12-28T20:49:20.546556 - xFormers not available
2024-12-28T20:49:20.548660 - model_path is /Users/tashielyonns/ComfyUI/custom_nodes/comfyui_controlnet_aux/ckpts/depth-anything/Depth-Anything-V2-Large/depth_anything_v2_vitl.pth2024-12-28T20:49:20.548718 - 
2024-12-28T20:49:20.553273 - using MLP layer as FFN
2024-12-28T20:49:24.990108 - /Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/functional.py:4028: UserWarning: The operator 'aten::upsample_bicubic2d.out' is not currently supported on the MPS backend and will fall back to run on the CPU. This may have performance implications. (Triggered internally at /Users/runner/work/pytorch/pytorch/pytorch/aten/src/ATen/mps/MPSFallback.mm:13.)
  return torch._C._nn.upsample_bicubic2d(input, output_size, align_corners, scale_factors)
2024-12-28T20:49:25.828108 - 
  0%|                                                    | 0/20 [00:00<?, ?it/s]2024-12-28T20:49:26.637174 - FETCH DATA from: /Users/tashielyonns/ComfyUI/custom_nodes/ComfyUI-Manager/.cache/4245046894_model-list.json2024-12-28T20:49:26.637212 - 2024-12-28T20:49:26.638350 -  [DONE]2024-12-28T20:49:26.638380 - 
2024-12-28T20:49:26.641252 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ControlNet-LLLite-ComfyUI/models2024-12-28T20:49:26.641570 - 
2024-12-28T20:49:26.641897 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/pfg-ComfyUI/models2024-12-28T20:49:26.642072 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/pfg-ComfyUI/models2024-12-28T20:49:26.642303 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/pfg-ComfyUI/models2024-12-28T20:49:26.642513 - 
2024-12-28T20:49:26.642581 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI-YoloWorld-EfficientSAM2024-12-28T20:49:26.642616 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI-YoloWorld-EfficientSAM2024-12-28T20:49:26.642652 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_FaceAnalysis/dlib2024-12-28T20:49:26.642690 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_FaceAnalysis/dlib2024-12-28T20:49:26.642719 - 
2024-12-28T20:49:26.642755 - 
2024-12-28T20:49:26.642826 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_ID_Animator/models2024-12-28T20:49:26.642874 - 
2024-12-28T20:49:26.642983 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_ID_Animator/models/animatediff_models2024-12-28T20:49:26.643006 - 
2024-12-28T20:49:26.643037 - 
2024-12-28T20:49:26.643132 - 
2024-12-28T20:49:26.643283 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_ID_Animator/models/image_encoder2024-12-28T20:49:26.643490 - 
2024-12-28T20:49:26.643689 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI_CustomNet/pretrain2024-12-28T20:49:26.643770 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/ComfyUI-ToonCrafter/ToonCrafter/checkpoints/tooncrafter_512_interp_v12024-12-28T20:49:26.643931 - 
2024-12-28T20:49:26.644019 - 
2024-12-28T20:49:26.644062 - [ComfyUI-Manager] The target custom node for model download is not installed: custom_nodes/comfyui-SegGPT2024-12-28T20:49:26.644084 - 
2024-12-28T20:49:26.644213 - 
2024-12-28T20:49:26.644238 - 
2024-12-28T20:51:06.262537 - 
100%|███████████████████████████████████████████| 20/20 [01:40<00:00,  4.89s/it]2024-12-28T20:51:06.262725 - 
100%|███████████████████████████████████████████| 20/20 [01:40<00:00,  5.02s/it]2024-12-28T20:51:06.262804 - 
2024-12-28T20:51:06.411027 - Requested to load AutoencoderKL
2024-12-28T20:51:06.519035 - !!! Exception during processing !!! BFloat16 is not supported on MPS
2024-12-28T20:51:06.519901 - Traceback (most recent call last):
  File "/Users/tashielyonns/ComfyUI/execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "/Users/tashielyonns/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/nodes.py", line 285, in decode
    images = vae.decode(samples["samples"])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/comfy/sd.py", line 465, in decode
    model_management.load_models_gpu([self.patcher], memory_required=memory_used)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 550, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 366, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "/Users/tashielyonns/ComfyUI/comfy/model_management.py", line 395, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 759, in partially_load
    raise e
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 756, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "/Users/tashielyonns/ComfyUI/comfy/model_patcher.py", line 607, in load
    x[2].to(device_to)
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1160, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 833, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "/Users/tashielyonns/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1158, in convert
    return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: BFloat16 is not supported on MPS

2024-12-28T20:51:06.520816 - Prompt executed in 105.99 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":24,"last_link_id":48,"nodes":[{"id":9,"type":"SaveImage","pos":[1781.1925048828125,1079.08740234375],"size":[210,147.104736328125],"flags":{"collapsed":true},"order":20,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":46}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":10,"type":"UltimateSDUpscale","pos":[392.36395263671875,1027.5501708984375],"size":[315,614],"flags":{},"order":15,"mode":4,"inputs":[{"name":"image","type":"IMAGE","link":13},{"name":"model","type":"MODEL","link":16},{"name":"positive","type":"CONDITIONING","link":22},{"name":"negative","type":"CONDITIONING","link":25},{"name":"vae","type":"VAE","link":28},{"name":"upscale_model","type":"UPSCALE_MODEL","link":29}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[30],"slot_index":0}],"properties":{"Node name for S&R":"UltimateSDUpscale"},"widgets_values":[2,853737773163045,"randomize",25,7,"euler","normal",0.2,"Linear",768,1344,8,32,"None",1,64,8,16,true,false]},{"id":17,"type":"UpscaleModelLoader","pos":[34.029170989990234,1328.7333984375],"size":[315,58],"flags":{},"order":0,"mode":4,"inputs":[],"outputs":[{"name":"UPSCALE_MODEL","type":"UPSCALE_MODEL","links":[29]}],"properties":{"Node name for S&R":"UpscaleModelLoader"},"widgets_values":["4xUltrasharp_4xUltrasharpV10.pt"]},{"id":11,"type":"LoadImage","pos":[37.3960075378418,1035.669189453125],"size":[304.5188903808594,314],"flags":{},"order":1,"mode":4,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[13]},{"name":"MASK","type":"MASK","links":null}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["ComfyUI_00481_.png","image"]},{"id":18,"type":"SaveImage","pos":[744.3792114257812,1024.2474365234375],"size":[323.91571044921875,612.0211181640625],"flags":{},"order":17,"mode":4,"inputs":[{"name":"images","type":"IMAGE","link":30}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":12,"type":"Reroute","pos":[1008.0634765625,811.454833984375],"size":[75,26],"flags":{},"order":6,"mode":0,"inputs":[{"name":"","type":"*","link":14}],"outputs":[{"name":"","type":"MODEL","links":[15,16],"slot_index":0}],"properties":{"showOutputText":false,"horizontal":false}},{"id":1,"type":"CheckpointLoaderSimple","pos":[-184.21978759765625,762.20654296875],"size":[315,98],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[14],"slot_index":0},{"name":"CLIP","type":"CLIP","links":[17],"slot_index":1},{"name":"VAE","type":"VAE","links":[26],"slot_index":2}],"properties":{"Node name for S&R":"CheckpointLoaderSimple"},"widgets_values":["animagineXLV31_v31.safetensors"]},{"id":13,"type":"Reroute","pos":[145.2758331298828,740.0143432617188],"size":[75,26],"flags":{},"order":7,"mode":0,"inputs":[{"name":"","type":"*","link":17}],"outputs":[{"name":"","type":"CLIP","links":[18,19],"slot_index":0}],"properties":{"showOutputText":false,"horizontal":false}},{"id":3,"type":"CLIPTextEncode","pos":[-187.29647827148438,513.5159301757812],"size":[400,200],"flags":{},"order":11,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":19}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[23],"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["low quality, worst quality, normal quality, lowres, bad anatomy, bad hands, missing fingers, extra digit, fewer digits, watermark, artist name, signature, comic strips, multiple images, deformed, bad proportions"]},{"id":14,"type":"Reroute","pos":[1008.1453247070312,845.4545288085938],"size":[75,26],"flags":{},"order":13,"mode":0,"inputs":[{"name":"","type":"*","link":32}],"outputs":[{"name":"","type":"CONDITIONING","links":[22,36],"slot_index":0}],"properties":{"showOutputText":false,"horizontal":false},"color":"#232","bgcolor":"#353"},{"id":15,"type":"Reroute","pos":[1007.6373901367188,879.398681640625],"size":[75,26],"flags":{},"order":14,"mode":0,"inputs":[{"name":"","type":"*","link":23}],"outputs":[{"name":"","type":"CONDITIONING","links":[25,37],"slot_index":0}],"properties":{"showOutputText":false,"horizontal":false},"color":"#322","bgcolor":"#533"},{"id":20,"type":"ControlNetLoader","pos":[1205.95166015625,886.4567260742188],"size":[315,58],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"CONTROL_NET","type":"CONTROL_NET","links":[38]}],"properties":{"Node name for S&R":"ControlNetLoader"},"widgets_values":["diffusion_pytorch_model.safetensors"]},{"id":23,"type":"PreviewImage","pos":[1872.6842041015625,568.694091796875],"size":[309.97857666015625,246],"flags":{},"order":12,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":43}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]},{"id":8,"type":"PreviewImage","pos":[1547.629150390625,1133.2681884765625],"size":[353.3922424316406,322.2351989746094],"flags":{},"order":21,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":47}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]},{"id":6,"type":"EmptyLatentImage","pos":[1211.9224853515625,1344.1856689453125],"size":[315,106],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","type":"LATENT","links":[7]}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[896,1152,1]},{"id":2,"type":"CLIPTextEncode","pos":[-187.18206787109375,266.0978088378906],"size":[400,200],"flags":{},"order":10,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":18}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[32],"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["ren shikimura, (masterpiece), (best quality), (ultra-detailed), very aesthetic, illustration, perfect composition, absurdres,\nblonde girl, long hair, white mini skirt, sexy, leaning against car, (white porsche car:1.1), countryside, blue skies,\ndepth of field, style of Pascal Campion"]},{"id":21,"type":"LoadImage","pos":[1204.0423583984375,527.7063598632812],"size":[316.836181640625,314],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[40],"slot_index":0},{"name":"MASK","type":"MASK","links":null,"slot_index":1}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["IMG_0476.png","image"]},{"id":19,"type":"ControlNetApplyAdvanced","pos":[1537.489013671875,760.0439453125],"size":[315,186],"flags":{},"order":16,"mode":0,"inputs":[{"name":"positive","type":"CONDITIONING","link":36},{"name":"negative","type":"CONDITIONING","link":37},{"name":"control_net","type":"CONTROL_NET","link":38},{"name":"image","type":"IMAGE","link":41},{"name":"vae","type":"VAE","link":44,"shape":7}],"outputs":[{"name":"positive","type":"CONDITIONING","links":[33],"slot_index":0},{"name":"negative","type":"CONDITIONING","links":[42],"slot_index":1}],"properties":{"Node name for S&R":"ControlNetApplyAdvanced"},"widgets_values":[0.8,0,1]},{"id":4,"type":"KSampler","pos":[1209.17919921875,1035.5013427734375],"size":[315,262],"flags":{},"order":18,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":15},{"name":"positive","type":"CONDITIONING","link":33},{"name":"negative","type":"CONDITIONING","link":42},{"name":"latent_image","type":"LATENT","link":7}],"outputs":[{"name":"LATENT","type":"LATENT","links":[45],"slot_index":0}],"properties":{"Node name for S&R":"KSampler"},"widgets_values":[1025830907027343,"randomize",20,7,"euler","normal",1]},{"id":24,"type":"VAEDecode","pos":[1546.99853515625,1038.4605712890625],"size":[210,46],"flags":{},"order":19,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":45},{"name":"vae","type":"VAE","link":48}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[46,47],"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"}},{"id":16,"type":"Reroute","pos":[1007.2833251953125,913.4507446289062],"size":[75,26],"flags":{},"order":8,"mode":0,"inputs":[{"name":"","type":"*","link":26}],"outputs":[{"name":"","type":"VAE","links":[28,44,48],"slot_index":0}],"properties":{"showOutputText":false,"horizontal":false}},{"id":22,"type":"AV_ControlNetPreprocessor","pos":[1539.00341796875,568.6806030273438],"size":[315,150],"flags":{},"order":9,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":40}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[41,43],"slot_index":0},{"name":"CNET_NAME","type":"STRING","links":null,"slot_index":1}],"properties":{"Node name for S&R":"AV_ControlNetPreprocessor"},"widgets_values":["depth","sdxl",896,"None"]}],"links":[[7,6,0,4,3,"LATENT"],[13,11,0,10,0,"IMAGE"],[14,1,0,12,0,"*"],[15,12,0,4,0,"MODEL"],[16,12,0,10,1,"MODEL"],[17,1,1,13,0,"*"],[18,13,0,2,0,"CLIP"],[19,13,0,3,0,"CLIP"],[22,14,0,10,2,"CONDITIONING"],[23,3,0,15,0,"*"],[25,15,0,10,3,"CONDITIONING"],[26,1,2,16,0,"*"],[28,16,0,10,4,"VAE"],[29,17,0,10,5,"UPSCALE_MODEL"],[30,10,0,18,0,"IMAGE"],[32,2,0,14,0,"*"],[33,19,0,4,1,"CONDITIONING"],[36,14,0,19,0,"CONDITIONING"],[37,15,0,19,1,"CONDITIONING"],[38,20,0,19,2,"CONTROL_NET"],[40,21,0,22,0,"IMAGE"],[41,22,0,19,3,"IMAGE"],[42,19,1,4,2,"CONDITIONING"],[43,22,0,23,0,"IMAGE"],[44,16,0,19,4,"VAE"],[45,4,0,24,0,"LATENT"],[46,24,0,9,0,"IMAGE"],[47,24,0,8,0,"IMAGE"],[48,16,0,24,1,"VAE"]],"groups":[{"id":1,"title":"Group","bounding":[1201.098876953125,963.481689453125,731.9964599609375,752.3034057617188],"color":"#3f789e","font_size":24,"flags":{}},{"id":3,"title":"Group","bounding":[22.693655014038086,958.749755859375,1133.987060546875,690.6932373046875],"color":"#3f789e","font_size":24,"flags":{}},{"id":4,"title":"Group","bounding":[1001.7008056640625,770.2759399414062,140,174.17152404785156],"color":"#3f789e","font_size":24,"flags":{}}],"config":{},"extra":{"ds":{"scale":0.5730855330116827,"offset":[300.5192518185021,-95.18316480387057]},"VHS_latentpreview":false,"VHS_latentpreviewrate":0},"version":0.4}


### Other

I do not want to to have to run this on my cpu as I've heard a similar issue resolved to that but was extremely slow.

Lyonns avatar Dec 29 '24 02:12 Lyonns

Try updating pytorch.

comfyanonymous avatar Dec 29 '24 02:12 comfyanonymous

This is happening to me also since ComfyUI manager updated a couple of days ago. VAE won't work same BFloat not supported on MPS error. if an image is VAE encoded it throws it right away, if I start with a blank latent it runs the prompt and crashes with the error as soon as it goes to VAE from Ksampler. I have tried various updates of PyTorch and evebn tried going back on PyTorch versions. Using SD1.5 have tried various VAEs also. Apple MI on Sonoma 14.5.

** ComfyUI startup time: 2024-12-29 19:22:57.818427 ** Platform: Darwin ** Python version: 3.11.6 (main, Oct 2 2023, 13:45:54) [Clang 15.0.0 (clang-1500.0.40.1)] ** Python executable: /Userss/joeyscat21/ComfyUI/venv/bin/python ** ComfyUI Path: /Userss/joeyscat21/ComfyUI ** Log path: /Userss/joeyscat21/ComfyUI/comfyui.log

Prestartup times for custom nodes: 1.2 seconds: /Users/joeyscat21/ComfyUI/custom_nodes/ComfyUI-Manager

Total VRAM 16384 MB, total RAM 16384 MB pytorch version: 2.1.2 Set vram state to: SHARED Device: mps

joeyscat21 avatar Dec 29 '24 19:12 joeyscat21

You need at least PyTorch 2.3 for bf16 support on MPS.

Adreitz avatar Dec 30 '24 00:12 Adreitz

Thanks very much for your reply. Attempted to update torch.

If I run pip3 show torch I get this

~ % pip3 show torch Name: torch Version: 2.5.1 Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration Home-page: https://pytorch.org/ Author: PyTorch Team Author-email: @.*** License: BSD-3-Clause Location: /Users/joeyscat25/miniconda3/lib/python3.12/site-packages

Seems to indicate torch 2.5.1?

If I then start up the Comfy UI script I get

venv/bin/python main.py [START] Security scan [DONE] Security scan

ComfyUI-Manager: installing dependencies done.

** ComfyUI startup time: 2024-12-30 18:53:13.000691 ** Platform: Darwin ** Python version: 3.11.6 (main, Oct 2 2023, 13:45:54) [Clang 15.0.0 (clang-1500.0.40.1)] ** Python executable: /Users/joeyscat21/ComfyUI/venv/bin/python ** ComfyUI Path: /Users/joeyscat21/ComfyUI ** Log path: /Users/joeyscat21/ComfyUI/comfyui.log

Prestartup times for custom nodes: 1.2 seconds: /Users/joeyscat21/ComfyUI/custom_nodes/ComfyUI-Manager

Total VRAM 16384 MB, total RAM 16384 MB pytorch version: 2.1.2

Which is not what I need? Any help would be appreciated.

On 30 Dec 2024, at 00:40, Adreitz @.***> wrote:

You need at least PyTorch 2.3 for bf16 support on MPS.

— Reply to this email directly, view it on GitHub https://github.com/comfyanonymous/ComfyUI/issues/6254#issuecomment-2564891388, or unsubscribe https://github.com/notifications/unsubscribe-auth/BOBYI664Q5ASPDYSDBW3C3L2ICI6NAVCNFSM6AAAAABUKKUQOSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRUHA4TCMZYHA. You are receiving this because you commented.

joeyscat21 avatar Dec 30 '24 19:12 joeyscat21

@joeyscat21 Did you run pip show from within the venv? If you just run it from the regular terminal, it will show what is installed in your OS, which does not apply to comfyui -- virtual environments are used so that CLI programs with different dependency requirements can all be run without messing each other up. Comfyui can only see the packages that are installed within its virtual environment, not the host system or any other virtual environment.

Since you have Comfyui Manager, I recommend using it to update to the latest torch nightlies. Open the manager window, click "Install PIP packages" and copy/paste the following into the dialog that appears: --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu Then click the button to restart the backend and it will download and install the new versions.

Adreitz avatar Dec 31 '24 03:12 Adreitz

Thanks for the reply again. Yes you're correct I ran the pip show in the terminal.

As recommended I tried the Pip update in the manager it won’t let me - it gives me this error "This action is not allowed with this security level configuration.”

I can’t see any security settings in the manger to change anything.

Appreciate the help,

On 31 Dec 2024, at 03:24, Adreitz @.***> wrote:

@joeyscat21 https://github.com/joeyscat21 Did you run pip show from within the venv? If you just run it from the regular terminal, it will show what is installed in your OS, which does not apply to comfyui -- virtual environments are used so that CLI programs with different dependency requirements can all be run without messing each other up. Comfyui can only see the packages that are installed within its virtual environment, not the host system or any other virtual environment.

Since you have Comfyui Manager, I recommend using it to update to the latest torch nightlies. Open the manager window, click "Install PIP packages" and copy/paste the following into the dialog that appears: --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu Then click the button to restart the backend and it will download and install the new versions.

— Reply to this email directly, view it on GitHub https://github.com/comfyanonymous/ComfyUI/issues/6254#issuecomment-2566100657, or unsubscribe https://github.com/notifications/unsubscribe-auth/BOBYI6265D2J5O3VZEXVYY32IIE7JAVCNFSM6AAAAABUKKUQOSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRWGEYDANRVG4. You are receiving this because you were mentioned.

joeyscat21 avatar Dec 31 '24 10:12 joeyscat21

@joeyscat21 True, I ran into this issue a couple months ago, but forgot about it after I fixed it. It's a security thing to try to prevent people from installing malicious nodes and packages, which has been a problem a couple times that I know of.

In order to get around it, you'll need to find the config.ini for the Manager. Since I installed comfyui through Pinokio, I have it within [home folder]/pinokio/api/ComfyUI/app/custom_nodes/ComfyUI-Manager. Open it in TextEdit and you should see a line, security_level = <something>. If you set it to normal- and restart comfy, it shouldn't complain about installing pip packages anymore. The details of the different security settings are here: https://github.com/ltdrdata/ComfyUI-Manager#security-policy.

Adreitz avatar Dec 31 '24 14:12 Adreitz

Madness :-) finally found the config it hadn’t actually got a security level line so I added it normal- as suggested and Manager has allowed pip to update. Now have manager saying

OS posix Python Version 3.11.6 (main, Oct 2 2023, 13:45:54) [Clang 15.0.0 (clang-1500.0.40.1)] Embedded Python false Pytorch Version 2.6.0.dev20241231

And Comfy is successfully running prompts without VAE crashing it. Result.

Thanks for your help!!

On 31 Dec 2024, at 14:43, Adreitz @.***> wrote:

@joeyscat21 https://github.com/joeyscat21 True, I ran into this issue a couple months ago, but forgot about it after I fixed it. It's a security thing to try to prevent people from installing malicious nodes and packages, which has been a problem a couple times that I know of.

In order to get around it, you'll need to find the config.ini for the Manager. Since I installed comfyui through Pinokio, I have it within [home folder]/pinokio/api/ComfyUI/app/custom_nodes/ComfyUI-Manager. Open it in TextEdit and you should see a line, security_level = . If you set it to normal- and restart comfy, it shouldn't complain about installing pip packages anymore. The details of the different security settings are here: https://github.com/ltdrdata/ComfyUI-Manager#security-policy.

— Reply to this email directly, view it on GitHub https://github.com/comfyanonymous/ComfyUI/issues/6254#issuecomment-2566504984, or unsubscribe https://github.com/notifications/unsubscribe-auth/BOBYI6YMYENXWLT3TSVJOX32IKUQZAVCNFSM6AAAAABUKKUQOSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRWGUYDIOJYGQ. You are receiving this because you were mentioned.

joeyscat21 avatar Dec 31 '24 17:12 joeyscat21

I'm trying to run COMFYUI with flux and am getting a similar BFloat 16 error. Would updating pytorch solve the issue?

bigeye-studios avatar Jan 07 '25 05:01 bigeye-studios

I upgraded pytorch and things are working however now I'm getting this message: SamplerCustomAdvanced Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.

bigeye-studios avatar Jan 08 '25 05:01 bigeye-studios

I upgraded pytorch and things are working however now I'm getting this message: SamplerCustomAdvanced Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.

I use ComfyUI Desktop on macOS and have the same problem. Did you find how to solve it?

salmazov avatar Jan 09 '25 16:01 salmazov

Madness :-) finally found the config it hadn’t actually got a security level line so I added it normal- as suggested and Manager has allowed pip to update. Now have manager saying

Which file did you update for this. I am on Macos and don't have an ini file

spipe007 avatar Jan 14 '25 01:01 spipe007

It’s in /custom_nodes/ComfyUI-Manager/

I added the line security_level = normal-

On 14 Jan 2025, at 01:53, spipe007 @.***> wrote:

Madness :-) finally found the config it hadn’t actually got a security level line so I added it normal- as suggested and Manager has allowed pip to update. Now have manager saying

Which file did you update for this. I am on Macos and don't have an ini file

— Reply to this email directly, view it on GitHub https://github.com/comfyanonymous/ComfyUI/issues/6254#issuecomment-2588600449, or unsubscribe https://github.com/notifications/unsubscribe-auth/BOBYI627J3MT5BUOTCR4IND2KRUZVAVCNFSM6AAAAABUKKUQOSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKOBYGYYDANBUHE. You are receiving this because you were mentioned.

joeyscat21 avatar Jan 14 '25 09:01 joeyscat21

ATTENTION: A recent update to ComfyUI-Manager has moved the location of the config.ini file!

In my case, the old config.ini was not relocated and there was no notice that there was a new location or what it was. I simply lost the ability to install pip packages because Manager was no longer reading the config.ini that I knew about. I filed a bug report (https://github.com/ltdrdata/ComfyUI-Manager/issues/1416) and ltdrdata updated Manager to list the config file location in the terminal when ComfyUI launches. In my case, it is now located at: /Users/[my user folder]/pinokio/api/ComfyUI/app/user/default/ComfyUI-Manager/config.ini

In order to regain your ability to install pip packages, you should look for this line in the terminal output to find and edit the security setting in the new file.

Adreitz avatar Jan 14 '25 16:01 Adreitz

I can run it using python3 main. py -- CPU, but it runs very slowly. If I use python3 main. py, it will prompt an error as shown in the picture.

Image

The following is the content of the python show torch: Name: torch Version: 2.6.0.dev20241112 Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration Home-page: https://pytorch.org/ Author: PyTorch Team Author-email: [email protected] License: BSD-3-Clause Location: /Users/shaqiushi/miniconda3/lib/python3.12/site-packages Requires: filelock, fsspec, jinja2, networkx, setuptools, sympy, typing-extensions Required-by: accelerate, clip-interrogator, facexlib, fairscale, kornia, open_clip_torch, pixeloe, spandrel, stanza, timm, torchaudio, torchsde, torchvision, transparent-background, ultralytics, ultralytics-thop

crazyshaqiushi avatar Jan 18 '25 18:01 crazyshaqiushi

Hi. I have the same problem and it seems that my pytorc is stuck at 2.2.2. I have tried manually updating from the nightly build but it still defaults to there. Not sure what else to do and am at a bit of a bind.

denbuzzsg avatar Jan 24 '25 09:01 denbuzzsg

I have the same issue, pytorch also stuck on 2.2.2 and BFloat16 is not supported on MPS. Might need to go back to my old Windows NVidia Laptop for AI related work.

staticfuzz1 avatar Feb 02 '25 11:02 staticfuzz1

I'm still struggling with the VAEDecodeBFloat16 is not supported on MPS issue. None of the solutions mentioned above have worked for me.

Torch Version: 2.7.0.dev20250205 Python Version: 3.13.1 What's interesting is that the error logs show the following config info:

ComfyUI Version: 0.3.13 Arguments: main.py OS: posix Python Version: 3.10.16 (main, Dec 3, 2024, 17:27:57) [Clang 16.0.0 (clang-1600.0.26.6)] Embedded Python: false PyTorch Version: 2.2.2 The Python and PyTorch versions in the logs don't match the ones installed on my Mac.

I updated both Python and PyTorch accordingly, but the BFloat16 is not supported on MPS error still persists.

After researching, I discovered that if you open the comfy/model_management.py file and replace all instances of torch.bfloat16 with torch.float32, then save the file and relaunch ComfyUI using python main.py, the error should be resolved.

But after that, you might have another issue like SaveImage Numpy is not available

duskat avatar Feb 05 '25 19:02 duskat

Hello MacOS Users. There are several things that could be causing these issues for you.

  1. Old PyTorch.
  2. Old Python.

I can force BF16 unet and BF16 vae on MPS and run fine, but I have manually installed everything many times before. Here is a random image to prove this is so, made with a bf16 checkpoint.

Image

M4 on MacOS Sequoia 15.3 I have installed with uv. torch 2.5.1 torchaudio 2.5.1 torchsde 0.2.6 torchvision 0.20.1

Run the torch install with --pre from the /nightly pytorch index. Use 'where pythonand 'python' commands to check your versions (there can *and should be* be multiple installs of python on one computer.) Also, if you getfloat8_e4m3fn` error, verify you are not using an fp8 converted model (or vae, if thats a thing.) It simply wont work.

Beyond that, my advice is to try the beta desktop client if you are still lost on how to proceed.

exdysa avatar Feb 08 '25 22:02 exdysa

I have a plan to deal with the problem(if your mac(M1/M2/M3)Version 11 or above):

  1. torch version: ^2.3~
  2. python version: ^3.0~ The most important thing is to ensure that your running terminal has not selected Rosetta 2 mode Run command:uname -m must be output:arm64 and then,Reinstall all extensions,It is ok~

zhendonger avatar Mar 05 '25 10:03 zhendonger

For me this worked:

  1. Open ComfyUI/comfy/model_management.py Change (line 868 atm):
# NOTE: bfloat16 seems to work on AMD for the VAE but is extremely slow in some cases compared to fp32
        if d == torch.bfloat16 and (not is_amd()) and should_use_bf16(device):
            return d

to

# NOTE: bfloat16 seems to work on AMD for the VAE but is extremely slow in some cases compared to fp32
        if d == torch.bfloat16 and (not is_amd()) and should_use_bf16(device) and device.type != 'mps':
            return d

userH309 avatar Apr 26 '25 11:04 userH309

For me this worked:

  1. Open ComfyUI/comfy/model_management.py Change (line 868 atm):
# NOTE: bfloat16 seems to work on AMD for the VAE but is extremely slow in some cases compared to fp32
        if d == torch.bfloat16 and (not is_amd()) and should_use_bf16(device):
            return d

to

# NOTE: bfloat16 seems to work on AMD for the VAE but is extremely slow in some cases compared to fp32
        if d == torch.bfloat16 and (not is_amd()) and should_use_bf16(device) and device.type != 'mps':
            return d

🆒,完美解决

personalityb0y avatar May 06 '25 08:05 personalityb0y

Adding a note: If you are on Apple Sillicon and the latest nightly PyTorch version you can get is only 2.2, check your current python environment: x86 or arm (https://blog.slray.com/2024/01/10/Check-Which-System-is-the-Python-based-on/). The x86 python would be only compatible up to pytorch v2.2

You might want to reinstall brew (which will install the ARM version of brew) and reinstall python. With ARM version of python, you can install later versions of pytorch. https://docs.brew.sh/Common-Issues#other-local-issues

nhanpdnguyen avatar May 09 '25 05:05 nhanpdnguyen

I upgraded PyTorch and it handles MPS better. I was able to output a 33-frame, 640x376 video to WebP in 10 minutes. Mac mini pro 32GB

Charlie-Perso avatar Jul 29 '25 02:07 Charlie-Perso