ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

WanVideoSampler TypeError: expected str, bytes or os.PathLike object, not NoneType

Open nana10255 opened this issue 1 month ago • 2 comments

Custom Node Testing

Expected Behavior

error

Actual Behavior

An error occurred when running Kijai's wan2.2_I2V

Steps to Reproduce

Image Image

Debug Logs

# ComfyUI Error Report
## Error Details
- **Node ID:** 27
- **Node Type:** WanVideoSampler
- **Exception Type:** torch._inductor.exc.InductorError
- **Exception Message:** TypeError: expected str, bytes or os.PathLike object, not NoneType

Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"


## Stack Trace

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\custom_nodes\ComfyUI-WanVideoWrapper\nodes_sampler.py", line 3190, in process
    raise e

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\custom_nodes\ComfyUI-WanVideoWrapper\nodes_sampler.py", line 3054, in process
    noise_pred, noise_pred_ovi, self.cache_state = predict_with_cfg(
                                                   ^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\custom_nodes\ComfyUI-WanVideoWrapper\nodes_sampler.py", line 1664, in predict_with_cfg
    raise e

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\custom_nodes\ComfyUI-WanVideoWrapper\nodes_sampler.py", line 1533, in predict_with_cfg
    noise_pred_cond, noise_pred_ovi, cache_state_cond = transformer(
                                                        ^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\nn\modules\module.py", line 1751, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\nn\modules\module.py", line 1762, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\custom_nodes\ComfyUI-WanVideoWrapper\wanvideo\modules\model.py", line 3014, in forward
    x, x_ip, lynx_ref_feature, x_ovi = block(x, x_ip=x_ip, lynx_ref_feature=lynx_ref_feature, x_ovi=x_ovi, x_onetoall_ref=x_onetoall_ref, onetoall_freqs=onetoall_freqs, **kwargs)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\nn\modules\module.py", line 1751, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\nn\modules\module.py", line 1762, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_dynamo\eval_frame.py", line 663, in _fn
    raise e.remove_dynamo_frames() from None  # see TORCHDYNAMO_VERBOSE=1
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\compile_fx.py", line 760, in _compile_fx_inner
    raise InductorError(e, currentframe()).with_traceback(

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\compile_fx.py", line 745, in _compile_fx_inner
    mb_compiled_graph = fx_codegen_and_compile(
                        ^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\compile_fx.py", line 1295, in fx_codegen_and_compile
    return scheme.codegen_and_compile(gm, example_inputs, inputs_to_check, graph_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\compile_fx.py", line 1197, in codegen_and_compile
    compiled_fn = graph.compile_to_module().call
                  ^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\graph.py", line 2083, in compile_to_module
    return self._compile_to_module()
           ^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\graph.py", line 2091, in _compile_to_module
    self.codegen_with_cpp_wrapper() if self.cpp_wrapper else self.codegen()
                                                             ^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\graph.py", line 2002, in codegen
    self.scheduler.codegen()

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\scheduler.py", line 4135, in codegen
    else self._codegen(self.nodes)
         ^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\scheduler.py", line 4264, in _codegen
    self.get_backend(device).codegen_node(node)

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\codegen\cuda_combined_scheduling.py", line 104, in codegen_node
    return self._triton_scheduling.codegen_node(node)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\codegen\simd.py", line 1320, in codegen_node
    return self.codegen_node_schedule(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\codegen\simd.py", line 1365, in codegen_node_schedule
    src_code = kernel.codegen_kernel()
               ^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\codegen\triton.py", line 3623, in codegen_kernel
    **self.inductor_meta_common(),
      ^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\_inductor\codegen\triton.py", line 3447, in inductor_meta_common
    "backend_hash": torch.utils._triton.triton_hash_with_backend(),
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\torch\utils\_triton.py", line 112, in triton_hash_with_backend
    key = f"{triton_key()}-{backend.hash()}"
                            ^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\triton\backends\nvidia\compiler.py", line 336, in hash
    version = get_ptxas_version()
              ^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\triton\backends\nvidia\compiler.py", line 38, in get_ptxas_version
    version = subprocess.check_output([_path_to_binary("ptxas")[0], "--version"]).decode("utf-8")
                                       ^^^^^^^^^^^^^^^^^^^^^^^^

  File "D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\python\Lib\site-packages\triton\backends\nvidia\compiler.py", line 23, in _path_to_binary
    os.path.join(os.environ.get("CUDA_PATH"), "bin", binary),
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "<frozen ntpath>", line 108, in join


## System Information
- **ComfyUI Version:** 0.4.0
- **Arguments:** D:\AIGC\ComfyUI-aki(2)\ComfyUI-aki-v1.7\ComfyUI-aki-v1.7\ComfyUI\main.py --auto-launch --preview-method auto --use-sage-attention --normalvram --disable-smart-memory --disable-cuda-malloc --cpu-vae --reserve-vram 8
- **OS:** win32
- **Python Version:** 3.11.9 (tags/v3.11.9:de54cf5, Apr  2 2024, 10:12:12) [MSC v.1938 64 bit (AMD64)]
- **Embedded Python:** false
- **PyTorch Version:** 2.7.0+cu128
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 3070 Laptop GPU : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 8589410304
  - **VRAM Free:** 7341307092
  - **Torch VRAM Total:** 201326592
  - **Torch VRAM Free:** 139687124

Other

Image

nana10255 avatar Dec 12 '25 09:12 nana10255

Help me solve this problem, guys, thanks

nana10255 avatar Dec 12 '25 09:12 nana10255

try export TORCHDYNAMO_VERBOSE=1 and then python main.py to get the exact error

Vijay2359 avatar Dec 16 '25 06:12 Vijay2359