An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
When I try to use some nodes, the issue always happen. How can I change the all default "huggingface.co" to "https://hf-mirror.com/"? Many thanks.
This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.
ComfyUI does not have the capability to access external sites on its own. This is likely related to ComfyUI-Manager or other custom nodes. In the case of ComfyUI-Manager, they are considering providing functionality similar to a reverse proxy in the future.
ComfyUI Error Report
Error Details
- Node ID: 51
- Node Type: PulidFluxEvaClipLoader
- Exception Type: huggingface_hub.errors.LocalEntryNotFoundError
- Exception Message: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
Stack Trace
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 170, in _map_node_over_list
process_inputs({})
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\pulidflux.py", line 298, in load_eva_clip
model, _, _ = create_model_and_transforms('EVA02-CLIP-L-14-336', 'eva_clip', force_custom_clip=True)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 377, in create_model_and_transforms
model = create_model(
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 279, in create_model
checkpoint_path = download_pretrained(pretrained_cfg, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 328, in download_pretrained
target = download_pretrained_from_hf(model_id, filename=filename, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 300, in download_pretrained_from_hf
cached_file = hf_hub_download(model_id, filename, revision=revision, cache_dir=cache_dir)
File "<enhanced_experience patches.hfmirror.huggingface_hub>", line 47, in hf_hub_download_wrapper_inner
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 860, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 967, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1485, in _raise_on_head_call_error
raise LocalEntryNotFoundError(
System Information
- ComfyUI Version: 0.3.12
- Arguments: D:\ComfyUI-aki\main.py --auto-launch --preview-method auto --disable-cuda-malloc --fast
- OS: nt
- Python Version: 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
- Embedded Python: false
- PyTorch Version: 2.5.0+cu124
Devices
-
Name: cuda:0 NVIDIA GeForce RTX 4060 Ti : cudaMallocAsync
- Type: cuda
- VRAM Total: 17175150592
- VRAM Free: 9956713976
- Torch VRAM Total: 5133828096
- Torch VRAM Free: 90662392
Logs
2025-01-18T16:24:33.104607 - end_vram - start_vram: 13301993064 - 134745576 = 131672474882025-01-18T16:24:33.104607 -
2025-01-18T16:24:33.109618 - #59 [HyVideoSampler]: 84.95s - vram 13167247488b2025-01-18T16:24:33.109618 -
2025-01-18T16:24:39.874030 - end_vram - start_vram: 7011184030 - 134745576 = 68764384542025-01-18T16:24:39.874030 -
2025-01-18T16:24:39.875026 - #60 [HyVideoDecode]: 6.76s - vram 6876438454b2025-01-18T16:24:39.875026 -
2025-01-18T16:24:40.120205 - end_vram - start_vram: 134745576 - 134745576 = 02025-01-18T16:24:40.120205 -
2025-01-18T16:24:40.121202 - #72 [easy cleanGpuUsed]: 0.25s - vram 0b2025-01-18T16:24:40.121202 -
2025-01-18T16:24:40.401265 - end_vram - start_vram: 134745576 - 134745576 = 02025-01-18T16:24:40.401265 -
2025-01-18T16:24:40.402262 - #71 [VHS_VideoCombine]: 0.27s - vram 0b2025-01-18T16:24:40.402262 -
2025-01-18T16:24:40.404255 - Prompt executed in 106.24 seconds
2025-01-18T16:33:59.651513 - got prompt
2025-01-18T16:33:59.704706 - end_vram - start_vram: 132644984 - 132644984 = 02025-01-18T16:33:59.704706 -
2025-01-18T16:33:59.705702 - #65 [LoadImage]: 0.02s - vram 0b2025-01-18T16:33:59.705702 -
2025-01-18T16:34:10.733260 - !!! Exception during processing !!! Only vision_languague models support image input
2025-01-18T16:34:10.736249 - Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 884, in process
prompt_embeds, negative_prompt_embeds, attention_mask, negative_attention_mask = encode_prompt(self,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 809, in encode_prompt
text_inputs = text_encoder.text2tokens(prompt,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\hyvideo\text_encoder\__init__.py", line 214, in text2tokens
raise ValueError("Only vision_languague models support image input")
ValueError: Only vision_languague models support image input
2025-01-18T16:34:10.737247 - end_vram - start_vram: 15143018744 - 132644984 = 150103737602025-01-18T16:34:10.737247 -
2025-01-18T16:34:10.737247 - #73 [HyVideoTextImageEncode]: 11.03s - vram 15010373760b2025-01-18T16:34:10.738243 -
2025-01-18T16:34:10.738243 - Prompt executed in 11.07 seconds
2025-01-18T16:34:42.696567 - got prompt
2025-01-18T16:34:43.844335 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:34:44.203174 - Text encoder to dtype: torch.float16
2025-01-18T16:34:44.338722 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:34:44.433482 - Loading text encoder model (llm) from: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-01-18T16:35:13.222955 - Text encoder to dtype: torch.bfloat16
2025-01-18T16:35:17.056240 - Loading tokenizer (llm) from: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-01-18T16:35:17.516114 - end_vram - start_vram: 15815648488 - 33554432 = 157820940562025-01-18T16:35:17.516114 -
2025-01-18T16:35:17.517111 - #71 [DownloadAndLoadHyVideoTextEncoder]: 33.67s - vram 15782094056b2025-01-18T16:35:17.517111 -
2025-01-18T16:35:17.547432 - !!! Exception during processing !!! Only vision_languague models support image input
2025-01-18T16:35:17.548429 - Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 884, in process
prompt_embeds, negative_prompt_embeds, attention_mask, negative_attention_mask = encode_prompt(self,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 809, in encode_prompt
text_inputs = text_encoder.text2tokens(prompt,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\hyvideo\text_encoder\__init__.py", line 214, in text2tokens
raise ValueError("Only vision_languague models support image input")
ValueError: Only vision_languague models support image input
2025-01-18T16:35:17.548429 - end_vram - start_vram: 8310461608 - 8310461608 = 02025-01-18T16:35:17.548429 -
2025-01-18T16:35:17.549425 - #73 [HyVideoTextImageEncode]: 0.03s - vram 0b2025-01-18T16:35:17.549425 -
2025-01-18T16:35:17.550422 - Prompt executed in 34.84 seconds
2025-01-18T16:35:40.535099 - got prompt
2025-01-18T16:35:40.615029 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:35:41.104704 - Text encoder to dtype: torch.float16
2025-01-18T16:35:41.153541 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:35:41.241742 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:36:23.329035 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:36:23.354970 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C8FEDA230>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8FEDA230>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8FEDA230>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: 5268a8c5-45a5-4306-b169-f592a6c9b03e)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:36:23.355969 - end_vram - start_vram: 279942248 - 33820672 = 2461215762025-01-18T16:36:23.355969 -
2025-01-18T16:36:23.356966 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.74s - vram 246121576b2025-01-18T16:36:23.356966 -
2025-01-18T16:36:23.357962 - Prompt executed in 42.81 seconds
2025-01-18T16:37:16.620766 - got prompt
2025-01-18T16:37:16.637217 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:37:16.768314 - Text encoder to dtype: torch.float16
2025-01-18T16:37:16.915821 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:37:16.961617 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:37:59.045429 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:37:59.049354 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027E918528C0>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E918528C0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E918528C0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: a5841598-6b00-430c-8aea-c44accaa4209)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:37:59.050349 - end_vram - start_vram: 279676008 - 33554432 = 2461215762025-01-18T16:37:59.051345 -
2025-01-18T16:37:59.052341 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.41s - vram 246121576b2025-01-18T16:37:59.052341 -
2025-01-18T16:37:59.052341 - Prompt executed in 42.42 seconds
2025-01-18T16:38:11.813446 - got prompt
2025-01-18T16:38:11.822416 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:38:11.949989 - Text encoder to dtype: torch.float16
2025-01-18T16:38:12.105468 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:38:12.151315 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:38:54.211796 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:38:54.215776 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027E91853E50>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E91853E50>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E91853E50>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: 2baa9458-dcf0-4c05-96b5-d1ed2a6cee8f)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:38:54.218767 - end_vram - start_vram: 279676008 - 33554432 = 2461215762025-01-18T16:38:54.218767 -
2025-01-18T16:38:54.219763 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.40s - vram 246121576b2025-01-18T16:38:54.220759 -
2025-01-18T16:38:54.221756 - Prompt executed in 42.40 seconds
2025-01-18T16:38:58.080121 - got prompt
2025-01-18T16:38:58.097064 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:38:58.231786 - Text encoder to dtype: torch.float16
2025-01-18T16:38:58.396297 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:38:58.443141 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:39:40.548241 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:39:40.550234 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C8F2333A0>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8F2333A0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8F2333A0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: aabd8cb1-82c3-4f59-bbb7-6886cb566f4b)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:39:40.551232 - end_vram - start_vram: 279676008 - 33554432 = 2461215762025-01-18T16:39:40.551232 -
2025-01-18T16:39:40.553224 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.45s - vram 246121576b2025-01-18T16:39:40.553224 -
2025-01-18T16:39:40.554220 - Prompt executed in 42.46 seconds
2025-01-18T16:40:07.413078 - got prompt
2025-01-18T16:40:07.430022 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:40:07.561263 - Text encoder to dtype: torch.float16
2025-01-18T16:40:07.713306 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:40:07.760039 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:40:49.849375 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:40:49.853359 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C8D7C1990>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8D7C1990>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C8D7C1990>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: 69af78a5-b4e8-4dfc-9423-99f85d6251d6)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:40:49.855351 - end_vram - start_vram: 279676008 - 33554432 = 2461215762025-01-18T16:40:49.855351 -
2025-01-18T16:40:49.856346 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.43s - vram 246121576b2025-01-18T16:40:49.856346 -
2025-01-18T16:40:49.856346 - Prompt executed in 42.43 seconds
2025-01-18T16:41:28.057132 - got prompt
2025-01-18T16:41:28.073079 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:41:28.195669 - Text encoder to dtype: torch.float16
2025-01-18T16:41:28.347196 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:41:28.392046 - Downloading model to: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-v1_1-transformers
2025-01-18T16:42:10.501331 - !!! Exception during processing !!! An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:42:10.501331 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027E918506A0>, 'Connection to huggingface.co timed out. (connect timeout=None)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E918506A0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 155, in snapshot_download
repo_info = api.repo_info(repo_id=repo_id, repo_type=repo_type, revision=revision, token=token)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2748, in repo_info
return method(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\hf_api.py", line 2532, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/xtuner/llava-llama-3-8b-v1_1-transformers/revision/main (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027E918506A0>, 'Connection to huggingface.co timed out. (connect timeout=None)'))"), '(Request ID: 72e07388-bf38-4e55-a831-e86d4e7f22cf)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 679, in loadmodel
snapshot_download(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\_snapshot_download.py", line 235, in snapshot_download
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
2025-01-18T16:42:10.501331 - end_vram - start_vram: 279676008 - 33554432 = 2461215762025-01-18T16:42:10.501331 -
2025-01-18T16:42:10.501331 - #71 [DownloadAndLoadHyVideoTextEncoder]: 42.43s - vram 246121576b2025-01-18T16:42:10.501331 -
2025-01-18T16:42:10.509621 - Prompt executed in 42.45 seconds
2025-01-18T16:44:38.864193 - got prompt
2025-01-18T16:44:40.104673 - end_vram - start_vram: 526512038 - 33554432 = 4929576062025-01-18T16:44:40.104673 -
2025-01-18T16:44:40.104673 - #7 [HyVideoVAELoader]: 1.20s - vram 492957606b2025-01-18T16:44:40.105174 -
2025-01-18T16:44:40.105174 - Loading text encoder model (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:44:40.203992 - Text encoder to dtype: torch.float16
2025-01-18T16:44:40.253147 - Loading tokenizer (clipL) from: D:\ComfyUI-aki\models\clip\clip-vit-large-patch14
2025-01-18T16:44:40.299812 - Loading text encoder model (llm) from: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-01-18T16:45:03.669140 - Text encoder to dtype: torch.bfloat16
2025-01-18T16:45:07.559301 - Loading tokenizer (llm) from: D:\ComfyUI-aki\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-01-18T16:45:07.961666 - end_vram - start_vram: 15783007374 - 526512038 = 152564953362025-01-18T16:45:07.962665 -
2025-01-18T16:45:07.962665 - #71 [DownloadAndLoadHyVideoTextEncoder]: 27.86s - vram 15256495336b2025-01-18T16:45:07.962665 -
2025-01-18T16:45:07.994952 - !!! Exception during processing !!! Only vision_languague models support image input
2025-01-18T16:45:07.996380 - Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 884, in process
prompt_embeds, negative_prompt_embeds, attention_mask, negative_attention_mask = encode_prompt(self,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\nodes.py", line 809, in encode_prompt
text_inputs = text_encoder.text2tokens(prompt,
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-HunyuanVideoWrapper\hyvideo\text_encoder\__init__.py", line 214, in text2tokens
raise ValueError("Only vision_languague models support image input")
ValueError: Only vision_languague models support image input
2025-01-18T16:45:07.996882 - end_vram - start_vram: 15783007374 - 15783007374 = 02025-01-18T16:45:07.996882 -
2025-01-18T16:45:07.996882 - #73 [HyVideoTextImageEncode]: 0.03s - vram 0b2025-01-18T16:45:07.996882 -
2025-01-18T16:45:07.998725 - Prompt executed in 29.12 seconds
2025-01-18T16:47:59.855423 - []2025-01-18T16:47:59.855423 -
2025-01-18T16:47:59.855423 - []2025-01-18T16:47:59.855423 -
2025-01-18T16:48:03.887783 - FETCH DATA from: D:\ComfyUI-aki\custom_nodes\ComfyUI-Manager\extension-node-map.json2025-01-18T16:48:03.887783 - 2025-01-18T16:48:03.890773 - [DONE]2025-01-18T16:48:03.890773 -
2025-01-18T16:48:03.900740 - [LogConsole] client [aff6b8253b3f46dca0732b5fb250de4f], console [84dcd3cb-ab67-4acb-a3d6-c98d279a2440], disconnected2025-01-18T16:48:03.900740 -
2025-01-18T16:48:04.795427 - []2025-01-18T16:48:04.795427 -
2025-01-18T16:48:04.795427 - []2025-01-18T16:48:04.795427 -
2025-01-18T16:48:06.882374 - [LogConsole] client [aff6b8253b3f46dca0732b5fb250de4f], console [c3e0101a-9d12-4785-9fb3-1b95178049b7], connected2025-01-18T16:48:06.882374 -
2025-01-18T16:49:01.899793 - got prompt
2025-01-18T16:49:01.980523 - Using pytorch attention in VAE
2025-01-18T16:49:01.985506 - Using pytorch attention in VAE
2025-01-18T16:49:02.191815 - VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
2025-01-18T16:49:02.209755 - end_vram - start_vram: 33554432 - 33554432 = 02025-01-18T16:49:02.209755 -
2025-01-18T16:49:02.209755 - #10 [VAELoader]: 0.25s - vram 0b2025-01-18T16:49:02.209755 -
2025-01-18T16:49:02.211749 - end_vram - start_vram: 33554432 - 33554432 = 02025-01-18T16:49:02.211749 -
2025-01-18T16:49:02.211749 - #69 [EmptyLatentImage]: 0.00s - vram 0b2025-01-18T16:49:02.211749 -
2025-01-18T16:49:02.394138 - Requested to load FluxClipModel_
2025-01-18T16:49:02.526747 - loaded completely 9.5367431640625e+25 4777.53759765625 True
2025-01-18T16:49:02.536713 - CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cuda:0, dtype: torch.float16
2025-01-18T16:49:05.906339 - clip missing: ['text_projection.weight']
2025-01-18T16:49:06.068794 - end_vram - start_vram: 5043165704 - 33554432 = 50096112722025-01-18T16:49:06.068794 -
2025-01-18T16:49:06.068794 - #64 [DualCLIPLoader]: 3.86s - vram 5009611272b2025-01-18T16:49:06.068794 -
2025-01-18T16:49:06.736643 - end_vram - start_vram: 5573747208 - 5043165704 = 5305815042025-01-18T16:49:06.736643 -
2025-01-18T16:49:06.737639 - #6 [CLIPTextEncode]: 0.67s - vram 530581504b2025-01-18T16:49:06.737639 -
2025-01-18T16:49:06.738636 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:49:06.738636 -
2025-01-18T16:49:06.738636 - #68 [ConditioningZeroOut]: 0.00s - vram 0b2025-01-18T16:49:06.738636 -
2025-01-18T16:49:06.739633 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:49:06.739633 -
2025-01-18T16:49:06.739633 - #26 [FluxGuidance]: 0.00s - vram 0b2025-01-18T16:49:06.739633 -
2025-01-18T16:49:06.750597 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:49:06.750597 -
2025-01-18T16:49:06.750597 - #71 [LoadImage]: 0.01s - vram 0b2025-01-18T16:49:06.750597 -
2025-01-18T16:49:07.198097 - Applied providers: ['CUDAExecutionProvider', 'CPUExecutionProvider'], with options: {'CUDAExecutionProvider': {'device_id': '0', 'has_user_compute_stream': '0', 'cudnn_conv1d_pad_to_nc1d': '0', 'user_compute_stream': '0', 'gpu_external_alloc': '0', 'gpu_mem_limit': '18446744073709551615', 'enable_cuda_graph': '0', 'gpu_external_free': '0', 'gpu_external_empty_cache': '0', 'arena_extend_strategy': 'kNextPowerOfTwo', 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': '1', 'cudnn_conv_use_max_workspace': '1', 'tunable_op_enable': '0', 'tunable_op_tuning_enable': '0', 'tunable_op_max_tuning_duration_ms': '0', 'enable_skip_layer_norm_strict_mode': '0', 'prefer_nhwc': '0', 'use_ep_level_unified_stream': '0', 'use_tf32': '1', 'sdpa_kernel': '0', 'fuse_conv_bias': '0'}, 'CPUExecutionProvider': {}}2025-01-18T16:49:07.198097 -
2025-01-18T16:49:07.270854 - find model:2025-01-18T16:49:07.270854 - 2025-01-18T16:49:07.270854 - D:\ComfyUI-aki\models\insightface\models\antelopev2\1k3d68.onnx2025-01-18T16:49:07.270854 - 2025-01-18T16:49:07.270854 - landmark_3d_682025-01-18T16:49:07.270854 - 2025-01-18T16:49:07.270854 - ['None', 3, 192, 192]2025-01-18T16:49:07.270854 - 2025-01-18T16:49:07.270854 - 0.02025-01-18T16:49:07.270854 - 2025-01-18T16:49:07.270854 - 1.02025-01-18T16:49:07.270854 -
2025-01-18T16:49:07.302747 - Applied providers: ['CUDAExecutionProvider', 'CPUExecutionProvider'], with options: {'CUDAExecutionProvider': {'device_id': '0', 'has_user_compute_stream': '0', 'cudnn_conv1d_pad_to_nc1d': '0', 'user_compute_stream': '0', 'gpu_external_alloc': '0', 'gpu_mem_limit': '18446744073709551615', 'enable_cuda_graph': '0', 'gpu_external_free': '0', 'gpu_external_empty_cache': '0', 'arena_extend_strategy': 'kNextPowerOfTwo', 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': '1', 'cudnn_conv_use_max_workspace': '1', 'tunable_op_enable': '0', 'tunable_op_tuning_enable': '0', 'tunable_op_max_tuning_duration_ms': '0', 'enable_skip_layer_norm_strict_mode': '0', 'prefer_nhwc': '0', 'use_ep_level_unified_stream': '0', 'use_tf32': '1', 'sdpa_kernel': '0', 'fuse_conv_bias': '0'}, 'CPUExecutionProvider': {}}2025-01-18T16:49:07.302747 -
2025-01-18T16:49:07.305737 - find model:2025-01-18T16:49:07.305737 - 2025-01-18T16:49:07.306734 - D:\ComfyUI-aki\models\insightface\models\antelopev2\2d106det.onnx2025-01-18T16:49:07.306734 - 2025-01-18T16:49:07.306734 - landmark_2d_1062025-01-18T16:49:07.306734 - 2025-01-18T16:49:07.306734 - ['None', 3, 192, 192]2025-01-18T16:49:07.306734 - 2025-01-18T16:49:07.306734 - 0.02025-01-18T16:49:07.306734 - 2025-01-18T16:49:07.306734 - 1.02025-01-18T16:49:07.306734 -
2025-01-18T16:49:07.353577 - Applied providers: ['CUDAExecutionProvider', 'CPUExecutionProvider'], with options: {'CUDAExecutionProvider': {'device_id': '0', 'has_user_compute_stream': '0', 'cudnn_conv1d_pad_to_nc1d': '0', 'user_compute_stream': '0', 'gpu_external_alloc': '0', 'gpu_mem_limit': '18446744073709551615', 'enable_cuda_graph': '0', 'gpu_external_free': '0', 'gpu_external_empty_cache': '0', 'arena_extend_strategy': 'kNextPowerOfTwo', 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': '1', 'cudnn_conv_use_max_workspace': '1', 'tunable_op_enable': '0', 'tunable_op_tuning_enable': '0', 'tunable_op_max_tuning_duration_ms': '0', 'enable_skip_layer_norm_strict_mode': '0', 'prefer_nhwc': '0', 'use_ep_level_unified_stream': '0', 'use_tf32': '1', 'sdpa_kernel': '0', 'fuse_conv_bias': '0'}, 'CPUExecutionProvider': {}}2025-01-18T16:49:07.353577 -
2025-01-18T16:49:07.355570 - find model:2025-01-18T16:49:07.355570 - 2025-01-18T16:49:07.355570 - D:\ComfyUI-aki\models\insightface\models\antelopev2\genderage.onnx2025-01-18T16:49:07.355570 - 2025-01-18T16:49:07.355570 - genderage2025-01-18T16:49:07.355570 - 2025-01-18T16:49:07.356567 - ['None', 3, 96, 96]2025-01-18T16:49:07.356567 - 2025-01-18T16:49:07.356567 - 0.02025-01-18T16:49:07.356567 - 2025-01-18T16:49:07.356567 - 1.02025-01-18T16:49:07.356567 -
2025-01-18T16:49:07.583326 - Applied providers: ['CUDAExecutionProvider', 'CPUExecutionProvider'], with options: {'CUDAExecutionProvider': {'device_id': '0', 'has_user_compute_stream': '0', 'cudnn_conv1d_pad_to_nc1d': '0', 'user_compute_stream': '0', 'gpu_external_alloc': '0', 'gpu_mem_limit': '18446744073709551615', 'enable_cuda_graph': '0', 'gpu_external_free': '0', 'gpu_external_empty_cache': '0', 'arena_extend_strategy': 'kNextPowerOfTwo', 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': '1', 'cudnn_conv_use_max_workspace': '1', 'tunable_op_enable': '0', 'tunable_op_tuning_enable': '0', 'tunable_op_max_tuning_duration_ms': '0', 'enable_skip_layer_norm_strict_mode': '0', 'prefer_nhwc': '0', 'use_ep_level_unified_stream': '0', 'use_tf32': '1', 'sdpa_kernel': '0', 'fuse_conv_bias': '0'}, 'CPUExecutionProvider': {}}2025-01-18T16:49:07.583326 -
2025-01-18T16:49:07.704455 - find model:2025-01-18T16:49:07.704455 - 2025-01-18T16:49:07.704455 - D:\ComfyUI-aki\models\insightface\models\antelopev2\glintr100.onnx2025-01-18T16:49:07.704455 - 2025-01-18T16:49:07.704455 - recognition2025-01-18T16:49:07.704455 - 2025-01-18T16:49:07.704455 - ['None', 3, 112, 112]2025-01-18T16:49:07.704455 - 2025-01-18T16:49:07.704455 - 127.52025-01-18T16:49:07.704455 - 2025-01-18T16:49:07.704455 - 127.52025-01-18T16:49:07.704455 -
2025-01-18T16:49:07.752294 - Applied providers: ['CUDAExecutionProvider', 'CPUExecutionProvider'], with options: {'CUDAExecutionProvider': {'device_id': '0', 'has_user_compute_stream': '0', 'cudnn_conv1d_pad_to_nc1d': '0', 'user_compute_stream': '0', 'gpu_external_alloc': '0', 'gpu_mem_limit': '18446744073709551615', 'enable_cuda_graph': '0', 'gpu_external_free': '0', 'gpu_external_empty_cache': '0', 'arena_extend_strategy': 'kNextPowerOfTwo', 'cudnn_conv_algo_search': 'EXHAUSTIVE', 'do_copy_in_default_stream': '1', 'cudnn_conv_use_max_workspace': '1', 'tunable_op_enable': '0', 'tunable_op_tuning_enable': '0', 'tunable_op_max_tuning_duration_ms': '0', 'enable_skip_layer_norm_strict_mode': '0', 'prefer_nhwc': '0', 'use_ep_level_unified_stream': '0', 'use_tf32': '1', 'sdpa_kernel': '0', 'fuse_conv_bias': '0'}, 'CPUExecutionProvider': {}}2025-01-18T16:49:07.753290 -
2025-01-18T16:49:07.753290 - find model:2025-01-18T16:49:07.753290 - 2025-01-18T16:49:07.753290 - D:\ComfyUI-aki\models\insightface\models\antelopev2\scrfd_10g_bnkps.onnx2025-01-18T16:49:07.753290 - 2025-01-18T16:49:07.753290 - detection2025-01-18T16:49:07.753290 - 2025-01-18T16:49:07.753290 - [1, 3, '?', '?']2025-01-18T16:49:07.753290 - 2025-01-18T16:49:07.753290 - 127.52025-01-18T16:49:07.753290 - 2025-01-18T16:49:07.753290 - 128.02025-01-18T16:49:07.753290 -
2025-01-18T16:49:07.754287 - set det-size:2025-01-18T16:49:07.754287 - 2025-01-18T16:49:07.754287 - (640, 640)2025-01-18T16:49:07.754287 -
2025-01-18T16:49:07.754287 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:49:07.754287 -
2025-01-18T16:49:07.754287 - #53 [PulidFluxInsightFaceLoader]: 1.00s - vram 0b2025-01-18T16:49:07.754287 -
2025-01-18T16:49:07.755285 - Loaded EVA02-CLIP-L-14-336 model config.
2025-01-18T16:49:07.767244 - Shape of rope freq: torch.Size([576, 64])
2025-01-18T16:49:31.469635 - !!! Exception during processing !!! An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:49:31.474127 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C9058BD30>, 'Connection to huggingface.co timed out. (connect timeout=10)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C9058BD30>, 'Connection to huggingface.co timed out. (connect timeout=10)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1374, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1294, in get_hf_file_metadata
r = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 278, in _request_wrapper
response = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 301, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C9058BD30>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: b5a6d900-3630-40a5-ada4-5aadf915a12f)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 170, in _map_node_over_list
process_inputs({})
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\pulidflux.py", line 298, in load_eva_clip
model, _, _ = create_model_and_transforms('EVA02-CLIP-L-14-336', 'eva_clip', force_custom_clip=True)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 377, in create_model_and_transforms
model = create_model(
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 279, in create_model
checkpoint_path = download_pretrained(pretrained_cfg, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 328, in download_pretrained
target = download_pretrained_from_hf(model_id, filename=filename, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 300, in download_pretrained_from_hf
cached_file = hf_hub_download(model_id, filename, revision=revision, cache_dir=cache_dir)
File "<enhanced_experience patches.hfmirror.huggingface_hub>", line 47, in hf_hub_download_wrapper_inner
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 860, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 967, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1485, in _raise_on_head_call_error
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:49:31.474127 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:49:31.474127 -
2025-01-18T16:49:31.474127 - #51 [PulidFluxEvaClipLoader]: 23.72s - vram 0b2025-01-18T16:49:31.474127 -
2025-01-18T16:49:31.475124 - Prompt executed in 29.57 seconds
2025-01-18T16:50:54.268442 - got prompt
2025-01-18T16:50:54.282395 - Loaded EVA02-CLIP-L-14-336 model config.
2025-01-18T16:50:54.319272 - Shape of rope freq: torch.Size([576, 64])
2025-01-18T16:51:18.148491 - !!! Exception during processing !!! An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:51:18.149488 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C917A3520>, 'Connection to huggingface.co timed out. (connect timeout=10)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C917A3520>, 'Connection to huggingface.co timed out. (connect timeout=10)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1374, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1294, in get_hf_file_metadata
r = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 278, in _request_wrapper
response = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 301, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C917A3520>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: e40244a4-1e61-4d13-8121-4038e9a7e898)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 170, in _map_node_over_list
process_inputs({})
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\pulidflux.py", line 298, in load_eva_clip
model, _, _ = create_model_and_transforms('EVA02-CLIP-L-14-336', 'eva_clip', force_custom_clip=True)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 377, in create_model_and_transforms
model = create_model(
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 279, in create_model
checkpoint_path = download_pretrained(pretrained_cfg, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 328, in download_pretrained
target = download_pretrained_from_hf(model_id, filename=filename, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 300, in download_pretrained_from_hf
cached_file = hf_hub_download(model_id, filename, revision=revision, cache_dir=cache_dir)
File "<enhanced_experience patches.hfmirror.huggingface_hub>", line 47, in hf_hub_download_wrapper_inner
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 860, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 967, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1485, in _raise_on_head_call_error
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:51:18.149488 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:51:18.149488 -
2025-01-18T16:51:18.150485 - #51 [PulidFluxEvaClipLoader]: 23.87s - vram 0b2025-01-18T16:51:18.150485 -
2025-01-18T16:51:18.151482 - Prompt executed in 23.88 seconds
2025-01-18T16:51:58.620672 - got prompt
2025-01-18T16:51:58.634625 - Loaded EVA02-CLIP-L-14-336 model config.
2025-01-18T16:51:58.667515 - Shape of rope freq: torch.Size([576, 64])
2025-01-18T16:52:21.730836 - !!! Exception during processing !!! An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:52:21.731832 - Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 95, in create_connection
raise err
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 363, in connect
self.sock = conn = self._new_conn()
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x0000027C917A2C80>, 'Connection to huggingface.co timed out. (connect timeout=10)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 486, in send
resp = conn.urlopen(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
retries = retries.increment(
File "D:\ComfyUI-aki\python\lib\site-packages\urllib3\util\retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C917A2C80>, 'Connection to huggingface.co timed out. (connect timeout=10)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1374, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1294, in get_hf_file_metadata
r = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 278, in _request_wrapper
response = _request_wrapper(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 301, in _request_wrapper
response = get_session().request(method=method, url=url, **params)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_http.py", line 93, in send
return super().send(request, *args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\requests\adapters.py", line 507, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /QuanSun/EVA-CLIP/resolve/main/EVA02_CLIP_L_336_psz14_s6B.pt (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x0000027C917A2C80>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: e493afae-71aa-48af-9513-1aca58c189a4)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\ComfyUI-aki\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "D:\ComfyUI-aki\execution.py", line 170, in _map_node_over_list
process_inputs({})
File "D:\ComfyUI-aki\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\pulidflux.py", line 298, in load_eva_clip
model, _, _ = create_model_and_transforms('EVA02-CLIP-L-14-336', 'eva_clip', force_custom_clip=True)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 377, in create_model_and_transforms
model = create_model(
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\factory.py", line 279, in create_model
checkpoint_path = download_pretrained(pretrained_cfg, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 328, in download_pretrained
target = download_pretrained_from_hf(model_id, filename=filename, cache_dir=cache_dir)
File "D:\ComfyUI-aki\custom_nodes\ComfyUI-PuLID-Flux-Enhanced\eva_clip\pretrained.py", line 300, in download_pretrained_from_hf
cached_file = hf_hub_download(model_id, filename, revision=revision, cache_dir=cache_dir)
File "<enhanced_experience patches.hfmirror.huggingface_hub>", line 47, in hf_hub_download_wrapper_inner
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 860, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 967, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "D:\ComfyUI-aki\python\lib\site-packages\huggingface_hub\file_download.py", line 1485, in _raise_on_head_call_error
raise LocalEntryNotFoundError(
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
2025-01-18T16:52:21.732829 - end_vram - start_vram: 5043165704 - 5043165704 = 02025-01-18T16:52:21.732829 -
2025-01-18T16:52:21.733826 - #51 [PulidFluxEvaClipLoader]: 23.10s - vram 0b2025-01-18T16:52:21.733826 -
2025-01-18T16:52:21.734822 - Prompt executed in 23.11 seconds
Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"last_node_id":77,"last_link_id":150,"nodes":[{"id":68,"type":"ConditioningZeroOut","pos":[1335.692626953125,460.7344055175781],"size":[210,26],"flags":{},"order":11,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":137,"label":"条件"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[140],"slot_index":0,"label":"条件"}],"properties":{"Node name for S&R":"ConditioningZeroOut"},"widgets_values":[]},{"id":49,"type":"VAEDecode","pos":[2001.45556640625,11.220564842224121],"size":[210,46],"flags":{},"order":14,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":139,"label":"Latent"},{"name":"vae","type":"VAE","link":88,"label":"VAE"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[89],"slot_index":0,"shape":3,"label":"图像"}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":50,"type":"PreviewImage","pos":[2263.08447265625,3.3247292041778564],"size":[268.9175109863281,295.81024169921875],"flags":{},"order":15,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":89,"label":"图像"}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]},{"id":26,"type":"FluxGuidance","pos":[1259.2752685546875,336.5177917480469],"size":[317.4000244140625,58],"flags":{"collapsed":false},"order":10,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":41,"label":"条件"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[136],"slot_index":0,"shape":3,"label":"条件"}],"properties":{"Node name for S&R":"FluxGuidance"},"widgets_values":[4]},{"id":69,"type":"EmptyLatentImage","pos":[1258.3369140625,560.9191284179688],"size":[315,106],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","type":"LATENT","links":[141],"label":"Latent"}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[1024,1024,1]},{"id":71,"type":"LoadImage","pos":[550.8306274414062,-119.77510833740234],"size":[296.17181396484375,314],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[144],"slot_index":0,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["einstein.jpg","image"]},{"id":45,"type":"PulidFluxModelLoader","pos":[915.3274536132812,-117.59740447998047],"size":[315,58],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"PULIDFLUX","type":"PULIDFLUX","links":[125],"slot_index":0,"shape":3,"label":"PULIDFLUX"}],"properties":{"Node name for S&R":"PulidFluxModelLoader"},"widgets_values":["pulid_flux_v0.9.1.safetensors"]},{"id":6,"type":"CLIPTextEncode","pos":[903.1098022460938,407.4521484375],"size":[330.1904602050781,105.4501724243164],"flags":{},"order":8,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":132,"label":"CLIP"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[41,137],"slot_index":0,"label":"条件"}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["portrait, color, cinematic",[false,true]]},{"id":64,"type":"DualCLIPLoader","pos":[547.044921875,402.2887268066406],"size":[315,106],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[132],"slot_index":0,"shape":3,"label":"CLIP"}],"properties":{"Node name for S&R":"DualCLIPLoader"},"widgets_values":["t5xxl_fp8_e4m3fn.safetensors","clip_l.safetensors","flux","default"]},{"id":70,"type":"ApplyFBCacheOnModel","pos":[911.3635864257812,239.325927734375],"size":[315,82],"flags":{},"order":9,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":142,"label":"model"}],"outputs":[{"name":"MODEL","type":"MODEL","links":[143],"slot_index":0,"label":"MODEL"}],"properties":{"Node name for S&R":"ApplyFBCacheOnModel"},"widgets_values":["diffusion_model",0.1]},{"id":10,"type":"VAELoader","pos":[1653.165771484375,486.8099670410156],"size":[311.81634521484375,60.429901123046875],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[88],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["ae.safetensors"]},{"id":62,"type":"ApplyPulidFlux","pos":[1268.993408203125,-110.4439697265625],"size":[315,346],"flags":{},"order":12,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":143,"label":"model"},{"name":"pulid_flux","type":"PULIDFLUX","link":125,"label":"pulid_flux"},{"name":"eva_clip","type":"EVA_CLIP","link":123,"label":"eva_clip"},{"name":"face_analysis","type":"FACEANALYSIS","link":124,"label":"face_analysis"},{"name":"image","type":"IMAGE","link":144,"label":"image"},{"name":"attn_mask","type":"MASK","link":null,"shape":7,"label":"attn_mask"},{"name":"prior_image","type":"IMAGE","link":null,"shape":7,"label":"prior_image"}],"outputs":[{"name":"MODEL","type":"MODEL","links":[135],"slot_index":0,"shape":3,"label":"MODEL"}],"properties":{"Node name for S&R":"ApplyPulidFlux"},"widgets_values":[1,0,1,"mean",1,0,1000,true]},{"id":63,"type":"UNETLoader","pos":[539.7626953125,237.74021911621094],"size":[315,82],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[142],"slot_index":0,"shape":3,"label":"模型"}],"properties":{"Node name for S&R":"UNETLoader"},"widgets_values":["Kijai-flux1-dev-fp8.safetensors","fp8_e4m3fn_fast"]},{"id":67,"type":"KSampler","pos":[1640.312255859375,4.239409446716309],"size":[315,262],"flags":{},"order":13,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":135,"label":"模型"},{"name":"positive","type":"CONDITIONING","link":136,"label":"正面条件"},{"name":"negative","type":"CONDITIONING","link":140,"label":"负面条件"},{"name":"latent_image","type":"LATENT","link":141,"label":"Latent"}],"outputs":[{"name":"LATENT","type":"LATENT","links":[139],"slot_index":0,"label":"Latent"}],"properties":{"Node name for S&R":"KSampler"},"widgets_values":[594971898004859,"randomize",25,1,"euler","beta",1]},{"id":53,"type":"PulidFluxInsightFaceLoader","pos":[910.3726196289062,104.22013092041016],"size":[315,58],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"FACEANALYSIS","type":"FACEANALYSIS","links":[124],"slot_index":0,"shape":3,"label":"FACEANALYSIS"}],"properties":{"Node name for S&R":"PulidFluxInsightFaceLoader"},"widgets_values":["CUDA"]},{"id":51,"type":"PulidFluxEvaClipLoader","pos":[917.8700561523438,-6.382608413696289],"size":[315,26],"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"EVA_CLIP","type":"EVA_CLIP","links":[123],"slot_index":0,"shape":3,"label":"EVA_CLIP"}],"properties":{"Node name for S&R":"PulidFluxEvaClipLoader"},"widgets_values":[]}],"links":[[41,6,0,26,0,"CONDITIONING"],[88,10,0,49,1,"VAE"],[89,49,0,50,0,"IMAGE"],[123,51,0,62,2,"EVA_CLIP"],[124,53,0,62,3,"FACEANALYSIS"],[125,45,0,62,1,"PULIDFLUX"],[132,64,0,6,0,"CLIP"],[135,62,0,67,0,"MODEL"],[136,26,0,67,1,"CONDITIONING"],[137,6,0,68,0,"CONDITIONING"],[139,67,0,49,0,"LATENT"],[140,68,0,67,2,"CONDITIONING"],[141,69,0,67,3,"LATENT"],[142,63,0,70,0,"MODEL"],[143,70,0,62,0,"MODEL"],[144,71,0,62,4,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.9849732675807809,"offset":[-657.8599530480627,212.35522245757159]},"node_versions":{"comfy-core":"0.3.12","ComfyUI-PuLID-Flux-Enhanced":"afc9705e11ec9095bfbca96e6c1c9b06835cec82","wavespeed":"1.0.9"},"ue_links":[]},"version":0.4}
Additional Context
(Please add any additional context or steps to reproduce the error here)
ComfyUI 無法自行訪問外部網站。這可能與 ComfyUI-Manager 或其他自定義節點有關。就 ComfyUI-Manager 而言,他們正在考慮在未來提供類似於反向代理的功能。
ComfyUI 無法自行訪問外部網站。這可能與 ComfyUI-Manager 或其他自定義節點有關。就 ComfyUI-Manager 而言,他們正在考慮在未來提供類似於反向代理的功能。
谢谢 支持 是我模型位置放错位置 外网没有打开
Add a single command set HF_ENDPOINT=https://hf-mirror.com in the startup batch file. It seems to work in my case, but unfortunately I don't have the screenshot. You can try this to change all the default "huggingface.co" to "hf-mirror.com".
https://github.com/YanWenKun/ComfyUI-Windows-Portable/releases 我更换了这个整合包以后,将原来的HuggingFaceHub文件夹下的全部内容和ComfyUI_Windows_portable\ComfyUI\models\LLM下的llava-llama-3-8b-text-encoder-tokenizer复制到对应的文件位置,问题解决了。我是windows系统,不知道对你们是否有用。
https://github.com/YanWenKun/ComfyUI-Windows-Portable/releases 我更换了这个整合包以后,将原来的HuggingFaceHub文件夹下的全部内容和ComfyUI_Windows_portable\ComfyUI\models\LLM下的llava-llama-3-8b-text-encoder-tokenizer复制到对应的文件位置,问题解决了。我是windows系统,不知道对你们是否有用。
忘了说,最重要的一点是这个整合包有强制使用hf-mirror更新节点插件的功能。
就是忘开梯子造成的不能更新节点包内容,开了就好了
https://github.com/YanWenKun/ComfyUI-Windows-Portable/releases 我更换了这个整合包以后,将原来的HuggingFaceHub文件夹下的全部内容和ComfyUI_Windows_portable\ComfyUI\models\LLM下的llava-llama-3-8b-text-encoder-tokenizer复制到对应的文件位置,问题解决了。我是windows系统,不知道对你们是否有用。
忘了说,最重要的一点是这个整合包有强制使用hf-mirror更新节点插件的功能。
可以详细的解释一下吗
我开了梯子,也连不到。不知道什么原因
我开了梯子,也连不到。不知道什么原因
Find the main.py file in the comfyui folder, edit it as shown below, add a line and restart to solve the problem.
os.environ["HF_ENDPOINT"] = "https://hf-mirror.com"
Find the main.py file in the comfyui folder, edit it as shown below, add a line and restart to solve the problem.
os.environ["HF_ENDPOINT"] = "https://hf-mirror.com"
Rather than modifying the code, it is more appropriate to modify the environment variable.
An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.请问这是什么问题?
找到comfyui文件夹下的main.py文件,按照下图编辑,添加一行后重启,问题解决。 os.environ["HF_ENDPOINT"] = " https://hf-mirror.com "
比起修改代码,修改环境变量更加合适。
这个管用
在 comfyui 文件夹中找到 main.py 文件,如下图编辑,添加一行并重新启动即可解决问题。 os.environ[“HF_ENDPOINT”] = “https://hf-mirror.com"
与其修改代码,不如修改环境变量。
My environment variables have always been present, but they still report errors. His method can solve this problem
os.environ["HF_ENDPOINT"] = "https:hf-mirror.com/" 有用!
This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.
PulidFluxEvaClipLoader An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
在 comfyui 文件夹中找到 main.py 文件,如下图编辑,添加一行并重新启动即可解决问题。 os.environ[“HF_ENDPOINT”] = “ https://hf-mirror.com ”
修改代码,不如修改环境变量。
我的环境变量一直都有,但是还是报错,他的方法可以解决这个问题
我还是没有解决
thank you!
但好像还是不行,主要是访问github访问不了。
在 2025-04-27 22:54:32,"zeyujun" @.***> 写道:
zeyujun left a comment (comfyanonymous/ComfyUI#3840)
在 comfyui 文件夹中找到 main.py 文件,如下图编辑,添加一行并重新启动即可解决问题。 os.environ[“HF_ENDPOINT”] = “ https://hf-mirror.com ” modify.png (view on web)
修改代码,不如修改环境变量。
我的环境变量一直都有,但是还是报错,他的方法可以解决这个问题
我还是没有解决
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
thank you!
但好像还是不行,主要是访问github访问不了。
在 2025-04-27 22:54:32,"zeyujun" @.***> 写道:
zeyujun left a comment (comfyanonymous/ComfyUI#3840)
在 comfyui 文件夹中找到 main.py 文件,如下图编辑,添加一行并重新启动即可解决问题。 os.environ[“HF_ENDPOINT”] = “ https://hf-mirror.com ” modify.png (view on web)
修改代码,不如修改环境变量。
我的环境变量一直都有,但是还是报错,他的方法可以解决这个问题
我还是没有解决
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
有用!
Find the main.py file in the comfyui folder, edit it as shown below, add a line and restart to solve the problem.
os.environ["HF_ENDPOINT"] = "https://hf-mirror.com"