can not found table name: breast_hetero_host namespace: experiment
hdfs spark rabbitmq模式下,提交任务,提示: [ERROR] [2023-09-14 03:46:00,152] [202309140345549528510] [38577:140051551999808] - [task_executor.run] [line:266]: can not found table name: breast_hetero_host namespace: experiment Traceback (most recent call last): File "/data/projects/fate/fateflow/python/fate_flow/worker/task_executor.py", line 210, in run cpn_output = run_object.run(cpn_input) File "/data/projects/fate/fateflow/python/fate_flow/components/_base.py", line 156, in run self._run(cpn_input=cpn_input) File "/data/projects/fate/fateflow/python/fate_flow/components/reader.py", line 86, in _run ) = self.convert_check( File "/data/projects/fate/fateflow/python/fate_flow/components/reader.py", line 202, in convert_check return data_utils.convert_output(input_name, input_namespace, output_name, output_namespace, computing_engine, File "/data/projects/fate/fateflow/python/fate_flow/utils/data_utils.py", line 71, in convert_output raise RuntimeError( RuntimeError: can not found table name: breast_hetero_host namespace: experiment
也就是说,在fate_arch->storage->_table.py->StorageTableMeta类中,__new__方法返回了None,但是name和namespace在提交任务时已经给出了,怀疑是hadoop的问题,但是节点正常,spark正常
这份数据已经执行过upload上传指令了吗
这份数据已经执行过upload上传指令了吗
执行过了
在hadoop里面 能看到有上传数据
并且,如果将service_conf.yaml文件中的dependent_distribution设置为True的话,则会出现以下问题。 [ERROR] [2023-09-18 00:58:34,084] [202309180058318736560] [81749:140152811226944] - [base_worker.run] [line:144]: 'NoneType' object has no attribute 'f_path' Traceback (most recent call last): File "/data/projects/fate/fateflow/python/fate_flow/worker/base_worker.py", line 142, in run result = self._run() File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 60, in _run self.upload_dependencies_to_hadoop(provider=provider, dependence_type=dependence_type) File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 52, in _wrapper raise e File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 40, in _wrapper return func(*args, **kwargs) File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 94, in upload_dependencies_to_hadoop source_path = ComponentProviderInfo.get_or_none( AttributeError: 'NoneType' object has no attribute 'f_path' [ERROR] [2023-09-18 00:58:36,039] [202309180058318736560] [81764:140346086664000] - [base_worker.run] [line:144]: Destination path '/data/projects/fate/fateflow/version_dependencies/1.11.1/fate_code/conf' already exists Traceback (most recent call last): File "/data/projects/fate/fateflow/python/fate_flow/worker/base_worker.py", line 142, in run result = self._run() File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 60, in _run self.upload_dependencies_to_hadoop(provider=provider, dependence_type=dependence_type) File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 52, in _wrapper raise e File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 40, in _wrapper return func(*args, **kwargs) File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 90, in upload_dependencies_to_hadoop cls.move_dir(os.path.join(python_base_dir, key), os.path.dirname(fate_code_base_dir)) File "/data/projects/fate/fateflow/python/fate_flow/worker/dependence_upload.py", line 157, in move_dir shutil.move(source_path, target_path) File "/root/.pyenv/versions/3.8.11/lib/python3.8/shutil.py", line 789, in move raise Error("Destination path '%s' already exists" % real_dst) shutil.Error: Destination path '/data/projects/fate/fateflow/version_dependencies/1.11.1/fate_code/conf' already exists
“RuntimeError: can not found table name: breast_hetero_host namespace: experiment”这个报错可能是因为upload是异步的操作,跑这个job之前可能upload的任务并没有真正的结束。你可以通过flow table info -t breast_hetero_host -n experiment查询下是否存在表(若上传成功会返回表信息)