Hi,
I am getting this error and I don't know how to resolve this. I run ppotrainer.py
/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/autoscaler/_private/cli_logger.py:57: FutureWarning: Not all Ray CLI dependencies were found. In Ray 1.4+, the Ray CLI, autoscaler, and dashboard will only be usable via pip install 'ray[default]'. Please update your install command.
warnings.warn(
2024-06-29 05:25:32,568 ERROR services.py:1276 -- Failed to start the dashboard: Failed to read dashbord log: [Errno 2] No such file or directory: '/tmp/ray/session_2024-06-29_05-25-31_980597_122872/logs/dashboard.log'
2024-06-29 05:25:34,120 INFO logger.py:179 -- pip install 'ray[tune]' to see TensorBoard files.
2024-06-29 05:25:34,120 WARNING logger.py:316 -- Could not instantiate TBXLogger: cannot import name 'builder' from 'google.protobuf.internal' (/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/google/protobuf/internal/init.py).
2024-06-29 05:25:34,122 WARNING deprecation.py:33 -- DeprecationWarning: simple_optimizer has been deprecated. This will raise an error in the future!
2024-06-29 05:25:34,122 INFO trainer.py:694 -- Current log_level is WARN. For more information, set 'log_level': 'INFO' / 'DEBUG' or use the -v and -vv flags.
(raylet) /home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/autoscaler/_private/cli_logger.py:57: FutureWarning: Not all Ray CLI dependencies were found. In Ray 1.4+, the Ray CLI, autoscaler, and dashboard will only be usable via pip install 'ray[default]'. Please update your install command.
(raylet) warnings.warn(
(raylet) Traceback (most recent call last):
(raylet) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/new_dashboard/agent.py", line 22, in
(raylet) import ray.new_dashboard.utils as dashboard_utils
(raylet) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/new_dashboard/utils.py", line 20, in
(raylet) import aiohttp.signals
(raylet) ModuleNotFoundError: No module named 'aiohttp.signals'
Traceback (most recent call last):
File "ppotrain.py", line 109, in
_main()
File "ppotrain.py", line 95, in _main
trainer = ppo.PPOTrainer(env='test_env',
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/agents/trainer_template.py", line 121, in init
Trainer.init(self, config, env, logger_creator)
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/agents/trainer.py", line 516, in init
super().init(config, logger_creator)
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/tune/trainable.py", line 98, in init
self.setup(copy.deepcopy(self.config))
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/agents/trainer.py", line 707, in setup
self._init(self.config, self.env_creator)
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/agents/trainer_template.py", line 148, in _init
self.workers = self._make_workers(
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/agents/trainer.py", line 783, in _make_workers
return WorkerSet(
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/evaluation/worker_set.py", line 79, in init
remote_spaces = ray.get(self.remote_workers(
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/_private/client_mode_hook.py", line 47, in wrapper
return func(*args, **kwargs)
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/worker.py", line 1483, in get
raise value
ray.exceptions.RayActorError: The actor died because of an error raised in its creation task, ray::RolloutWorker.init() (pid=122986, ip=192.168.15.93)
File "python/ray/_raylet.pyx", line 505, in ray._raylet.execute_task
File "python/ray/_raylet.pyx", line 449, in ray._raylet.execute_task.function_executor
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/_private/function_manager.py", line 556, in actor_method_executor
return method(__ray_actor, *args, **kwargs)
File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/evaluation/rollout_worker.py", line 501, in init
raise ImportError("Could not import tensorflow")
ImportError: Could not import tensorflow
(pid=122995) 2024-06-29 05:25:34,862 ERROR worker.py:382 -- Exception raised in creation task: The actor died because of an error raised in its creation task, ray::RolloutWorker.init() (pid=122995, ip=192.168.15.93)
(pid=122995) File "python/ray/_raylet.pyx", line 505, in ray._raylet.execute_task
(pid=122995) File "python/ray/_raylet.pyx", line 449, in ray._raylet.execute_task.function_executor
(pid=122995) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/_private/function_manager.py", line 556, in actor_method_executor
(pid=122995) return method(__ray_actor, *args, **kwargs)
(pid=122995) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/evaluation/rollout_worker.py", line 501, in init
(pid=122995) raise ImportError("Could not import tensorflow")
(pid=122995) ImportError: Could not import tensorflow
(pid=122986) 2024-06-29 05:25:34,860 ERROR worker.py:382 -- Exception raised in creation task: The actor died because of an error raised in its creation task, ray::RolloutWorker.init() (pid=122986, ip=192.168.15.93)
(pid=122986) File "python/ray/_raylet.pyx", line 505, in ray._raylet.execute_task
(pid=122986) File "python/ray/_raylet.pyx", line 449, in ray._raylet.execute_task.function_executor
(pid=122986) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/_private/function_manager.py", line 556, in actor_method_executor
(pid=122986) return method(__ray_actor, *args, **kwargs)
(pid=122986) File "/home/rllibsumoutils-master/example/lcodeca/lib/python3.8/site-packages/ray/rllib/evaluation/rollout_worker.py", line 501, in init
(pid=122986) raise ImportError("Could not import tensorflow")
(pid=122986) ImportError: Could not import tensorflow
INFO:marlenvironment:Environment destruction: SUMOTestMultiAgentEnv
(lcodeca) ryzen@ryzen:~/rllibsumoutils-master/example$
How to resolve this?