PaddleDetection
PaddleDetection copied to clipboard
Module is not registered
问题确认 Search before asking
- [X] 我已经搜索过问题,但是没有找到解答。I have searched the question and found no related answer.
请提出你的问题 Please ask your question
I've built the paddle detection docker image:
docker run --name dev --runtime=nvidia -v $PWD:/mnt -p 8888:8888 -it paddlecloud/paddledetection:2.4-gpu-cuda10.2-cudnn7-e9a542 /bin/bash
I've copied the relevant config files and weights into the /home/PaddleDetection directory in the docker image, and now I'm attempting to run the simplest image example in-line with the paddle detection guide:
CUDA_VISIBLE_DEVICES=0 python tools/infer.py \
-c configs/ppyoloe/ppyoloe_plus_crn_t_auxhead_320_300e_coco.yml \
-o use_gpu=true \
-o weights=configs/ppyoloe/weights/ppyoloe_plus_crn_t_auxhead_320_300e_coco.pdparams \
--infer_img=demo/000000014439_640x640.jpg
However, I receive an unexpected error that the module is not registered:
λ 2f965d600917 /home/PaddleDetection >> ./infer_single.sh
grep: warning: GREP_OPTIONS is deprecated; please use an alias or script
/usr/local/python3.7.0/lib/python3.7/site-packages/paddle/tensor/creation.py:130: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
if data.dtype == np.object:
Traceback (most recent call last):
File "tools/infer.py", line 177, in <module>
main()
File "tools/infer.py", line 173, in main
run(FLAGS, cfg)
File "tools/infer.py", line 121, in run
trainer = Trainer(cfg, mode='test')
File "/home/PaddleDetection/ppdet/engine/trainer.py", line 99, in __init__
self.model = create(cfg.architecture)
File "/home/PaddleDetection/ppdet/core/workspace.py", line 215, in create
"the module {} is not registered".format(name)
AssertionError: the module PPYOLOEWithAuxHead is not registered
Why is this error occurring and how can I fix it in order to run inference?
met same problem, anyone have solution?