NumPy-related Error Running with Docker
Dear author, I'm a Ph.D. student supervised by Yibo Lin. I've encountered some issues when trying to run AutoDMP using Docker and build a docker image.
- dockerfile error The download path for Boost in the Dockerfile has become invalid and needs to be replaced with
https://www.boost.org/users/history/version_1_66_0.html
- numpy related error - really need help :(
Firstly, if I use NumPy version 1.20.0, an error will be reported.
Traceback (most recent call last):
File "./tuner/tuner_train.py", line 28, in <module>
from hpbandster.optimizers import BOHB as BOHB
File "/AutoDMP/hpbandster/optimizers/__init__.py", line 1, in <module>
from hpbandster.optimizers.randomsearch import RandomSearch
File "/AutoDMP/hpbandster/optimizers/randomsearch.py", line 10, in <module>
import ConfigSpace as CS
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/__init__.py", line 31, in <module>
from ConfigSpace.api import (
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/api/__init__.py", line 1, in <module>
from ConfigSpace.api import distributions, types
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/api/types/__init__.py", line 1, in <module>
from ConfigSpace.api.types.categorical import Categorical
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/api/types/categorical.py", line 6, in <module>
from ConfigSpace.hyperparameters import CategoricalHyperparameter, OrdinalHyperparameter
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/hyperparameters/__init__.py", line 1, in <module>
from ConfigSpace.hyperparameters.beta_float import BetaFloatHyperparameter
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/hyperparameters/beta_float.py", line 10, in <module>
from ConfigSpace.hyperparameters.distributions import (
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/hyperparameters/distributions.py", line 9, in <module>
from ConfigSpace.functional import (
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/functional.py", line 10, in <module>
from ConfigSpace.types import Number, f64, i64
File "/opt/conda/lib/python3.8/site-packages/ConfigSpace/types.py", line 9, in <module>
Mask: TypeAlias = npt.NDArray[np.bool_]
AttributeError: module 'numpy.typing' has no attribute 'NDArray'
Secondly, I searched for solutions online and upgraded NumPy to version 1.21.0, but then another error would be reported.
Traceback (most recent call last):
File "./tuner/tuner_train.py", line 186, in <module>
res = bohb.run(n_iterations=args.n_iterations, min_n_workers=args.n_workers)
File "/AutoDMP/hpbandster/core/master.py", line 206, in run
next_run = self.iterations[i].get_next_run()
File "/AutoDMP/hpbandster/core/base_iteration.py", line 170, in get_next_run
self.add_configuration()
File "/AutoDMP/hpbandster/core/base_iteration.py", line 103, in add_configuration
self.result_logger.new_config(config_id, config, config_info)
File "/AutoDMP/hpbandster/core/result.py", line 150, in new_config
fh.write(json.dumps([config_id, config, config_info]))
File "/opt/conda/lib/python3.8/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/opt/conda/lib/python3.8/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/opt/conda/lib/python3.8/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/opt/conda/lib/python3.8/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type int64 is not JSON serializable
I'm looking forward your response :)
@loujc Hello, I am also experiencing this problem, please how to solve it. Looking forward to hearing from you.
@loujc Hello, I am also experiencing this problem, please how to solve it. Looking forward to hearing from you.
sorry, I haven't figure out that. I'm move on to another work now
Hi @loujc @nineight908, was anyone able to finally resolve this? I am facing this issue and looks like when I update numpy, some other function from scipy starts creating some issues.
I reframed the requirements.txt file (also same in docker file) like this, so that all the versions work without any complications, this helped in resolving the error that we saw before regarding the numpy errors.
pyunpack==0.2.2 patool==1.12 matplotlib==3.3.4 cairocffi==1.2.0 pkgconfig==1.5.1 setuptools==59.5.0 scipy==1.7.3 numpy==1.21.0 torch==1.10.0 shapely==1.8.5 pygmo==2.16.1 pyDOE2==1.3.0 shap==0.41.0 Pyro4==4.82 ConfigSpace==0.6.0 statsmodels==0.13.2 xgboost==1.5.1
Also, I made 2 edits to the Dockerfile -
- changed the path to download the boost library to this - https://sourceforge.net/projects/boost/files/boost/1.66.0/boost_1_66_0.tar.gz/download
- changed the ssh communication to https for cloning the AutoDMP code. (This was more of my system requirement)