Import error due to missing symbol
When installing the latest dpctl with pip, I get this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/dcortes/mambaforge/envs/py311/lib/python3.11/site-packages/dpctl/__init__.py", line 30, in <module>
from ._device_selection import select_device_with_aspects
File "/home/dcortes/mambaforge/envs/py311/lib/python3.11/site-packages/dpctl/_device_selection.py", line 20, in <module>
from ._sycl_device import SyclDevice, SyclDeviceCreationError
ImportError: /home/dcortes/mambaforge/envs/py311/lib/python3.11/site-packages/dpctl/libDPCTLSyclInterface.so: undefined symbol: _ZNK4sycl3_V18platform23khr_get_default_contextEv
which demangles to:
sycl::_V1::platform::khr_get_default_context() const
I would guess that this is an issue of not specifying sufficiently high versions for all the sycl-related dependencies.
Rest of the environment is as follows:
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
anyio 4.6.2.post1 pyhd8ed1ab_0 conda-forge
argon2-cffi 23.1.0 pyhd8ed1ab_0 conda-forge
argon2-cffi-bindings 21.2.0 py311h9ecbd09_5 conda-forge
arrow 1.3.0 pyhd8ed1ab_0 conda-forge
asttokens 2.4.1 pyhd8ed1ab_0 conda-forge
async-lru 2.0.4 pyhd8ed1ab_0 conda-forge
attrs 24.2.0 pyh71513ae_0 conda-forge
babel 2.14.0 pyhd8ed1ab_0 conda-forge
beautifulsoup4 4.12.3 pyha770c72_0 conda-forge
bleach 6.1.0 pyhd8ed1ab_0 conda-forge
brotli-python 1.1.0 py311hfdbb021_2 conda-forge
bzip2 1.0.8 h4bc722e_7 conda-forge
ca-certificates 2024.8.30 hbcca054_0 conda-forge
cached-property 1.5.2 hd8ed1ab_1 conda-forge
cached_property 1.5.2 pyha770c72_1 conda-forge
certifi 2024.8.30 pyhd8ed1ab_0 conda-forge
cffi 1.17.1 py311hf29c0ef_0 conda-forge
charset-normalizer 3.4.0 pyhd8ed1ab_0 conda-forge
comm 0.2.2 pyhd8ed1ab_0 conda-forge
contourpy 1.3.0 pypi_0 pypi
cycler 0.12.1 pypi_0 pypi
debugpy 1.8.5 py311hfdbb021_1 conda-forge
decorator 5.1.1 pyhd8ed1ab_0 conda-forge
defusedxml 0.7.1 pyhd8ed1ab_0 conda-forge
dill 0.3.9 pypi_0 pypi
dpctl 0.20.1 pypi_0 pypi
duckdb 1.1.2 pypi_0 pypi
entrypoints 0.4 pyhd8ed1ab_0 conda-forge
exceptiongroup 1.2.2 pyhd8ed1ab_0 conda-forge
executing 2.1.0 pyhd8ed1ab_0 conda-forge
fonttools 4.54.1 pypi_0 pypi
fqdn 1.5.1 pyhd8ed1ab_0 conda-forge
h11 0.14.0 pyhd8ed1ab_0 conda-forge
h2 4.1.0 pyhd8ed1ab_0 conda-forge
hpack 4.0.0 pyh9f0ad1d_0 conda-forge
httpcore 1.0.6 pyhd8ed1ab_0 conda-forge
httpx 0.27.2 pyhd8ed1ab_0 conda-forge
hyperframe 6.0.1 pyhd8ed1ab_0 conda-forge
idna 3.10 pyhd8ed1ab_0 conda-forge
importlib-metadata 8.5.0 pyha770c72_0 conda-forge
importlib_metadata 8.5.0 hd8ed1ab_0 conda-forge
importlib_resources 6.4.5 pyhd8ed1ab_0 conda-forge
intel-cmplr-lib-rt 2025.1.1 pypi_0 pypi
intel-cmplr-lib-ur 2025.1.1 pypi_0 pypi
intel-cmplr-lic-rt 2025.1.1 pypi_0 pypi
intel-sycl-rt 2025.1.1 pypi_0 pypi
ipykernel 6.29.5 pyh3099207_0 conda-forge
ipython 8.27.0 pyh707e725_0 conda-forge
isoduration 20.11.0 pyhd8ed1ab_0 conda-forge
jedi 0.19.1 pyhd8ed1ab_0 conda-forge
jinja2 3.1.4 pyhd8ed1ab_0 conda-forge
joblib 1.5.1 pypi_0 pypi
json5 0.9.25 pyhd8ed1ab_0 conda-forge
jsonpointer 3.0.0 py311h38be061_1 conda-forge
jsonschema 4.23.0 pyhd8ed1ab_0 conda-forge
jsonschema-specifications 2024.10.1 pyhd8ed1ab_0 conda-forge
jsonschema-with-format-nongpl 4.23.0 hd8ed1ab_0 conda-forge
jupyter-lsp 2.2.5 pyhd8ed1ab_0 conda-forge
jupyter_client 8.6.2 pyhd8ed1ab_0 conda-forge
jupyter_core 5.7.2 py311h38be061_0 conda-forge
jupyter_events 0.10.0 pyhd8ed1ab_0 conda-forge
jupyter_server 2.14.2 pyhd8ed1ab_0 conda-forge
jupyter_server_terminals 0.5.3 pyhd8ed1ab_0 conda-forge
jupyterlab 4.2.5 pyhd8ed1ab_0 conda-forge
jupyterlab_pygments 0.3.0 pyhd8ed1ab_1 conda-forge
jupyterlab_server 2.27.3 pyhd8ed1ab_0 conda-forge
keyutils 1.6.1 h166bdaf_0 conda-forge
kiwisolver 1.4.7 pypi_0 pypi
krb5 1.21.3 h659f571_0 conda-forge
ld_impl_linux-64 2.40 hf3520f5_7 conda-forge
libedit 3.1.20191231 he28a2e2_2 conda-forge
libexpat 2.6.3 h5888daf_0 conda-forge
libffi 3.4.2 h7f98852_5 conda-forge
libgcc 14.1.0 h77fa898_1 conda-forge
libgcc-ng 14.1.0 h69a702a_1 conda-forge
libgomp 14.1.0 h77fa898_1 conda-forge
libnsl 2.0.1 hd590300_0 conda-forge
libsodium 1.0.20 h4ab18f5_0 conda-forge
libsqlite 3.46.1 hadc24fc_0 conda-forge
libstdcxx 14.1.0 hc0a3c3a_1 conda-forge
libstdcxx-ng 14.1.0 h4852527_1 conda-forge
libuuid 2.38.1 h0b41bf4_0 conda-forge
libxcrypt 4.4.36 hd590300_1 conda-forge
libzlib 1.3.1 h4ab18f5_1 conda-forge
markupsafe 3.0.2 py311h2dc5d0c_0 conda-forge
matplotlib 3.9.2 pypi_0 pypi
matplotlib-inline 0.1.7 pyhd8ed1ab_0 conda-forge
mistune 3.0.2 pyhd8ed1ab_0 conda-forge
mizani 0.11.4 pypi_0 pypi
nb_conda_kernels 2.5.1 pyh707e725_2 conda-forge
nbclient 0.10.0 pyhd8ed1ab_0 conda-forge
nbconvert 7.16.4 hd8ed1ab_1 conda-forge
nbconvert-core 7.16.4 pyhd8ed1ab_1 conda-forge
nbconvert-pandoc 7.16.4 hd8ed1ab_1 conda-forge
nbformat 5.10.4 pyhd8ed1ab_0 conda-forge
ncurses 6.5 he02047a_1 conda-forge
nest-asyncio 1.6.0 pyhd8ed1ab_0 conda-forge
notebook 7.2.2 pyhd8ed1ab_0 conda-forge
notebook-shim 0.2.4 pyhd8ed1ab_0 conda-forge
numpy 1.26.4 pypi_0 pypi
openssl 3.3.2 hb9d3cd8_0 conda-forge
overrides 7.7.0 pyhd8ed1ab_0 conda-forge
packaging 24.1 pyhd8ed1ab_0 conda-forge
pandas 2.2.3 pypi_0 pypi
pandoc 3.5 ha770c72_0 conda-forge
pandocfilters 1.5.0 pyhd8ed1ab_0 conda-forge
parso 0.8.4 pyhd8ed1ab_0 conda-forge
patchworklib 0.6.5 pypi_0 pypi
patsy 0.5.6 pypi_0 pypi
pexpect 4.9.0 pyhd8ed1ab_0 conda-forge
pickleshare 0.7.5 py_1003 conda-forge
pillow 11.0.0 pypi_0 pypi
pip 24.2 pyh8b19718_1 conda-forge
pkgutil-resolve-name 1.3.10 pyhd8ed1ab_1 conda-forge
platformdirs 4.3.2 pyhd8ed1ab_0 conda-forge
plotnine 0.13.6 pypi_0 pypi
polars 1.12.0 pypi_0 pypi
prometheus_client 0.21.0 pyhd8ed1ab_0 conda-forge
prompt-toolkit 3.0.47 pyha770c72_0 conda-forge
psutil 6.0.0 py311h9ecbd09_1 conda-forge
ptyprocess 0.7.0 pyhd3deb0d_0 conda-forge
pure_eval 0.2.3 pyhd8ed1ab_0 conda-forge
pyarrow 18.0.0 pypi_0 pypi
pycparser 2.22 pyhd8ed1ab_0 conda-forge
pygments 2.18.0 pyhd8ed1ab_0 conda-forge
pyparsing 3.2.0 pypi_0 pypi
pysocks 1.7.1 pyha2e5f31_6 conda-forge
python 3.11.10 hc5c86c4_0_cpython conda-forge
python-dateutil 2.9.0 pyhd8ed1ab_0 conda-forge
python-fastjsonschema 2.20.0 pyhd8ed1ab_0 conda-forge
python-json-logger 2.0.7 pyhd8ed1ab_0 conda-forge
python_abi 3.11 5_cp311 conda-forge
pytz 2024.2 pyhd8ed1ab_0 conda-forge
pyyaml 6.0.2 py311h9ecbd09_1 conda-forge
pyzmq 26.2.0 py311h7deb3e3_2 conda-forge
readline 8.2 h8228510_1 conda-forge
referencing 0.35.1 pyhd8ed1ab_0 conda-forge
requests 2.32.3 pyhd8ed1ab_0 conda-forge
rfc3339-validator 0.1.4 pyhd8ed1ab_0 conda-forge
rfc3986-validator 0.1.1 pyh9f0ad1d_0 conda-forge
rpds-py 0.20.0 py311h9e33e62_1 conda-forge
scikit-learn 1.7.0 pypi_0 pypi
scipy 1.14.1 pypi_0 pypi
seaborn 0.13.2 pypi_0 pypi
send2trash 1.8.3 pyh0d859eb_0 conda-forge
setuptools 73.0.1 pyhd8ed1ab_0 conda-forge
six 1.16.0 pyh6c4a22f_0 conda-forge
sniffio 1.3.1 pyhd8ed1ab_0 conda-forge
soupsieve 2.5 pyhd8ed1ab_1 conda-forge
stack_data 0.6.2 pyhd8ed1ab_0 conda-forge
statsmodels 0.14.4 pypi_0 pypi
tcmlib 1.3.0 pypi_0 pypi
terminado 0.18.1 pyh0d859eb_0 conda-forge
threadpoolctl 3.6.0 pypi_0 pypi
tinycss2 1.4.0 pyhd8ed1ab_0 conda-forge
tk 8.6.13 noxft_h4845f30_101 conda-forge
tomli 2.0.2 pyhd8ed1ab_0 conda-forge
tornado 6.4.1 py311h9ecbd09_1 conda-forge
traitlets 5.14.3 pyhd8ed1ab_0 conda-forge
types-python-dateutil 2.9.0.20241003 pyhff2d567_0 conda-forge
typing-extensions 4.12.2 hd8ed1ab_0 conda-forge
typing_extensions 4.12.2 pyha770c72_0 conda-forge
typing_utils 0.1.0 pyhd8ed1ab_0 conda-forge
tzdata 2024.2 pypi_0 pypi
umf 0.10.0 pypi_0 pypi
uri-template 1.3.0 pyhd8ed1ab_0 conda-forge
urllib3 2.2.3 pyhd8ed1ab_0 conda-forge
wcwidth 0.2.13 pyhd8ed1ab_0 conda-forge
webcolors 24.8.0 pyhd8ed1ab_0 conda-forge
webencodings 0.5.1 pyhd8ed1ab_2 conda-forge
websocket-client 1.8.0 pyhd8ed1ab_0 conda-forge
wheel 0.44.0 pyhd8ed1ab_0 conda-forge
xz 5.2.6 h166bdaf_0 conda-forge
yaml 0.2.5 h7f98852_2 conda-forge
zeromq 4.3.5 ha4adb4c_5 conda-forge
zipp 3.20.1 pyhd8ed1ab_0 conda-forge
zstandard 0.23.0 py311hbc35293_1 conda-forge
zstd 1.5.6 ha6fb4c9_0 conda-forge
@antonwolfy @ndgrigorian I have run into this issue as well. Any comment on this?
@icfaust, @david-cortes-intel The issue is because you have the latest dpctl installed with 2025.1 DPC++ RT package. You need to update the env to install the latest 2025.2 components.
The reason is that there is no "greater than" constraints for the DPC++ RT package in the METADATA:
$ cat ./pypi_venv/lib/python3.12/site-packages/dpctl-0.20.1.dist-info/METADATA | grep Requires-Dist
Requires-Dist: numpy>=1.23.0
Requires-Dist: intel-sycl-rt
Requires-Dist: intel-cmplr-lib-rt
but expected intel-sycl-rt>=2025.2.
Looks like the wheel is not taking the requirements from requirements.txt:
https://github.com/IntelPython/dpctl/blob/master/requirements.txt
.. and not taking them from pyproject.toml either:
https://github.com/IntelPython/dpctl/blob/master/pyproject.toml
.. so I guess it would need to updated in some internal build script.
@david-cortes-intel, like mentioned in gh-1910, it's intended that they are not present in neither requirements.txt nor pyproject.toml, assuming required packages might be provided by the user (like through installing oneAPI BaseKit).
But it's different for the wheel packages on https://software.repos.intel.com/python/pypi channel. The internal workflows extends dpctl's METADATA to add dependencies on intel-sycl-rt and intel-cmplr-lib-rt. I've filed a JIRA ticket to update the script with adding low and upper bounds for these dependencies (similar to ones which dpctl conda package has).
@david-cortes-intel, like mentioned in gh-1910, it's intended that they are not present in neither
requirements.txtnorpyproject.toml, assuming required packages might be provided by the user (like through installing oneAPI BaseKit).But it's different for the wheel packages on
https://software.repos.intel.com/python/pypichannel. The internal workflows extends dpctl'sMETADATAto add dependencies onintel-sycl-rtandintel-cmplr-lib-rt. I've filed a JIRA ticket to update the script with adding low and upper bounds for these dependencies (similar to ones which dpctl conda package has).
I see it mentions:
The dependencies list must specify Python only dependencies, assuming required libraries are provided by user's OS.
.. but the packages intel-sycl-rt and intel-cmplr-lib-rt are distributed as PyPI packages even though they aren't python libraries, and those PyPI/conda packages are what other libraries using DPCTL would be loading at runtime.
For the main installation flow you are right. If you are installing dpctl from the PyPI or conda-forge, then it's requires to have a correct dependency on intel-sycl-rt and intel-cmplr-lib-rt from dpctl package.
But alternatively, you might want to create a dev environment including DPC++ compiler (there is no wheel package for it) and dpctl installed to build dpnp from the source code. Or to test dpctl and dpnp with the latest nightly LLVM compiler.
For all that cases you would like to have more freedom how you would like to control the dependencies. And so, then you are building dpctl wheel from the source, or installing dev dpctl wheel package from dppy/label/dev channel and so there will be no dependency on DPC++ RT assuming you are resolving them by yourself.
For the main installation flow you are right. If you are installing dpctl from the PyPI or conda-forge, then it's requires to have a correct dependency on
intel-sycl-rtandintel-cmplr-lib-rtfrom dpctl package.But alternatively, you might want to create a dev environment including DPC++ compiler (there is no wheel package for it) and dpctl installed to build dpnp from the source code. Or to test dpctl and dpnp with the latest nightly LLVM compiler. For all that cases you would like to have more freedom how you would like to control the dependencies. And so, then you are building dpctl wheel from the source, or installing dev dpctl wheel package from
dppy/label/devchannel and so there will be no dependency on DPC++ RT assuming you are resolving them by yourself.
But in that case, you'd either be building from an sdist, or through the setup.py file. Or are you also building a binary wheel for that use-case?
Or are you also building a binary wheel for that use-case?
Yes, if I need to install that dpctl into the dev dpnp environment.
Or are you also building a binary wheel for that use-case?
Yes, if I need to install that dpctl into the dev dpnp environment.
Then it sounds like building with system-managed vs. pip/conda-managed dpc++ runtime could be a configurable option.
As mentioned here we can add option dependency on the DPC++ RT package:
[project.optional-dependencies]
coverage = [<SKIPPED>]
docs = [<SKIPPED>]
dpcpp_rt = ["intel-cmplr-lib-rt", "intel-sycl-rt"]
As mentioned here we can add option dependency on the DPC++ RT package:
[project.optional-dependencies] coverage = [<SKIPPED>] docs = [<SKIPPED>] dpcpp_rt = ["intel-cmplr-lib-rt", "intel-sycl-rt"]
I think (but not 100% sure) that still wouldn't work for the use-case of pip-installing requirements where those are transitive dependencies of other packages.
For example, other packages with DPC++ capabilities might specify them without version constraints, and if they are optional for the case of dpctl, the pip resolver wouldn't take it into account and might still end up pulling incompatible versions.
I'm a bit confused. If we are talking about default installation way. The DPC++ version constraints will be added to the dpctl wheel, we are working on that. There will be no optional dependencies added in scope of that.
If we are considering a use case when installing dpctl dev package, i.e. from either dppy/label/dev or built from the source, and the new optional dependency is added to the dpctl as dpcpp_rt (as proposed above).
In that case the new dpcpp_rt optional dependency is not be transitive. If a package depends on dpctl and would like to ensure the proper DPC++ RT version installed, it would need to specify dpctl[dpcpp_rt] as a dependency.
If you concerned about the above solution and missing the version constraints there. Then it was posted as an example, the implemented one will have the version constraints for sure, like:
[project.optional-dependencies]
dpcpp_rt = ["intel-cmplr-lib-rt>=2025.2,<2026.0", "intel-sycl-rt>=2025.2,<2026.0"]
The DPC++ version constraints will be added to the dpctl wheel, we are working on that.
Thanks, that part wasn't clear to me.
But the snippet that you posted makes it an optional dependency. I think if you pip install that from a requirements.txt where other packages have intel-sycl-rt as a mandatory dependency, it will end up ignoring the version constraints from that optional dependency.
I checked that briefly and it seems that would work, the version constraints will be met.
In fact, in the future, I suggest avoiding using the internal logic for dependency correction. All dependencies should be listed in setup.py, since the chain of calls is as follows:
- run
pip install - read
pyproject.toml - check
[build-system], in dpctl/dpnp this issetuptools.build_metahttps://github.com/IntelPython/dpctl/blob/master/pyproject.toml#L2 -
pipcallssetuptools.build_metafor the build - if
setup.pyexists,setuptoolswill use it (pyproject.tomldefines backend and build deps, not run)
So my suggestion is to update setup.py. The necessary dependencies should be listed there, and they will automatically appear in section Requires-Dist in METADATA file after the build
Example:
install_requires=[
"numpy",
"intel-cmplr-lib-rt >=2025.3,<2026.0a0",
"intel-sycl-rt >=2025.3,<2026.0a0"
],
In fact, in the future, I suggest avoiding using the internal logic for dependency correction. All dependencies should be listed in
setup.py, since the chain of calls is as follows:
We chose not to do this because we want to support two separate use cases: one with Python packages in the environment and one with basekit, and don't want to install redundant packages.
CuPy, for example, is similar, noting that for the wheel to work correctly, the toolkit must be installed.
In fact, in the future, I suggest avoiding using the internal logic for dependency correction. All dependencies should be listed in
setup.py, since the chain of calls is as follows:We chose not to do this because we want to support two separate use cases: one with Python packages in the environment and one with basekit, and don't want to install redundant packages.
CuPy, for example, is similar, noting that for the wheel to work correctly, the toolkit must be installed.
Would also be helpful to offer an automated way of installing it with pip with necessary dependencies too, like pip install dpctl[all] or similar. Otherwise, it's very confusing for someone not familiar with the intel ecosystem to have some packages like torch working one way (dependencies pulled with pip install) and dpctl working another way (dependencies expected to be satisfied externally).
Would also be helpful to offer an automated way of installing it with pip with necessary dependencies too, like
pip install dpctl[all]or similar
I agree, I think that we should work out a preferred approach. Preferably, we can have both options (allowing basekit or automatic dependency resolution) when installing from PyPI, Intel channel, or conda-forge (in the future).