Pytorch 2.6 does not allow model loading
Any quick fixes? The version on PyPI works (downloads the model). The version on github does not even download the model (the file structure of both installations is different). Tried to apply this patch, does not work as well.
>>> from supar import Parser
>>> parser = Parser.load('biaffine-dep-en')
Downloading: https://github.com/yzhangcs/parser/releases/download/v1.1.0/ptb.biaffine.dep.lstm.char.zip to .cache/supar\ptb.biaffine.dep.lstm.char.zip
100%|███████████████████████████████████████████████████████████████████████████████| 331M/331M [01:47<00:00, 3.22MB/s]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "\lib\site-packages\supar\parsers\parser.py", line 194, in load
state = torch.load(path if os.path.exists(path) else download(supar.MODEL[src].get(path, path), reload=reload))
File "\lib\site-packages\torch\serialization.py", line 1529, in load
raise pickle.UnpicklingError(_get_wo_message(str(e))) from None
_pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, do those steps only if you trust the source of the checkpoint.
(1) In PyTorch 2.6, we changed the default value of the `weights_only` argument in `torch.load` from `False` to `True`. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source.
(2) Alternatively, to load with `weights_only=True` please check the recommended steps in the following error message.
WeightsUnpickler error: Unsupported global: GLOBAL supar.utils.config.Config was not an allowed global by default. Please use `torch.serialization.add_safe_globals([supar.utils.config.Config])` or the `torch.serialization.safe_globals([supar.utils.config.Config])` context manager to allowlist this global if you trust this class/function.
Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html.
>>>
This issue is stale because it has been open for 30 days with no activity.
@mshakirDr The line number for parser.py (194) you showed is strange. If you clone the repo to your file system and install from this clone (pip install -e <path-to-your-clone>), the line to be changed by the patch is line 565. I just tested this on Debian 13.
I remember I was not able to download the models for parsing with the patched version. So I created a separate python environment, found an older version of pytroch and installed it (the original version, through pip). Then installed this and it worked.
Have you tested non-English languages via parser = Parser.load('dep-biaffine-xlmr') ?
I was stuck in version hell of python and rust, it seems. OK, I found a fork that contains some modernization (i.e. adjustment to newer versions) and published it as a release (1.1.5). This helped me.
@svenha Can you let us know what this fork is?
The fork I meant is: https://github.com/Yu-val-weiss/supar-parser Hope it helps in your case.
Thank you so much, it works indeed!
@bbunzeck If you find adjustments needed for training, please let us know.
This issue is stale because it has been open for 30 days with no activity.