Dan Jones
Dan Jones
@sgugger I am iterating through. 👍 thanks for the heads up, though!
Thanks, I was about to ask.. any thoughts? I am not well-versed in torch's symbolic tracer (or FX generally). I'm happy to do the work if you can point me...
Current offending line is line 985 in `src/transformers/utils/fx.py` (`HFTracer.trace()`): `self.graph = super().trace(root, concrete_args=concrete_args)` where `root` is ``` PLBartModel( (shared): Embedding(99, 16, padding_idx=1) (encoder): PLBartEncoder( (embed_tokens): Embedding(99, 16, padding_idx=1) (embed_positions): PLBartLearnedPositionalEmbedding(102,...
Hey @michaelbenayoun, let me know if you have any thoughts to resolve the tracer issue :)
@sgugger all resolved now. Would you mind giving the PR another look?
No problem, thank you for your support 👍
I'd be keen to explore this Open Issue! 🙏