kddubey
kddubey
There is at least one hacky way: ```python import torch from transformers import AutoTokenizer from ctransformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("marella/gpt-2-ggml", hf=True) tokenizer = AutoTokenizer.from_pretrained("gpt2") text = "a b c...
@ArthurZucker what if the suffixes (`[" d", " z"]` in the example) have a different number of tokens? I changed the suffixes to `[" d e", " z"]` and don't...
A general solution (general meaning: prefixes can have a different number of tokens, and suffixes can have a different number of tokens) is to create and supply `position_ids` as @IanMagnusson...
Hello, It's KD_A from Reddit. I purged my account recently, so the linked Reddit comment is no longer available. Posting it and the next reply here for posterity: First reply...
@martinjm97 thank you for the feedback about `convert_to_tap_class`, and the typing fixes from a few days ago. I updated the PR. Current tests should be passing. Remaining todos for me...
The PR is ready for review @martinjm97 (Vocab note: "data model" refers to builtin dataclasses, Pydantic models, and Pydantic dataclasses. If something is a data model, it means it has...
No worries, take your time :-)
@martinjm97 see [this comment](https://github.com/swansonk14/typed-argument-parser/pull/128#discussion_r1460237907) and [this comment](https://github.com/swansonk14/typed-argument-parser/pull/128/files#r1469054683). Note that, by default, a Pydantic `BaseModel` (surprisingly, to me at least) ignores extra fields that are passed to it; it doesn't raise...
No worries. That comment from a week ago wasn't really clear now that I'm reading it. And I appreciate that you're testing the code independently!
Hi @martinjm97, the last commit changes the Pydantic implementation to not allow extra arguments for Pydantic `BaseModel`s by default. That way it's backwards compatible and sensible. Sorry for causing confusion....