Yiximail
Yiximail
> FWIW Meta changed [the official config](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/blob/1448453bdb895762499deb4176c1dd83b145fac1/generation_config.json#L4) to reflect this a few hours ago but I don't think the webui respects it. Ah, I have not yet had access to...
> ```diff > diff --git a/generation_config.json b/generation_config.json > index 4358365..aecb1b8 100644 > --- a/generation_config.json > +++ b/generation_config.json > @@ -1,6 +1,6 @@ > { > "_from_model_config": true, > "bos_token_id": 128000,...
> > # Quick fix for llama3 doesn't stop correctly > > You need to also mention that this will break it for everything else than llama-3, otherwise some people...
> > > > For API I had to manually insert in completions.py the fields: 'skip_special_tokens': False, 'custom_stopping_strings': '""' > > > > as the other side doesnt insert those...
The same issue happens with Chinese.
> I think that the problem here is that the `eot_id` cannot be obtained from the `shared.tokenizer` object loaded through `AutoTokenizer.from_pretrained`. This can be verified by checking the attributes under...
I encounter this error when installing dependencies with `bun`, but not with `pnpm`
I'm not sure if they changed the interface name but forgot to edit the documents I changed to `CustomAppConfig` and it works well. ``` typescript // index.d.ts declare module "nuxt/schema"...