PhenixZhang
PhenixZhang
> Have you tried the `OpenAI compatible API` model provider? Yes, This is the config screenshot  And using openai server directly without dify is no problem.
> Can you try changing the `max_tokens` parameter to 4K or larger and see if it works? Of course~ I set the maximum number of tokens to 8K at the...
> 另请注意,它`torch.distributed.launch`已被弃用,并且`torchrun`在 PyTorch 2.0 中是首选。 Thanks for this tip.