Somesh

Results 59 comments of Somesh

@adam-simple it's not working I've tried the fix you mentioned can you please inform what did you set for TOGETHER_API_BASE

by using api_base as `https://api.together.xyz/v1/chat` by using api_base as `https://api.together.xyz/v1` Also tried by using use_chat_api = True something is wrong with it ig

I resolved this issue by using the OpenAI compatability together has. I created another handler for together. using existing one was too much hassle for me. ### If anyone needed...

> Hi @someshfengde , did you want to add your changes for Together to this PR? maybe the ones that were [removed from the other PR](https://github.com/stanfordnlp/dspy/pull/637#issuecomment-2021032131)? I'd have to open...

Hey @thusinh1969, I've tried previously to finetune with the falcon7b. For the `lora.py` with some modifications everything works fine. But for the `adapter.py` and `adapter-v2.py` I'm getting out-of-memory errors does...

Hi @carmocca I've tried with the recommended changes it works fine for `adapter.py` for first epoch but after first epoch get's completed the loss automatically changes to NAN. Also I...

I've used device=2 and strategy='deepspeed' it was training with 2xA100 but causing OOM error

Yes I saw memory occupation on both the GPUs the strategy='deepspeed' did the thing ig