rig icon indicating copy to clipboard operation
rig copied to clipboard

bug: request json not same with python api

Open jamesvren opened this issue 11 months ago • 4 comments

  • [x] I have looked for existing issues (including closed) about this

Bug Report

The request json are not same with what python openai send. This cause my local deepseek think it missing some items and donnot chat.

Reproduction

    let openai_client = openai::Client::from_url(&key, &url);

    let sentiment_classifier = openai_client
        .extractor::<SentimentClassification>("xxxx")
        .build();

    let text = "I like this house, it is awsome!";

    match sentiment_classifier.extract(text).await {
        Ok(result) => pretty_print_result(text, &result),
        Err(e) => eprintln!("Got wrong answer: {}", e),
    }

Expected behavior

I expect it send the same request like python openai

Screenshots

This is what rig send (Please note that messages content are not string)

Image

It will give me the following error:

Image

This is waht python openai send (which is my expect)

Image

Additional context

jamesvren avatar Feb 27 '25 16:02 jamesvren

Hey! Thanks for this issue. We actually have proper deepseek support via our rig-core/providers/deepseek.rs module. Using that will ensure the proper format that deepseek accepts.

    let deepseek_client = deepseek::Client::from_env();

    let sentiment_classifier = deepseek_client
        .extractor::<SentimentClassification>("xxxx")
        .build();

    let text = "I like this house, it is awsome!";

    match sentiment_classifier.extract(text).await {
        Ok(result) => pretty_print_result(text, &result),
        Err(e) => eprintln!("Got wrong answer: {}", e),
    }

Even though deepseek and other providers say they accept openai compatible format, we've now learned that they only accept a subset of the actual format. We might adjust our openai client a bit to serialize in a similar manner but it's kinda an odd.

0xMochan avatar Feb 27 '25 17:02 0xMochan

Hey! Thanks for this issue. We actually have proper deepseek support via our rig-core/providers/deepseek.rs module. Using that will ensure the proper format that deepseek accepts.

let deepseek_client = deepseek::Client::from_env();

let sentiment_classifier = deepseek_client
    .extractor::<SentimentClassification>("xxxx")
    .build();

let text = "I like this house, it is awsome!";

match sentiment_classifier.extract(text).await {
    Ok(result) => pretty_print_result(text, &result),
    Err(e) => eprintln!("Got wrong answer: {}", e),
}

Even though deepseek and other providers say they accept openai compatible format, we've now learned that they only accept a subset of the actual format. We might adjust our openai client a bit to serialize in a similar manner but it's kinda an odd.

I deploy deepseek with mistral, it only response with openai format. I tried with deepseek provider, but it failed to parse response. I also looked openai document, the example for default and stream also use string for content and only use dict with type for image. So I think it's better to support the same :)

jamesvren avatar Feb 28 '25 03:02 jamesvren

By the way, it's better to have debug output about request json before send and response text before serialization since it might failed in somes cases. That will save us time for debugging.

jamesvren avatar Feb 28 '25 04:02 jamesvren

That's a good point. I think we can really reform our debugging and tracing calls to be much more useful in these situations, very useful feedback!

On Thu, Feb 27, 2025, 9:51 PM jamesvren @.***> wrote:

By the way, it's better to have debug output about request json before send and response text before serialization since it might failed in somes cases. That will save us time for debugging.

— Reply to this email directly, view it on GitHub https://github.com/0xPlaygrounds/rig/issues/326#issuecomment-2689714435, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYUKJZYNBXR3O4MFGJV3MET2R7TOHAVCNFSM6AAAAABYAIGJYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMOBZG4YTINBTGU . You are receiving this because you commented.Message ID: @.***> [image: jamesvren]jamesvren left a comment (0xPlaygrounds/rig#326) https://github.com/0xPlaygrounds/rig/issues/326#issuecomment-2689714435

By the way, it's better to have debug output about request json before send and response text before serialization since it might failed in somes cases. That will save us time for debugging.

— Reply to this email directly, view it on GitHub https://github.com/0xPlaygrounds/rig/issues/326#issuecomment-2689714435, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYUKJZYNBXR3O4MFGJV3MET2R7TOHAVCNFSM6AAAAABYAIGJYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMOBZG4YTINBTGU . You are receiving this because you commented.Message ID: @.***>

0xMochan avatar Feb 28 '25 05:02 0xMochan