bigcodebench icon indicating copy to clipboard operation
bigcodebench copied to clipboard

Question: Why do sequential generation on OpenAIChatDecoder

Open floatingbigcat opened this issue 6 months ago • 1 comments

In bigcodebench/provider/openai.py, all model processes prompts sequentially, the func _codegen_batch_via_concurrency process each prompt n time in parallel, but still sequentially process the whole prompts.

Looks wired, and significantly slow down the code gen speed.

floatingbigcat avatar Aug 05 '25 12:08 floatingbigcat

In bigcodebench/provider/openai.py, all model processes prompts sequentially, the func _codegen_batch_via_concurrency process each prompt n time in parallel, but still sequentially process the whole prompts.

Looks wired, and significantly slow down the code gen speed.

same question, have to rebuilt a new pipeline to fix it.

bitkira avatar Sep 30 '25 13:09 bitkira