Add stream_options support according to OpenAI API
This PR adds the stream_options and include_usage fields as defined in the OpenAI API reference to the server.
Set to false by default they can be activated which leads to the last chunk being returned during streaming to supply the usage information.
This solves the Question raised in #1461 and the request indicated in #1498.
I decided to put the chunk generation code into its own function since not doing so would have lead to blown up code.
I did not update the places where stream is clearly set to false.
I'm happy to update the PR with any necessary/requested modifications wrt code style or if you think all chunk generation places should be changed to use functions (or add more options/update the chunk generation function). Just let me know.
Closes #1498
@tpfau thank you for starting on this, I'll review in more depth but my initial request would be that instead of stream_include_usage we just add stream_options directly to the methods in the Llama class as they should also mirror the OpenAI api.
@abetlen I hope this change is what you had in mind.
I'm admittedly unsure how I can properly add the description of the include_usage field description to the API (I tried to stick to how it was done with the completion requests specs for now).
I also wanted to avoid having to add the stream option check in every single function so I kept it in the "main" functionalities and left the addition of the stream_include_usage parameter in the conversion functions.
I would really like to see this in production, as I think this feature can be quite important for a lot of applications that need some kind of usage control or just let the users know about their usage. Of course people can work around this kind of approach but it requires additional code in the code base that does tokenization and thus computation which is completely unnecessary as it is already calculated by the model.
@abetlen Any news on this?
Updated again to fix conflicts.
Would be great to see this feature in the codebase. If there is anything missing, please let me know
Did some tests again, seems to be working. Please let me know if you have any updates which you want to have in.