Fixed bug usage.prompt_tokens_details=None
- [x] I understand that this repository is auto-generated and my pull request may not be merged
Changes being requested
Problem
When Gemini API responses omitted prompt_tokens_details or completion_tokens_details from the usage object, the OpenAI Python library was setting them to None instead of objects with None values:
'prompt_tokens_details': None # ❗️Wrong
Solution
Added preprocessing in src/openai/_models.py to ensure missing fields are initialized as empty dicts before Pydantic validation: Created _preprocess_completion_usage() function
- Detects CompletionUsage type
- Ensures prompt_tokens_details and completion_tokens_details are {} if missing/None
- Empty dicts cause Pydantic to construct nested models with default None values
'prompt_tokens_details': {'audio_tokens': None, 'cached_tokens': None} # ✅ Correct
Additional context & links
After code changes, you can see audio_tokens and cached_tokens become available in None instead of "prompt_tokens_details": None below demo.
https://github.com/user-attachments/assets/4ae76e25-3eec-4230-9398-cbdd74b24c32
To test out this changes, use this file
Fixes #2544