Gunand3043

Results 35 comments of Gunand3043

Hey, Please note that in the Gemini API, context caching is now enabled by default for batch requests. It is not something you can configure manually. For more details, please...

Hey @iakolzin , the issue has been escalated to the engineering team. Just curious, do you still hear the static noise even when using noise-canceling headphones?

I suggest trying our advanced 2.5-pro model, as it appears to resolve the issue. I just tested it on my end, and it worked as expected.

@hedihadi For a tuned model, JSON mode is not supported, and also you cannot pass the system instruction. Please refer [the link](https://ai.google.dev/gemini-api/docs/model-tuning/tutorial?lang=python) for the current limitations of tuned model.

Hey @mhyeonsoo , I just checked, and it looks like the problem has been fixed. We had a brief internal issue that seems to be resolved. Let me know if...

which version of the genai SDK are you using? Could you try updating to the latest `google-genai` version and see if that resolves the issue?

Similar error was discussed in the [AI forum.](https://discuss.ai.google.dev/t/new-live-api-features-dont-work-through-the-api-proactive-audio-and-affective-dialong/84326/10) Could you try updating your API version to v1alpha and see if that resolves it? ``` client = genai.Client(http_options={"api_version": "v1alpha"}) ```

Hey, are you still facing the issue? We’re not able to reproduce it on our end. Do you mind sharing the full code?

@MarkDaoust ,there is an issue with passing text prompt schema and model configuration schema at the same time, Please find the [gist](https://colab.sandbox.google.com/gist/Gunand3043/29aee8bcc292b2e67d6f0a822ff5c774/structured_output.ipynb#scrollTo=EZryq6R-gl9a) here for reference .Thanks!

Hi, I just tested it using the 2.5 model, and it seems to be working fine. For your reference, I am attaching a [Colab Gist](https://colab.sandbox.google.com/gist/Gunand3043/d137cea649d71805f6c0b890951fcc3c/inline_video-_processing.ipynb) file. Let me know if...