Bokai Xu
Bokai Xu
不要用fp16,用bf16就没有问题了 @lukeewin
@Oxi84 Thanks! That is cool, mine is still 0.8 after 1k steps, my batch size=128, lora-r=8, base_model=llama-7b, i used int8 training. :D
@1c7 Exactly!
also interested in training data composition, as it was not mentioned in technical report :)
Hi, you can use `http://39.101.77.220/query/search/?start=10&size=10&keyword=hello` to fetch desired data.
Hi! Thank you for using MiniCPM-o 2.6, I checked the code on huggingface, if I understand your question correctly, streaming_generate=True means to generate audio, and streaming_generate=False will not generate audio....
Hello biraj! Thank you for your feedback, which is very important. The audio generation's latency has two concept: - initial latency, the delay between end of user question and the...
Yes absolutely you can, but you should consider using `omni` mode. You can refer to this code: https://github.com/OpenBMB/MiniCPM-o?tab=readme-ov-file#multimodal-live-streaming ```python import math import numpy as np from PIL import Image from...
> hey [@bokesyo](https://github.com/bokesyo), i've a question wrt to your [previous comment](https://github.com/OpenBMB/MiniCPM-o/issues/848#issuecomment-2671552404). > > _note: we're not using vLLM. we're using transformers._ > > 1. do we have to use ``...
您好,请问这里的 seedtts test-zh SIM-o 42的推理代码是如何的?感谢!