HyacinthJingjing

Results 11 comments of HyacinthJingjing

Hello, I have a question about the pretrained DCCRN model, https://huggingface.co/Johnson-Lsx/Shaoxiong_Lin_dns_ins20_enh_enh_train_enh_dccrn_raw When I use the pretrained model to enhance my test wav, i noticed the magnitude of the enhanced wav...

Thank you for you reply. I would like to try this method. Can i train the model with multiple criterions, SI-SNR and MSE loss? Can this method mitigate the issue?...

Hi @Emrys365 , I have tried to retrain the dccrn model with the SNR loss, but the results didnot meet expectations. Here is the config file. ![降噪效果评估](https://github.com/espnet/espnet/assets/17040205/83deaf9e-c469-4458-8dca-da22b5232e35) My expectation was...

Yes, I have already tried that. Thank you. What situation is this scale-normalization designed for? @Emrys365

Thank you for your reply. When I increase warmup_step to 4000, training seems to be normal. And, I have another confusion. In order to get probabilities of each word in...

Qwen2-audio微调时能使用lora训练只训练audio-encoder部分吗?怎么配置能实现此功能?@Jintao-Huang

> 感谢 这部分最近看下 > > llm moe那边有标准处理方法 大概思思路是 利用one hot 代替 for @Mddct 关于moe模型如何转onnx,能否提供一些参考文章呢?期待回复

我也是用的这一版镜像,用run_server_2pass.sh起的服务,启动命令如下: /workspace/FunASR/runtime/websocket/build/bin/funasr-wss-server-2pass --download-model-dir /workspace/models --model-dir damo/speech_paraformer-large-vad-punc_asr_nat-zh-cn-16k-common-vocab8404-onnx --online-model-dir damo/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-online-onnx --vad-dir damo/speech_fsmn_vad_zh-cn-16k-common-onnx --punc-dir damo/punc_ct-transformer_zh-cn-common-vad_realtime-vocab272727-onnx --itn-dir thuduj12/fst_itn_zh --lm-dir damo/speech_ngram_lm_zh-cn-ai-wesp-fst --decoder-thread-num 48 --model-thread-num 1 --io-thread-num 3 --port 10097 --certfile /workspace/FunASR/runtime/ssl_key/server.crt --keyfile /workspace/FunASR/runtime/ssl_key/server.key --hotword /workspace/FunASR/runtime/websocket/hotwords.txt...

Have you resolved this issue? I am encountering a similar problem.