Mohammad Sharara
Mohammad Sharara
The issue concerning the preprocessed data extends to the fact that if we want to adopt this dataset for training a model, and we want to predict on a new...
 As you may notice, all numerical features are between 0 and 1. To my knowledge, this is called normalization. A standardized distribution does not necessarily yields values between 0...
Or at least share the minimum and maximum of each feature in the dataset, for a complete reproducibility
Sorry for being persistent. But I bet when you publicly release a dataset, then your aim is that people benefit from the dataset. If your released dataset is normalized (using...
Thanks for sharing. But there is a win-win solution for both of us: just share the minimum and the maximum for each numerical feature, please. In this way, no user...
This only worked when using offline model with Llama cpp. The config.json file became: ```json { "models": [ { "title": "Qwen 2.5 Coder 7b", "model": "qwen-2.5-coder-instruct-7b", "provider": "llama.cpp", "apiBase": "http://192.168.120.243:8078"...
I tried it with and without /v1, same result. No, the issue is not in llama.cpp, llama.cpp provider worked, but ollama didn't work. I have ollama installed on a server,...