UltraChat
UltraChat copied to clipboard
Any plans to support local deployment on Macbook?
Thanks for your work! I find that the inference scripts in UltraChat/UltraLM/inference_cli.py at main · thunlp/UltraChat · GitHub is still a vanilla one. Do you plan to provide deployment scripts on low-resource devices such as MacBook?
Hi, thanks for your question. Yes, supporting local deployment on low-resource device is definitely a must-do future work. We are currently focusing on training and releasing larger and better models, though. Later, we will explore the techniques in low-resource deployment for LLMs.