Can I use this with colab?
Got the same question :)
I tried this but the involvement of conda makes it hard to use this on colab and limited runtime means you need to run all the setup everytime colab got disconnected which is a pain.
I used standard mode, and models take up to 200GBs storage. I guess it is not possible.
I think the answer is yes, you can use lite.yaml as your config which may use model online. You need not to deploy and download any model.You can copy all in awesome_chat.py into a code block, and change the lite.yaml by your self.And then embed your question into code or put the question in file. finally, simply change CLI function and use it as your main code