Replace the default adapter with an adapter that uses node-llama-cpp
Regrettably, llama-node hasn't seen updates in quite some time. On the other hand, node-llama-cpp appears to be under active development. It would be nice to update the default adapter to use node-llama-cpp instead of llama-node.
That should be relatively straightforward. However, I may require some assistance, given my limited availability for bigger modifications to this project at the moment.
It seems like node-llama-cpp doesn't have support for embeddings, how should we handle this?
Yes and that's okay. It will eventually support it:
https://github.com/withcatai/node-llama-cpp/issues/123
Looks it will be done in short-term:
https://github.com/withcatai/node-llama-cpp/pull/128#issuecomment-1867641654
So for now it could just return an empty array or a randomized array.