PriNova

Results 40 comments of PriNova

Thank you for your suggestions and review. I will have a look about this issue and will commit changes asap

I fixed the leak with the help of LeakCanary 2.8.1 and it gave no leaks anymore. I chose lambdas in kotlin to avoid interfaces and to avoid extra object overhead...

BTW: For events, you can use Kotlinx coroutines BroadcastChannel's & Flow's and make them LiveCycle-& LiveData aware with a viewModelScope as a coroutinecontext. So an Event API would be obsolete.

Integrated ViewModel for configuration change. Added Snackbar to notify user-actions And refactored other logic into module

Thank you for approving this PR @Kryslynn23 . There were so many changes in nearly 1 year since opening the PR, that there are a lot breaking changes which needs...

I close this PR because the project seems not to be maintained anymore.

> If you don't have access to the original LLaMA files I think someone uploaded it here https://huggingface.co/decapoda-research/llama-7b-hf/blob/main/tokenizer.model This works like charm. It would be amazing creating a PR for...

> @eiz It seems there is a problem with the alpaca 13B, after conversion, when loading it complains about the embedding size: > > ``` > main: seed = 1679320340...

Since the PR, the models (13B-Q4 SentencePiece) behaves strange. With the `--ins` flag I got this conversation: ![Screenshot 2023-03-20 181011](https://user-images.githubusercontent.com/31413214/226429546-832a3d29-08f9-4b22-a640-474a9f7a4ada.png) And with the `-i` flag it behaves like this: ![Screenshot...

I extracted the vocabulary from a (pre-PR reformatted) model as a json and from the tokenizer.model file from the original llama source for comparisons. My observation is, both are equal...