qmpham
Results
2
comments of
qmpham
But LLAMA has input's max_len of only 2048 tokens
yes, I understand. but why interested in having long input while the model's capacity is only 2048. You might risk of truncating the question to which the target is addressed