adirtha
adirtha
@jerryjliu Thanks for the reply. I was following the [InsertDemo.ipynb](https://github.com/jerryjliu/gpt_index/blob/main/examples/paul_graham_essay/InsertDemo.ipynb) notebook which used the text_splitter before creating Document objects from the generated chunks. If wrapping the Document around the fetched...
Hi. Did you decide on the final values? Correct me if I am wrong but won't setting a very high memmap_threshold value lead to issues because, till 1,000,000,000 Kb, qdrant...
I figured out the solution to this and #10. To run the fine-tuned models, import `TransformerLanguageModelPrompt` from the `src.transformer_lm_prompt` script instead of using `TransformerLanguageModel` from `fairseq.models.transformer_lm`.
You probably will need to run the `preprocess_large.sh` script first before running the `infer_large.sh` script.
Did you download and extract the trained checkpoint tgz file in the required directory? If not, you need to do these steps: ``` mkdir checkpoints cd checkpoints wget https://msramllasc.blob.core.windows.net/modelrelease/BioGPT/checkpoints/QA-PubMedQA-BioGPT-Large.tgz tar...
Did you put the checkpoints directory inside the BioGPT directory? Because the paths it uses is relative and all the necessary directories has to be inside the BioGPT folder. From...
Alright, thanks a lot. Is there a reason why this happens? And any solution as to how I can prevent it from happening again?
Sure I'll try it out and see if I get the same error again
> Proposing a workaround in #1595 > > @AdirthaBorgohain are you still running heavy search or inserts after decreasing `indexing_threshold`? When I faced this issue, I wasn't even running any...