Xiao

Results 25 issues of Xiao

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question i run the follow code ``` import nest_asyncio nest_asyncio.apply() import logging import...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question I run the [llm reranker](https://github.com/run-llama/llama_index/blob/v0.10.24/docs/docs/examples/node_postprocessor/LLMReranker-Lyft-10k.ipynb) My error is that ``` ValidationError Traceback...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question currently, my code is for single query. if i want do batch...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question I use the llama2 paper datasets. my code snippet is below ```...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question I run the query rewrite with the follow code. ``` """## Setup...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question the below is my code. ``` import torch # from transformers import...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question according to the [async doc](https://docs.llamaindex.ai/en/stable/examples/pipeline/query_pipeline_async/), async can get 2x speed up. my...

question

### Question Validation - [X] I have searched both the documentation and discord for an answer. ### Question I deploy llamaindex as a webserver and it will handle requests from...

question

我把它clone我的项目,在ubuntu下想直接使用,刚开始报的错误是msgpack.hpp: No such file or directory,然后sudo apt install libmsgpack-dev后,报的错误为 /rest_rpc/include/rest_rpc/codec.h:38:42: error: no matching function for call to ‘unpack(msgpack::v1::unpacked*, const char*&, size_t&)’ 38 | msgpack::unpack(&msg_, data, length); | ^

### Describe the issue I use the [Local-LLMs/](https://microsoft.github.io/autogen/blog/2023/07/14/Local-LLMs/) to deploy my local model but the result by llm is weird ### Steps to reproduce ## lunch the local model In...

alt-models