llava-cpp-server
llava-cpp-server copied to clipboard
API call with python requests
Hello, I was able to successfully run the server using the repo. However I am wondering how to perform an API call with python requests library. I was trying to do it like this:
`def _query():
API_URL = 'http://localhost:8080/completion'
image_path = ...
user_prompt = ...
with open(image_path, 'rb') as image_file:
files = {'image_file': image_file}
data = {'user_prompt': user_prompt}
response = requests.post(API_URL, files=files, data=data)
if response.status_code == 200:
response_data = response.json()
text = response_data['content']
return text
else:
return f"Error: {response.status_code}"
response = _query()
print("Response:", response)`
I always get 404 error.
I haven’t tested in a while but this repo should have examples for both this llava cpp server as well as llama.cpp (which I recommend you switch to because it is being actively developed still): https://github.com/trzy/multimodal-cloud-tests/blob/main/llava-cpp-test.py