ragas icon indicating copy to clipboard operation
ragas copied to clipboard

docs: add proper docs about the change in how v0.1 fixes the class of issues with passing your own LLMs and Embeddings

Open hteeyeoh opened this issue 2 years ago • 8 comments

Describe the bug I was trying to follow the steps mentioned here: https://docs.ragas.io/en/v0.0.22/howtos/customisations/llms.html to bring my own llm(AzureOpenAI) into ragas evaluation. I was interested in 3 of the following metrics: result = evaluate( dataset=dataset, metrics=[ #context_precision, #faithfulness, answer_relevancy, ], )

Where I follow the steps mentioned to use LangchainLLM method to bring my azure endpoint into. ragas_azure_model = LangchainLLM(llm) faithfulness.llm = ragas_azure_model answer_relevancy.llm = ragas_azure_model context_precision.llm = ragas_azure_model

But during evaluate, faithfulness and context_precision looks ok where i can get the result, But for answer_relevancy it keep complained on OpenAI API key not found. I was using the same endpoint as stated above but how come only this specific metric bumps into error.. Is there any extra steps needed for "answer_relevancy" metric? Ragas version: 0.0.22 Python version: 3.10.13

Code to Reproduce Share code to reproduce the issue

Error trace

Expected behavior A clear and concise description of what you expected to happen.

Additional context Add any other context about the problem here.

hteeyeoh avatar Jan 24 '24 08:01 hteeyeoh

Seems to be related to the pip version, if you install from source it works.

AugustDS avatar Jan 24 '24 09:01 AugustDS

Seems to be related to the pip version, if you install from source it works.

Thanks @AugustDS for your response. Means that i cannot direct pip install ragas? I have to build from scratch and install it myself?

hteeyeoh avatar Jan 24 '24 09:01 hteeyeoh

For me this worked: pip uninstall ragas

In Python site-packages dir: git clone https://github.com/explodinggradients/ragas && cd ragas pip install -e .

AugustDS avatar Jan 24 '24 10:01 AugustDS

RUN cd /usr/local/lib/python3.10/site-packages/ && git clone https://github.com/explodinggradients/ragas && cd ragas && pip install -e . but when i start it complain from ragas.llms import LangchainLLM ragas | ImportError: cannot import name 'LangchainLLM' from 'ragas.llms' (/usr/local/lib/python3.10/site-packages/ragas/src/ragas/llms/init.py)

hteeyeoh avatar Jan 24 '24 10:01 hteeyeoh

RUN cd /usr/local/lib/python3.10/site-packages/ && git clone https://github.com/explodinggradients/ragas && cd ragas && pip install -e . but when i start it complain from ragas.llms import LangchainLLM ragas | ImportError: cannot import name 'LangchainLLM' from 'ragas.llms' (/usr/local/lib/python3.10/site-packages/ragas/src/ragas/llms/init.py)

@hteeyeoh It is now called LangchainLLMWrapper

https://github.com/explodinggradients/ragas/blob/ff449fccfb7d594c582071694b00f4c2192c5465/src/ragas/llms/base.py#L66-L74

And the usage is also different from pip version since langchain have dramatically changed in version 0.1.0. They deprecated langchain.chat_models, which ragas pip version is using, and replaced it with from langchain_core.language_models import BaseLanguageModel.

So I think the documents might need an update to catch on these changes?

Lanture1064 avatar Jan 25 '24 07:01 Lanture1064

Ok my mistakes. I tried to recompile with v0.0.22 tag yet the issue still occur: ERROR: OpenAI API key not found! Seems like your trying to use Ragas metrics with OpenAI endpoints. Please set 'OPENAI_API_KEY' environment variable Will try to compile and use the latest main branch code and try

hteeyeoh avatar Jan 26 '24 00:01 hteeyeoh

From https://docs.ragas.io/en/v0.0.22/howtos/customisations/azure-openai.html, I observed following:

init and change the embeddings

only for answer_relevancy

azure_embeddings = AzureOpenAIEmbeddings( deployment="your-embeddings-deployment-name", model="your-embeddings-model-name", openai_api_base="https://your-endpoint.openai.azure.com/", openai_api_type="azure", )

embeddings can be used as it is

answer_relevancy.embeddings = azure_embeddings

Is it a must to have embeddings in case I want to measure answer_relevancy metrics? Cause so far without this i observed other metrics like context_precision, and faithfulness im able to get. Just answer_relevancy will hit into the error mentioned in title.

hteeyeoh avatar Jan 26 '24 03:01 hteeyeoh

keeping this open but will use this to track the documentation change required so that you folks can actually use the fix as easily as possible.

jjmachan avatar Feb 05 '24 21:02 jjmachan