PromptWizard icon indicating copy to clipboard operation
PromptWizard copied to clipboard

can no acess the Azure Open AI Service

Open julianhu1979 opened this issue 1 year ago • 6 comments

I updated the .env file using my own AOAI key and endpoint. and i changed the config to mode: online

once i run the demo, i got following error code. my AOAI key and endpoint is correct.

Mutating Task Description.... Iterations completed: 0%| | 0/3 [00:00<?, ?it/s] Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}} Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}} Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}} Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}} Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}} Error code: 401 - {'error': {'code': 'PermissionDenied', 'message': 'Principal does not have access to API/Operation.'}}

julianhu1979 avatar Dec 26 '24 04:12 julianhu1979

Hi, you can check the following function for debugging https://github.com/microsoft/PromptWizard/blob/0d22cd25133b4a42db1a8a37b3c2ffa7598d5207/promptwizard/glue/common/llm/llm_mgr.py#L17

raghav-2002-os avatar Dec 26 '24 07:12 raghav-2002-os

I find that there are only two types of LLM configurations:

Pure OpenAI: you cannot configure the API; you can only provide the KEY and model. Azure: you can only use Azure CLI, and the KEY cannot be configured.

langcaiye avatar Dec 26 '24 08:12 langcaiye

how to use local openai vllm serving

datalee avatar Dec 26 '24 11:12 datalee

How can I config the Azure API KEY?

lizhiling01 avatar Dec 27 '24 03:12 lizhiling01

Hi, Thanks for your comments. By changing the following function any LLM calling resource can be used https://github.com/microsoft/PromptWizard/blob/60504476f8231a7cb0fc3182101a86014e3b73ee/promptwizard/glue/common/llm/llm_mgr.py#L17

raghav-2002-os avatar Dec 27 '24 03:12 raghav-2002-os

Here is a working version of call_api that uses the OpenAI client to connect to a model hosted by vLLM using:

python -m vllm.entrypoints.openai.api_server \
        --model hugging-quants/Meta-Llama-3.1-8B-Instruct-AWQ-INT4 \
        --quantization awq
        --api-key token-abc123 

You need to set: OPENAI_MODEL_NAME=hugging-quants/Meta-Llama-3.1-8B-Instruct-AWQ-INT4

def call_api(messages):
    from openai import OpenAI
    client = OpenAI(
        base_url = os.environ.get("AZURE_OPENAI_ENDPOINT", "http://localhost:8000/v1"),
        api_key = os.environ.get("AZURE_OPENAI_API_KEY", "token-abc123")  
    )
    response = client.chat.completions.create(
            model=os.environ["OPENAI_MODEL_NAME"],
            messages=messages,
            temperature=0.0,
        )
    prediction = response.choices[0].message.content
    return prediction

owi1972 avatar Dec 30 '24 21:12 owi1972