OpenAI-API-dotnet icon indicating copy to clipboard operation
OpenAI-API-dotnet copied to clipboard

How to call chat api to start a continious converation instead of a single isolated q/a

Open ericliu0408 opened this issue 2 years ago • 2 comments

I currently use below code to call the chat api

  var results = await api.Chat.CreateChatCompletionAsync(new ChatRequest()
                  {
                      Model = Model.ChatGPTTurbo,
                      Temperature = 0.1,
                      MaxTokens = max_Tokens,
                      Messages = new ChatMessage[]
                      {
                        new ChatMessage(ChatMessageRole.User, prompt)
                      }
                  });
                  answer = results.Choices[0].Message.Content.Trim();

but how to call chat api to start a continious converation, just like in the chatgpt official web interface, instead of a single isolated q/a like above.

some update: I just found below code to create a conversation:

var api = new OpenAI_API.OpenAIAPI(apiKey);

                       var chat = api.Chat.CreateConversation();
                       chat.AppendUserInput(prompt);

Many thanks, I am new to c#. Any example code is highly appreciated.

ericliu0408 avatar Mar 15 '23 15:03 ericliu0408

"var chat = api.Chat.CreateConversation();

/// give instruction as System chat.AppendSystemMessage("You are a teacher who helps children understand if things are animals or not. If the user tells you an animal, you say "yes". If the user tells you something that is not an animal, you say "no". You only ever respond with "yes" or "no". You do not say anything else.");

// give a few examples as user and assistant chat.AppendUserInput("Is this an animal? Cat"); chat.AppendExampleChatbotOutput("Yes"); chat.AppendUserInput("Is this an animal? House"); chat.AppendExampleChatbotOutput("No");

// now let's ask it a question' chat.AppendUserInput("Is this an animal? Dog"); // and get the response string response = await chat.GetResponseFromChatbot(); Console.WriteLine(response); // "Yes"

// and continue the conversation by asking another chat.AppendUserInput("Is this an animal? Chair"); // and get another response response = await chat.GetResponseFromChatbot(); Console.WriteLine(response); // "No"

// the entire chat history is available in chat.Messages foreach (ChatMessage msg in chat.Messages) { Console.WriteLine($"{msg.Role}: {msg.Content}"); }"

front page of the github maybe it was added later on so you didn't see it. it keeps entire chat history in chat.messages.

MathiasEllegaardRitter avatar Mar 17 '23 03:03 MathiasEllegaardRitter

Many thanks. Actually my issue is in a chat form application, and I achived my desired behaviour by storing chat messages in database, and each time a new prompt comes, all previous chats are retrieved from database and combined into a contexual prompt.

The way of conversation really consums token like hell.

ericliu0408 avatar Mar 17 '23 08:03 ericliu0408