Skip to content

Chat completion endpoint cannot remember its previous messages #1097

Closed as not planned
@HuskyDanny

Description

@HuskyDanny

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

image
With this request, llm should answer my last question. But it says it doesn;t know. Is ChatPromptTemplate with chat completion endpoint supposed to remember the previous messages or I should manually added them in the latest user prompt?
Chat completion takes a list of messages
image
Response saying don't know previous message
image

To Reproduce

This is my code for setting up using semantic kerner which uses openai package in the end to send the request, basically it add the previous messages into the ChatPromptTemplate.

system_message = """
You are a technical supporter that help users' questions, you should give 1-2 sentences to explain the answer and code examples for the question based on only the following contexts.
You may find the contexts provide information for multiple potential answers, you can give up to 3 most relevant answers separated by numbers like 1. 2. 3.

Follow this pattern to answer the question:
Contexts:
- Document path_title1 : page_content1
&&&
- Document path_title2 : page_content2

Question: question

Answer: answer
"""

prompt_template = """
Contexts:
{{$context}}

Question: {{$input}}
"""

def preprocess_messages(messages : list):
    transformed_messages = []
    transformed_messages.append({"role": "system", "message": system_message})
    for message in messages:
      role = "assistant" if message["role"] == "bot" else message["role"]
      transformed_message = {"role": role, "message": message["content"]}
      if (role != "system"):
        transformed_messages.append(transformed_message)
    return transformed_messages

def create_chat_prompt_template_instance(kernel : Kernel, messages : list = []):
    req_settings = sk_oai.AzureChatRequestSettings(max_tokens=2000, temperature=0, extension_data={"chat_system_prompt": system_message})
    req_settings.unpack_extension_data()
    config = PromptTemplateConfig(completion=req_settings)
    template = ChatPromptTemplate(prompt_template, kernel.prompt_template_engine, config)
    if (messages and len(messages) > 0):
      processed_messages = preprocess_messages(messages)
      for message in processed_messages:
        template.add_message(message["role"], message["message"])
    function_config = SemanticFunctionConfig(config, template)
    return kernel.register_semantic_function("ChatBot", "rag_chat", function_config)

  # simplified calling
  messages = [{"content": "Q1", "role": "user"}, {"content": "A1", "role": "bot"},{"content": "Q2", "role": "user"}]
  query=messages[-1]['content']
  context['context'] = combined_documents
  context['input'] = query

  previous_messages = messages[:-1]
  chat_function = create_chat_prompt_template_instance(kernel=kernel, messages=previous_messages)

  response = await kernel.run_async(chat_function, input_context=context)

Code snippets

No response

OS

Ubuntu

Python version

python3.9

Library version

openai1.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions