Skip to content

temperature is not supported with this model(o3-mini) #2104

Closed as not planned
@gautamjajoo

Description

@gautamjajoo

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Temperature is not supported in the o3 model. Similar issue was reported earlier(#2072) and it was supposed to be fixed in the 1.61.1 release(#2078)

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}

To Reproduce

messages = [
    {"role": "system", "content": "You are an expert"},
    {"role": "user", "content": "What is the capital of France"}
]

client = OpenAI(api_key=api_key)

response = client.chat.completions.create(
    model="o3-mini",
    messages=messages,
    temperature=0
)

print(response.choices[0].message.content)

Using this code, the above error pops up.

Code snippets

OS

macOS

Python version

Python 3.13.1

Library version

openai 1.61.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions