Skip to content

CLI erroneously sends unsupported parameters (temperature/top_p) to the o3-mini model #2072

Closed
@MrJarnould

Description

@MrJarnould

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Using openai api chat.completions.create with the newly released o3-mini-2025-01-31 model triggers errors about unsupported parameters even when those parameters are not explicitly set in the CLI command. Specifically, temperature and top_p appear to be sent to the API, causing 400 errors.

However, if you set --temperature "1" and --top_p "1", then no error is produced and a chat response is obtained.

Note: installed openai using pipx v1.7.1

To Reproduce

  1. Use the CLI without specifying --temperature nor --top_p:
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31"
Error: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
  1. Confirm the same request works using cURL:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "reasoning_effort": "low"
  }'
{
  "id": "chatcmpl-AwNe73vdAtaqbpdecVpG3XNRwfLbt",
  "object": "chat.completion",
  "created": 1738477283,
  "model": "o3-mini-2025-01-31",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "The capital of France is Paris.",
        "refusal": null
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 12,
    "completion_tokens": 17,
    "total_tokens": 29,
    "prompt_tokens_details": {
      "cached_tokens": 0,
      "audio_tokens": 0
    },
    "completion_tokens_details": {
      "reasoning_tokens": 0,
      "audio_tokens": 0,
      "accepted_prediction_tokens": 0,
      "rejected_prediction_tokens": 0
    }
  },
  "service_tier": "default",
  "system_fingerprint": "fp_8bcaa0ca21"
}
  1. Observe that the cURL command fails when setting "temperature": 0.5:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "temperature": 0.5,
    "reasoning_effort": "low"
  }'
{
  "error": {
    "message": "Unsupported parameter: 'temperature' is not supported with this model.",
    "type": "invalid_request_error",
    "param": "temperature",
    "code": "unsupported_parameter"
  }
}
  1. Observe that the cURL command works when setting "temperature": 1:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "temperature": 1,
    "reasoning_effort": "low"
  }'
{
  "id": "chatcmpl-AwNYOeW1akjt6dMLkH3UNSrdXk5tV",
  "object": "chat.completion",
  "created": 1738476928,
  "model": "o3-mini-2025-01-31",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "The capital of France is Paris.",
        "refusal": null
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 12,
    "completion_tokens": 17,
    "total_tokens": 29,
    "prompt_tokens_details": {
      "cached_tokens": 0,
      "audio_tokens": 0
    },
    "completion_tokens_details": {
      "reasoning_tokens": 0,
      "audio_tokens": 0,
      "accepted_prediction_tokens": 0,
      "rejected_prediction_tokens": 0
    }
  },
  "service_tier": "default",
  "system_fingerprint": "fp_8bcaa0ca21"
}
  1. Observe the CLI still fails when setting --temperature "1", now with a new "Unsupported parameter: 'top_p' is not supported with this model." error:
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31" --temperature "1"
Error: Error code: 400 - {'error': {'message': "Unsupported parameter: 'top_p' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'top_p', 'code': 'unsupported_parameter'}}
  1. Observe that the CLI request works when setting --temperature "1" --top_p "1":
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31" --temperature "1" --top_p "1"
The capital of France is Paris.

Code snippets

OS

macOS

Python version

Python v3.12.5

Library version

openai 1.61.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions