Closed
Description
Confirm this is an issue with the Python library and not an underlying OpenAI API
- This is an issue with the Python library
Describe the bug
Using with_options()
to configure options per request introduces a memory leak.
How to determine it's a memory leak?
The following code calls the completion API 10 times. On each iteration, it takes a snapshot of traces of memory blocks using tracemalloc
, compares it to the snapshot of the previous iteration, and prints the top 2 differences. The memory required by the openai
library increases on every iteration. In particular, the following lines have the most increases (using sync client):
.../env/lib/python3.11/site-packages/openai/_response.py:227: size=321 KiB (+26.0 KiB), count=2681 (+175), average=123 B
.../env/lib/python3.11/site-packages/openai/_response.py:226: size=214 KiB (+19.4 KiB), count=1830 (+165), average=120 B
import os
import tracemalloc
from openai import OpenAI
tracemalloc.start()
client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
snapshot1 = tracemalloc.take_snapshot()
for _ in range(10):
client.with_options(max_retries=5).chat.completions.create(
messages=[
{
"role": "user",
"content": "How can I get the name of the current day in Node.js?",
}
],
model="gpt-3.5-turbo",
)
snapshot2 = tracemalloc.take_snapshot()
top_stats = snapshot2.compare_to(snapshot1, 'lineno')
print("[ Top 2 differences ]")
for stat in top_stats[:2]:
print(stat)
snapshot1 = snapshot2
Potential cause
Looking at the library code, when using with_options()
, a new client is created on every request (sync client and async client).
To Reproduce
Use with_options()
and make multiple API calls.
Code snippets
No response
OS
macOS
Python version
Python v3.11.5
Library version
openai v1.3.3