You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/models/index.md
+17-11Lines changed: 17 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -3,21 +3,21 @@
3
3
PydanticAI is model-agnostic and has built-in support for multiple model providers:
4
4
5
5
*[OpenAI](openai.md)
6
-
*[DeepSeek](openai.md#openai-compatible-models)
7
6
*[Anthropic](anthropic.md)
8
7
*[Gemini](gemini.md) (via two different APIs: Generative Language API and VertexAI API)
9
-
*[Ollama](openai.md#ollama)
10
8
*[Groq](groq.md)
11
9
*[Mistral](mistral.md)
12
10
*[Cohere](cohere.md)
13
11
*[Bedrock](bedrock.md)
14
12
15
13
## OpenAI-compatible Providers
16
14
17
-
Many models are compatible with the OpenAI API, and can be used with `OpenAIModel` in PydanticAI:
15
+
In addition, many providers are compatible with the OpenAI API, and can be used with `OpenAIModel` in PydanticAI:
18
16
19
-
*[OpenRouter](openai.md#openrouter)
17
+
*[DeepSeek](openai.md#deepseek)
20
18
*[Grok (xAI)](openai.md#grok-xai)
19
+
*[Ollama](openai.md#ollama)
20
+
*[OpenRouter](openai.md#openrouter)
21
21
*[Perplexity](openai.md#perplexity)
22
22
*[Fireworks AI](openai.md#fireworks-ai)
23
23
*[Together AI](openai.md#together-ai)
@@ -40,27 +40,33 @@ PydanticAI uses a few key terms to describe how it interacts with different LLMs
40
40
roughly in the format `<VendorSdk>Model`, for example, we have `OpenAIModel`, `AnthropicModel`, `GeminiModel`,
41
41
etc. When using a Model class, you specify the actual LLM model name (e.g., `gpt-4o`,
42
42
`claude-3-5-sonnet-latest`, `gemini-1.5-flash`) as a parameter.
43
-
***Provider**: This refers to Model-specific classes which handle the authentication and connections
43
+
***Provider**: This refers to provider-specific classes which handle the authentication and connections
44
44
to an LLM vendor. Passing a non-default _Provider_ as a parameter to a Model is how you can ensure
45
45
that your agent will make requests to a specific endpoint, or make use of a specific approach to
46
46
authentication (e.g., you can use Vertex-specific auth with the `GeminiModel` by way of the `VertexProvider`).
47
47
In particular, this is how you can make use of an AI gateway, or an LLM vendor that offers API compatibility
48
48
with the vendor SDK used by an existing Model (such as `OpenAIModel`).
49
+
***Profile**: This refers to a description of how requests to a specific model or family of models need to be
50
+
constructed to get the best results, independent of the model and provider classes used.
51
+
For example, different models have different restrictions on the JSON schemas that can be used for tools,
52
+
and the same schema transformer needs to be used for Gemini models whether you're using `GoogleModel`
53
+
with model name `gemini-2.5-pro-preview`, or `OpenAIModel` with `OpenRouterProvider` and model name `google/gemini-2.5-pro-preview`.
49
54
50
-
In short, you select a specific model name (like `gpt-4o`), PydanticAI uses the appropriate Model class (like `OpenAIModel`), and the provider handles the connection and authentication to the underlying service.
55
+
When you instantiate an [`Agent`][pydantic_ai.Agent] with just a name formatted as `<provider>:<model>`, e.g. `openai:gpt-4o` or `openrouter:google/gemini-2.5-pro-preview`,
56
+
PydanticAI will automatically select the appropriate model class, provider, and profile.
57
+
If you want to use a different provider or profile, you can instantiate a model class directly and pass in `provider` and/or `profile` arguments.
51
58
52
59
## Custom Models
53
60
54
-
To implement support for models not already supported, you will need to subclass the [`Model`][pydantic_ai.models.Model] abstract base class.
55
-
56
-
For streaming, you'll also need to implement the following abstract base class:
To implement support for a model API that's not already supported, you will need to subclass the [`Model`][pydantic_ai.models.Model] abstract base class.
62
+
For streaming, you'll also need to implement the [`StreamedResponse`][pydantic_ai.models.StreamedResponse] abstract base class.
59
63
60
64
The best place to start is to review the source code for existing implementations, e.g. [`OpenAIModel`](https://github.com/pydantic/pydantic-ai/blob/main/pydantic_ai_slim/pydantic_ai/models/openai.py).
61
65
62
66
For details on when we'll accept contributions adding new models to PydanticAI, see the [contributing guidelines](../contributing.md#new-model-rules).
63
67
68
+
If a model API is compatible with the OpenAI API, you do not need a custom model class and can provide your own [custom provider](openai.md#openai-compatible-models) instead.
69
+
64
70
<!-- TODO(Marcelo): We need to create a section in the docs about reliability. -->
Copy file name to clipboardExpand all lines: docs/models/openai.md
+47-20Lines changed: 47 additions & 20 deletions
Original file line number
Diff line number
Diff line change
@@ -2,15 +2,15 @@
2
2
3
3
## Install
4
4
5
-
To use OpenAI models, you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `openai` optional group:
5
+
To use OpenAI models or OpenAI-compatible APIs, you need to either install `pydantic-ai`, or install `pydantic-ai-slim` with the `openai` optional group:
6
6
7
7
```bash
8
8
pip/uv-add "pydantic-ai-slim[openai]"
9
9
```
10
10
11
11
## Configuration
12
12
13
-
To use `OpenAIModel`through their main API, go to [platform.openai.com](https://platform.openai.com/) and follow your nose until you find the place to generate an API key.
13
+
To use `OpenAIModel`with the OpenAI API, go to [platform.openai.com](https://platform.openai.com/) and follow your nose until you find the place to generate an API key.
14
14
15
15
## Environment variable
16
16
@@ -130,7 +130,7 @@ You can learn more about the differences between the Responses API and Chat Comp
130
130
131
131
## OpenAI-compatible Models
132
132
133
-
Many models are compatible with the OpenAI API, and can be used with `OpenAIModel` in PydanticAI.
133
+
Many providers and models are compatible with the OpenAI API, and can be used with `OpenAIModel` in PydanticAI.
134
134
Before getting started, check the [installation and configuration](#install) instructions above.
135
135
136
136
To use another OpenAI-compatible API, you can make use of the `base_url` and `api_key` arguments from `OpenAIProvider`:
@@ -150,7 +150,40 @@ agent = Agent(model)
150
150
...
151
151
```
152
152
153
-
You can also use the `provider` argument with a custom provider class like the `DeepSeekProvider`:
153
+
Various providers also have their own provider classes so that you don't need to specify the base URL yourself and you can use the standard `<PROVIDER>_API_KEY` environment variable to set the API key.
154
+
When a provider has its own provider class, you can use the `Agent("<provider>:<model>")` shorthand, e.g. `Agent("deepseek:deepseek-chat")` or `Agent("openrouter:google/gemini-2.5-pro-preview")`, instead of building the `OpenAIModel` explicitly. Similarly, you can pass the provider name as a string to the `provider` argument on `OpenAIModel` instead of building instantiating the provider class explicitly.
155
+
156
+
#### Model Profile
157
+
158
+
Sometimes, the provider or model you're using will have slightly different requirements than OpenAI's API or models, like having different restrictions on JSON schemas for tool definitions, or not supporting tool definitions to be marked as strict.
159
+
160
+
When using an alternative provider class provided by PydanticAI, an appropriate model profile is typically selected automatically based on the model name.
161
+
If the model you're using is not working correctly out of the box, you can tweak various aspects of how model requests are constructed by providing your own [`ModelProfile`][pydantic_ai.profiles.ModelProfile] (for behaviors shared among all model classes) or [`OpenAIModelProfile`][pydantic_ai.profiles.openai.OpenAIModelProfile] (for behaviors specific to `OpenAIModel`):
162
+
163
+
```py
164
+
from pydantic_ai import Agent
165
+
from pydantic_ai.models.openai import OpenAIModel
166
+
from pydantic_ai.profiles._json_schema import InlineDefsJsonSchemaTransformer
167
+
from pydantic_ai.profiles.openai import OpenAIModelProfile
168
+
from pydantic_ai.providers.openai import OpenAIProvider
0 commit comments