Open
Description
Imagine the following case:
from pydantic_ai import Agent, RunContext
qa_agent = Agent(
'openai:gpt-4o',
system_prompt=(
'Use the `qa` function to reply to user's questions about pydantic ai '
),
)
@qa.tool
async def qa(ctx: RunContext[int], query: str) -> AsyncGenerator[str | Reference, None]:
"""Answer users questions about pydantic ai, by inspecting documentation"""
docs = get_context(...)
async for token in prompt(docs, ...):
yield token # Can be a Reference to a document, or just text
This would also need the agent to end earlier, which is a known issue that other people are asking for.
I also know that this can be circumvented by using the graph implementation.
But I also don't see a way to stream results from inside of a node in the graph implementation.
Is this a pattern that is accounted for, or is planned for the future?
Can we do this right now?