-
Notifications
You must be signed in to change notification settings - Fork 741
feat(instrumentation): Set peer.service as the traced object name for client spans to mark the destination services #2967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…he destination services
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to 8256dd5 in 1 minute and 37 seconds. Click for details.
- Reviewed
303
lines of code in8
files - Skipped
0
files when reviewing. - Skipped posting
3
draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:49
- Draft comment:
Consider handling the commented-out LLM_RESPONSE_ID. Either implement it or remove the commented code if not needed. - Reason this comment was not posted:
Comment was not on a location in the diff, so it can't be submitted as a review comment.
2. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:35
- Draft comment:
Check naming consistency: some attributes use 'gen_ai.' while others use 'llm.' prefixes (e.g., LLM_USAGE_PROMPT_TOKENS vs LLM_USAGE_TOTAL_TOKENS). Consider standardizing the naming. - Reason this comment was not posted:
Comment was not on a location in the diff, so it can't be submitted as a review comment.
3. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:24
- Draft comment:
Review the value for LLM_COMPLETIONS_EXCEPTIONS which references OpenAI conventions. Ensure it aligns with overall semantic conventions for all supported LLMs. - Reason this comment was not posted:
Comment was not on a location in the diff, so it can't be submitted as a review comment.
Workflow ID: wflow_nUIVNKvyONunEykS
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
.../opentelemetry-instrumentation-sagemaker/opentelemetry/instrumentation/sagemaker/__init__.py
Outdated
Show resolved
Hide resolved
.../opentelemetry-instrumentation-sagemaker/opentelemetry/instrumentation/sagemaker/__init__.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @LakshmiPriyaSujith thanks for this - can you explain this change? I'm not sure why the object is the peer service that you want to set
Hi @nirga! This is my thought. Since LLM traces have only client spans, the destinations of LLM calls are unspecified. I thought it would be better to use the instrumented object as the remote service rather can using a constant value like Watsonx or Bedrock. For example, in the above screenshot, the span bedrock.completion represents a call from the client application to Bedrock implemented via ClientCreator. Do you think there's a better way to mark the destination service? |
Can you give an example of when it won't be bedrock? |
feat(instrumentation): ...
orfix(instrumentation): ...
.Important
Adds
peer.service
attribute to client spans across multiple instrumentation packages to mark destination services.SpanAttributes.PEER_SERVICE
to client spans in_wrap()
and_awrap()
functions inanthropic/__init__.py
,groq/__init__.py
, andwatsonx/__init__.py
.SpanAttributes.PEER_SERVICE
towith_instrumentation()
inbedrock/__init__.py
andsagemaker/__init__.py
.SpanAttributes.PEER_SERVICE
towrap_agent_execute_task()
,wrap_task_execute()
, andwrap_llm_call()
increwai/instrumentation.py
.SpanAttributes.PEER_SYSTEM
to_create_llm_span()
inlangchain/callback_handler.py
.PEER_SERVICE
toSpanAttributes
insemconv_ai/__init__.py
.This description was created by
for 8256dd5. You can customize this summary. It will automatically update as commits are pushed.