Skip to content

feat(instrumentation): Set peer.service as the traced object name for client spans to mark the destination services #2967

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

LakshmiPriyaSujith
Copy link

@LakshmiPriyaSujith LakshmiPriyaSujith commented May 29, 2025

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

image


Important

Adds peer.service attribute to client spans across multiple instrumentation packages to mark destination services.

  • Behavior:
    • Adds SpanAttributes.PEER_SERVICE to client spans in _wrap() and _awrap() functions in anthropic/__init__.py, groq/__init__.py, and watsonx/__init__.py.
    • Adds SpanAttributes.PEER_SERVICE to with_instrumentation() in bedrock/__init__.py and sagemaker/__init__.py.
    • Adds SpanAttributes.PEER_SERVICE to wrap_agent_execute_task(), wrap_task_execute(), and wrap_llm_call() in crewai/instrumentation.py.
    • Adds SpanAttributes.PEER_SYSTEM to _create_llm_span() in langchain/callback_handler.py.
  • Misc:
    • Adds PEER_SERVICE to SpanAttributes in semconv_ai/__init__.py.

This description was created by Ellipsis for 8256dd5. You can customize this summary. It will automatically update as commits are pushed.

@CLAassistant
Copy link

CLAassistant commented May 29, 2025

CLA assistant check
All committers have signed the CLA.

@LakshmiPriyaSujith LakshmiPriyaSujith changed the title Set peer.service as the traced object name for client spans to mark t… Set peer.service as the traced object name for client spans to mark the destination services May 29, 2025
@LakshmiPriyaSujith LakshmiPriyaSujith marked this pull request as ready for review May 29, 2025 10:49
Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Changes requested ❌

Reviewed everything up to 8256dd5 in 1 minute and 37 seconds. Click for details.
  • Reviewed 303 lines of code in 8 files
  • Skipped 0 files when reviewing.
  • Skipped posting 3 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:49
  • Draft comment:
    Consider handling the commented-out LLM_RESPONSE_ID. Either implement it or remove the commented code if not needed.
  • Reason this comment was not posted:
    Comment was not on a location in the diff, so it can't be submitted as a review comment.
2. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:35
  • Draft comment:
    Check naming consistency: some attributes use 'gen_ai.' while others use 'llm.' prefixes (e.g., LLM_USAGE_PROMPT_TOKENS vs LLM_USAGE_TOTAL_TOKENS). Consider standardizing the naming.
  • Reason this comment was not posted:
    Comment was not on a location in the diff, so it can't be submitted as a review comment.
3. packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py:24
  • Draft comment:
    Review the value for LLM_COMPLETIONS_EXCEPTIONS which references OpenAI conventions. Ensure it aligns with overall semantic conventions for all supported LLMs.
  • Reason this comment was not posted:
    Comment was not on a location in the diff, so it can't be submitted as a review comment.

Workflow ID: wflow_nUIVNKvyONunEykS

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@LakshmiPriyaSujith LakshmiPriyaSujith changed the title Set peer.service as the traced object name for client spans to mark the destination services feat(instrumentation): Set peer.service as the traced object name for client spans to mark the destination services May 30, 2025
Copy link
Member

@nirga nirga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @LakshmiPriyaSujith thanks for this - can you explain this change? I'm not sure why the object is the peer service that you want to set

@LakshmiPriyaSujith
Copy link
Author

Hey @LakshmiPriyaSujith thanks for this - can you explain this change? I'm not sure why the object is the peer service that you want to set

Hi @nirga! This is my thought. Since LLM traces have only client spans, the destinations of LLM calls are unspecified. I thought it would be better to use the instrumented object as the remote service rather can using a constant value like Watsonx or Bedrock. For example, in the above screenshot, the span bedrock.completion represents a call from the client application to Bedrock implemented via ClientCreator. Do you think there's a better way to mark the destination service?

@nirga
Copy link
Member

nirga commented May 30, 2025

Can you give an example of when it won't be bedrock?

@LakshmiPriyaSujith
Copy link
Author

Can you give an example of when it won't be bedrock?

Here's an example using Langchain instrumentation.
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants