diff --git a/README-zh.md b/README-zh.md
index 7ee32e57a..80d3cce04 100644
--- a/README-zh.md
+++ b/README-zh.md
@@ -32,6 +32,7 @@ LoongSuite Python Agent 同时也是上游 [OTel Python Agent](https://github.co
|--------|------|---------|
| [AgentScope](https://github.com/agentscope-ai/agentscope) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-agentscope/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-agentscope/) |
| [Agno](https://github.com/agno-agi/agno) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-agno/README.md) | in dev |
+| [Anthropic Python SDK](https://github.com/anthropics/anthropic-sdk-python) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-anthropic/README.rst) | in dev |
| [Claude Agent SDK](https://github.com/anthropics/claude-agent-sdk-python) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-claude-agent-sdk/) |
| [QwenPaw](https://github.com/agentscope-ai/QwenPaw) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-qwenpaw/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-qwenpaw/) |
| [CrewAI](https://github.com/crewAIInc/crewAI) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-crewai/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-crewai/) |
diff --git a/README.md b/README.md
index 287efabcc..fba52a8b1 100644
--- a/README.md
+++ b/README.md
@@ -32,6 +32,7 @@ Source tree: [`instrumentation-loongsuite/`](instrumentation-loongsuite).
|--------|------|---------|
| [AgentScope](https://github.com/agentscope-ai/agentscope) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-agentscope/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-agentscope/) |
| [Agno](https://github.com/agno-agi/agno) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-agno/README.md) | in dev |
+| [Anthropic Python SDK](https://github.com/anthropics/anthropic-sdk-python) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-anthropic/README.rst) | in dev |
| [Claude Agent SDK](https://github.com/anthropics/claude-agent-sdk-python) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-claude-agent-sdk/) |
| [QwenPaw](https://github.com/agentscope-ai/QwenPaw) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-qwenpaw/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-qwenpaw/) |
| [CrewAI](https://github.com/crewAIInc/crewAI) | [GUIDE](instrumentation-loongsuite/loongsuite-instrumentation-crewai/README.md) | [PyPI](https://pypi.org/project/loongsuite-instrumentation-crewai/) |
diff --git a/instrumentation-loongsuite/README.md b/instrumentation-loongsuite/README.md
index 0816ea65c..a938964f5 100644
--- a/instrumentation-loongsuite/README.md
+++ b/instrumentation-loongsuite/README.md
@@ -3,6 +3,7 @@
| --------------- | ------------------ | --------------- | -------------- |
| [loongsuite-instrumentation-agentscope](./loongsuite-instrumentation-agentscope) | agentscope >= 1.0.0 | No | development
| [loongsuite-instrumentation-agno](./loongsuite-instrumentation-agno) | agno | No | development
+| [loongsuite-instrumentation-anthropic](./loongsuite-instrumentation-anthropic) | anthropic >= 0.16.0 | No | development
| [loongsuite-instrumentation-claude-agent-sdk](./loongsuite-instrumentation-claude-agent-sdk) | claude-agent-sdk >= 0.1.0 | No | development
| [loongsuite-instrumentation-crewai](./loongsuite-instrumentation-crewai) | crewai >= 0.80.0 | No | development
| [loongsuite-instrumentation-dashscope](./loongsuite-instrumentation-dashscope) | dashscope >= 1.0.0 | No | development
@@ -15,4 +16,4 @@
| [loongsuite-instrumentation-mcp](./loongsuite-instrumentation-mcp) | mcp >= 1.3.0, <= 1.25.0 | No | development
| [loongsuite-instrumentation-mem0](./loongsuite-instrumentation-mem0) | mem0ai >= 1.0.0, < 2.0.0 | No | development
| [loongsuite-instrumentation-qwen-agent](./loongsuite-instrumentation-qwen-agent) | qwen-agent >= 0.0.20 | No | development
-| [loongsuite-instrumentation-qwenpaw](./loongsuite-instrumentation-qwenpaw) | qwenpaw >= 1.1.0; copaw >= 0.1.0, <= 1.0.2 (legacy) | No | development
\ No newline at end of file
+| [loongsuite-instrumentation-qwenpaw](./loongsuite-instrumentation-qwenpaw) | qwenpaw >= 1.1.0; copaw >= 0.1.0, <= 1.0.2 (legacy) | No | development
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/CHANGELOG.md b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/CHANGELOG.md
new file mode 100644
index 000000000..ba164c0eb
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/CHANGELOG.md
@@ -0,0 +1,20 @@
+# Changelog
+
+All notable changes to this project will be documented in this file.
+
+The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
+and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
+
+## Unreleased
+
+### Added
+
+- Initial implementation of Anthropic instrumentation
+ ([#3978](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/3978))
+- Implement sync `Messages.create` instrumentation with GenAI semantic convention attributes
+ ([#4034](https://github.com/open-telemetry/opentelemetry-python-contrib/pull/4034))
+ - Captures request attributes: `gen_ai.request.model`, `gen_ai.request.max_tokens`, `gen_ai.request.temperature`, `gen_ai.request.top_p`, `gen_ai.request.top_k`, `gen_ai.request.stop_sequences`
+ - Captures response attributes: `gen_ai.response.id`, `gen_ai.response.model`, `gen_ai.response.finish_reasons`, `gen_ai.usage.input_tokens`, `gen_ai.usage.output_tokens`
+ - Error handling with `error.type` attribute
+ - Minimum supported anthropic version is 0.16.0 (SDK uses modern `anthropic.resources.messages` module structure introduced in this version)
+
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/LICENSE b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/LICENSE
new file mode 100644
index 000000000..e294301d4
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/LICENSE
@@ -0,0 +1,202 @@
+Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Support. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright The OpenTelemetry Authors
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/README.rst b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/README.rst
new file mode 100644
index 000000000..1a8b083c1
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/README.rst
@@ -0,0 +1,69 @@
+LoongSuite Anthropic Instrumentation
+====================================
+
+|pypi|
+
+.. |pypi| image:: https://badge.fury.io/py/opentelemetry-instrumentation-anthropic.svg
+ :target: https://pypi.org/project/opentelemetry-instrumentation-anthropic/
+
+This library allows tracing LLM requests made by the
+`Anthropic Python SDK `_ with the
+LoongSuite distribution of the Anthropic instrumentation.
+
+Installation
+------------
+
+::
+
+ pip install ./instrumentation-loongsuite/loongsuite-instrumentation-anthropic
+ pip install ./util/opentelemetry-util-genai
+
+If you don't have an Anthropic application yet, try our `examples `_
+which only need a valid Anthropic API key.
+
+Check out the `zero-code example `_ for a quick start.
+
+Usage
+-----
+
+This section describes how to set up Anthropic instrumentation if you're setting OpenTelemetry up manually.
+Check out the `manual example `_ for more details.
+
+.. code-block:: python
+
+ from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+ import anthropic
+
+ # Instrument Anthropic
+ AnthropicInstrumentor().instrument()
+
+ # Use Anthropic client as normal
+ client = anthropic.Anthropic()
+ response = client.messages.create(
+ model="claude-3-5-sonnet-20241022",
+ max_tokens=1024,
+ messages=[
+ {"role": "user", "content": "Hello, Claude!"}
+ ]
+ )
+
+
+Configuration
+-------------
+
+Capture Message Content
+***********************
+
+By default, prompts and completions are not captured. To enable message content capture,
+set the environment variable:
+
+::
+
+ export OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
+
+
+References
+----------
+
+* `OpenTelemetry Project `_
+* `Anthropic Documentation `_
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/README.rst b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/README.rst
new file mode 100644
index 000000000..1486c7bb8
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/README.rst
@@ -0,0 +1,50 @@
+OpenTelemetry Anthropic Instrumentation Example
+===============================================
+
+This is an example of how to instrument Anthropic calls when configuring OpenTelemetry SDK and Instrumentations manually.
+
+When `main.py `_ is run, it exports traces and logs to an OTLP
+compatible endpoint. Traces include details such as the model used and the
+duration of the chat request. Logs capture the chat request and the generated
+response, providing a comprehensive view of the performance and behavior of
+your Anthropic requests.
+
+Note: `.env <.env>`_ file configures additional environment variables:
+
+- `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true` configures
+ Anthropic instrumentation to capture prompt and completion contents on
+ events.
+
+Setup
+-----
+
+An OTLP compatible endpoint should be listening for traces and logs on
+http://localhost:4317. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
+
+Next, set up a virtual environment like this:
+
+::
+
+ python3 -m venv .venv
+ source .venv/bin/activate
+ pip install "python-dotenv[cli]"
+ pip install -r requirements.txt
+
+You will also need an Anthropic API key. Set it as an environment variable:
+
+::
+
+ export ANTHROPIC_API_KEY=your_api_key_here
+
+Run
+---
+
+Run the example like this:
+
+::
+
+ dotenv run -- python main.py
+
+You should see a poem generated by Claude while traces and logs export to your
+configured observability tool.
+
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/main.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/main.py
new file mode 100644
index 000000000..fd3f4f13b
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/main.py
@@ -0,0 +1,47 @@
+# pylint: skip-file
+import anthropic
+
+# NOTE: OpenTelemetry Python Logs and Events APIs are in beta
+from opentelemetry import _logs, trace
+from opentelemetry.exporter.otlp.proto.grpc._log_exporter import (
+ OTLPLogExporter,
+)
+from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
+ OTLPSpanExporter,
+)
+from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+from opentelemetry.sdk._logs import LoggerProvider
+from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
+from opentelemetry.sdk.trace import TracerProvider
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+
+# configure tracing
+trace.set_tracer_provider(TracerProvider())
+trace.get_tracer_provider().add_span_processor(
+ BatchSpanProcessor(OTLPSpanExporter())
+)
+
+# configure logging and events
+_logs.set_logger_provider(LoggerProvider())
+_logs.get_logger_provider().add_log_record_processor(
+ BatchLogRecordProcessor(OTLPLogExporter())
+)
+
+# instrument Anthropic
+AnthropicInstrumentor().instrument()
+
+
+def main():
+ client = anthropic.Anthropic()
+ message = client.messages.create(
+ model="claude-3-5-sonnet-20241022",
+ max_tokens=1024,
+ messages=[
+ {"role": "user", "content": "Write a short poem on OpenTelemetry."}
+ ],
+ )
+ print(message.content[0].text)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/requirements.txt b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/requirements.txt
new file mode 100644
index 000000000..117ef3720
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/manual/requirements.txt
@@ -0,0 +1,4 @@
+anthropic>=0.16.0
+opentelemetry-sdk~=1.36.0
+loongsuite-instrumentation-anthropic # TODO: update to the released version when available
+opentelemetry-exporter-otlp-proto-grpc~=1.36.0
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/README.rst b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/README.rst
new file mode 100644
index 000000000..6ead71bde
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/README.rst
@@ -0,0 +1,56 @@
+OpenTelemetry Anthropic Zero-Code Instrumentation Example
+=========================================================
+
+This is an example of how to use OpenTelemetry's automatic instrumentation
+(zero-code) capabilities with the Anthropic SDK.
+
+The `opentelemetry-instrument` CLI automatically instruments your Python
+application without requiring code changes. When `main.py `_ is run
+with the CLI, it exports traces and logs to an OTLP compatible endpoint.
+
+Note: `.env <.env>`_ file configures additional environment variables:
+
+- `OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true` configures
+ Anthropic instrumentation to capture prompt and completion contents on
+ events.
+
+Setup
+-----
+
+An OTLP compatible endpoint should be listening for traces and logs on
+http://localhost:4317. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
+
+Next, set up a virtual environment like this:
+
+::
+
+ python3 -m venv .venv
+ source .venv/bin/activate
+ pip install "python-dotenv[cli]"
+ pip install -r requirements.txt
+
+You will also need an Anthropic API key. Set it as an environment variable:
+
+::
+
+ export ANTHROPIC_API_KEY=your_api_key_here
+
+Run
+---
+
+Run the example with zero-code instrumentation like this:
+
+::
+
+ dotenv run -- opentelemetry-instrument python main.py
+
+You should see a poem generated by Claude while traces and logs export to your
+configured observability tool. No changes to `main.py` were required!
+
+Learn More
+----------
+
+See the `OpenTelemetry Python automatic instrumentation docs
+`_ for more
+information about zero-code instrumentation.
+
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/main.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/main.py
new file mode 100644
index 000000000..f37be09cb
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/main.py
@@ -0,0 +1,18 @@
+# pylint: skip-file
+import anthropic
+
+
+def main():
+ client = anthropic.Anthropic()
+ message = client.messages.create(
+ model="claude-3-5-sonnet-20241022",
+ max_tokens=1024,
+ messages=[
+ {"role": "user", "content": "Write a short poem on OpenTelemetry."}
+ ],
+ )
+ print(message.content[0].text)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/requirements.txt b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/requirements.txt
new file mode 100644
index 000000000..88b9cd8bf
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/examples/zero-code/requirements.txt
@@ -0,0 +1,5 @@
+anthropic>=0.16.0
+opentelemetry-sdk~=1.28.2
+opentelemetry-distro~=0.49b2
+loongsuite-instrumentation-anthropic # TODO: update to the released version when available
+opentelemetry-exporter-otlp-proto-grpc~=1.36.0
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/pyproject.toml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/pyproject.toml
new file mode 100644
index 000000000..89bd4693f
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/pyproject.toml
@@ -0,0 +1,61 @@
+[build-system]
+requires = ["hatchling"]
+build-backend = "hatchling.build"
+
+[project]
+name = "loongsuite-instrumentation-anthropic"
+dynamic = ["version"]
+description = "LoongSuite Anthropic instrumentation"
+readme = "README.rst"
+license = "Apache-2.0"
+requires-python = ">=3.9"
+authors = [
+ { name = "OpenTelemetry Authors", email = "cncf-opentelemetry-contributors@lists.cncf.io" },
+]
+classifiers = [
+ "Development Status :: 4 - Beta",
+ "Intended Audience :: Developers",
+ "License :: OSI Approved :: Apache Software License",
+ "Programming Language :: Python",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
+ "Programming Language :: Python :: 3.13",
+ "Programming Language :: Python :: 3.14",
+]
+dependencies = [
+ "opentelemetry-api ~= 1.37",
+ "opentelemetry-instrumentation ~= 0.58b0",
+ "opentelemetry-semantic-conventions ~= 0.58b0",
+ "opentelemetry-util-genai >= 0.2b0, <0.4b0",
+]
+
+[project.optional-dependencies]
+instruments = [
+ "anthropic >= 0.16.0",
+]
+
+[project.entry-points.opentelemetry_instrumentor]
+anthropic = "opentelemetry.instrumentation.anthropic:AnthropicInstrumentor"
+
+[project.urls]
+Homepage = "https://github.com/alibaba/loongsuite-python-agent/tree/main/instrumentation-loongsuite/loongsuite-instrumentation-anthropic"
+Repository = "https://github.com/alibaba/loongsuite-python-agent"
+
+[tool.hatch.version]
+path = "src/opentelemetry/instrumentation/anthropic/version.py"
+
+[tool.hatch.build.targets.sdist]
+include = [
+ "/src",
+ "/tests",
+ "/examples",
+]
+
+[tool.hatch.build.targets.wheel]
+packages = ["src/opentelemetry"]
+
+[tool.pytest.ini_options]
+testpaths = ["tests"]
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/__init__.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/__init__.py
new file mode 100644
index 000000000..91562f572
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/__init__.py
@@ -0,0 +1,149 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+OpenTelemetry Anthropic Instrumentation
+========================================
+
+Instrumentation for the Anthropic Python SDK.
+
+Usage
+-----
+
+.. code-block:: python
+
+ from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+ import anthropic
+
+ # Enable instrumentation
+ AnthropicInstrumentor().instrument()
+
+ # Use Anthropic client normally
+ client = anthropic.Anthropic()
+ response = client.messages.create(
+ model="claude-sonnet-4-20250514",
+ max_tokens=1024,
+ messages=[{"role": "user", "content": "Hello!"}]
+ )
+
+Configuration
+-------------
+
+Message content capture can be enabled by setting the environment variable:
+``OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true``
+
+Or via the ``OTEL_INSTRUMENTATION_GENAI_EXPERIMENTAL_CONTENT_CAPTURING_MODE``
+environment variable (values: ``none``, ``span``, ``event``, ``all``).
+
+API
+---
+"""
+
+from typing import Any, Collection
+
+from wrapt import (
+ wrap_function_wrapper, # pyright: ignore[reportUnknownVariableType]
+)
+
+from opentelemetry.instrumentation.anthropic.package import _instruments
+from opentelemetry.instrumentation.anthropic.patch import (
+ async_messages_create,
+ messages_create,
+)
+from opentelemetry.instrumentation.instrumentor import BaseInstrumentor
+from opentelemetry.instrumentation.utils import unwrap
+from opentelemetry.util.genai.handler import TelemetryHandler
+from opentelemetry.util.genai.types import ContentCapturingMode
+from opentelemetry.util.genai.utils import (
+ get_content_capturing_mode,
+ is_experimental_mode,
+)
+
+
+class AnthropicInstrumentor(BaseInstrumentor):
+ """An instrumentor for the Anthropic Python SDK.
+
+ This instrumentor will automatically trace Anthropic API calls and
+ optionally capture message content as events.
+
+ Supported features:
+ - Sync and async Messages.create
+ - Streaming responses (stream=True)
+ - Content capture (input/output messages, system instructions)
+ - Tool calling (tool definitions, tool use blocks, tool results)
+ - Token usage metrics (including cache tokens)
+ - Error handling and exception recording
+ """
+
+ def __init__(self) -> None:
+ super().__init__()
+
+ # pylint: disable=no-self-use
+ def instrumentation_dependencies(self) -> Collection[str]:
+ return _instruments
+
+ def _instrument(self, **kwargs: Any) -> None:
+ """Enable Anthropic instrumentation.
+
+ Args:
+ **kwargs: Optional arguments
+ - tracer_provider: TracerProvider instance
+ - meter_provider: MeterProvider instance
+ - logger_provider: LoggerProvider instance
+ """
+ tracer_provider = kwargs.get("tracer_provider")
+ meter_provider = kwargs.get("meter_provider")
+ logger_provider = kwargs.get("logger_provider")
+
+ handler = TelemetryHandler(
+ tracer_provider=tracer_provider,
+ meter_provider=meter_provider,
+ logger_provider=logger_provider,
+ )
+
+ content_mode = (
+ get_content_capturing_mode()
+ if is_experimental_mode()
+ else ContentCapturingMode.NO_CONTENT
+ )
+
+ # Patch sync Messages.create
+ wrap_function_wrapper(
+ "anthropic.resources.messages",
+ "Messages.create",
+ messages_create(handler, content_mode),
+ )
+
+ # Patch async AsyncMessages.create
+ wrap_function_wrapper(
+ "anthropic.resources.messages",
+ "AsyncMessages.create",
+ async_messages_create(handler, content_mode),
+ )
+
+ def _uninstrument(self, **kwargs: Any) -> None:
+ """Disable Anthropic instrumentation.
+
+ This removes all patches applied during instrumentation.
+ """
+ import anthropic # pylint: disable=import-outside-toplevel # noqa: PLC0415
+
+ unwrap(
+ anthropic.resources.messages.Messages, # pyright: ignore[reportAttributeAccessIssue,reportUnknownMemberType,reportUnknownArgumentType]
+ "create",
+ )
+ unwrap(
+ anthropic.resources.messages.AsyncMessages, # pyright: ignore[reportAttributeAccessIssue,reportUnknownMemberType,reportUnknownArgumentType]
+ "create",
+ )
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/package.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/package.py
new file mode 100644
index 000000000..576638a5f
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/package.py
@@ -0,0 +1,15 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+_instruments = ("anthropic >= 0.16.0",)
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/patch.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/patch.py
new file mode 100644
index 000000000..1abb17f5b
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/patch.py
@@ -0,0 +1,489 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Patching functions for Anthropic instrumentation."""
+
+from __future__ import annotations
+
+import json
+import timeit
+from typing import TYPE_CHECKING, Any, Callable, Optional
+
+from opentelemetry.util.genai.handler import TelemetryHandler
+from opentelemetry.util.genai.types import (
+ ContentCapturingMode,
+ Error,
+ LLMInvocation,
+ MessagePart,
+ OutputMessage,
+ Text,
+ ToolCall,
+)
+
+from .utils import (
+ create_anthropic_invocation,
+ is_streaming,
+ populate_response,
+)
+
+if TYPE_CHECKING:
+ from anthropic.resources.messages import AsyncMessages, Messages
+ from anthropic.types import Message
+
+
+def messages_create(
+ handler: TelemetryHandler,
+ content_capturing_mode: ContentCapturingMode,
+) -> Callable[..., "Message"]:
+ """Wrap the `create` method of the `Messages` class to trace it."""
+ capture_content = content_capturing_mode != ContentCapturingMode.NO_CONTENT
+
+ def traced_method(
+ wrapped: Callable[..., "Message"],
+ instance: "Messages",
+ args: tuple[Any, ...],
+ kwargs: dict[str, Any],
+ ) -> "Message":
+ invocation = handler.start_llm(
+ create_anthropic_invocation(kwargs, instance, capture_content)
+ )
+
+ try:
+ result = wrapped(*args, **kwargs)
+
+ if is_streaming(kwargs):
+ return AnthropicStreamWrapper(
+ result, handler, invocation, capture_content
+ )
+
+ populate_response(invocation, result, capture_content)
+ handler.stop_llm(invocation)
+ return result
+
+ except Exception as error:
+ handler.fail_llm(
+ invocation, Error(type=type(error), message=str(error))
+ )
+ raise
+
+ return traced_method
+
+
+def async_messages_create(
+ handler: TelemetryHandler,
+ content_capturing_mode: ContentCapturingMode,
+) -> Callable[..., "Message"]:
+ """Wrap the `create` method of the `AsyncMessages` class to trace it."""
+ capture_content = content_capturing_mode != ContentCapturingMode.NO_CONTENT
+
+ async def traced_method(
+ wrapped: Callable[..., "Message"],
+ instance: "AsyncMessages",
+ args: tuple[Any, ...],
+ kwargs: dict[str, Any],
+ ) -> "Message":
+ invocation = handler.start_llm(
+ create_anthropic_invocation(kwargs, instance, capture_content)
+ )
+
+ try:
+ result = await wrapped(*args, **kwargs)
+
+ if is_streaming(kwargs):
+ return AsyncAnthropicStreamWrapper(
+ result, handler, invocation, capture_content
+ )
+
+ populate_response(invocation, result, capture_content)
+ handler.stop_llm(invocation)
+ return result
+
+ except Exception as error:
+ handler.fail_llm(
+ invocation, Error(type=type(error), message=str(error))
+ )
+ raise
+
+ return traced_method
+
+
+class _ContentBlockAccumulator:
+ """Accumulates content blocks from streaming events."""
+
+ def __init__(self):
+ self.text_parts: list[str] = []
+ self.tool_calls: list[dict[str, Any]] = []
+ self._current_block_type: str | None = None
+ self._current_block_data: dict[str, Any] = {}
+
+ def on_content_block_start(self, content_block: Any) -> None:
+ """Handle a content_block_start event."""
+ block_type = getattr(content_block, "type", None)
+ self._current_block_type = block_type
+ self._current_block_data = {}
+
+ if block_type == "tool_use":
+ self._current_block_data = {
+ "name": getattr(content_block, "name", ""),
+ "id": getattr(content_block, "id", None),
+ "input_json": "",
+ }
+
+ def on_content_block_delta(self, delta: Any) -> None:
+ """Handle a content_block_delta event."""
+ delta_type = getattr(delta, "type", None)
+
+ if delta_type == "text_delta":
+ text = getattr(delta, "text", "")
+ self.text_parts.append(text)
+ elif delta_type == "input_json_delta":
+ partial_json = getattr(delta, "partial_json", "")
+ if self._current_block_data:
+ self._current_block_data["input_json"] += partial_json
+
+ def on_content_block_stop(self) -> None:
+ """Handle a content_block_stop event."""
+ if self._current_block_type == "tool_use" and self._current_block_data:
+ input_json_str = self._current_block_data.get("input_json", "")
+ try:
+ arguments = json.loads(input_json_str) if input_json_str else None
+ except (json.JSONDecodeError, ValueError):
+ arguments = input_json_str
+
+ self.tool_calls.append({
+ "name": self._current_block_data.get("name", ""),
+ "id": self._current_block_data.get("id"),
+ "arguments": arguments,
+ })
+
+ self._current_block_type = None
+ self._current_block_data = {}
+
+ def build_output_messages(
+ self, stop_reason: str | None
+ ) -> list[OutputMessage]:
+ """Build OutputMessage list from accumulated data."""
+ parts: list[MessagePart] = []
+
+ text = "".join(self.text_parts)
+ if text:
+ parts.append(Text(content=text))
+
+ for tc in self.tool_calls:
+ parts.append(
+ ToolCall(
+ name=tc["name"],
+ id=tc.get("id"),
+ arguments=tc.get("arguments"),
+ )
+ )
+
+ if not parts:
+ return []
+
+ return [
+ OutputMessage(
+ role="assistant",
+ parts=parts,
+ finish_reason=stop_reason or "error",
+ )
+ ]
+
+
+class AnthropicStreamWrapper:
+ """Wraps an Anthropic streaming response to capture telemetry."""
+
+ def __init__(
+ self,
+ stream: Any,
+ handler: TelemetryHandler,
+ invocation: LLMInvocation,
+ capture_content: bool,
+ ):
+ self.stream = stream
+ self.handler = handler
+ self.invocation = invocation
+ self.capture_content = capture_content
+ self._accumulator = _ContentBlockAccumulator()
+ self._started = True
+ self._response_model: str | None = None
+ self._response_id: str | None = None
+ self._stop_reason: str | None = None
+ self._input_tokens: int | None = None
+ self._output_tokens: int | None = None
+ self._cache_creation_input_tokens: int | None = None
+ self._cache_read_input_tokens: int | None = None
+
+ def __enter__(self):
+ return self
+
+ def __exit__(self, exc_type, exc_val, exc_tb):
+ error = exc_val if exc_type else None
+ self._cleanup(error)
+ return False
+
+ def __iter__(self):
+ return self
+
+ def __next__(self):
+ try:
+ event = next(self.stream)
+ self._process_event(event)
+ return event
+ except StopIteration:
+ self._cleanup()
+ raise
+ except Exception as error:
+ self._cleanup(error)
+ raise
+
+ def close(self):
+ if hasattr(self.stream, "close"):
+ self.stream.close()
+ self._cleanup()
+
+ def _process_event(self, event: Any) -> None:
+ """Process a single streaming event."""
+ event_type = getattr(event, "type", None)
+
+ # Record time to first token
+ if (
+ self.invocation.monotonic_first_token_s is None
+ and event_type in ("content_block_delta",)
+ ):
+ self.invocation.monotonic_first_token_s = timeit.default_timer()
+
+ if event_type == "message_start":
+ message = getattr(event, "message", None)
+ if message:
+ self._response_model = getattr(message, "model", None)
+ self._response_id = getattr(message, "id", None)
+ usage = getattr(message, "usage", None)
+ if usage:
+ self._input_tokens = getattr(usage, "input_tokens", None)
+ if hasattr(usage, "cache_creation_input_tokens"):
+ self._cache_creation_input_tokens = getattr(
+ usage, "cache_creation_input_tokens", None
+ )
+ if hasattr(usage, "cache_read_input_tokens"):
+ self._cache_read_input_tokens = getattr(
+ usage, "cache_read_input_tokens", None
+ )
+
+ elif event_type == "content_block_start":
+ content_block = getattr(event, "content_block", None)
+ if content_block and self.capture_content:
+ self._accumulator.on_content_block_start(content_block)
+
+ elif event_type == "content_block_delta":
+ delta = getattr(event, "delta", None)
+ if delta and self.capture_content:
+ self._accumulator.on_content_block_delta(delta)
+
+ elif event_type == "content_block_stop":
+ if self.capture_content:
+ self._accumulator.on_content_block_stop()
+
+ elif event_type == "message_delta":
+ delta = getattr(event, "delta", None)
+ if delta:
+ self._stop_reason = getattr(delta, "stop_reason", None)
+ usage = getattr(event, "usage", None)
+ if usage:
+ self._output_tokens = getattr(usage, "output_tokens", None)
+
+ elif event_type == "message_stop":
+ pass # Cleanup happens in __next__ StopIteration or __exit__
+
+ def _cleanup(self, error: Optional[BaseException] = None) -> None:
+ """Finalize the invocation with accumulated data."""
+ if not self._started:
+ return
+ self._started = False
+
+ self.invocation.response_model_name = self._response_model
+ self.invocation.response_id = self._response_id
+ self.invocation.input_tokens = self._input_tokens
+ self.invocation.output_tokens = self._output_tokens
+
+ if self._cache_creation_input_tokens:
+ self.invocation.usage_cache_creation_input_tokens = (
+ self._cache_creation_input_tokens
+ )
+ if self._cache_read_input_tokens:
+ self.invocation.usage_cache_read_input_tokens = (
+ self._cache_read_input_tokens
+ )
+
+ if self._stop_reason:
+ self.invocation.finish_reasons = [self._stop_reason]
+
+ if self.capture_content:
+ self.invocation.output_messages = (
+ self._accumulator.build_output_messages(self._stop_reason)
+ )
+
+ if error:
+ self.handler.fail_llm(
+ self.invocation,
+ Error(type=type(error), message=str(error)),
+ )
+ else:
+ self.handler.stop_llm(self.invocation)
+
+ def __getattr__(self, name):
+ """Proxy attribute access to the underlying stream."""
+ return getattr(self.stream, name)
+
+
+class AsyncAnthropicStreamWrapper:
+ """Wraps an Anthropic async streaming response to capture telemetry."""
+
+ def __init__(
+ self,
+ stream: Any,
+ handler: TelemetryHandler,
+ invocation: LLMInvocation,
+ capture_content: bool,
+ ):
+ self.stream = stream
+ self.handler = handler
+ self.invocation = invocation
+ self.capture_content = capture_content
+ self._accumulator = _ContentBlockAccumulator()
+ self._started = True
+ self._response_model: str | None = None
+ self._response_id: str | None = None
+ self._stop_reason: str | None = None
+ self._input_tokens: int | None = None
+ self._output_tokens: int | None = None
+ self._cache_creation_input_tokens: int | None = None
+ self._cache_read_input_tokens: int | None = None
+
+ async def __aenter__(self):
+ return self
+
+ async def __aexit__(self, exc_type, exc_val, exc_tb):
+ error = exc_val if exc_type else None
+ self._cleanup(error)
+ return False
+
+ def __aiter__(self):
+ return self
+
+ async def __anext__(self):
+ try:
+ event = await self.stream.__anext__()
+ self._process_event(event)
+ return event
+ except StopAsyncIteration:
+ self._cleanup()
+ raise
+ except Exception as error:
+ self._cleanup(error)
+ raise
+
+ async def close(self):
+ if hasattr(self.stream, "close"):
+ await self.stream.close()
+ self._cleanup()
+
+ def _process_event(self, event: Any) -> None:
+ """Process a single streaming event - same logic as sync."""
+ event_type = getattr(event, "type", None)
+
+ if (
+ self.invocation.monotonic_first_token_s is None
+ and event_type in ("content_block_delta",)
+ ):
+ self.invocation.monotonic_first_token_s = timeit.default_timer()
+
+ if event_type == "message_start":
+ message = getattr(event, "message", None)
+ if message:
+ self._response_model = getattr(message, "model", None)
+ self._response_id = getattr(message, "id", None)
+ usage = getattr(message, "usage", None)
+ if usage:
+ self._input_tokens = getattr(usage, "input_tokens", None)
+ if hasattr(usage, "cache_creation_input_tokens"):
+ self._cache_creation_input_tokens = getattr(
+ usage, "cache_creation_input_tokens", None
+ )
+ if hasattr(usage, "cache_read_input_tokens"):
+ self._cache_read_input_tokens = getattr(
+ usage, "cache_read_input_tokens", None
+ )
+
+ elif event_type == "content_block_start":
+ content_block = getattr(event, "content_block", None)
+ if content_block and self.capture_content:
+ self._accumulator.on_content_block_start(content_block)
+
+ elif event_type == "content_block_delta":
+ delta = getattr(event, "delta", None)
+ if delta and self.capture_content:
+ self._accumulator.on_content_block_delta(delta)
+
+ elif event_type == "content_block_stop":
+ if self.capture_content:
+ self._accumulator.on_content_block_stop()
+
+ elif event_type == "message_delta":
+ delta = getattr(event, "delta", None)
+ if delta:
+ self._stop_reason = getattr(delta, "stop_reason", None)
+ usage = getattr(event, "usage", None)
+ if usage:
+ self._output_tokens = getattr(usage, "output_tokens", None)
+
+ def _cleanup(self, error: Optional[BaseException] = None) -> None:
+ """Finalize the invocation with accumulated data."""
+ if not self._started:
+ return
+ self._started = False
+
+ self.invocation.response_model_name = self._response_model
+ self.invocation.response_id = self._response_id
+ self.invocation.input_tokens = self._input_tokens
+ self.invocation.output_tokens = self._output_tokens
+
+ if self._cache_creation_input_tokens:
+ self.invocation.usage_cache_creation_input_tokens = (
+ self._cache_creation_input_tokens
+ )
+ if self._cache_read_input_tokens:
+ self.invocation.usage_cache_read_input_tokens = (
+ self._cache_read_input_tokens
+ )
+
+ if self._stop_reason:
+ self.invocation.finish_reasons = [self._stop_reason]
+
+ if self.capture_content:
+ self.invocation.output_messages = (
+ self._accumulator.build_output_messages(self._stop_reason)
+ )
+
+ if error:
+ self.handler.fail_llm(
+ self.invocation,
+ Error(type=type(error), message=str(error)),
+ )
+ else:
+ self.handler.stop_llm(self.invocation)
+
+ def __getattr__(self, name):
+ """Proxy attribute access to the underlying stream."""
+ return getattr(self.stream, name)
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/utils.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/utils.py
new file mode 100644
index 000000000..a81b653e2
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/utils.py
@@ -0,0 +1,304 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Utility functions for Anthropic instrumentation."""
+
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Any
+from urllib.parse import urlparse
+
+from opentelemetry.semconv._incubating.attributes import (
+ gen_ai_attributes as GenAIAttributes,
+)
+from opentelemetry.util.genai.types import (
+ FunctionToolDefinition,
+ InputMessage,
+ LLMInvocation,
+ MessagePart,
+ OutputMessage,
+ Reasoning,
+ Text,
+ ToolCall,
+ ToolCallResponse,
+)
+
+if TYPE_CHECKING:
+ from anthropic.resources.messages import Messages
+
+
+def is_streaming(kwargs: dict[str, Any]) -> bool:
+ """Check if the request is a streaming request."""
+ return bool(kwargs.get("stream"))
+
+
+def get_server_address_and_port(
+ client_instance: "Messages",
+) -> tuple[str | None, int | None]:
+ """Extract server address and port from the Anthropic client instance."""
+ base_client = getattr(client_instance, "_client", None)
+ base_url = getattr(base_client, "base_url", None)
+ if not base_url:
+ return None, None
+
+ address: str | None = None
+ port: int | None = None
+
+ if hasattr(base_url, "host"):
+ # httpx.URL object
+ address = base_url.host
+ port = getattr(base_url, "port", None)
+ elif isinstance(base_url, str):
+ url = urlparse(base_url)
+ address = url.hostname
+ port = url.port
+
+ if port == 443:
+ port = None
+
+ return address, port
+
+
+def create_anthropic_invocation(
+ kwargs: dict[str, Any],
+ client_instance: "Messages",
+ capture_content: bool,
+) -> LLMInvocation:
+ """Create an LLMInvocation from Anthropic Messages.create() parameters.
+
+ This populates the LLMInvocation with all semantic convention attributes
+ from the request parameters, including content capture if enabled.
+ """
+ invocation = LLMInvocation(
+ request_model=kwargs.get("model", ""),
+ )
+ invocation.provider = (
+ GenAIAttributes.GenAiSystemValues.ANTHROPIC.value # pyright: ignore[reportDeprecated]
+ )
+
+ # Request parameters
+ invocation.max_tokens = kwargs.get("max_tokens")
+ invocation.temperature = kwargs.get("temperature")
+ invocation.top_p = kwargs.get("top_p")
+ invocation.top_k = kwargs.get("top_k")
+
+ stop_sequences = kwargs.get("stop_sequences")
+ if stop_sequences is not None:
+ invocation.stop_sequences = list(stop_sequences)
+
+ # Server address
+ address, port = get_server_address_and_port(client_instance)
+ if address:
+ invocation.server_address = address
+ if port:
+ invocation.server_port = port
+
+ # Content capture
+ if capture_content:
+ # Input messages
+ messages = kwargs.get("messages", [])
+ invocation.input_messages = _convert_messages_to_input(messages)
+
+ # System instruction
+ system = kwargs.get("system")
+ if system:
+ invocation.system_instruction = _convert_system_to_parts(system)
+
+ # Tool definitions
+ tools = kwargs.get("tools")
+ if tools:
+ invocation.tool_definitions = _convert_tools_to_definitions(tools)
+
+ return invocation
+
+
+def populate_response(
+ invocation: LLMInvocation,
+ result: Any,
+ capture_content: bool,
+) -> None:
+ """Populate an LLMInvocation with response data from an Anthropic Message."""
+ if result is None:
+ return
+
+ if getattr(result, "model", None):
+ invocation.response_model_name = result.model
+
+ if getattr(result, "id", None):
+ invocation.response_id = result.id
+
+ if getattr(result, "stop_reason", None):
+ invocation.finish_reasons = [result.stop_reason]
+
+ # Token usage
+ usage = getattr(result, "usage", None)
+ if usage:
+ if hasattr(usage, "input_tokens"):
+ invocation.input_tokens = usage.input_tokens
+ if hasattr(usage, "output_tokens"):
+ invocation.output_tokens = usage.output_tokens
+ # Cache token usage (Anthropic-specific)
+ if hasattr(usage, "cache_creation_input_tokens") and usage.cache_creation_input_tokens:
+ invocation.usage_cache_creation_input_tokens = usage.cache_creation_input_tokens
+ if hasattr(usage, "cache_read_input_tokens") and usage.cache_read_input_tokens:
+ invocation.usage_cache_read_input_tokens = usage.cache_read_input_tokens
+
+ # Output messages (content blocks)
+ if capture_content:
+ content = getattr(result, "content", None)
+ if content:
+ invocation.output_messages = _convert_content_blocks_to_output(
+ content, getattr(result, "stop_reason", None)
+ )
+
+
+def _convert_messages_to_input(
+ messages: list[dict[str, Any]],
+) -> list[InputMessage]:
+ """Convert Anthropic message format to InputMessage list."""
+ input_messages: list[InputMessage] = []
+ for msg in messages:
+ role = msg.get("role", "user")
+ parts: list[MessagePart] = []
+ content = msg.get("content")
+
+ if isinstance(content, str):
+ parts.append(Text(content=content))
+ elif isinstance(content, list):
+ for block in content:
+ if isinstance(block, dict):
+ block_type = block.get("type", "")
+ if block_type == "text":
+ parts.append(Text(content=block.get("text", "")))
+ elif block_type == "tool_use":
+ parts.append(
+ ToolCall(
+ name=block.get("name", ""),
+ id=block.get("id"),
+ arguments=block.get("input"),
+ )
+ )
+ elif block_type == "tool_result":
+ tool_content = block.get("content", "")
+ if isinstance(tool_content, list):
+ # Extract text from content blocks
+ text_parts = []
+ for sub_block in tool_content:
+ if isinstance(sub_block, dict) and sub_block.get("type") == "text":
+ text_parts.append(sub_block.get("text", ""))
+ tool_content = "\n".join(text_parts)
+ parts.append(
+ ToolCallResponse(
+ id=block.get("tool_use_id"),
+ response=tool_content,
+ )
+ )
+ elif block_type == "image":
+ # Skip image blocks for now (binary data)
+ pass
+ elif isinstance(block, str):
+ parts.append(Text(content=block))
+
+ input_messages.append(InputMessage(role=role, parts=parts))
+ return input_messages
+
+
+def _convert_system_to_parts(
+ system: Any,
+) -> list[MessagePart]:
+ """Convert Anthropic system prompt to MessagePart list."""
+ parts: list[MessagePart] = []
+ if isinstance(system, str):
+ parts.append(Text(content=system))
+ elif isinstance(system, list):
+ for block in system:
+ if isinstance(block, dict):
+ if block.get("type") == "text":
+ parts.append(Text(content=block.get("text", "")))
+ elif isinstance(block, str):
+ parts.append(Text(content=block))
+ return parts
+
+
+def _convert_tools_to_definitions(
+ tools: list[Any],
+) -> list[FunctionToolDefinition]:
+ """Convert Anthropic tool definitions to FunctionToolDefinition list."""
+ definitions: list[FunctionToolDefinition] = []
+ for tool in tools:
+ if isinstance(tool, dict):
+ name = tool.get("name", "")
+ description = tool.get("description")
+ input_schema = tool.get("input_schema")
+ definitions.append(
+ FunctionToolDefinition(
+ name=name,
+ description=description,
+ parameters=input_schema,
+ )
+ )
+ else:
+ # Pydantic model or other object
+ name = getattr(tool, "name", "")
+ description = getattr(tool, "description", None)
+ input_schema = getattr(tool, "input_schema", None)
+ tool_type = getattr(tool, "type", "custom")
+ if tool_type == "custom" or name:
+ definitions.append(
+ FunctionToolDefinition(
+ name=name,
+ description=description,
+ parameters=input_schema,
+ )
+ )
+ return definitions
+
+
+def _convert_content_blocks_to_output(
+ content: list[Any],
+ stop_reason: str | None,
+) -> list[OutputMessage]:
+ """Convert Anthropic content blocks to OutputMessage list."""
+ parts: list[MessagePart] = []
+ for block in content:
+ block_type = getattr(block, "type", None)
+ if block_type == "text":
+ text = getattr(block, "text", "")
+ parts.append(Text(content=text))
+ elif block_type == "tool_use":
+ name = getattr(block, "name", "")
+ tool_id = getattr(block, "id", None)
+ tool_input = getattr(block, "input", None)
+ parts.append(
+ ToolCall(
+ name=name,
+ id=tool_id,
+ arguments=tool_input,
+ )
+ )
+ elif block_type == "thinking":
+ # Extended thinking block - capture as text for now
+ thinking_text = getattr(block, "thinking", "")
+ if thinking_text:
+ parts.append(Reasoning(content=thinking_text))
+
+ if parts:
+ return [
+ OutputMessage(
+ role="assistant",
+ parts=parts,
+ finish_reason=stop_reason or "error",
+ )
+ ]
+ return []
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/version.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/version.py
new file mode 100644
index 000000000..61ae9b7c2
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/src/opentelemetry/instrumentation/anthropic/version.py
@@ -0,0 +1,15 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+__version__ = "2.0b0.dev"
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/__init__.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/__init__.py
new file mode 100644
index 000000000..b0a6f4284
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/__init__.py
@@ -0,0 +1,13 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_api_error.yaml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_api_error.yaml
new file mode 100644
index 000000000..394058219
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_api_error.yaml
@@ -0,0 +1,437 @@
+interactions:
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Hello"
+ }
+ ],
+ "model": "invalid-model-name"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '110'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python
+ x-api-key:
+ - test_anthropic_api_key
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "not_found_error",
+ "message": "model: invalid-model-name"
+ }
+ }
+ headers:
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Mon, 15 Dec 2024 10:00:04 GMT
+ Server:
+ - cloudflare
+ content-length:
+ - '105'
+ status:
+ code: 404
+ message: Not Found
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Hello"
+ }
+ ],
+ "model": "invalid-model-name"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '94'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "authentication_error",
+ "message": "invalid x-api-key"
+ },
+ "request_id": "req_011CX88XGWSm82bN96ZkDWcr"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e03eff016e28-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '130'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:22:32 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88XGWSm82bN96ZkDWcr
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '13'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 401
+ message: Unauthorized
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Hello"
+ }
+ ],
+ "model": "invalid-model-name"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '94'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88ZmBumGkvj7aK6Gqzx"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e1127fd1b1bc-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:23:06 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88ZmBumGkvj7aK6Gqzx
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '59'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Hello"
+ }
+ ],
+ "model": "invalid-model-name"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '94'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "not_found_error",
+ "message": "model: invalid-model-name"
+ },
+ "request_id": "req_011CX89f9XqPgyRJC1ASfAab"
+ }
+ headers:
+ CF-RAY:
+ - 9be0f610add5b89f-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:37:25 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '133'
+ request-id:
+ - req_011CX89f9XqPgyRJC1ASfAab
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '27'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 404
+ message: Not Found
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Hello"
+ }
+ ],
+ "model": "invalid-model-name"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '94'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "not_found_error",
+ "message": "model: invalid-model-name"
+ },
+ "request_id": "req_011CX89kiWcGNsjEPrPGA42x"
+ }
+ headers:
+ CF-RAY:
+ - 9be0f7e8dbf70ab9-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:38:41 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '133'
+ request-id:
+ - req_011CX89kiWcGNsjEPrPGA42x
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '78'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 404
+ message: Not Found
+version: 1
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_basic.yaml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_basic.yaml
new file mode 100644
index 000000000..205b1c2e4
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_basic.yaml
@@ -0,0 +1,853 @@
+interactions:
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '128'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python
+ x-api-key:
+ - test_anthropic_api_key
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "id": "msg_01XFDUDYJgAACzvnptvVoYEL",
+ "type": "message",
+ "role": "assistant",
+ "model": "claude-3-5-sonnet-20241022",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello!"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 14,
+ "output_tokens": 4
+ }
+ }
+ headers:
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Mon, 15 Dec 2024 10:00:00 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ content-length:
+ - '350'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '119'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "authentication_error",
+ "message": "invalid x-api-key"
+ },
+ "request_id": "req_011CX88X7DM3SrsuuERgZeYJ"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e0315fa2a02c-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '130'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:22:30 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88X7DM3SrsuuERgZeYJ
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '13'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 401
+ message: Unauthorized
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '119'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88Zakc7NS5rDAkMMK45"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e1032e73ace5-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:23:03 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88Zakc7NS5rDAkMMK45
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '30'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '119'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88qYh2YvHnk7Hp8vCR7"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e64cc92eb4c6-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:26:40 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88qYh2YvHnk7Hp8vCR7
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '20'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '119'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "not_found_error",
+ "message": "model: claude-3-5-sonnet-20241022"
+ },
+ "request_id": "req_011CX89Wba1H8DoNZMAq5M9M"
+ }
+ headers:
+ CF-RAY:
+ - 9be0f33baa01cdf0-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:35:29 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '141'
+ request-id:
+ - req_011CX89Wba1H8DoNZMAq5M9M
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '28'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 404
+ message: Not Found
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '117'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01ChhLEFb4TSQWHpQzFqEQsj",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello!"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 13,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 5,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f55b2b560866-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:36:58 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:36:58Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:36:58Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:36:58Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:36:58Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '409'
+ request-id:
+ - req_011CX89d1Mu8qapBc5y9KdXf
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1596'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '117'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01W7B22k9o9QrCqiEWmU1v9G",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello!"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 13,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 5,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f5d0cffcdcde-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:37:17 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:37:16Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:37:17Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:37:16Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:37:16Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '409'
+ request-id:
+ - req_011CX89ePqXjam4ByzovDWC6
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1516'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello in one word."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '117'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01MpMMdwiz43MhCffJBjWRZP",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello!"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 13,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 5,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f7a98b01b734-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:38:33 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:38:33Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:38:33Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:38:32Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:38:33Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '409'
+ request-id:
+ - req_011CX89jyLYwphKuArFEcRij
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1970'
+ status:
+ code: 200
+ message: OK
+version: 1
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_stop_reason.yaml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_stop_reason.yaml
new file mode 100644
index 000000000..17e0fb871
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_stop_reason.yaml
@@ -0,0 +1,528 @@
+interactions:
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hi."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '114'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python
+ x-api-key:
+ - test_anthropic_api_key
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "id": "msg_04AGDUDYJgAACzvnptvVoYEL",
+ "type": "message",
+ "role": "assistant",
+ "model": "claude-3-5-sonnet-20241022",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hi!"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "output_tokens": 3
+ }
+ }
+ headers:
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Mon, 15 Dec 2024 10:00:03 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ content-length:
+ - '340'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hi."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '104'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "authentication_error",
+ "message": "invalid x-api-key"
+ },
+ "request_id": "req_011CX88XA9TbQ5AUJfKb95H1"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e035a86cc60f-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '130'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:22:30 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88XA9TbQ5AUJfKb95H1
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '14'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 401
+ message: Unauthorized
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hi."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '104'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88Ze7X2Hzt93cQSJVwx"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e1081e29cb6b-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:23:04 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88Ze7X2Hzt93cQSJVwx
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '32'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hi."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '102'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_0179PsyreizP7wUuiZek9cY1",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hi! How are you doing today?"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 11,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f5f8dbb9c484-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:37:24 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:37:23Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:37:24Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:37:23Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:37:23Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '432'
+ request-id:
+ - req_011CX89esEoi7zVpEaWsLZc8
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '2140'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hi."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '102'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_015tg8KmiFK3cef86zqT6mbU",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hi! How are you doing today?"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 11,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f7d4490ae55d-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:38:39 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:38:39Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:38:39Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:38:39Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:38:39Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '432'
+ request-id:
+ - req_011CX89kUUFsmSE8auY18PrD
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1745'
+ status:
+ code: 200
+ message: OK
+version: 1
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_token_usage.yaml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_token_usage.yaml
new file mode 100644
index 000000000..e6f90a095
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_token_usage.yaml
@@ -0,0 +1,528 @@
+interactions:
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Count to 5."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '118'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python
+ x-api-key:
+ - test_anthropic_api_key
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "id": "msg_03ZGDUDYJgAACzvnptvVoYEL",
+ "type": "message",
+ "role": "assistant",
+ "model": "claude-3-5-sonnet-20241022",
+ "content": [
+ {
+ "type": "text",
+ "text": "1, 2, 3, 4, 5"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 12,
+ "output_tokens": 14
+ }
+ }
+ headers:
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Mon, 15 Dec 2024 10:00:02 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ content-length:
+ - '355'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Count to 5."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '108'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "authentication_error",
+ "message": "invalid x-api-key"
+ },
+ "request_id": "req_011CX88X9HN93DBTEXsjo9DZ"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e0346f01a0f4-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '130'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:22:30 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88X9HN93DBTEXsjo9DZ
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '11'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 401
+ message: Unauthorized
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Count to 5."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '108'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88Zd4WpWvH6NbSKLd4T"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e1068d28f9a9-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:23:04 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88Zd4WpWvH6NbSKLd4T
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '23'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Count to 5."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '106'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01ASpi9ptEqzyzyCGk3aB1tr",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "1\n2\n3\n4\n5"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 12,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 13,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f5e96d19cc98-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:37:21 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:37:21Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:37:21Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:37:20Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:37:21Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '417'
+ request-id:
+ - req_011CX89egfZS3S8npjMnb4jb
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '2246'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 100,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Count to 5."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514"
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '106'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01JFQS8RhcxvidJNQvmANwSp",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "1, 2, 3, 4, 5"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 12,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 17,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f7c51d21c62c-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:38:37 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:38:36Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:38:38Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:38:36Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:38:36Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '417'
+ request-id:
+ - req_011CX89kJ5gA2k39mva9DeiB
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '2244'
+ status:
+ code: 200
+ message: OK
+version: 1
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_with_all_params.yaml b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_with_all_params.yaml
new file mode 100644
index 000000000..b63b5b56d
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/cassettes/test_sync_messages_create_with_all_params.yaml
@@ -0,0 +1,558 @@
+interactions:
+- request:
+ body: |-
+ {
+ "max_tokens": 50,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022",
+ "stop_sequences": [
+ "STOP"
+ ],
+ "temperature": 0.7,
+ "top_k": 40,
+ "top_p": 0.9
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '200'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python
+ x-api-key:
+ - test_anthropic_api_key
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "id": "msg_02YGDUDYJgAACzvnptvVoYEL",
+ "type": "message",
+ "role": "assistant",
+ "model": "claude-3-5-sonnet-20241022",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello! How can I help you today?"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "output_tokens": 10
+ }
+ }
+ headers:
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Mon, 15 Dec 2024 10:00:01 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ content-length:
+ - '380'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 50,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022",
+ "stop_sequences": [
+ "STOP"
+ ],
+ "temperature": 0.7,
+ "top_k": 40,
+ "top_p": 0.9
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '173'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "authentication_error",
+ "message": "invalid x-api-key"
+ },
+ "request_id": "req_011CX88X8R2SNXCvHpf8jdxa"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e0331fe3effa-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '130'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:22:30 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88X8R2SNXCvHpf8jdxa
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '14'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 401
+ message: Unauthorized
+- request:
+ body: |-
+ {
+ "max_tokens": 50,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello."
+ }
+ ],
+ "model": "claude-3-5-sonnet-20241022",
+ "stop_sequences": [
+ "STOP"
+ ],
+ "temperature": 0.7,
+ "top_k": 40,
+ "top_p": 0.9
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '173'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "type": "error",
+ "error": {
+ "type": "invalid_request_error",
+ "message": "Your credit balance is too low to access the Anthropic API. Please go to Plans & Billing to upgrade or purchase credits."
+ },
+ "request_id": "req_011CX88Zc1kwbPkJkhghetKz"
+ }
+ headers:
+ CF-RAY:
+ - 9be0e10508204ba5-EWR
+ Connection:
+ - keep-alive
+ Content-Length:
+ - '234'
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:23:03 GMT
+ Server:
+ - cloudflare
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ cf-cache-status:
+ - DYNAMIC
+ request-id:
+ - req_011CX88Zc1kwbPkJkhghetKz
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '31'
+ x-should-retry:
+ - 'false'
+ status:
+ code: 400
+ message: Bad Request
+- request:
+ body: |-
+ {
+ "max_tokens": 50,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514",
+ "stop_sequences": [
+ "STOP"
+ ],
+ "temperature": 0.7,
+ "top_k": 40,
+ "top_p": 0.9
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '171'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_01Nf6xgm48TeELiSicTA83cX",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello! How are you doing today?"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 11,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f5db9b8110f3-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:37:19 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:37:18Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:37:19Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:37:18Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:37:18Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '435'
+ request-id:
+ - req_011CX89eXCYZvbfUZDzWQK3e
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1953'
+ status:
+ code: 200
+ message: OK
+- request:
+ body: |-
+ {
+ "max_tokens": 50,
+ "messages": [
+ {
+ "role": "user",
+ "content": "Say hello."
+ }
+ ],
+ "model": "claude-sonnet-4-20250514",
+ "stop_sequences": [
+ "STOP"
+ ],
+ "temperature": 0.7,
+ "top_k": 40,
+ "top_p": 0.9
+ }
+ headers:
+ accept:
+ - application/json
+ accept-encoding:
+ - gzip, deflate
+ anthropic-version:
+ - '2023-06-01'
+ connection:
+ - keep-alive
+ content-length:
+ - '171'
+ content-type:
+ - application/json
+ host:
+ - api.anthropic.com
+ user-agent:
+ - Anthropic/Python 0.75.0
+ x-api-key:
+ - test_anthropic_api_key
+ x-stainless-arch:
+ - arm64
+ x-stainless-async:
+ - 'false'
+ x-stainless-lang:
+ - python
+ x-stainless-os:
+ - MacOS
+ x-stainless-package-version:
+ - 0.75.0
+ x-stainless-read-timeout:
+ - '600'
+ x-stainless-retry-count:
+ - '0'
+ x-stainless-runtime:
+ - CPython
+ x-stainless-runtime-version:
+ - 3.9.6
+ x-stainless-timeout:
+ - '600'
+ method: POST
+ uri: https://api.anthropic.com/v1/messages
+ response:
+ body:
+ string: |-
+ {
+ "model": "claude-sonnet-4-20250514",
+ "id": "msg_015XXNApQAi4ZgazLmEDHne6",
+ "type": "message",
+ "role": "assistant",
+ "content": [
+ {
+ "type": "text",
+ "text": "Hello! How are you doing today?"
+ }
+ ],
+ "stop_reason": "end_turn",
+ "stop_sequence": null,
+ "usage": {
+ "input_tokens": 10,
+ "cache_creation_input_tokens": 0,
+ "cache_read_input_tokens": 0,
+ "cache_creation": {
+ "ephemeral_5m_input_tokens": 0,
+ "ephemeral_1h_input_tokens": 0
+ },
+ "output_tokens": 11,
+ "service_tier": "standard"
+ }
+ }
+ headers:
+ CF-RAY:
+ - 9be0f7b73af7c269-EWR
+ Connection:
+ - keep-alive
+ Content-Type:
+ - application/json
+ Date:
+ - Wed, 14 Jan 2026 23:38:35 GMT
+ Server:
+ - cloudflare
+ Transfer-Encoding:
+ - chunked
+ X-Robots-Tag:
+ - none
+ anthropic-organization-id:
+ - 455ea6be-bd92-4199-83ec-0c6b39c5c169
+ anthropic-ratelimit-input-tokens-limit:
+ - '30000'
+ anthropic-ratelimit-input-tokens-remaining:
+ - '30000'
+ anthropic-ratelimit-input-tokens-reset:
+ - '2026-01-14T23:38:35Z'
+ anthropic-ratelimit-output-tokens-limit:
+ - '8000'
+ anthropic-ratelimit-output-tokens-remaining:
+ - '8000'
+ anthropic-ratelimit-output-tokens-reset:
+ - '2026-01-14T23:38:35Z'
+ anthropic-ratelimit-requests-limit:
+ - '50'
+ anthropic-ratelimit-requests-remaining:
+ - '49'
+ anthropic-ratelimit-requests-reset:
+ - '2026-01-14T23:38:34Z'
+ anthropic-ratelimit-tokens-limit:
+ - '38000'
+ anthropic-ratelimit-tokens-remaining:
+ - '38000'
+ anthropic-ratelimit-tokens-reset:
+ - '2026-01-14T23:38:35Z'
+ cf-cache-status:
+ - DYNAMIC
+ content-length:
+ - '435'
+ request-id:
+ - req_011CX89k8aBRkvFFLPZSBSrX
+ strict-transport-security:
+ - max-age=31536000; includeSubDomains; preload
+ x-envoy-upstream-service-time:
+ - '1955'
+ status:
+ code: 200
+ message: OK
+version: 1
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/conftest.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/conftest.py
new file mode 100644
index 000000000..eeea495cb
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/conftest.py
@@ -0,0 +1,237 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Test configuration and fixtures for Anthropic instrumentation tests."""
+# pylint: disable=redefined-outer-name
+
+import json
+import os
+
+import pytest
+import yaml
+from anthropic import Anthropic
+
+from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+from opentelemetry.sdk._logs import LoggerProvider
+from opentelemetry.sdk._logs.export import (
+ InMemoryLogExporter,
+ SimpleLogRecordProcessor,
+)
+from opentelemetry.sdk.metrics import MeterProvider
+from opentelemetry.sdk.metrics.export import InMemoryMetricReader
+from opentelemetry.sdk.trace import TracerProvider
+from opentelemetry.sdk.trace.export import SimpleSpanProcessor
+from opentelemetry.sdk.trace.export.in_memory_span_exporter import (
+ InMemorySpanExporter,
+)
+from opentelemetry.util.genai.environment_variables import (
+ OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT,
+)
+
+
+@pytest.fixture
+def span_exporter():
+ """Create and return an in-memory span exporter."""
+ exporter = InMemorySpanExporter()
+ yield exporter
+ exporter.clear()
+
+
+@pytest.fixture
+def log_exporter():
+ """Create and return an in-memory log exporter."""
+ exporter = InMemoryLogExporter()
+ yield exporter
+ exporter.clear()
+
+
+@pytest.fixture
+def metric_reader():
+ """Create and return an in-memory metric reader."""
+ reader = InMemoryMetricReader()
+ yield reader
+
+
+@pytest.fixture
+def tracer_provider(span_exporter):
+ """Create and configure a tracer provider with in-memory export."""
+ provider = TracerProvider()
+ provider.add_span_processor(SimpleSpanProcessor(span_exporter))
+ return provider
+
+
+@pytest.fixture
+def logger_provider(log_exporter):
+ """Create and configure a logger provider with in-memory export."""
+ provider = LoggerProvider()
+ provider.add_log_record_processor(SimpleLogRecordProcessor(log_exporter))
+ yield provider
+
+
+@pytest.fixture
+def meter_provider(metric_reader):
+ """Create and configure a meter provider with in-memory metrics."""
+ provider = MeterProvider(metric_readers=[metric_reader])
+ yield provider
+
+
+@pytest.fixture(autouse=True)
+def environment():
+ """Set up environment variables for testing."""
+ if not os.getenv("ANTHROPIC_API_KEY"):
+ os.environ["ANTHROPIC_API_KEY"] = "test_anthropic_api_key"
+
+
+@pytest.fixture
+def anthropic_client():
+ """Create and return an Anthropic client."""
+ return Anthropic()
+
+
+@pytest.fixture(scope="module")
+def vcr_config():
+ """Configure VCR for recording/replaying HTTP interactions."""
+ return {
+ "filter_headers": [
+ ("x-api-key", "test_anthropic_api_key"),
+ ("authorization", "Bearer test_anthropic_api_key"),
+ ],
+ "decode_compressed_response": True,
+ "before_record_response": scrub_response_headers,
+ "match_on": ["method", "body"],
+ "record_mode": "none",
+ }
+
+
+@pytest.fixture(scope="function")
+def instrument_no_content(tracer_provider, logger_provider, meter_provider):
+ """Instrument Anthropic without content capture."""
+ os.environ.update(
+ {OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT: "False"}
+ )
+
+ instrumentor = AnthropicInstrumentor()
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ yield instrumentor
+ os.environ.pop(OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT, None)
+ instrumentor.uninstrument()
+
+
+@pytest.fixture(scope="function")
+def instrument_with_content(tracer_provider, logger_provider, meter_provider):
+ """Instrument Anthropic with content capture enabled."""
+ os.environ.update(
+ {OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT: "True"}
+ )
+ instrumentor = AnthropicInstrumentor()
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ yield instrumentor
+ os.environ.pop(OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT, None)
+ instrumentor.uninstrument()
+
+
+@pytest.fixture
+def instrument_anthropic(tracer_provider, logger_provider, meter_provider):
+ """Fixture to instrument Anthropic with test providers."""
+ instrumentor = AnthropicInstrumentor()
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+ yield instrumentor
+ instrumentor.uninstrument()
+
+
+@pytest.fixture
+def uninstrument_anthropic():
+ """Fixture to ensure Anthropic is uninstrumented after test."""
+ yield
+ AnthropicInstrumentor().uninstrument()
+
+
+class LiteralBlockScalar(str):
+ """Formats the string as a literal block scalar."""
+
+
+def literal_block_scalar_presenter(dumper, data):
+ """Represents a scalar string as a literal block, via '|' syntax."""
+ return dumper.represent_scalar("tag:yaml.org,2002:str", data, style="|")
+
+
+yaml.add_representer(LiteralBlockScalar, literal_block_scalar_presenter)
+
+
+def process_string_value(string_value):
+ """Pretty-prints JSON or returns long strings as a LiteralBlockScalar."""
+ try:
+ json_data = json.loads(string_value)
+ return LiteralBlockScalar(json.dumps(json_data, indent=2))
+ except (ValueError, TypeError):
+ if len(string_value) > 80:
+ return LiteralBlockScalar(string_value)
+ return string_value
+
+
+def convert_body_to_literal(data):
+ """Searches the data for body strings, attempting to pretty-print JSON."""
+ if isinstance(data, dict):
+ for key, value in data.items():
+ if key == "body" and isinstance(value, dict) and "string" in value:
+ value["string"] = process_string_value(value["string"])
+ elif key == "body" and isinstance(value, str):
+ data[key] = process_string_value(value)
+ else:
+ convert_body_to_literal(value)
+ elif isinstance(data, list):
+ for idx, choice in enumerate(data):
+ data[idx] = convert_body_to_literal(choice)
+ return data
+
+
+class PrettyPrintJSONBody:
+ """This makes request and response body recordings more readable."""
+
+ @staticmethod
+ def serialize(cassette_dict):
+ cassette_dict = convert_body_to_literal(cassette_dict)
+ return yaml.dump(
+ cassette_dict, default_flow_style=False, allow_unicode=True
+ )
+
+ @staticmethod
+ def deserialize(cassette_string):
+ return yaml.load(cassette_string, Loader=yaml.Loader)
+
+
+@pytest.fixture(scope="module", autouse=True)
+def fixture_vcr(vcr):
+ """Register the VCR serializer."""
+ vcr.register_serializer("yaml", PrettyPrintJSONBody)
+ return vcr
+
+
+def scrub_response_headers(response):
+ """Scrub sensitive response headers."""
+ return response
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.latest.txt b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.latest.txt
new file mode 100644
index 000000000..3f51924e5
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.latest.txt
@@ -0,0 +1,48 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+# ********************************
+# WARNING: NOT HERMETIC !!!!!!!!!!
+# ********************************
+#
+# This "requirements.txt" is installed in conjunction
+# with multiple other dependencies in the top-level "tox-loongsuite.ini"
+# file. In particular, please see:
+#
+# anthropic-latest: {[testenv]test_deps}
+# anthropic-latest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.latest.txt
+#
+# This provides additional dependencies, namely:
+#
+# opentelemetry-api
+# opentelemetry-sdk
+# opentelemetry-semantic-conventions
+#
+# ... with a "dev" version based on the latest distribution.
+
+
+# This variant of the requirements aims to test the system using
+# the newest supported version of external dependencies.
+
+anthropic
+pytest==7.4.4
+pytest-vcr==1.0.2
+pytest-asyncio==0.21.0
+wrapt==1.16.0
+# test with the latest version of opentelemetry-api, sdk, and semantic conventions
+
+-e opentelemetry-instrumentation
+-e util/opentelemetry-util-genai
+-e instrumentation-loongsuite/loongsuite-instrumentation-anthropic
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.oldest.txt b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.oldest.txt
new file mode 100644
index 000000000..0fd5ed27a
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.oldest.txt
@@ -0,0 +1,29 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# This variant of the requirements aims to test the system using
+# the oldest supported version of external dependencies.
+
+-e util/opentelemetry-util-genai
+anthropic==0.16.0
+httpx>=0.25.2,<0.28.0 # Pin to version compatible with anthropic 0.16.0 (proxies arg removed in 0.28)
+pytest==7.4.4
+pytest-vcr==1.0.2
+pytest-asyncio==0.21.0
+wrapt==1.16.0
+opentelemetry-api==1.37 # when updating, also update in pyproject.toml
+opentelemetry-sdk==1.37 # when updating, also update in pyproject.toml
+opentelemetry-semantic-conventions==0.58b0 # when updating, also update in pyproject.toml
+
+-e instrumentation-loongsuite/loongsuite-instrumentation-anthropic
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_instrumentor.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_instrumentor.py
new file mode 100644
index 000000000..354e53345
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_instrumentor.py
@@ -0,0 +1,128 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Tests for the AnthropicInstrumentor class."""
+
+from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+
+
+def test_instrumentor_instantiation():
+ """Test that the instrumentor can be instantiated."""
+ instrumentor = AnthropicInstrumentor()
+ assert instrumentor is not None
+ assert isinstance(instrumentor, AnthropicInstrumentor)
+
+
+def test_instrumentation_dependencies():
+ """Test that instrumentation dependencies are correctly reported."""
+ instrumentor = AnthropicInstrumentor()
+ dependencies = instrumentor.instrumentation_dependencies()
+
+ assert dependencies is not None
+ assert len(dependencies) > 0
+ assert "anthropic >= 0.16.0" in dependencies
+
+
+def test_instrument_uninstrument_cycle(
+ tracer_provider, logger_provider, meter_provider
+):
+ """Test that instrument() and uninstrument() can be called multiple times."""
+ instrumentor = AnthropicInstrumentor()
+
+ # First instrumentation
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # First uninstrumentation
+ instrumentor.uninstrument()
+
+ # Second instrumentation (should work)
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # Second uninstrumentation
+ instrumentor.uninstrument()
+
+
+def test_multiple_instrumentation_calls(
+ tracer_provider, logger_provider, meter_provider
+):
+ """Test that multiple instrument() calls don't cause issues."""
+ instrumentor = AnthropicInstrumentor()
+
+ # First call
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # Second call (should be idempotent or handle gracefully)
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # Clean up
+ instrumentor.uninstrument()
+
+
+def test_uninstrument_without_instrument():
+ """Test that uninstrument() can be called without prior instrument()."""
+ instrumentor = AnthropicInstrumentor()
+
+ # This should not raise an error
+ instrumentor.uninstrument()
+
+
+def test_instrument_with_no_providers(
+ tracer_provider, logger_provider, meter_provider
+):
+ """Test that instrument() works without explicit providers.
+
+ Note: We still pass providers to ensure a clean test environment,
+ but this tests that the instrumentor can be called and cleaned up.
+ In a real scenario without explicit providers, it would use the
+ global (no-op) providers.
+ """
+ instrumentor = AnthropicInstrumentor()
+
+ # Test that instrument/uninstrument cycle works
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # Clean up
+ instrumentor.uninstrument()
+
+
+def test_instrumentor_has_required_attributes():
+ """Test that the instrumentor has the required methods."""
+ instrumentor = AnthropicInstrumentor()
+
+ assert hasattr(instrumentor, "instrument")
+ assert hasattr(instrumentor, "uninstrument")
+ assert hasattr(instrumentor, "instrumentation_dependencies")
+ assert callable(instrumentor.instrument)
+ assert callable(instrumentor.uninstrument)
+ assert callable(instrumentor.instrumentation_dependencies)
diff --git a/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_sync_messages.py b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_sync_messages.py
new file mode 100644
index 000000000..c5af27d0f
--- /dev/null
+++ b/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/test_sync_messages.py
@@ -0,0 +1,307 @@
+# Copyright The OpenTelemetry Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Tests for sync Messages.create instrumentation."""
+
+import pytest
+from anthropic import Anthropic, APIConnectionError, NotFoundError
+
+from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+from opentelemetry.semconv._incubating.attributes import (
+ error_attributes as ErrorAttributes,
+)
+from opentelemetry.semconv._incubating.attributes import (
+ gen_ai_attributes as GenAIAttributes,
+)
+from opentelemetry.semconv._incubating.attributes import (
+ server_attributes as ServerAttributes,
+)
+
+
+def assert_span_attributes(
+ span,
+ request_model,
+ response_id=None,
+ response_model=None,
+ input_tokens=None,
+ output_tokens=None,
+ finish_reasons=None,
+ operation_name="chat",
+ server_address="api.anthropic.com",
+):
+ """Assert that a span has the expected attributes."""
+ assert span.name == f"{operation_name} {request_model}"
+ assert (
+ operation_name
+ == span.attributes[GenAIAttributes.GEN_AI_OPERATION_NAME]
+ )
+ # The TelemetryHandler uses GEN_AI_PROVIDER_NAME (new semconv) instead of GEN_AI_SYSTEM (deprecated)
+ assert (
+ GenAIAttributes.GenAiSystemValues.ANTHROPIC.value
+ == span.attributes.get(
+ GenAIAttributes.GEN_AI_PROVIDER_NAME,
+ span.attributes.get(GenAIAttributes.GEN_AI_SYSTEM),
+ )
+ )
+ assert (
+ request_model == span.attributes[GenAIAttributes.GEN_AI_REQUEST_MODEL]
+ )
+ if server_address is not None:
+ assert ServerAttributes.SERVER_ADDRESS in span.attributes
+
+ if response_id is not None:
+ assert (
+ response_id == span.attributes[GenAIAttributes.GEN_AI_RESPONSE_ID]
+ )
+
+ if response_model is not None:
+ assert (
+ response_model
+ == span.attributes[GenAIAttributes.GEN_AI_RESPONSE_MODEL]
+ )
+
+ if input_tokens is not None:
+ assert (
+ input_tokens
+ == span.attributes[GenAIAttributes.GEN_AI_USAGE_INPUT_TOKENS]
+ )
+
+ if output_tokens is not None:
+ assert (
+ output_tokens
+ == span.attributes[GenAIAttributes.GEN_AI_USAGE_OUTPUT_TOKENS]
+ )
+
+ if finish_reasons is not None:
+ # OpenTelemetry converts lists to tuples when storing as attributes
+ assert (
+ tuple(finish_reasons)
+ == span.attributes[GenAIAttributes.GEN_AI_RESPONSE_FINISH_REASONS]
+ )
+
+
+@pytest.mark.vcr()
+def test_sync_messages_create_basic(
+ span_exporter, anthropic_client, instrument_no_content
+):
+ """Test basic sync message creation produces correct span."""
+ model = "claude-sonnet-4-20250514"
+ messages = [{"role": "user", "content": "Say hello in one word."}]
+
+ response = anthropic_client.messages.create(
+ model=model,
+ max_tokens=100,
+ messages=messages,
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+
+ assert_span_attributes(
+ spans[0],
+ request_model=model,
+ response_id=response.id,
+ response_model=response.model,
+ input_tokens=response.usage.input_tokens,
+ output_tokens=response.usage.output_tokens,
+ finish_reasons=[response.stop_reason],
+ )
+
+
+@pytest.mark.vcr()
+def test_sync_messages_create_with_all_params(
+ span_exporter, anthropic_client, instrument_no_content
+):
+ """Test message creation with all optional parameters."""
+ model = "claude-sonnet-4-20250514"
+ messages = [{"role": "user", "content": "Say hello."}]
+
+ anthropic_client.messages.create(
+ model=model,
+ max_tokens=50,
+ messages=messages,
+ temperature=0.7,
+ top_p=0.9,
+ top_k=40,
+ stop_sequences=["STOP"],
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+ span = spans[0]
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_MAX_TOKENS] == 50
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_TEMPERATURE] == 0.7
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_TOP_P] == 0.9
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_TOP_K] == 40
+ # OpenTelemetry converts lists to tuples when storing as attributes
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_STOP_SEQUENCES] == (
+ "STOP",
+ )
+
+
+@pytest.mark.vcr()
+def test_sync_messages_create_token_usage(
+ span_exporter, anthropic_client, instrument_no_content
+):
+ """Test that token usage is captured correctly."""
+ model = "claude-sonnet-4-20250514"
+ messages = [{"role": "user", "content": "Count to 5."}]
+
+ response = anthropic_client.messages.create(
+ model=model,
+ max_tokens=100,
+ messages=messages,
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+
+ span = spans[0]
+ assert GenAIAttributes.GEN_AI_USAGE_INPUT_TOKENS in span.attributes
+ assert GenAIAttributes.GEN_AI_USAGE_OUTPUT_TOKENS in span.attributes
+ assert (
+ span.attributes[GenAIAttributes.GEN_AI_USAGE_INPUT_TOKENS]
+ == response.usage.input_tokens
+ )
+ assert (
+ span.attributes[GenAIAttributes.GEN_AI_USAGE_OUTPUT_TOKENS]
+ == response.usage.output_tokens
+ )
+
+
+@pytest.mark.vcr()
+def test_sync_messages_create_stop_reason(
+ span_exporter, anthropic_client, instrument_no_content
+):
+ """Test that stop reason is captured as finish_reasons array."""
+ model = "claude-sonnet-4-20250514"
+ messages = [{"role": "user", "content": "Say hi."}]
+
+ response = anthropic_client.messages.create(
+ model=model,
+ max_tokens=100,
+ messages=messages,
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+
+ span = spans[0]
+ # Anthropic's stop_reason should be wrapped in a tuple (OTel converts lists)
+ assert span.attributes[GenAIAttributes.GEN_AI_RESPONSE_FINISH_REASONS] == (
+ response.stop_reason,
+ )
+
+
+def test_sync_messages_create_connection_error(
+ span_exporter, instrument_no_content
+):
+ """Test that connection errors are handled correctly."""
+ model = "claude-sonnet-4-20250514"
+ messages = [{"role": "user", "content": "Hello"}]
+
+ # Create client with invalid endpoint
+ client = Anthropic(base_url="http://localhost:9999")
+
+ with pytest.raises(APIConnectionError):
+ client.messages.create(
+ model=model,
+ max_tokens=100,
+ messages=messages,
+ timeout=0.1,
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+
+ span = spans[0]
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_MODEL] == model
+ assert ErrorAttributes.ERROR_TYPE in span.attributes
+ assert "APIConnectionError" in span.attributes[ErrorAttributes.ERROR_TYPE]
+
+
+@pytest.mark.vcr()
+def test_sync_messages_create_api_error(
+ span_exporter, anthropic_client, instrument_no_content
+):
+ """Test that API errors (e.g., invalid model) are handled correctly."""
+ model = "invalid-model-name"
+ messages = [{"role": "user", "content": "Hello"}]
+
+ with pytest.raises(NotFoundError):
+ anthropic_client.messages.create(
+ model=model,
+ max_tokens=100,
+ messages=messages,
+ )
+
+ spans = span_exporter.get_finished_spans()
+ assert len(spans) == 1
+
+ span = spans[0]
+ assert span.attributes[GenAIAttributes.GEN_AI_REQUEST_MODEL] == model
+ assert ErrorAttributes.ERROR_TYPE in span.attributes
+ assert "NotFoundError" in span.attributes[ErrorAttributes.ERROR_TYPE]
+
+
+def test_uninstrument_removes_patching(
+ span_exporter, tracer_provider, logger_provider, meter_provider
+):
+ """Test that uninstrument() removes the patching."""
+ instrumentor = AnthropicInstrumentor()
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+
+ # Uninstrument
+ instrumentor.uninstrument()
+
+ # Create a new client after uninstrumenting
+ # The actual API call won't work without a real API key,
+ # but we can verify no spans are created for a mocked scenario
+ # For this test, we'll just verify uninstrument doesn't raise
+ assert True
+
+
+def test_multiple_instrument_uninstrument_cycles(
+ tracer_provider, logger_provider, meter_provider
+):
+ """Test that multiple instrument/uninstrument cycles work correctly."""
+ instrumentor = AnthropicInstrumentor()
+
+ # First cycle
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+ instrumentor.uninstrument()
+
+ # Second cycle
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+ instrumentor.uninstrument()
+
+ # Third cycle - should still work
+ instrumentor.instrument(
+ tracer_provider=tracer_provider,
+ logger_provider=logger_provider,
+ meter_provider=meter_provider,
+ )
+ instrumentor.uninstrument()
diff --git a/tox-loongsuite.ini b/tox-loongsuite.ini
index 385825aa0..7e02df83f 100644
--- a/tox-loongsuite.ini
+++ b/tox-loongsuite.ini
@@ -24,6 +24,10 @@ envlist =
py3{10,11,12,13}-test-loongsuite-instrumentation-claude-agent-sdk-{oldest,latest}
lint-loongsuite-instrumentation-claude-agent-sdk
+ ; loongsuite-instrumentation-anthropic
+ py3{9,10,11,12,13}-test-loongsuite-instrumentation-anthropic-{oldest,latest}
+ lint-loongsuite-instrumentation-anthropic
+
; loongsuite-instrumentation-google-adk
py3{9,10,11,12,13}-test-loongsuite-instrumentation-google-adk-{oldest,latest}
lint-loongsuite-instrumentation-google-adk
@@ -105,6 +109,11 @@ deps =
claude-agent-sdk-latest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk/tests/requirements.latest.txt
lint-loongsuite-instrumentation-claude-agent-sdk: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk/tests/requirements.oldest.txt
+ anthropic-oldest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.oldest.txt
+ anthropic-latest: {[testenv]test_deps}
+ anthropic-latest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.latest.txt
+ lint-loongsuite-instrumentation-anthropic: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests/requirements.oldest.txt
+
google-adk-oldest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-google-adk/tests/requirements.oldest.txt
google-adk-latest: {[testenv]test_deps}
google-adk-latest: -r {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-google-adk/tests/requirements.latest.txt
@@ -180,6 +189,9 @@ commands =
test-loongsuite-instrumentation-claude-agent-sdk: pytest {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk/tests {posargs}
lint-loongsuite-instrumentation-claude-agent-sdk: python -m ruff check {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-claude-agent-sdk
+ test-loongsuite-instrumentation-anthropic: pytest {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic/tests {posargs}
+ lint-loongsuite-instrumentation-anthropic: python -m ruff check {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-anthropic
+
test-loongsuite-instrumentation-google-adk: pytest {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-google-adk/tests {posargs}
lint-loongsuite-instrumentation-google-adk: python -m ruff check {toxinidir}/instrumentation-loongsuite/loongsuite-instrumentation-google-adk