Create Custom LiteLLM Callback Handler Maintained by AgentOps #1182
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Create Custom LiteLLM Callback Handler Maintained by AgentOps
Summary
This PR implements a comprehensive LiteLLM instrumentation system for AgentOps that addresses issue #1180. The implementation provides a hybrid approach combining LiteLLM's callback system with wrapt-based instrumentation for complete telemetry coverage.
Key Components:
The implementation supports all major LLM providers (OpenAI, Anthropic, Cohere, etc.) and is compatible with LiteLLM versions from 1.68.0 to the latest 1.74.9.rc.1.
Review & Testing Checklist for Human
Recommended Test Plan:
examples/litellm/litellm_example.py
with different providersDiagram
Notes
callback_handler.py
lines 254, 256, 258 where Exception objects are accessed without proper type checkingLink to Devin run: https://app.devin.ai/sessions/c572fc6b318948c4bc61b0b8841d6ca1
Requested by: Pratyush Shukla ([email protected])
Fixes #1180