Skip to content

Conversation

harshil-sanghvi
Copy link
Contributor

Issue Link / Problem Description

This PR adds direct support for Oracle Cloud Infrastructure (OCI) Generative AI models in Ragas, enabling evaluation without requiring LangChain or LlamaIndex dependencies. Currently, users who want to use OCI Gen AI models must go through LangChain or LlamaIndex wrappers, which adds unnecessary complexity and dependencies.

Problem: No direct OCI Gen AI integration exists in Ragas, forcing users to use indirect approaches through LangChain/LlamaIndex.

Solution: Implement a native OCI Gen AI wrapper that uses the OCI Python SDK directly.

Changes Made

Core Implementation

  • Add OCIGenAIWrapper - New LLM wrapper class extending BaseRagasLLM
  • Direct OCI SDK Integration - Uses oci.generative_ai.GenerativeAiClient directly
  • Factory Function - oci_genai_factory() for easy initialization
  • Async Support - Full async/await implementation with proper error handling

Model Support

Pretrained Foundational Models supported by OCI Generative AI

Dependencies & Configuration

  • Optional Dependency - Added oci>=2.160.1 as optional dependency
  • Import Safety - Graceful handling when OCI SDK is not installed
  • Configuration Options - Support for OCI CLI config, environment variables, or manual config

Testing & Quality

  • Comprehensive Test Suite - 15+ test cases with mocking
  • Error Handling Tests - Tests for authentication, model not found, permission errors
  • Async Testing - Full async operation testing
  • Factory Testing - Factory function validation

Documentation & Examples

  • Complete Integration Guide - Step-by-step setup and usage
  • Working Example Script - examples/oci_genai_example.py
  • Authentication Guide - Multiple OCI auth methods
  • Troubleshooting Section - Common issues and solutions
  • Updated Integration Index - Added to main integrations page

Analytics & Monitoring

  • Usage Tracking - Built-in analytics with LLMUsageEvent
  • Error Logging - Comprehensive error logging and debugging
  • Performance Monitoring - Request tracking and metrics

Testing

How to Test

  • Automated tests added/updated
  • Manual testing steps:
    1. Install OCI dependency: pip install ragas[oci]
    2. Configure OCI authentication: Set up OCI config file or environment variables
    3. Run example script: python examples/oci_genai_example.py
    4. Test with different models: Try Cohere, Meta, and xAI models
    5. Test async operations: Verify async generation works correctly
    6. Test error handling: Verify proper error messages for auth/model issues

Test Coverage

# Run OCI Gen AI specific tests
pytest tests/unit/test_oci_genai_wrapper.py -v

# Test syntax validation
python -c "import ast; ast.parse(open('src/ragas/llms/oci_genai_wrapper.py').read())"

References

Screenshots/Examples

Basic Usage

from ragas.llms import oci_genai_factory
from ragas import evaluate

# Initialize OCI Gen AI LLM
llm = oci_genai_factory(
    model_id="cohere.command",
    compartment_id="ocid1.compartment.oc1..example"
)

# Evaluate with OCI Gen AI
result = evaluate(dataset, llm=llm)

Advanced Configuration

# Custom OCI configuration
config = {
    "user": "ocid1.user.oc1..example",
    "key_file": "~/.oci/private_key.pem",
    "fingerprint": "your_fingerprint",
    "tenancy": "ocid1.tenancy.oc1..example",
    "region": "us-ashburn-1"
}

llm = oci_genai_factory(
    model_id="cohere.command",
    compartment_id="ocid1.compartment.oc1..example",
    config=config,
    endpoint_id="ocid1.endpoint.oc1..example"  # Optional
)

Files Changed

  • src/ragas/llms/oci_genai_wrapper.py - Main implementation
  • src/ragas/llms/__init__.py - Export new classes
  • pyproject.toml - Add OCI optional dependency
  • tests/unit/test_oci_genai_wrapper.py - Comprehensive tests
  • docs/howtos/integrations/oci_genai.md - Complete documentation
  • docs/howtos/integrations/index.md - Updated integration index
  • examples/oci_genai_example.py - Working example script

Breaking Changes

None - This is a purely additive feature with no breaking changes.

Dependencies

  • New Optional Dependency: oci>=2.160.1
  • No Breaking Changes: Existing functionality unchanged
  • Backward Compatible: All existing code continues to work

- Add OCIGenAIWrapper for direct OCI Gen AI integration
- Support for Cohere, Meta, Mistral and other OCI models
- Async/await support with proper error handling
- Comprehensive test suite with mocking
- Complete documentation and examples
- Add OCI as optional dependency in pyproject.toml
- Factory function for easy initialization
- Analytics tracking and usage monitoring

Resolves: Direct OCI Gen AI support without LangChain/LlamaIndex dependency
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Sep 27, 2025
Copy link
Contributor

@anistark anistark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall great addition. Thanks for the PR @harshil-sanghvi

@dosubot dosubot bot added size:XL This PR changes 500-999 lines, ignoring generated files. and removed size:XXL This PR changes 1000+ lines, ignoring generated files. labels Sep 29, 2025
@anistark
Copy link
Contributor

@harshil-sanghvi All tests are failing. Could you please fix them?

You can format using : make format
Run : make run-ci locally to check.

@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. and removed size:XL This PR changes 500-999 lines, ignoring generated files. labels Sep 29, 2025
@anistark anistark merged commit 19caa7a into explodinggradients:main Sep 30, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XXL This PR changes 1000+ lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants