Skip to content

Conversation

ChenyangLi4288
Copy link

Issue Link / Problem Description

Problem: Errors during testset generation transforms were being silently caught and suppressed

Changes Made

  • src/ragas/async_utils.py: Removed exception suppression in process_futures() function (lines 93-95)

    • Previously: Caught all exceptions and converted them to results (result = e)
    • Now: Exceptions propagate naturally to the caller
  • src/ragas/testset/transforms/engine.py: Removed exception suppression in run_coroutines() function (line 61)

    • Previously: Caught all exceptions and only logged them (logger.error(...))
    • Now: Exceptions propagate immediately when transforms fail

Why these changes:

  • Allows users to see actual error messages (e.g., "temperature not supported") instead of downstream errors
  • Follows Python best practices: fail fast and provide clear error messages
  • Enables proper debugging of LLM configuration and transform issues

Testing

How to Test

  • Automated tests added/updated
  • Manual testing steps:
    1. Create a testset generator with an LLM that has invalid configuration (e.g., unsupported temperature value)
    2. Run generator.generate_with_langchain_docs() or apply_transforms()
    3. Verify that the actual error is raised immediately (e.g., BadRequestError: temperature not supported)
    4. Previously: Would see ValueError: Node X has no summary_embedding later in the pipeline
    5. Now: Should see the LLM error immediately when the extractor runs

Example test case (reproduces original issue):

from ragas.testset import TestsetGenerator
from ragas.llms import llm_factory

# Create LLM with invalid temperature for certain models
llm = llm_factory(model="gpt-5-mini", temperature=0.01)  # May not support 0.01

generator = TestsetGenerator(llm=llm)
# This should now raise the actual error immediately instead of failing silently
dataset = generator.generate_with_langchain_docs(documents, testset_size=10)

References

@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Oct 14, 2025
Copy link
Contributor

@anistark anistark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the attempt @ChenyangLi4288

This however doesn't solve the issue.

Executor class still works as before. Additionally this is creating incompatible exception handling.

run_coroutines is anyway not used.

If you want to take another stab at it, follow the execution path:

apply_transforms() → run_async_tasks() → process_futures() → await future

Also, try not to introduce breaking changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:S This PR changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Please do not mute error during transforms

2 participants