Skip to content

Opik #529

@faberto

Description

@faberto

Adding

llama_index.core.set_global_handler("opik")

to a workflow causes Serialization Issues with LLama Deploy. Running the workflow without deploy works fine.

Tested 7.0 , 8.1 and main:

LOG
  INFO:     127.0.0.1:48256 - "POST /messages/llama_deploy.control_plane HTTP/1.1" 200 OK
WARNING:root:Removing unpickleable attribute callback_manager
WARNING:root:Removing unpickleable private attribute _client
WARNING:root:Removing unpickleable private attribute _client
ERROR:llama_deploy.services.workflow - Encountered error in task 5f43fbab-37b9-4773-95ea-c3bdd45fa450! Failed to serialize value for key index: cannot pickle '_thread.RLock' object
Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 44, in serialize
    try:
         
  File "/usr/local/lib/python3.12/json/__init__.py", line 231, in dumps
    return _default_encoder.encode(obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 200, in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 258, in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 180, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type VectorStoreIndex is not JSON serializable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 71, in serialize
    """Serialize while prioritizing JSON, falling back to Pickle."""
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 46, in serialize
    return json.dumps(serialized_value)
ValueError: Failed to serialize value: <class 'llama_index.core.indices.vector_store.base.VectorStoreIndex'>: <llama_index.core.indices.vector_store.base.VectorStoreIndex object at 0x7f20f0785940>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 139, in _serialize_globals
    context._streaming_queue = context._deserialize_queue(
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 73, in serialize
    return super().serialize(value)
                            ^^^^^^^^
TypeError: cannot pickle '_thread.RLock' object

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_deploy/services/workflow.py", line 278, in process_call
    await self.set_workflow_state(handler.ctx, current_call)
  File "/opt/venv/lib/python3.12/site-packages/llama_deploy/services/workflow.py", line 193, in set_workflow_state
    context_dict = ctx.to_dict(serializer=JsonPickleSerializer())
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 167, in to_dict
    key: A unique string to identify the value stored.
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 149, in _serialize_globals
    context._waiter_id = data.get("waiter_id", str(uuid.uuid4()))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: Failed to serialize value for key index: cannot pickle '_thread.RLock' object
ERROR:llama_deploy.services.workflow:Encountered error in task 5f43fbab-37b9-4773-95ea-c3bdd45fa450! Failed to serialize value for key index: cannot pickle '_thread.RLock' object
Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 44, in serialize
    try:
         
  File "/usr/local/lib/python3.12/json/__init__.py", line 231, in dumps
    return _default_encoder.encode(obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 200, in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 258, in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/encoder.py", line 180, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type VectorStoreIndex is not JSON serializable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 71, in serialize
    """Serialize while prioritizing JSON, falling back to Pickle."""
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 46, in serialize
    return json.dumps(serialized_value)
ValueError: Failed to serialize value: <class 'llama_index.core.indices.vector_store.base.VectorStoreIndex'>: <llama_index.core.indices.vector_store.base.VectorStoreIndex object at 0x7f20f0785940>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 139, in _serialize_globals
    context._streaming_queue = context._deserialize_queue(
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context_serializers.py", line 73, in serialize
    return super().serialize(value)
                            ^^^^^^^^
TypeError: cannot pickle '_thread.RLock' object

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/llama_deploy/services/workflow.py", line 278, in process_call
    await self.set_workflow_state(handler.ctx, current_call)
  File "/opt/venv/lib/python3.12/site-packages/llama_deploy/services/workflow.py", line 193, in set_workflow_state
    context_dict = ctx.to_dict(serializer=JsonPickleSerializer())
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 167, in to_dict
    key: A unique string to identify the value stored.
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/llama_index/core/workflow/context.py", line 149, in _serialize_globals
    context._waiter_id = data.get("waiter_id", str(uuid.uuid4()))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: Failed to serialize value for key index: cannot pickle '_thread.RLock' object

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions