Skip to content

LangChain Integration

Manoj Desai edited this page May 3, 2025 · 1 revision

LangChain Integration

Python A2A provides bidirectional integration with LangChain, allowing you to:

  1. Convert A2A agents to LangChain agents
  2. Convert LangChain components to A2A servers
  3. Convert MCP tools to LangChain tools
  4. Convert LangChain tools to MCP servers

This integration enables access to thousands of pre-built LangChain tools while maintaining A2A compatibility.

A2A to LangChain

Converting A2A Agents to LangChain

You can convert any A2A agent to a LangChain agent using to_langchain_agent():

from python_a2a import A2AServer, agent, skill
from python_a2a.langchain import to_langchain_agent

@agent(
    name="Weather Agent",
    description="Provides weather information"
)
class WeatherAgent(A2AServer):
    
    @skill(
        name="Get Weather",
        description="Get current weather for a location"
    )
    def get_weather(self, location):
        return f"It's sunny and 75°F in {location}"
    
    def handle_task(self, task):
        return {"output": self.get_weather(task.input)}

# Create the A2A agent
weather_agent = WeatherAgent()

# Convert to LangChain
langchain_agent = to_langchain_agent(weather_agent)

# Use in LangChain
result = langchain_agent.invoke("What's the weather in New York?")
print(result)

Using A2A in LangChain Agents

You can use converted A2A agents as tools in LangChain agents:

from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from python_a2a import A2AServer, agent, skill
from python_a2a.langchain import to_langchain_agent

# Create the A2A agent
@agent(
    name="Calculator",
    description="Performs simple calculations"
)
class CalculatorAgent(A2AServer):
    
    @skill(
        name="Add",
        description="Add two numbers"
    )
    def add(self, a, b):
        return float(a) + float(b)
    
    def handle_task(self, task):
        parts = task.input.split("+")
        if len(parts) == 2:
            return {"output": self.add(parts[0].strip(), parts[1].strip())}
        return {"output": "Please provide an addition expression like '5 + 3'"}

# Convert to LangChain
calculator_agent = CalculatorAgent()
langchain_calculator = to_langchain_agent(calculator_agent)

# Create a LangChain agent with the A2A tool
llm = OpenAI(temperature=0)
tools = [
    Tool(
        name="Calculator",
        func=langchain_calculator,
        description="Useful for performing addition operations"
    )
]

agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)

# Run the agent
result = agent.run("What is 25 + 17?")
print(result)

LangChain to A2A

Converting LangChain to A2A Servers

You can convert any LangChain agent or chain to an A2A server:

from langchain.chains import LLMMathChain
from langchain.llms import OpenAI
from python_a2a.langchain import to_a2a_server
from python_a2a import run_server

# Create a LangChain chain
llm = OpenAI(temperature=0)
math_chain = LLMMathChain(llm=llm)

# Convert to A2A server
a2a_server = to_a2a_server(
    math_chain,
    name="Math Assistant",
    description="Solves math problems"
)

# Run the server
if __name__ == "__main__":
    run_server(a2a_server, port=5000)

Using the Converted Server

from python_a2a import HTTPClient

client = HTTPClient("http://localhost:5000")
response = client.send_message("What is the square root of 144?")
print(response.content)

MCP to LangChain

Converting MCP Tools to LangChain Tools

You can convert MCP tools to LangChain tools:

from python_a2a.mcp import FastMCP
from python_a2a.langchain import to_langchain_tool
from python_a2a import run_server
from langchain.agents import initialize_agent, AgentType
from langchain.llms import OpenAI

# Create an MCP server with tools
calculator = FastMCP(name="Calculator MCP")

@calculator.tool()
def add(a: float, b: float) -> float:
    """Add two numbers together."""
    return a + b

@calculator.tool()
def subtract(a: float, b: float) -> float:
    """Subtract b from a."""
    return a - b

# Run the MCP server in a background thread
import threading
server_thread = threading.Thread(
    target=run_server,
    args=(calculator,),
    kwargs={"port": 8000},
    daemon=True
)
server_thread.start()

# Convert MCP tools to LangChain tools
add_tool = to_langchain_tool("http://localhost:8000", "add")
subtract_tool = to_langchain_tool("http://localhost:8000", "subtract")

# Use in a LangChain agent
llm = OpenAI(temperature=0)
tools = [add_tool, subtract_tool]

agent = initialize_agent(
    tools, 
    llm, 
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    verbose=True
)

# Run the agent
result = agent.run("Add 15 and 27, then subtract 5 from the result")
print(result)

LangChain to MCP

Converting LangChain Tools to MCP Servers

You can convert LangChain tools to MCP servers:

from langchain.tools import DuckDuckGoSearchRun, WikipediaQueryRun
from langchain.utilities import WikipediaAPIWrapper
from python_a2a.langchain import to_mcp_server
from python_a2a import run_server

# Create LangChain tools
search_tool = DuckDuckGoSearchRun()
wikipedia_tool = WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper())

# Convert to MCP server
mcp_server = to_mcp_server(
    tools=[search_tool, wikipedia_tool],
    name="Research Tools"
)

# Run the server
if __name__ == "__main__":
    run_server(mcp_server, port=8000)

Using the Converted MCP Server

from python_a2a.mcp import MCPClient

client = MCPClient("http://localhost:8000")

# Call the DuckDuckGo search tool
search_results = client.call_tool("duckduckgo_search", {"query": "python a2a protocol"})
print(search_results)

# Call the Wikipedia tool
wiki_results = client.call_tool("wikipedia_query", {"query": "Agent to Agent Protocol"})
print(wiki_results)

Complete Integration Example

This example shows a complete workflow that combines A2A, MCP, and LangChain:

import os
import threading
from langchain.llms import OpenAI
from langchain.agents import initialize_agent, Tool, AgentType
from langchain.chains import LLMMathChain

from python_a2a import A2AServer, agent, skill, run_server, HTTPClient
from python_a2a.mcp import FastMCP, MCPAgent
from python_a2a.langchain import to_langchain_agent, to_a2a_server, to_langchain_tool

# 1. Create a specialized A2A agent
@agent(
    name="Weather Agent",
    description="Provides weather information"
)
class WeatherAgent(A2AServer):
    
    # Sample weather data
    weather_data = {
        "new york": {"condition": "Sunny", "temperature": 75},
        "london": {"condition": "Rainy", "temperature": 62},
        "tokyo": {"condition": "Cloudy", "temperature": 80},
        "paris": {"condition": "Partly Cloudy", "temperature": 70},
    }
    
    @skill(
        name="Get Weather",
        description="Get current weather for a location"
    )
    def get_weather(self, location):
        location = location.lower()
        if location in self.weather_data:
            weather = self.weather_data[location]
            return f"The weather in {location.title()} is {weather['condition']} with a temperature of {weather['temperature']}°F."
        else:
            return f"Sorry, I don't have weather data for {location}."
    
    def handle_task(self, task):
        return {"output": self.get_weather(task.input)}

# 2. Create an MCP server with tools
calculator = FastMCP(name="Calculator MCP")

@calculator.tool()
def add(a: float, b: float) -> float:
    """Add two numbers together."""
    return a + b

@calculator.tool()
def subtract(a: float, b: float) -> float:
    """Subtract b from a."""
    return a - b

# 3. Create a LangChain component
llm = OpenAI(temperature=0)
math_chain = LLMMathChain(llm=llm)

# 4. Convert to respective components
weather_agent_instance = WeatherAgent()
langchain_weather = to_langchain_agent(weather_agent_instance)

math_a2a_server = to_a2a_server(
    math_chain,
    name="Math Assistant",
    description="Solves complex mathematical problems"
)

# 5. Run all servers in background threads
def run_server_thread(server, port):
    run_server(server, port=port)

# Start the MCP calculator server
calculator_thread = threading.Thread(
    target=run_server_thread,
    args=(calculator, 8000),
    daemon=True
)
calculator_thread.start()

# Start the weather agent server
weather_thread = threading.Thread(
    target=run_server_thread,
    args=(weather_agent_instance, 5001),
    daemon=True
)
weather_thread.start()

# Start the math A2A server
math_thread = threading.Thread(
    target=run_server_thread,
    args=(math_a2a_server, 5002),
    daemon=True
)
math_thread.start()

# 6. Convert MCP tools to LangChain tools
add_tool = to_langchain_tool("http://localhost:8000", "add")
subtract_tool = to_langchain_tool("http://localhost:8000", "subtract")

# 7. Create a combined LangChain agent with all tools
tools = [
    Tool(
        name="Weather",
        func=langchain_weather,
        description="Get weather information for a specific location"
    ),
    Tool(
        name="Add",
        func=add_tool,
        description="Add two numbers together"
    ),
    Tool(
        name="Subtract",
        func=subtract_tool,
        description="Subtract one number from another"
    ),
    Tool(
        name="Math",
        func=lambda q: HTTPClient("http://localhost:5002").send_message(q).content,
        description="Solve complex math problems"
    )
]

agent = initialize_agent(
    tools, 
    llm, 
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    verbose=True
)

# 8. Run the combined agent
if __name__ == "__main__":
    # Wait for servers to start
    import time
    time.sleep(2)
    
    # Ask questions that require different tools
    questions = [
        "What's the weather like in Tokyo?",
        "What is 125 + 37?",
        "What is 200 - 45?",
        "Calculate the square root of 144"
    ]
    
    for question in questions:
        print(f"\nQuestion: {question}")
        result = agent.run(question)
        print(f"Answer: {result}")

Best Practices for LangChain Integration

  1. Component Selection:

    • Choose the right direction of conversion based on your use case
    • A2A to LangChain: When you want to use A2A agents within LangChain pipelines
    • LangChain to A2A: When you want to expose LangChain functionality as A2A agents
  2. Error Handling:

    • Implement proper error handling for cross-framework communication
    • LangChain and A2A have different error patterns - map them appropriately
  3. Performance:

    • Be aware of additional overhead from cross-framework communication
    • Consider running frequently used components natively in their respective frameworks
  4. Tool Design:

    • Keep tools focused on specific, well-defined tasks
    • Provide clear descriptions for tools to enable proper routing
    • Match parameter schemas across frameworks

By following these practices, you can effectively combine the power of A2A and LangChain to build sophisticated agent systems that leverage the best of both ecosystems.

Clone this wiki locally