Integration Guide 🦜 LangChain

LangChain Integration

Give your LangChain agents the ability to discover and use x402 APIs automatically. Find any tool your AI needs in real-time.

What you'll build: A LangChain agent that can search for crypto price APIs, weather data, news feeds, and any other x402-enabled service — then use them autonomously.

Prerequisites

Installation

pip install agentindex langchain langchain-openai

Quick Start

Create a LangChain tool that searches Agent Index:

from agentindex import create_langchain_tool
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_functions_agent
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder

# Create the Agent Index tool
agent_index_tool = create_langchain_tool()

# Create your agent
llm = ChatOpenAI(model="gpt-4", temperature=0)
tools = [agent_index_tool]

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful AI that can find and use x402 APIs. "
               "When you need external data, search Agent Index to find suitable APIs."),
    ("human", "{input}"),
    MessagesPlaceholder(variable_name="agent_scratchpad"),
])

agent = create_openai_functions_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# Run it
result = agent_executor.invoke({
    "input": "I need to get real-time Bitcoin prices. Find me an API."
})
print(result)

How It Works

The Agent Index tool provides a single function agent_index_search that your LangChain agent can call:

# Tool signature
agent_index_search(
    query: str,        # What capability are you looking for?
    category: str,     # Optional: crypto, defi, ai, weather, news, oracle
    max_price: float   # Optional: Maximum price per call in USD
) -> str              # Returns top 5 matching agents as formatted text

Example Tool Output

- Get real-time crypto prices (https://api.example.com/price) - $0.01/call, Health: healthy
- Historical price data (https://oracle.example.com/history) - $0.02/call, Health: healthy
- Multi-chain token prices (https://defi.example.com/tokens) - $0.005/call, Health: healthy

Advanced: Custom Client

Pass a custom client with different settings:

from agentindex import AgentIndex, create_langchain_tool

# Custom client with longer timeout
custom_client = AgentIndex(timeout=30.0)

# Create tool with custom client
tool = create_langchain_tool(client=custom_client)

Example Use Cases

Full Example: Autonomous Data Fetcher

from agentindex import AgentIndex, create_langchain_tool
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_functions_agent
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.tools import tool
import httpx

# Agent Index tool
agent_index_tool = create_langchain_tool()

# Tool to actually call the found APIs
@tool
def call_api(url: str) -> str:
    """Call an x402 API endpoint and return the result."""
    try:
        response = httpx.get(url, timeout=10)
        return response.text[:1000]  # First 1000 chars
    except Exception as e:
        return f"Error: {e}"

# Build agent with both tools
llm = ChatOpenAI(model="gpt-4", temperature=0)
tools = [agent_index_tool, call_api]

prompt = ChatPromptTemplate.from_messages([
    ("system", """You are a data-fetching AI. When asked for data:
1. Use agent_index_search to find a suitable API
2. Use call_api to fetch data from the best result
3. Parse and return the relevant information"""),
    ("human", "{input}"),
    MessagesPlaceholder(variable_name="agent_scratchpad"),
])

agent = create_openai_functions_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# Now it can find AND use APIs autonomously
result = executor.invoke({
    "input": "What's the current ETH price? Find an API and get the data."
})
print(result["output"])

Recommended Tools

To deploy your LangChain agent in production, you'll need reliable infrastructure:

See our full recommended stack →

Next Steps

Install from PyPI →