Integration Guide 🦙 LlamaIndex

LlamaIndex Integration

Build RAG-powered AI agents that can discover, evaluate, and pay for x402 APIs in real-time. Perfect for building data-aware applications that need dynamic external data sources.

🦙 What you'll build: A LlamaIndex agent with a custom tool that searches Agent Index to find the best x402 APIs for any data need — crypto prices, weather, news, DeFi yields, and more.

Why LlamaIndex + Agent Index?

LlamaIndex excels at connecting LLMs with external data sources. By integrating Agent Index, your RAG applications can:

Prerequisites

Installation

pip install llama-index llama-index-llms-openai httpx

Quick Start

Create a LlamaIndex tool that searches Agent Index:

import httpx
from llama_index.core.tools import FunctionTool
from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI

# Agent Index API client
AGENT_INDEX_API = "http://209.38.42.28:4402"

def search_agent_index(query: str, category: str = None, max_price: float = None) -> str:
    """
    Search Agent Index for x402-enabled API endpoints.
    
    Args:
        query: What capability are you looking for? (e.g., "bitcoin price", "weather data")
        category: Optional filter - crypto, defi, ai, weather, news, oracle
        max_price: Optional maximum price per call in USD
    
    Returns:
        Formatted list of matching API endpoints with URLs, prices, and health status
    """
    params = {"q": query, "limit": 5}
    if category:
        params["category"] = category
    if max_price:
        params["max_price"] = max_price
    
    try:
        response = httpx.get(f"{AGENT_INDEX_API}/search", params=params, timeout=10.0)
        data = response.json()
        
        if not data.get("endpoints"):
            return "No matching endpoints found. Try a different query or category."
        
        results = []
        for ep in data["endpoints"]:
            price = f"${ep.get('price', 0):.4f}/call"
            health = ep.get('health', 'unknown')
            tier = ep.get('tier', 'N/A')
            results.append(
                f"- {ep['name']} ({ep['url']}) - {price}, Health: {health}, Tier: {tier}"
            )
        
        return "Found these x402 API endpoints:\n" + "\n".join(results)
    except Exception as e:
        return f"Error searching Agent Index: {str(e)}"

# Create LlamaIndex tool
agent_index_tool = FunctionTool.from_defaults(
    fn=search_agent_index,
    name="search_agent_index",
    description="Search for x402 payment-enabled API endpoints. Use this to find APIs for crypto prices, weather data, news feeds, DeFi analytics, and more."
)

# Create agent
llm = OpenAI(model="gpt-4", temperature=0)
agent = ReActAgent.from_tools(
    [agent_index_tool],
    llm=llm,
    verbose=True,
    system_prompt="""You are a helpful AI assistant that can find and recommend x402 API endpoints.
When users need external data, search Agent Index to find the best API for their needs.
Consider price, health status, and tier when making recommendations."""
)

# Run it
response = agent.chat("I need to get real-time Ethereum prices. What APIs are available?")
print(response)

Advanced: RAG with Dynamic Data Sources

Combine Agent Index with LlamaIndex's RAG capabilities to build agents that can both search your documents AND fetch real-time external data:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI

# Load your documents for RAG
documents = SimpleDirectoryReader("./docs").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

# Document search tool
doc_tool = QueryEngineTool(
    query_engine=query_engine,
    metadata=ToolMetadata(
        name="search_docs",
        description="Search internal documentation and knowledge base"
    )
)

# Agent Index tool (from above)
agent_index_tool = FunctionTool.from_defaults(
    fn=search_agent_index,
    name="search_agent_index",
    description="Search for external x402 APIs when you need real-time data"
)

# Combined agent with both tools
agent = ReActAgent.from_tools(
    [doc_tool, agent_index_tool],
    llm=OpenAI(model="gpt-4"),
    verbose=True,
    system_prompt="""You are an AI assistant with access to:
1. Internal documentation (use search_docs)
2. External x402 APIs (use search_agent_index)

For questions about internal processes, search docs first.
For real-time external data (prices, weather, news), use Agent Index."""
)

# Now the agent can use both internal docs AND external APIs
response = agent.chat(
    "What's our company's policy on API usage, and what's the current BTC price?"
)

Tool to Call Found APIs

Add a second tool to actually call the APIs your agent discovers:

def call_x402_api(url: str) -> str:
    """
    Call an x402 API endpoint and return the response.
    
    Args:
        url: The full URL of the API endpoint to call
    
    Returns:
        The API response data as a string
    """
    try:
        response = httpx.get(url, timeout=15.0, follow_redirects=True)
        response.raise_for_status()
        return response.text[:2000]  # Limit response size
    except httpx.HTTPError as e:
        return f"HTTP Error: {str(e)}"
    except Exception as e:
        return f"Error calling API: {str(e)}"

call_api_tool = FunctionTool.from_defaults(
    fn=call_x402_api,
    name="call_x402_api",
    description="Call an x402 API endpoint URL to fetch data. Use after finding an API with search_agent_index."
)

# Agent with both search and call capabilities
agent = ReActAgent.from_tools(
    [agent_index_tool, call_api_tool],
    llm=OpenAI(model="gpt-4"),
    verbose=True,
    system_prompt="""You are a data-fetching AI. To get external data:
1. Use search_agent_index to find a suitable API
2. Use call_x402_api to fetch data from the best result
3. Parse and return the relevant information to the user"""
)

Example Use Cases

Production Tips

Related Resources

Get Started with LlamaIndex →