AutoGen Integration
Build multi-agent systems with Microsoft AutoGen that can discover, evaluate, and pay for x402 services autonomously.
โก What you'll build: An AutoGen multi-agent team where agents collaborate to find and use x402 APIs โ from market data to AI inference to specialized tools.
Prerequisites
- Python 3.9+
- AutoGen installed (
pip install pyautogen) - OpenAI API key (or other LLM provider)
- USDC on Base for payments (optional for testing)
Installation
# Install AutoGen and Agent Index
pip install pyautogen agentindex x402-client
Quick Start
Create a tool that lets AutoGen agents discover x402 services:
from autogen import AssistantAgent, UserProxyAgent
from agentindex import AgentIndexClient, search_agents
# Initialize the Agent Index client
client = AgentIndexClient()
# Define the search tool for AutoGen
def search_x402_services(query: str, category: str = None) -> str:
"""Search for x402 services that match the query."""
results = client.search(
query=query,
category=category,
limit=5
)
return format_results(results)
def call_x402_service(endpoint_url: str, payload: dict) -> str:
"""Call an x402 service with payment handling."""
from x402_client import x402_fetch
response = x402_fetch(
endpoint_url,
method="POST",
json=payload,
wallet_key=os.environ["WALLET_PRIVATE_KEY"]
)
return response.json()
# Create the AutoGen assistant with x402 capabilities
assistant = AssistantAgent(
name="x402_assistant",
system_message="""You are a helpful assistant with access to x402 services.
You can search for services using search_x402_services()
and call them using call_x402_service().
When a user needs data or capabilities you don't have,
search for an x402 service that can help.""",
llm_config={
"model": "gpt-4-turbo",
"functions": [
{
"name": "search_x402_services",
"description": "Search Agent Index for x402 services",
"parameters": {
"type": "object",
"properties": {
"query": {"type": "string", "description": "Search query"},
"category": {"type": "string", "description": "Category filter"}
},
"required": ["query"]
}
},
{
"name": "call_x402_service",
"description": "Call an x402 service endpoint",
"parameters": {
"type": "object",
"properties": {
"endpoint_url": {"type": "string"},
"payload": {"type": "object"}
},
"required": ["endpoint_url", "payload"]
}
}
]
}
)
# Create user proxy
user = UserProxyAgent(
name="user",
human_input_mode="NEVER",
code_execution_config={"work_dir": "coding"}
)
# Register the functions
user.register_function(
function_map={
"search_x402_services": search_x402_services,
"call_x402_service": call_x402_service
}
)
# Start a conversation
user.initiate_chat(
assistant,
message="Find me a service that can analyze crypto sentiment"
)
Multi-Agent Research Team
Build a team of agents that collaborate using x402 services:
from autogen import GroupChat, GroupChatManager
# Researcher agent - finds services
researcher = AssistantAgent(
name="researcher",
system_message="""You find x402 services relevant to the task.
Use search_x402_services() to discover APIs.
Evaluate them by cost, uptime, and capabilities.""",
llm_config=llm_config
)
# Analyst agent - uses services
analyst = AssistantAgent(
name="analyst",
system_message="""You analyze data using x402 services.
Use call_x402_service() to get data.
Synthesize findings into actionable insights.""",
llm_config=llm_config
)
# Coordinator agent
coordinator = AssistantAgent(
name="coordinator",
system_message="""You coordinate the team.
Direct the researcher to find services.
Direct the analyst to use them.
Budget: max $0.50 per task.""",
llm_config=llm_config
)
# Create group chat
groupchat = GroupChat(
agents=[user, coordinator, researcher, analyst],
messages=[],
max_round=10
)
manager = GroupChatManager(groupchat=groupchat)
# Run the team
user.initiate_chat(
manager,
message="Analyze the top 3 AI tokens by market cap and sentiment"
)
Deploy Your AutoGen System
Ready to deploy your multi-agent system? Use production-ready infrastructure:
๐ Railway
Deploy Python apps in seconds. $20 free credit to start.
๐ DigitalOcean
Reliable VPS for long-running agents. $200 free credit.
Railway Deployment
# railway.json
{
"build": {
"builder": "NIXPACKS"
},
"deploy": {
"startCommand": "python main.py",
"healthcheckPath": "/health"
}
}
# Procfile
web: python main.py
Docker Deployment (DigitalOcean)
# Dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "main.py"]
Best Practices
- Budget limits: Set spending caps per conversation to control costs
- Service caching: Cache Agent Index results to reduce API calls
- Retry logic: Handle 402 responses gracefully with exponential backoff
- Logging: Track all x402 transactions for debugging and auditing
Example Use Cases
- Research teams that find and use specialized APIs on demand
- Trading bots that discover new data sources automatically
- Content agents that use AI services for generation and analysis
- DevOps agents that find and use infrastructure tools
Next Steps
- Browse available x402 services
- Read the full API documentation
- Submit your own x402 endpoint
- Try the CrewAI integration
- Try the LangChain integration