Semantic Kernel Integration
Build enterprise AI applications with Microsoft Semantic Kernel that can discover, evaluate, and pay for x402 services โ in both .NET and Python.
โก What you'll build: A Semantic Kernel application with native x402 plugins, enabling your AI to autonomously find and use paid APIs from Agent Index โ from market data to AI inference to specialized tools.
Why Semantic Kernel + x402?
- Enterprise-ready: Built by Microsoft for production AI applications
- Multi-language: First-class support for .NET and Python
- Plugin architecture: Clean separation of AI capabilities and tools
- Planner support: Automatic tool selection and chaining
- Azure integration: Native Azure OpenAI and cognitive services
Prerequisites
- .NET 8+ or Python 3.10+
- Semantic Kernel SDK
- OpenAI or Azure OpenAI API key
- USDC on Base for x402 payments (optional for testing)
Installation
.NET
dotnet add package Microsoft.SemanticKernel
dotnet add package AgentIndex.SDK # Coming soon
dotnet add package X402.Client
Python
pip install semantic-kernel agentindex x402-client
Quick Start (Python)
Create an Agent Index plugin for Semantic Kernel:
import semantic_kernel as sk
from semantic_kernel.functions import kernel_function
from agentindex import AgentIndexClient
class AgentIndexPlugin:
"""Plugin to discover and use x402 services via Agent Index."""
def __init__(self):
self.client = AgentIndexClient()
@kernel_function(
name="search_services",
description="Search for x402 API services that match a query"
)
def search_services(self, query: str, category: str = None) -> str:
"""Search Agent Index for x402 services."""
results = self.client.search(
query=query,
category=category,
limit=5
)
return self._format_results(results)
@kernel_function(
name="get_service_details",
description="Get detailed information about a specific x402 service"
)
def get_service_details(self, endpoint_url: str) -> str:
"""Get details for a specific endpoint."""
details = self.client.get_endpoint(endpoint_url)
return f"""
Service: {details['name']}
Price: ${details['price']} per call
Uptime: {details['uptime']}%
Category: {details['category']}
Description: {details['description']}
"""
@kernel_function(
name="call_service",
description="Call an x402 service endpoint with payment"
)
def call_service(self, endpoint_url: str, payload: dict) -> str:
"""Call an x402 service with automatic payment handling."""
from x402_client import x402_fetch
import os
response = x402_fetch(
endpoint_url,
method="POST",
json=payload,
wallet_key=os.environ["WALLET_PRIVATE_KEY"]
)
return response.json()
def _format_results(self, results):
output = []
for r in results:
output.append(f"- {r['name']} (${r['price']}) - {r['url']}")
return "\n".join(output)
# Initialize Semantic Kernel
kernel = sk.Kernel()
# Add Azure OpenAI (or OpenAI)
kernel.add_service(
sk.connectors.ai.AzureChatCompletion(
deployment_name="gpt-4",
endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_KEY"]
)
)
# Add the Agent Index plugin
kernel.add_plugin(AgentIndexPlugin(), "x402")
# Create a function using the plugin
async def find_and_use_service():
result = await kernel.invoke_prompt(
"Find a crypto sentiment analysis service and analyze BTC",
settings=sk.PromptExecutionSettings(
function_choice_behavior="auto" # Let SK auto-select functions
)
)
return result
# Run it
import asyncio
print(asyncio.run(find_and_use_service()))
Quick Start (.NET)
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Plugins;
using AgentIndex.SDK;
using X402.Client;
// Create a plugin class for Agent Index
public class AgentIndexPlugin
{
private readonly AgentIndexClient _client;
public AgentIndexPlugin()
{
_client = new AgentIndexClient();
}
[KernelFunction("search_services")]
[Description("Search for x402 API services")]
public async Task<string> SearchServicesAsync(
[Description("Search query")] string query,
[Description("Category filter")] string? category = null)
{
var results = await _client.SearchAsync(query, category, limit: 5);
return FormatResults(results);
}
[KernelFunction("call_service")]
[Description("Call an x402 service with payment")]
public async Task<string> CallServiceAsync(
[Description("Endpoint URL")] string endpointUrl,
[Description("Request payload")] string payload)
{
var walletKey = Environment.GetEnvironmentVariable("WALLET_PRIVATE_KEY");
var response = await X402Fetch.PostAsync(endpointUrl, payload, walletKey);
return await response.Content.ReadAsStringAsync();
}
}
// Usage
var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
deploymentName: "gpt-4",
endpoint: Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT"),
apiKey: Environment.GetEnvironmentVariable("AZURE_OPENAI_KEY")
);
var kernel = builder.Build();
kernel.ImportPluginFromObject(new AgentIndexPlugin());
// Use with automatic function calling
var result = await kernel.InvokePromptAsync(
"Find a service for real-time crypto prices and get the BTC price",
new KernelArguments(new OpenAIPromptExecutionSettings
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
})
);
Using the Planner
Let Semantic Kernel's planner automatically orchestrate x402 service discovery and usage:
from semantic_kernel.planners import FunctionCallingStepwisePlanner
# Create a planner
planner = FunctionCallingStepwisePlanner(kernel)
# Give it a complex task
result = await planner.invoke(
kernel,
"I need to analyze crypto market sentiment for my portfolio. "
"Find relevant x402 services, check their prices, and use the "
"most cost-effective one to analyze BTC, ETH, and SOL sentiment. "
"Budget: max $0.10 total."
)
print(result.final_answer)
Deploy Your Application
Production-ready infrastructure for Semantic Kernel applications:
๐ Railway
Deploy .NET or Python apps instantly. $20 free credit to start.
๐ DigitalOcean
Scalable VPS for enterprise AI. $200 free credit for new users.
Railway Deployment (Python)
# railway.json
{
"build": {
"builder": "NIXPACKS"
},
"deploy": {
"startCommand": "python main.py"
}
}
# requirements.txt
semantic-kernel>=0.9.0
agentindex
x402-client
uvicorn
fastapi
Docker Deployment (.NET)
# Dockerfile
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /src
COPY . .
RUN dotnet publish -c Release -o /app
FROM mcr.microsoft.com/dotnet/aspnet:8.0
WORKDIR /app
COPY --from=build /app .
ENTRYPOINT ["dotnet", "MyAIApp.dll"]
Best Practices
- Plugin organization: Separate x402 capabilities into dedicated plugins
- Cost tracking: Implement a spending tracker in your plugin
- Caching: Cache Agent Index search results (they update hourly)
- Error handling: Wrap x402 calls with retry logic for 402 responses
- Logging: Use SK's built-in telemetry for debugging
Enterprise Features
- Azure Key Vault: Store wallet keys securely in Azure
- Managed Identity: Use Azure AD for authentication
- Application Insights: Monitor x402 spending and performance
- Cost Management: Set organizational spending limits
Example Use Cases
- Enterprise chatbots that access premium data sources on demand
- Automated research assistants using specialized APIs
- Trading systems with real-time market data plugins
- Content generation with AI inference services
- Customer service bots with dynamic capability expansion
Next Steps
- Browse available x402 services
- Read the full API documentation
- Submit your own x402 endpoint
- Try the CrewAI integration
- Try the LangChain integration
- Try the AutoGen integration