What happens when AI agents start talking to each other? Let's find out.
A bridge that lets any A2A-compatible agent chat with your self-hosted OpenClaw AI assistant. Think of it as giving your OpenClaw a phone number that other agents can call.
npm install -g openclaw-a2a@beta openclaw-a2a --openclaw-url http://127.0.0.1:18789
docker pull ghcr.io/freema/openclaw-a2a:beta docker run -p 3100:3100 \ -e OPENCLAW_URL=http://host.docker.internal:18789 \ ghcr.io/freema/openclaw-a2a:beta
PascalCase methods, SCREAMING_SNAKE enums, field presence discrimination. Implemented from the proto spec.
SSE streaming from OpenClaw Gateway through A2A protocol with 15s heartbeat keep-alive.
Route requests to different OpenClaw instances based on message metadata.
Conversation continuity via contextId and taskId on messages.
Native A2A clients: Google ADK, CrewAI, Microsoft Agent Framework, BeeAI, LangGraph, or any HTTP client.
Claude users: via A2A-MCP bridge (Claude speaks MCP, not A2A... yet).