By Qingyun Wu and Vasiliy Radostev, AG2AI Team
Editor’s note: AG2 (formerly AutoGen) is an open-source framework created by the original AutoGen team, now operating under the AG2AI organization with open governance. AG2 continues to serve the community via the PyPI packages ag2 , autogen , and pyautogen .
The future of AI isn’t about isolated agents working in silos—it’s about interconnected ecosystems where autonomous agents collaborate seamlessly across platforms to solve complex problems. Today, we’re excited to announce AG2 0.10.0, bringing native support for Google’s Agent2Agent (A2A) protocol to help developers build truly interoperable multi-agent systems.
Breaking down framework silos
With AG2 0.10.0, developers can now create multi-agent systems that work seamlessly with agents from LangChain, CrewAI, and any other A2A-compatible framework. The A2A protocol enables AI agents built on different frameworks to discover, communicate, and collaborate effectively—as agents, not just as tools.
The momentum behind A2A is accelerating rapidly. Over 150 organizations now support the protocol, including every major hyperscaler and leading technology providers. This growing ecosystem represents a fundamental shift toward collaborative AI systems that can work together regardless of their underlying implementation.
What’s new: Native A2A support in AG2
Our latest release makes it seamless to build interoperable agents with native A2A protocol support. Built on top of the A2A SDKs, this integration enables developers to:
- Expose existing AG2 agents as A2A-compatible services that can be discovered and used by any A2A client
- Connect to remote A2A agents built with any framework and use them as if they were native AG2 agents
- Stream real-time updates during long-running tasks with Server-Sent Events (SSE) support
Exposing your agents is simple
Below is a client and server example to demonstrate how easy it is to make AG2 agents available in the A2A protocol. You can find more detailed setup instructions in AG2’s documentation for A2A. You can view the source code here. AG2 agents available via the A2A protocol.
Creating the server
from autogen import ConversableAgent, LLMConfig
from autogen.a2a import A2aAgentServer
# Create your regular agent
llm_config = LLMConfig({ "model": "gpt-4o-mini" })
agent = ConversableAgent(
name="python_coder",
system_message="You are an expert Python developer...",
llm_config=llm_config,
# set human_input_mode "NEVER" to avoid asking for human input on server side
human_input_mode="NEVER",
)
# Create A2A server
server = A2aAgentServer(agent).build()
Creating the client
from autogen.a2a import A2aRemoteAgent
remote_coding_agent = A2aRemoteAgent(
url="http://localhost:8000", # your server URL
name="python_coder",
)
Utilizing ConversableAgent with the remote client/server
from autogen import ConversableAgent, LLMConfig
from autogen.a2a import A2aRemoteAgent
llm_config = LLMConfig({ "model": "gpt-4o-mini" })
review_agent = ConversableAgent(
name="reviewer",
system_message="You are a code reviewer...",
llm_config=llm_config,
)
remote_coding_agent = A2aRemoteAgent(
url="http://localhost:9999",
name="python_coder",
)
await review_agent.a_initiate_chat(
recipient=remote_coding_agent,
message={"role": "user", "content": prompt},
)
Enhanced security with Safe Guards
Security is paramount when building multi-agent systems. AG2 0.10.0 introduces Safe Guards in Group Chat, providing comprehensive security controls for multi-agent interactions:
- Policy-guided safeguards for fine-grained control over agent interactions
- Configurable communication guardrails supporting both regex and LLM-based detection methods
- Comprehensive security controls across agent-to-agent and agent-to-environment interactions
Better visibility with automatic flow diagrams
Understanding complex multi-agent workflows just got easier. AG2 0.10.0 automatically generates flow diagrams for all orchestrations, providing clear insights into agent interactions, message flows, and decision points throughout your agentic systems. These visualizations make it simpler to debug, optimize, and communicate how your agents work together.
A growing ecosystem
Since evolving from AutoGen in November 2024, AG2 has grown to over 3,700 GitHub stars and 400+ contributors. The framework supports:
- Multiple LLM providers: OpenAI, Google Gemini, Anthropic Claude, Azure OpenAI, and more
- Protocol interoperability: A2A for agent collaboration, MCP for tool integration
- Flexible orchestration patterns: Group chats, swarms, nested chats, and custom patterns
- Production deployment: Integration with FastAgency for scalable agent workflows
Additional improvements in 0.10.0
Beyond A2A protocol support, this release includes:
- Enhanced message content support for list[dict] types in two-agent chat and group chat APIs
- Improved tool detection for OpenAI o1 models with LLM Tools/Functions merging
- Documentation enhancements including process message hooks and updated examples
- Multiple bug fixes for improved reliability across the framework
View the full changelog for complete details.
Get started today
Ready to build interoperable AI agents? Here’s how to begin:
Install AG2 0.10.0:
pip install ag2==0.10.0
Join the community:
- GitHub: github.com/ag2ai/ag2
- Discord: discord.gg/pAbnFJrkgZ
- Documentation: docs.ag2.ai
Learn more
- A2A Protocol Specification: a2a.dev
- Release Notes v0.10.0: Releases · ag2ai/ag2 · GitHub
AG2 is licensed under Apache 2.0 and welcomes contributions from developers and organizations worldwide. The project operates under open governance principles to foster collaborative innovation in agentic AI.
We can’t wait to see what you build with AG2 and A2A. Share your projects with us on Discord or GitHub.