MCP Vs A2A - Do You Need to Learn Both?


Hello Reader,

The last couple of weeks have been action-packed for Gen AI! Two specific announcements were at the forefront - MCP (Model Context Protocol) and A2A (Agent To Agent). In today's edition, we will learn the similarities and differences between both, and answer which parts YOU need to know for the job and interviews.

MCP

MCP is released by Anthropic. Before we understand MCP, let's understand the existing challenges. Let's say you send a prompt to the app, "What's the weather in Tokyo?". The LLM in the app doesn't know about the current weather, hence it invokes an agent that runs some code. This code reaches out to an external weather tool, which sends weather data in JSON to the agent. Agent passes it to the LLM, and the LLM formats the data in nice natural language and sends it to the user.

Question is - how does the agent code interact with the weather tool? Via API. And to do that, as shown below, the agentic code, needs to know the API URL, required header information, and payload.

This works, but there are some challenges too:

  • If developers in the weather app changes the API schema, and adds/removes input/output fields, and agentic code doesn't know about it, the connection breaks
  • Imagine, in real-world, agents talk to numerous such tools. Each tool requires separate integration with custom code
  • It's a tedious talk for the app developers to instrument all these separate tools inside the app code

Hence, MCP was born!

MCP standardizes the communication between the agentic code and tools (and local datasources, but tool is the most widely used). What does this mean?

  • An MCP client (think of a piece of code running inside the agent), connects to the MCP server, instead of connecting to the tool directly with a predefined API URL
  • The developers of each tool expose this MCP server
  • MCP client asks the server, "What can you do?". In response, the MCP server responds with the tool capability, description, and input/output schemas
  • IMPORTANT - this discovery is dynamic, and happening at runtime. If input/output field changes, this discovery call will reveal all the fields at runtime
  • The MCP client registers all these, and then can invoke the tool via the MCP server
  • The MCP server handles the connection to the tool. As a result, the code does NOT need to hardcode the API URLs like before
  • The developers of the tools give you the code for the MCP server. You still need to pass authn/z parameters (for e.g. token and team ID for Slack Tool, Access Key and Secret Access Key for AWS Tool etc) to the MCP server. BUT you can run this MCP server locally, or on the cloud. It's up to you

Okay, so MCP standardizes the interaction between the Agent and the underlying tools. But what is this new A2A then? Let's find out.

A2A

MCP handles the communication between the agent and tools (and local datasources). But how about agent-to-agent? Let's look at the diagram below

Agent B has the logic to get the stuff done with tools/datasources using MCP, and this part we understood from above.

Now, Agent A needs to call Agent B. How does this happen BEFORE A2A was in the picture:

Similar to any other API call, Agent A will invoke the API URL of Agent B, and pass AuthN/Z parameters, and a payload. This brings similar challenges as above:

  • Each agent needs to know what the other agent does beforehand
  • Hardcode the connection URL and authn/z parameters
  • Have to know how to submit prompts/tasks to the other agent

A2A (or Agent2Agent) standardizes the communication between agents. What does this mean?

A2A + MCP Flow

  • Agent A, runs a A2A client, which connects to Agent B A2A Server
  • A2A server returns an Agent Card to Agent A
  • This Agent card includes capabilities of Agent B, endpoint URL, how to submit tasks/prompts, how to get notifications on task status, and authn/z mechanism
  • IMPORTANT - Note how dynamic this is (similar to MCP). Using this any agent can discover capabilities, and connection URL, and other info during runtime, and don't need to hardcode beforehand
  • A2A do NOT include payload schema at this time (MCP returns payload schema for the tools/datasources)

In summary, MCP standardizes the connection between LLM Agents and tools, and A2A standardizes the connection between two agents. They work hand in hand because they complement each other and do not compete with each other.

This is a pretty detailed subject, and if you want a more detailed explanation with code snippets, check out video below:

video preview

If you have found this newsletter helpful, and want to support me 🙏:

Checkout my bestselling courses on AWS, System Design, Kubernetes, DevOps, and more: Max discounted links

AWS SA Bootcamp with Live Classes, Mock Interviews, Hands-On, Resume Improvement and more: https://www.sabootcamp.com/

Keep learning and keep rocking 🚀,

Raj

Fast Track To Cloud

Free Cloud Interview Guide to crush your next interview. Plus, real-world answers for cloud interviews, and system design from a top Solutions Architect at AWS.

Read more from Fast Track To Cloud

Hello Reader, Are you thinking about becoming an AWS SA? The demand for AWS Solutions Architects has never been higher. And the data indicates it will continue to rise because there are literally trillions of dollars worth of projects currently running on legacy technologies that need to be migrated to the cloud. SA Bootcamp is developed to be the most direct and guided route to become a Solutions Architect and get a high paying cloud job. In as little as 3 months you could be an AWS SA...

Hello Reader, I have been a Cloud Solutions Architect for 10 years - 4 years at Verizon, 6.5 years at AWS. I was an Application Cloud Architect at Verizon, and then I joined AWS, where I had two different SA roles - first a General SA (Enterprise Architect) and then a Specialist SA. In this post, I will review my responsibilities as an SA in all these companies, including the hardest parts of the job (in my humble opinion). Let's get started: Solutions Architect at Verizon I became a SA at...

Hello Reader, In today's newsletter, I am going to share three tips that helped me and many of my students switch careers to the cloud and get high-paying jobs. I will also share an update about the upcoming Sep cohort of the AWS SA Bootcamp. Tip 1: Leverage your IT experience Your existing IT experience is NOT throwaway. Don't think you can't reuse components of your existing knowledge in your cloud journey. For example, my mentee and SA Bootcamper Rukmani, came from software engineering...