Best Way to Build Agents on AWS (This will be the future)


Hello Reader,

Another week, another AI announcement. But this one is worth studying because this one will become the defacto standard of running agents on AWS. I am talking about newly released Amazon AgentCore. Let's dive in.

🧩 The Big Picture: Why Agents Exist

Let’s break it down using a practical example: What happens when a user asks an LLM app:

  • What’s the time in New York?
  • What’s the weather there?
  • List my S3 buckets

The LLM don't have these information, hence it needs to invoke tools for time, weather, and AWS info. Perhaps it'd call Tool 1 to get latitude and longitude of NYC, and use that to call Tool 2 to get time, and Tool 3 to get weather. And finally it'd call Tool 4 for S3 buckets.

That orchestration logic - “call this tool first, then that one if it fails” - used to be something you had to code in workflows. Now, agents handle it automatically. It can call one tool multiple times, properly sequence it, and more - without you coding the logic.

To understand the value proposition of this new release, we have to understand the previous iteration - Amazon Bedrock Agents.

☁️ Bedrock Agents: Fully Managed, but Verbose

The flagship platform to run agents on AWS "was" Amazon Bedrock Agents. Below is how it works:

  • You use Bedrock-hosted LLMs
  • You need to define Action Group for the agent
  • In that Action Group, each tool (e.g., get time, fetch weather, list S3) must be its own Lambda function
  • You define each function’s description and parameters
  • The agent decides execution order based on the description and parameters

Here are the pros and cons

Pros:

  • Managed infrastructure - both the LLM and underlying Lambda infrastructure is managed by AWS
  • Easy Bedrock Guardrails integration
  • Since this is all in AWS service, you can use AWS support for issues (except your Lambda code, that's your part in the Shared Responsibility model ;) )

Cons:

  • You must code and manage multiple Lambdas. This is a major pain
  • Descriptions and input parameters must be of good quality so that the LLM can determine what to call
  • No external LLM support (e.g., Claude hosted outside Bedrock)

Now, let's take a look at the next iteration, AWS Strands.

🔧 AWS Strands: Developer Freedom with Built-in Power

Think of AWS Strands as a single Python program that can implement powerful agentic workflows with a minimum amount of code. Why?

  • Supports both Bedrock and external LLMs (e.g., Anthropic Claude)
  • This one is awesome - It comes with 20+ prebuilt tools, that you do NOT have to code. For example:
    • current_time (gives current time)
    • use_aws (uses boto3 under the hood to list AWS resources)
    • http_request (figures out API calls automatically)

Strand Code to implement the above is below:

Here is the code in less than 30 lines to implement the above (I am not joking). And as you can see, most of it is plain english. The powerful part is line 22, where I simply specify to use prebuilt tools - current_time (to get time!), http_request (automatically calls popular API endpoints to perform the task automatically!), and use_aws (to run boto3 commands using your natural language!)

You no longer need to write separate Lambdas for separate tools. Great, you have your Strands code running on your laptop. Now what? The final goal is to run on AWS. You can upload this code and manage it yourself, such as upload this Strands code in lambda, or EKS, interface with ALB or API Gateway, instrument security, observability, keep up with upgrades etc. That's still lot of pain.

Hence Bedrock AgentCore was born!

🚀 Bedrock AgentCore

Bedrock AgentCore can help you run your agents on AWS in simple commands. Let's find out how:

Step 1 : Change your code to run on Agentcore

Don't fret, you just need to add couple lines, majority of your code stays same. If you take the above Strands code, you need to make the following changes marked by green rectangles, to make it AgentCore compatible

Highlighted changes are:

  • Import necessary library - pretty standard. then instantiate a BedRock AgentCore app
  • Encapsulate your code in a function (in this case invoke()) add @app.entrypoint above it. This simply tells AgentCore when your program is invoked what is the entry point
  • Standard app.run() function to execute the code

Step 2 : Containerize your code (You don't have to write Dockerfile!)

  • You simply run command "agentcore configure" and it creates a Dockerfile for you!
    • I hate writing Dockerfile myself because I am always forgetting something to include. AgentCore creates the Dockerfile which will package the code and requirements.txt file
  • This is all happening on your local laptop

Step 3 : Deploy and Run your Agent - sounds like magic?

This part AgentCore is what makes it so great!

You run command "agentcore launch" and lot of things happen. Let's go one by one

  • AgentCore copies the python code, requirements.txt, and Dockerfile from your local to a S3 bucket. It creates the bucket for you
  • It runs the Dockerfile in AWS CodeBuild which creates a container image for the agent code
  • It saves the image in an ECR repo (it creates the repo for you)
  • It starts running the container in AgentCore ready to be invoked
  • It creates logging and tracing for you. AWS even created new Gen AI Observability which is super slick

Below are the characteristics of the AgentCore:

  • Agents run on Serverless microVMs managed by AWS
  • Out of the box Gen AI logging and tracing
  • Convert any Lambda or APIs into MCP
  • Supports third party Agent Frameworks – Crew, LangGraph etc. if you have written your agents on these framework, you can still utilize AgentCore to run them on AWS
  • Easy to add AuthN/Z
  • And I think you have noticed - no need to write and manage individual Lambda for each tool unlike Amazon Bedrock Agents

🤖 Summary

  • Strands simplified coding the agent, but the problem persisted of running and managing it yourself on AWS. Example - you need to run that code on EKS, or Lambda and manage the orchestration
  • AgentCore can take your existing agentic code, containerize it, and run it on infra along with security, logging, tracing.
  • I predict - AgentCore will become the primary platform on AWS to run agents
  • Deprioritize learning Bedrock Agents

Below is the code for AgentCore I used: https://github.com/saha-rajdeep/Strands-agents-demo/blob/main/agentcore_weather_time_strands.py

If you want to see the demo in action check out the video:

video preview

the future is Agentic, and now you are ready for it!

If you have found this newsletter helpful, and want to support me 🙏:

Checkout my bestselling courses on AWS, System Design, Kubernetes, DevOps, and more: Max discounted links

AWS SA Bootcamp with Live Classes, Mock Interviews, Hands-On, Resume Improvement and more: https://www.sabootcamp.com/

Keep learning and keep rocking 🚀,

Raj

Fast Track To Cloud

Free Cloud Interview Guide to crush your next interview. Plus, real-world answers for cloud interviews, and system design from a top AWS Solutions Architect.

Read more from Fast Track To Cloud

Hello Reader, Often I hear this - API Gateway is Serverless, hence it's better than Application Load Balancer (ALB). In todays newsletter edition, we will take an objective look at both, consider pros and cons, and more importantly how to tackle this in system design or tech interview. Remember our guiding principle - to get the job, or to excel at the job - you need to DELIGHT and not just MEET the standard. Let's get started. Both can route traffic to backends, both are managed by AWS, and...

Hello Reader, AI agent frameworks have created a crowded and confusing landscape. This edition offers a direct, value-focused comparison of these popular frameworks, examining their features, ease of use, enterprise readiness, and ideal use cases to help you decide which is best suited for your project. This edition of the newsletter is written by Adam Bluhm, Principal AI Engineer at HiddenLayer, and a former award-winning Senior Solutions Architect at AWS. Adam Bluhm builds and architects...

Hello Reader, In today's newsletter, I am going to share three tips that helped me and many of my students switch careers to the cloud and get high-paying jobs. I will also share an update about the upcoming Sep cohort of the AWS SA Bootcamp. Tip 1: Leverage your IT experience Your existing IT experience is NOT throwaway. Don't think you can't reuse components of your existing knowledge in your cloud journey. For example, my mentee and SA Bootcamper Rukmani, came from software engineering...