
AI in Production: Gen AI and Agentic AI at scale
Deploy AI to AWS, GCP, Azure, Vercel with MLOps, Bedrock, SageMaker, RAG, Agents, MCP: scalable, secure and observable.
What you'll learn
- Deploy SaaS LLM apps to production on Vercel, AWS, Azure, and GCP, using Clerk
- Design cloud architectures with Lambda, S3, CloudFront, SQS, Route 53, App Runner and API Gateway
- Integrate with Amazon Bedrock and SageMaker, and build with GPT-5, Claude 4, OSS, AWS Nova and HuggingFace
- Rollout to Dev, Test and Prod automatically with Terraform and ship continuously via GitHub Actions
- Deliver enterprise-grade AI solutions that are scalable, secure, monitored, explainable, observable, and controlled with guardrails.
- Create Multi-Agent systems and Agentic Loops with Amazon Bedrock AgentCore and Stands Agents
Requirements
- While it’s ideal if you can code in Python and have some experience working with LLMs, this course is designed for a very wide audience, regardless of background. I’ve included a whole folder of self-study labs that cover foundational technical and programming skills. If you’re new to coding, there’s only one requirement: plenty of patience!
- The course runs best if you have a small budget for APIs and Cloud Providers of a few dollars. But we monitor expenses at every point, and it's always a personal choice.
About this course
This is the course that more of my students have asked for than any other course — put together.
One student called it:
“The missing course in AI.”
This course is for:
- Entrepreneurs
- Enterprise engineers
- …and everyone in between.
It’s not just about RAG — although we’ll work with RAG.
It’s not just about Agents — but there will be many Agents.
It’s not just about MCP — but yes, there will be plenty of MCP too.
This course is about:
RAG, Agents, MCP, and so much more… deployed to production.
Live.
Enterprise-grade.
Scalable, resilient, secure, monitored — and explained.
You’ll ship real-world, production-grade AI with LLMs and agents across Vercel, AWS, GCP, and Azure, going deepest on AWS.
Across four weeks you’ll take four products to production:
Week 1
You’ll launch a Next.js SaaS product on Vercel and AWS,
with AWS App Runner and Clerk for user management and subscriptions.
Week 2
- You’ll become an AI platform engineer on AWS,
- deploying serverless infrastructure using:
- Lambda, Bedrock, API Gateway, S3, CloudFront, Route 53
- Write Infrastructure as Code with Terraform
- Set up CI/CD pipelines with GitHub Actions
— for hands-free deployments and one-click promotions.
Week 3
- You’ll gain broad industry skills for GenAI in production:
- Deploy a Cyber Security Analyst agent with MCP to Azure & GCP
- Stand up SageMaker inference
- Build data ingest to S3 vectors
- Deploy a Researcher Agent using OpenAI OSS models on Bedrock + MCP
Week 4
You’ll go fully agentic in production:
Architect multi-agent systems with:
Aurora Serverless, Lambda, SQS
JWT-authenticated CloudFront frontends
LangFuse observability
Overview of AWS Agent Core
By the end, you’ll know how to:
- Pick the right architecture
- Lock down security
- Monitor costs
- Deliver continuous updates
- Everything needed to run scalable, reliable AI apps in production.
- Course sections (Weeks & Projects)
Week 1
SaaS App Live in Production with Vercel, AWS, Next.js, Clerk, App Runner
Project: SaaS Healthcare App
Week 2
AI Platform Engineering on AWS with Bedrock, Lambda, API Gateway, Terraform, CI/CD
Project: Digital Twin Mk II
Week 3
Gen AI in Production with Azure, GCP, AWS SageMaker, S3 Vectors, MCP
Project: Cybersecurity Analyst
Week 4
Agentic AI in Production: Build and deploy a Multi-Agent System on AWS (Aurora Serverless, Lambda, SQS),
with LangFuse and Bedrock AgentCore
Capstone Project: SaaS Financial Planner
Related Deals


LLM Engineering: Master AI, Large Language Models & Agents

Apache Kafka Series - Learn Apache Kafka for Beginners v3
