# NOTE : this is a work-in-progress document.
Author: Jay Sivadasan
Last Updated : 11/22/2025
# 🎓 Akamai Learning Curriculum (Beginner → Architect)
Duration: 8–10 weeks (self-paced)
Outcome: Able to design, implement, secure, optimize, and govern Akamai solutions for enterprise-scale systems.
## 📘 Phase 1 — Foundations (Week 1–2)
1. CDN Basics
- What is a CDN? [WIKI](https://en.wikipedia.org/wiki/Content_delivery_network)
- **Edge vs Origin** : An origin server is the primary source of your content, holding the original, authoritative files. An edge server is one of many servers, distributed globally and part of a Content Delivery Network (CDN), that caches a copy of this content to serve users faster from locations closer to them. The origin server is essential for dynamic content and server-side logic, while edge servers handle static content delivery to reduce latency and load on the origin
- How caching works: CDN caching works by storing copies of content on a network of edge servers around the world, so content can be delivered from a server geographically closer to the user. When a user requests content, the CDN serves it from the nearest edge server instead of the original, or "origin," server. This process significantly reduces latency and improves website performance by lessening the load on the origin server and decreasing the distance the data has to travel.
- How the caching process works
- User request: A user requests a file (e.g., an image, CSS file, or video) from a website.
- Request routing: The CDN directs the request to the edge server closest to the user's location.
- Cache check: The edge server checks if it has a valid, cached copy of the requested file.
- Cache hit: If a valid copy is found (a "cache hit"), the server immediately sends the file to the user. This is a very fast process.
- Cache miss: If a valid copy is not found (a "cache miss"), the edge server requests the file from the origin server.
- Content delivery and caching: The origin server sends the file to the edge server. The edge server then delivers the file to the user and stores a copy of it for future requests.
- Cache expiration: The CDN's caching is controlled by settings, often defined by the s-maxage cache directive, which tells the server how long to store the file before it needs to be refreshed from the origin server.
- Benefits of CDN caching
- Reduced latency: Serving content from a nearby edge server drastically cuts down the time it takes for data to travel to the user.
Lower origin server load: By handling many requests from its cache, the CDN significantly reduces the number of requests that reach the origin server, which helps prevent server overload.
- Improved user experience: Faster load times lead to a better experience for the end-user.
- CDN benefits: latency, performance, availability, cost
- Reverse proxy vs forward proxy: A forward proxy acts as an intermediary for clients, protecting their identities and controlling their access to external servers, while a reverse proxy acts as an intermediary for servers, protecting them and managing incoming requests from clients. Forward proxies are used by clients for privacy, content filtering, and bypassing geo-restrictions, whereas reverse proxies are used by servers for load balancing, caching, and security against external attacks.
- Reverse proxy (This is more important for us)
- Purpose: Protects and manages one or more backend servers from external clients.
- Position: Sits in front of one or more servers.
- How it works: Clients connect to the reverse proxy, which then forwards the request to the appropriate backend server. It hides the IP addresses of the backend servers and acts as a single point of contact.
- Primary use cases:
- Load balancing: Distributes incoming traffic across multiple servers to prevent any single server from being overloaded.
- Security: Acts as a firewall to protect servers from attacks.
- SSL encryption/decryption: Handles SSL/TLS encryption and decryption to reduce the load on the backend servers.
- Caching: Caches static content to improve performance.
- a reverse proxy is commonly used for TLS offloading, which means it handles the SSL/TLS encryption and decryption for the backend servers. This process frees up the backend servers from the heavy computational load of encryption, improves performance, and simplifies certificate management by centralizing it on the proxy itself.
-
2. Akamai Basics
- High-level Akamai architecture
- Akamai edge server network
- What Ion, Kona, Bot Manager, EdgeWorkers, Image Manager, mPulse are
- Akamai terminology:
- Edge nodes
- Property
- CP code: You can think of it like a tracking tag that follows your content through the Akamai network. Akamai serves content from tens of thousands of edge servers.
To know what content belongs to which customer, which product, and which environment, Akamai tags every request with a CP code. We can use separate CP codes for each app to bill them separately.
- Purge
- Cache keys
- Property manager rules
3. Tools & Accounts
- Akamai Control Center (UI)
- Property Manager
- Akamai Sandbox
- Akamai Logs
- APIs (Akamai Open APIs)
- Hands-on:
- Create a trial/mocked property in Sandbox
- Deploy a simple static site behind Akamai
## 📗 Phase 2 — Intermediate: Akamai Property Management (Week 3–4)
1. Property Manager Deep Dive
Rule hierarchy
Behaviors
Match conditions
Versioning & activation
Staging vs production
Best practices (small rules, grouping, naming conventions)
2. Caching in Detail
TTL
Cache keys (query params, cookies, headers)
ignore vs include params
Stale-if-error
Stale-while-revalidate
No-store vs no-cache
Edge-side includes basics
3. Origins
Defining origins
HTTPS, certificates
Forwarding rules
Failover behavior
Multiple origins (GCP, AWS, On-prem)
4. Purging
Fast Purge
Content Tagging
CP code purge
Soft purge vs hard purge
Pre-warming cache
Hands-on:
Configure an Akamai property with custom cache rules
Set up 2 origins with failover
Test caching behaviors via curl + debug headers
📙 Phase 3 — Performance Optimization (Week 5)
1. Ion / DSA
- Dynamic site acceleration
- Route optimization
- Prefetching
- Pre-connect
- TCP optimizations
- Brotli and gzip
- HTTP/2 & HTTP/3 QUIC support
2. Image & Video Optimization
- Akamai Image Manager
- Responsive image policies
- WebP / AVIF transformations
3. mPulse
- RUM
- Core web vitals
- Synthetic monitoring vs RUM
- Third-party script impact
- Hands-on:
- Enable Brotli
- Convert all images automatically to WebP
- Add mPulse tracking to a test site
📕 Phase 4 — Security (Week 6)
1. WAF (Kona Site Defender)
- Managed vs custom rules
- OWASP Top 10 protections
- Rate controls
- IP/Geo blocking
- False positive management
- Tuning
- Testing attacks safely (Akamai Sandbox)
2. Bot Manager
- Good vs bad bots
- Bot categories
- Hard vs soft mitigation
- CAPTCHA, challenges, tarpits
- SEO bot allowlist management
3. Edge Authentication
- Akamai EdgeAuth tokens
- JWT validation at edge
- Signed URLs
- Token replay protection
4. DDoS
- Layer 3/4 DDoS
- Layer 7 protection
- Traffic scrubbing
- Endpoint hardening
- Hands-on:
- Create a custom WAF rule
- Configure bot mitigation with challenges
- Implement signed URLs for a protected asset
📘 Phase 5 — Edge Compute (Week 7)
1. EdgeWorkers Basics
JavaScript at the edge
Routes and bundle structure
Sandbox testing
Logging
Event model (onClientRequest, onOriginRequest, etc.)
2. Common Use Cases
Authentication and authorization
Request/response manipulation
A/B testing
Header injection
API token validation
Redirects
Dynamic rewriting
Personalization at the edge
3. EdgeKV
Data storage at the edge
Key-value store structure
TTL, namespaces
Usage limits, cost
Hands-on:
Build a simple redirect worker
Validate JWT at the edge
Store session info using EdgeKV
📗 Phase 6 — Advanced Architecture (Week 8–9)
1. Multi-Origin & Traffic Routing
Region-based routing
Geo, device, AB testing rules
Blue/green rollouts using traffic throttling
Failover patterns (active-active vs active-passive)
2. Enterprise Architecture Patterns
Serving static assets entirely from edge
Micro frontends with edge-based routing
Progressive migration from on-prem to cloud
Using Akamai to protect backend APIs
Multi-CDN architecture and traffic steering
Cache resilience when backend fails
3. Debugging & Observability
Using debug headers (x-cache, x-serial, x-akam-shield)
Log streaming (ELK, Splunk, Datadog)
Diagnosing cache poisoning
Troubleshooting edge failures
Comparing origin vs edge behavior
4. Governance
Naming conventions
Version control
Approval workflows
API-first configuration model
Managing hundreds of properties across enterprise
Hands-on:
Design a blue/green deployment pipeline using Akamai
Set up log streaming to Splunk
Build a fallback edge page when origin is down
📕 Phase 7 — DevOps + Automation (Week 10)
1. Akamai APIs
Authentication
PAPI (Property Manager API)
Fast Purge API
EdgeWorker APIs
Certificate management API
2. CI/CD Automation
GitHub Actions
Jenkins
GitLab pipelines
Automated rollouts to staging + production
Policy-as-code using Terraform
3. Terraform for Akamai
Providers
Managing properties
Activations
EdgeWorkers automation
Multi-environment setup
Hands-on:
Build CI/CD pipeline to deploy Akamai properties using PAPI
Trigger fast purge via pipeline
Deploy EdgeWorkers to staging+prod using automation
📘 Phase 8 — Architect-Level Case Studies (Optional)
In this phase you solve real-world scenarios:
Case Study 1:
Migrate on-prem website to GCP using Akamai with 2% → 100% traffic throttling.
Case Study 2:
Design a zero-downtime global rollout using Akamai + multi-region cloud.
Case Study 3:
Secure an API backend using Akamai WAF, token-based auth, and EdgeWorkers.
Case Study 4:
Design High Availability + automatic failover between AWS and On-prem.
Case Study 5:
Implement micro-frontends routing using Akamai edge logic.