Kgateway Lab: Gateway API inference extensions with kgateway
In this free, hands-on technical lab, dive into the Gateway API Inference Extension and learn how to route AI model requests using Kubernetes-native resources by:
- Installing and enabling the Gateway API Inference Extension with kgateway
- Deploying a lightweight, fake LLM workload to simulate inference traffic
- Creating and configuring
InferencePool
andInferenceModel
resources - Setting up a gateway and route to dynamically direct requests to model variants
- Testing real-world behavior with weighted routing and model switching
Gain practical experience with the Inference Extension and see how it brings intelligent request routing for AI workloads to the Kubernetes Gateway API with open source project kgateway.
Kgateway Lab: Gateway API inference extensions with kgateway
Lab
Take the course
Cloud connectivity done right
