Speaker
Description
This hands-on workshop demonstrates how to run AI workloads on Kubernetes using MicroK8s on Ubuntu. Participants will build a lightweight AI inference service, containerize it, deploy it to a local MicroK8s cluster, and implement scaling and observability using native Kubernetes tooling.
The session focuses on production discipline: resource isolation, image optimization, autoscaling, secrets management, and upgrade considerations. Although the workshop runs on CPU-only systems, it includes architectural guidance for integrating GPU scheduling in production environments.
Attendees will leave with a fully functional Kubernetes-based AI deployment and a clear roadmap for scaling it using Ubuntu infrastructure.
Justification
This workshop will be valuable to UbuCon attendees because it provides a practical, Ubuntu-native pathway to running AI workloads on Kubernetes using MicroK8s. Rather than discussing AI or cloud infrastructure in abstract terms, participants will gain hands-on experience deploying, scaling, and observing a containerized AI service on a real cluster built on Ubuntu. It reinforces the role of Ubuntu and Canonical’s Kubernetes tooling in modern AI infrastructure while equipping developers, students, and engineers with immediately applicable skills. By focusing on reproducibility, resource efficiency, and production discipline within hardware constraints common in African environments the session aligns strongly with UbuCon’s mission of empowering communities through open-source technology.
| Submission type | Workshop |
|---|---|
| Technical level | intermediate |
| Where are you based? | Nigeria |