Scaling AI with Open Source: Running Models on CPUs and GPUs in VMs on OpenStack

Audience:

Unlock the power of open-source AI in a scalable and flexible environment by leveraging the capabilities of OpenStack. In this talk, Todd Robinson will explore how to run cutting-edge open source AI models using both CPU and GPU resources within VMs on OpenStack. Discover the advantages of using open-source tools, how to optimize resource allocation for diverse AI workloads, and best practices for scaling your AI infrastructure. We will also share insights from real-world implementations and discuss how a hosted OpenStack private cloud could enhance performance and security while controlling costs. Whether you're a data scientist or an infrastructure manager, learn how to create a robust AI-ready environment with OpenStack.

Time:
Tuesday, December 17, 2024 - 11:00