This is part two of our series on Tackling AI Together. We’re looking at this new challenge/opportunity in the cloud native world and asking how we can work together – and bring in new voices – to get the most out of artificial intelligence. Read part one.
As organizations around the world consider how best to leverage artificial intelligence, we believe that cloud native is the bedrock of the AI movement because it’s the epicenter of innovation and cloud native is the glue that holds the AI stack together. Six years ago OpenAI spoke at KubeCon Berlin about Building the Infrastructure that Powers the Future of AI, and we have heard more and more about how foundational Kubernetes and cloud native is to AI.
Priyanka Sharma, executive director of CNCF, shared her thoughts at KubeCon in Chicago, which included:
- Cloud native applications optimize resource usage, which is critical for keeping AI workloads efficient while remaining high performing. Kubeflow is a perfect example of this. An incubation-level project, Kubeflow supports ML pipelines and MLOps.
- AI needs a reliable technology stack and cloud native applications are built to support the freedom of and flexibility of deploying workloads anywhere, even at the event-driven level with projects like KEDA.
- Cloud native application development demands rapid testing and iteration in order to ensure even more rapid deployments – all practices that are equally important for AI development.
- Transparency is a core cloud native/open source value and practice that is critically important when adopting new technologies like artificial intelligence. Organizations need to clearly communicate values, responsibilities and priorities in order to create AI workflows that are safe, ethical and open.
- The transition to AI can feel risky but a reliance on the cloud native ecosystem can help. Cloud native lowers the barriers to entry through open interfaces and standards and offers a hefty ecosystem of tools, runtimes, and community wisdom. Replace vendor lock-in with a smorgasbord of choices.
- Cloud native is synonymous with cutting-edge technologies so it’s the obvious place to turn for the latest ML/AI support and features.
And if all that’s not convincing enough, just look at the cloud native community. We stepped up to move fast and through the pandemic and past economic downturns, and all of that has translated into even more momentum going forward.
In fact, the cloud native open source community is larger than it’s ever been with more than 170 projects and over 220,000 contributors to CNCF. It may have started with Kubernetes almost 10 years ago, but it’s clearly not ended there.
Cloud native built the model for how everyone can contribute over these last ten years, and now we’re helping to build the infrastructure for the next chapter of the web and the future of AI.
Next up: Joanna Lee, CNCF VP of strategic programs and legal, about best practices and legal risks of using Generative AI in software development.
Want to know more about cloud native and AI? Join us on 20 March 2024 in Paris at KubeCon + CloudNativeCon Europe 2024 for more cloud native and AI focused content, Cloud Native AI Day as well as for our second ever AI Hub!