Article 71JNH Cloud-Native Computing Is Poised To Explode

Cloud-Native Computing Is Poised To Explode

by
BeauHD
from Slashdot on (#71JNH)
An anonymous reader quotes a report from ZDNet: At KubeCon North America 2025 in Atlanta, the Cloud Native Computing Foundation (CNCF)'s leaders predicted an enormous surge in cloud-native computing, driven by the explosive growth of AI inference workloads. How much growth? They're predicting hundreds of billions of dollars in spending over the next 18 months. [...] Where cloud-native computing and AI inference come together is when AI is no longer a separate track from cloud-native computing. Instead, AI workloads, particularly inference tasks, are fueling a new era where intelligent applications require scalable and reliable infrastructure. That era is unfolding because, said [CNCF Executive Director Jonathan Bryce], "AI is moving from a few 'Training supercomputers' to widespread 'Enterprise Inference.' This is fundamentally a cloud-native problem. You, the platform engineers, are the ones who will build the open-source platforms that unlock enterprise AI." "Cloud native and AI-native development are merging, and it's really an incredible place we're in right now," said CNCF CTO Chris Aniszczyk. The data backs up this opinion. For example, Google has reported that its internal inference jobs have processed 1.33 quadrillion tokens per month recently, up from 980 trillion just months before. [...] Aniszczyk added that cloud-native projects, especially Kubernetes, are adapting to serve inference workloads at scale: "Kubernetes is obviously one of the leading examples as of the last release the dynamic resource allocation feature enables GPU and TPU hardware abstraction in a Kubernetes context." To better meet the demand, the CNCF announced the Certified Kubernetes AI Conformance Program, which aims to make AI workloads as portable and reliable as traditional cloud-native applications. "As AI moves into production, teams need a consistent infrastructure they can rely on," Aniszczyk stated during his keynote. "This initiative will create shared guardrails to ensure AI workloads behave predictably across environments. It builds on the same community-driven standards process we've used with Kubernetes to help bring consistency as AI adoption scales." What all this effort means for business is that AI inference spending on cloud-native infrastructure and services will reach into the hundreds of billions within the next 18 months. That investment is because CNCF leaders predict that enterprises will race to stand up reliable, cost-effective AI services.

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments