Article 62903 Some Cloud-based AI Systems are Returning to On-premises Data Centers

Some Cloud-based AI Systems are Returning to On-premises Data Centers

by
hubie
from on (#62903)

upstart writes:

Some cloud-based AI systems are returning to on-premises data centers:

Cloud computing revolutionized AI and machine learning, not because the hyperscalers invented it but because they made it affordable. Nevertheless, I and some others are seeing a shift in thinking about where to host AI/ML processing and AI/ML-coupled data. Using the public cloud providers was pretty much a no-brainer for the past few years. These days, the valuing of hosting AI/ML and the needed data on public cloud providers is being called into question. Why?

Cost of course. Many businesses have built game-changing AI/ML systems in the cloud, and when they get the cloud bills at the end of the month, they understand quickly that hosting AI/ML systems, including terabytes or petabytes of data, is pricey. Moreover, data egress and ingress costs (what you pay to send data from your cloud provider to your data center or another cloud provider) will run up that bill significantly.

[...] First, the cost of traditional compute and storage equipment has fallen a great deal in the past five years or so. If you've never used anything but cloud-based systems, let me explain. We used to go into rooms called data centers where we could physically touch our computing equipment-equipment that we had to purchase outright before we could use it. I'm only half kidding.

When it comes down to renting versus buying, many are finding that traditional approaches, including the burden of maintaining your own hardware and software, are actually much cheaper than the ever-increasing cloud bills.

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title
Feed Link https://soylentnews.org/
Reply 0 comments