Need GPUs? Take a look at microclouds

This post was originally published on Info World

As most IT people know, GPUs are in high demand and are critical for running and training generative AI models. The alternative cloud sector, also known as microclouds, is experiencing a significant surge. Businesses such as CoreWeave, Lambda Labs, Voltage Park, and Together AI are at the forefront of this movement. CoreWeave, which started as a cryptocurrency mining venture, has become a major provider of GPU infrastructure.

This shift illustrates a broader trend in which companies are increasingly relying on cloud-hosted GPU services, mainly due to the high cost and technical requirements of installing and maintaining the necessary hardware on-site. Since public cloud providers are not discounting these computing services, microclouds provide a better path for many enterprises.

Why don’t we stick with “traditional” cloud services provided by AWS, Google Cloud, and Microsoft Azure, which also offer a range of GPU resources? The answer, as usual, is money. Microclouds are often a more cost-effective solution for AI projects that require GPUs. The cost of renting popular GPUs such as Nvidia’s A100 40GB can be significantly lower on CoreWeave or another microcloud platform, compared to Azure or Google Cloud (check current pricing; this is a very general observation).

Enterprises, be wary

Despite this sector’s vibrancy,

Read the rest of this post, which was originally published on Info World.

Previous Post

An IT Manager’s (Re)View of the RSA Conference

Next Post

Network Support for AI