There’s more to cloud architecture than GPUs

This post was originally published on Info World

Talk to anybody about generative AI in the cloud, and the conversation goes quickly to GPUs (graphics processing units). But that could be a false objective. GPUs do not matter as much as people think they do, and in a few years, the conversation will likely shift to what is much more critical to the development and deployment of generative AI systems in the cloud.

The current assumption is that GPUs are indispensable for facilitating the complex computations required by generative AI models. While GPUs have been pivotal in advancing AI, overemphasizing them might detract from exploring and leveraging equally effective and potentially more sustainable alternatives. Indeed, GPUs could quickly become commodities like other resources that AI systems need, such as storage and processing space. The focus should be on designing and deploying these systems, not just the hardware they run on. Call me crazy.

GPU gold rush

The importance of GPUs has worked out well for Nvidia, a company most people did not pay much attention to until now. In its most recent quarter, Nvidia posted record-high data center revenue of $14.5 billion, up 41% from the prior quarter and 279% from the year-ago quarter. Its GPUs are now the standard in

Read the rest of this post, which was originally published on Info World.

Previous Post

The Economics of Pure Storage Evergreen Subscriptions

Next Post

Seeing the Unseen: How AI is Transforming SDN Monitoring