This post was originally published on Light Reading
RA’ANANA, Israel – DriveNets – a leader in cloud-native networking solutions – today announced the introduction of DriveNets Network Cloud-AI, an innovative artificial intelligence (AI) networking solution designed to maximize the utilization of AI infrastructures and improve the performance of large-scale AI workloads. Built on DriveNets’ Network Cloud – which is deployed in the world’s largest networks – Network Cloud-AI has been validated by leading hyperscalers in recent trials as the most cost-effective Ethernet solution for AI networking. With this new offering, DriveNets is well-positioned to address the growing AI networking segment – a $10B market opportunity.
With the fast growth of AI workloads, network solutions that are used in the fabric of AI clusters need to evolve to maximize the utilization of costly AI compute resources. Simply put, AI workloads perform most effectively when the network is able to operate at 100% utilization.
Until now, AI networks were based on either traditional Ethernet leaf-and-spine architecture that was not designed to support high-performance AI workloads at scale, or with proprietary solutions such as Nvidia’s InfiniBand that do not support network interoperability and offer little flexibility for hyperscalers looking to avoid “vendor lock-in.” DriveNets Network Cloud-AI
— Read the rest of this post, which was originally published on Light Reading.