Cisco Systems revealed networking chips for AI supercomputers on Tuesday, making them a direct competitor to Broadcom and Marvell Technology’s offerings.
Cisco stated that five out of six major cloud providers are testing the chips, without specifying which companies are involved. According to Bofa Global Research, Amazon Web Services, Microsoft Azure, and Google Cloud are the top three cloud providers in the market.
The rapid communication speed of individual chips has become a crucial factor in the growing popularity of AI applications like ChatGPT, which are powered by a network of specialized chips called graphics processing units (GPUs).
Cisco is a prominent provider of networking equipment that uses Ethernet switches to connect computers, laptops, routers, servers, and printers to a local area network.
The G200 and G202 Ethernet switches from the company offer twice the performance of the previous generation and can connect up to 32,000 GPUs together, according to the company.
The power-efficient network for AI/ML workloads will be made possible by the introduction of G200 and G202 as the most powerful networking chips in the market.
The chips, as per Cisco, have the ability to perform AI and machine learning tasks with 40 percent fewer switches and less lag, while also being more energy efficient.
SUPERCOMPUTERS NEWORKING CHIPS GPU CISCO AI