Nvidia co-founder and CEO Jensen Huang delivers the first keynote speech of Computex 2025 at the … More
Nvidia just reclaimed its title as the world’s most valuable company. Whether it retains this top position and for how long depends on its success in defining and developing a worldwide network of AI processing units.
Nvidia is pursuing a vision of a future where “part of the application runs in the data center, another part in a data center at the edge, and another part in an autonomous machine roaming around the world.” This is how Jensen Huang, Nvidia’s co-founder and CEO, described the future of computer applications in a conversation with Bob Metcalfe, inventor of the Ethernet. Huang and Metcalfe are prime examples of the remarkable marriage of engineering ingenuity and marketing creativity that has made many American entrepreneurs successful.
Already five years ago, Huang saw the data center as “a composable disaggregated infrastructure,” where the critical path is the interaction of one “computing node” with another “computing node” over the Ethernet network. In response, Metcalfe asked, “Is this why you bought Mellanox?” and Huang answered, “It is exactly the reason why I bought Mellanox,” adding a great insight: “Understanding the direction of software inspires you about what’s the best way to design and evolve hardware.” In other words, anticipating how applications will be developed and run in the future, Nvidia has added to its portfolio (developing in-house or acquiring) new hardware elements so that it can offer its customers faster, more efficient, more resilient, and less expensive shuttling of data inside and outside the data center.
Founded in Israel in 1999, Mellanox initially focused on developing computer networking products based on the then-new InfiniBand standard. These products featured high throughput and low latency, ensuring fast data movement between one “computing node” and another. Mellanox later added networking products based on the Ethernet standard and was acquired by Nvidia for $6.9 billion in 2019.
Kevin Deierling, the first Mellanox employee in the U.S., is now Nvidia’s senior vice president of networking. Nvidia’s networking division develops and sells the Spectrum-X networking platform, which the company calls “the world’s first Ethernet networking platform for AI.”
Deierling explains that the unique nature of data processing for AI makes the capabilities of the network critical. Cloud computing serves millions of users, each transferring a small amount of data, and that data is completely unsynchronized. In contrast, AI—and Nvidia’s processing units, or GPUs—do things in parallel. “With AI workloads,” says Deierling, “we have enormous, what we call elephant [data] flows, that are synchronized.” Each of the vast number of AI computing nodes operates on its part of the data and then shares all the data it’s processed with the other nodes. “That ends up being extremely bursty traffic,” observes Deierling.
The second trend driving the need for Spectrum-X’s capabilities is the shift in the focus of AI projects. Until recently, AI work mainly involved “training,” feeding an AI model vast amounts of data to learn patterns and relationships. Enterprises are now moving to “inference,” or using the trained model to process new data, make predictions, or take action.
With inferencing, many customers share the same network infrastructure, increasing performance expectations and requirements. The Spectrum-X platform answers these, bringing InfiniBand’s high-performance bandwidth and latency specifications to Ethernet. The significant benefit of using Ethernet for connecting all the components of the AI infrastructure—the data storage unit, the network moving the data, and the data processing units or GPUs—is that it is a widely deployed standard familiar to the many customers now investing in AI. Spectrum-X “uses standard Ethernet protocols,” says Deierling, “but it does things under the hood that make it extremely high performance. The largest AI supercomputer in the world today is based on our Spectrum-X platform.”
The faster and more efficient data movement in the data center implies increased profits for the service provider. “If you’re offering an AI service, you’re extremely interested in the performance per dollar and the performance per watt of the data center,” says Deierling. In addition, Spectrum-X allows the data center to offer a customized service, adjusting the network’s performance based on the varying needs of different end-users and, of course, on what they pay.
Deierling reports that enterprises are rapidly adopting AI agents, adapting them by adding their proprietary data to a model trained on what’s found on the internet. Especially in the context of AI research agents, that’s a sure way to reduce AI “hallucinations” and comply with regulations. “The next wave we’re starting to see is physical AI, edge applications, and robotics,” says Deierling, with the Ethernet connecting everything from the cloud to enterprise data centers to mobile and stationary sensors.
“The Network is the Computer” was the 1984 tag line for Sun Microsystems, a maker of “workstations,” or networked desktop computers. Nvidia’s founders played together flight simulator and “theorized that the killer app would be virtual reality, video games, and 3D games,” Huang told Metcalfe, and that “everybody would want to be a gamer.”
Four decades later, with AI constituting “a new way of writing software,” everybody would want to be a coder, writing applications for the composable disaggregated infrastructure developed and maintained by Nvidia and its partners. “We found ourselves at the right place at the right time. Part insight, part strategy, part serendipity,” said Huang.
This post was created with our nice and easy submission form. Create your post!

