InfiniBand More Than Doubles Usage on TOP500 Taking Share from Ethernet and Proprietary Interconnects

InfiniBand-based supercomputer clusters increase 105% since June 2006 and 173% since November 2005

SUPERCOMPUTING 2006, TAMPA, FLORIDA, NOVEMBER 13, 2006 – Mellanox™ Technologies Ltd, a leading supplier of semiconductor-based high-performance interconnect products, announced today that InfiniBand continues to be the fastest growing cluster interconnect, according to the 28th edition of the TOP500 list of the world’s most powerful computers. 82 supercomputers are powered by InfiniBand interconnect, more than doubling the usage of InfiniBand in less than 6-months (40 supercomputers reported using InfiniBand in June 2006).

Published twice a year and publicly available at, the TOP500 list ranks the most powerful computer systems according to the Linpack benchmark rating system and is an industry respected report which indicates usage trends in computing and interconnect solutions. Highlights of InfiniBand interconnect usage on the November 2006 TOP500 list include:

  • Three of the top ten and five of the top twelve most prestigious positions use InfiniBand – all based on Mellanox adapters and switch silicon.
  • InfiniBand is the most used high-speed, low-latency interconnect and has the greatest increase in the rate of usage.
  • InfiniBand interconnect usage has surpassed proprietary Myrinet interconnect usage.  Myrinet usage on the list has continually decreased since June 2005.
  • The average efficiency of all reported InfiniBand-based supercomputers is 71% -- significantly superior to the average efficiency of Gigabit Ethernet connected clusters at 51%.
  • Increasing InfiniBand interconnect entries replace technologies such as proprietary Myrinet (10% decline) and lower performing Gigabit Ethernet (16% decline).

“To see InfiniBand’s usage more than double on the TOP500 in six months is a leading indicator of the demand for Mellanox’s industry-standard, high-performance, low-latency interconnect in place of proprietary and lower performance solutions,” said Eyal Waldman, president and CEO of Mellanox Technologies. “We believe these same interconnect trends flow into enterprise data centers that face similar challenges with various applications including financial, clustered database, CAE, EDA, digital content creation, utility computing, retail, and supply and demand management. The increasing usage of Mellanox’s products is based on industry-leading price/performance as an interconnect for server and storage systems, proven scalability on multi-thousand node clusters, and compatibility with a mature ecosystem of software support.”

About Mellanox

Mellanox Technologies is a leading supplier of semiconductor-based, high-performance interconnect products that facilitate data transmission between servers and storage systems through communications infrastructure equipment. Our products are an integral part of a total solution focused on computing, storage and communication applications used in enterprise data centers, high-performance computing and embedded systems. Based on InfiniBand technology, our field-proven adapter and switch integrated circuits deliver industry-leading performance and capabilities, and serve as the building blocks for creating reliable and scalable interconnect solutions.

Founded in 1999, Mellanox Technologies is headquartered in Santa Clara, California and Yokneam, Israel.  For more information on Mellanox’s solutions, please visit

Mellanox is a registered trademark of Mellanox Technologies, Inc. and ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, and InfiniPCI are trademarks of Mellanox Technologies, Inc. All other trademarks are property of their respective owners.

For more information:
Mellanox Technologies, Inc.
Brian Sparks

NVIDIA Mellanox Cookie Policy

This website uses cookies which may help to deliver content tailored to your preferences and interests, provide you with a better browsing experience, and to analyze our traffic. You may delete and/or block out cookies from this site, but it may affect how the site operates. Further information can be found in our Privacy Policy.