Mellanox InfiniBand Provides the Networking Foundation for the University of Cambridge's HPC Cloud

40Gb/s InfiniBand-Accelerated Solution Centre Provides Academic and Commercial Researchers with Readily Accessible Performance for Multiple Scientific Applications

Sunnyvale, Calif. and Yokneam, Israel – Aug. 2, 2010 – Mellanox® Technologies, Ltd. (NASDAQ: MLNX; TASE: MLNX), a leading supplier of high-performance, end-to-end connectivity solutions for data center servers and storage systems, today announced that its industry-leading end-to-end 40Gb/s InfiniBand connectivity products, including ConnectX®-2 adapter cards, BridgeX gateway system, 40Gb/s InfiniBand switches and cables, provide the high-performance server and storage networking for the new Dell | Cambridge High-Performance Computing Solution Centre. On the University campus, the HPC cluster provides on-demand access to support over 400 internal users spread across 70 research groups ranging from traditional sciences such as chemistry, physics and biology, to rapidly growing areas for HPC-based research such as bio-medicine, clinical-medicine and social sciences.

The University of Cambridge is a world-leading teaching and research institution, consistently ranked in the top three universities worldwide. The University also forms the central hub of Europe’s largest technology centre with more than 1,200 technology companies located in science parks surrounding the city and boasting Europe’s largest bio-technology centre.

“The Solutions Centre provides accessible research computing services and technology to academic or private sector organizations that would otherwise not have the money or expertise,” said Dr. Paul Calleja, Director HPC Service, University of Cambridge. “Incorporating Mellanox 40Gb/s InfiniBand, with its GPU efficiency and offloading capabilities, as the clustering backbone of the Solutions Centre was essential to our cloud provision requirements and provides our research community with leading application performance to enhance and accelerate the wide-range of their research.”

“InfiniBand connectivity enables HPC applications to achieve significant performance improvements, enabling researchers to get faster results,” said Marc Sultzbaugh, vice president of worldwide sales at Mellanox Technologies. “By integrating Mellanox 40Gb/s InfiniBand interconnects, the University of Cambridge can deliver on-demand high-performance cluster access and industry-leading application performance to its large pool of academic and commercial researchers.”

Mellanox’s end-to-end InfiniBand connectivity, consisting of the ConnectX®-2 line of I/O adapter products, gateways, cables and comprehensive IS5000 family of fixed and modular switches, delivers industry-leading bandwidth, efficiency and economics for the best return-on-investment for performance interconnects. Mellanox provides its worldwide customers with the broadest, most advanced and highest performing end-to-end networking solutions for the world’s most compute-demanding applications.

Supporting Resources:

About Mellanox
Mellanox Technologies is a leading supplier of end-to-end connectivity solutions for servers and storage that optimize data center performance. Mellanox products deliver market-leading bandwidth, performance, scalability, power conservation and cost-effectiveness while converging multiple legacy network technologies into one future-proof solution. For the best in performance and scalability, Mellanox connectivity solutions are a preferred choice for Fortune 500 data centers and the world’s most powerful supercomputers. Founded in 1999, Mellanox Technologies is headquartered in Sunnyvale, California and Yokneam, Israel. For more information, visit Mellanox at

Mellanox, BridgeX, ConnectX, InfiniBlast, InfiniBridge, InfiniHost, InfiniRISC, InfiniScale, InfiniPCI, PhyX, and Virtual Protocol Interconnect are registered trademarks of Mellanox Technologies, Ltd. CORE-Direct and FabricIT are trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.

NVIDIA Mellanox Cookie Policy

This website uses cookies which may help to deliver content tailored to your preferences and interests, provide you with a better browsing experience, and to analyze our traffic. You may delete and/or block out cookies from this site, but it may affect how the site operates. Further information can be found in our Privacy Policy.