gilad shainer, vp of marketing dec 2013 interconnect your future
Post on 13-Jan-2016
214 Views
Preview:
TRANSCRIPT
Gilad Shainer, VP of Marketing
Dec 2013
Interconnect Your Future
© 2013 Mellanox Technologies 2- Mellanox Confidential -
Leading Supplier of End-to-End Interconnect Solutions
MXMMellanox Messaging
Acceleration
FCAFabric Collectives
Acceleration
Management
UFMUnified Fabric Management
Storage and Data
VSAStorage Accelerator
(iSCSI)
UDAUnstructured Data
Accelerator
Comprehensive End-to-End Software Accelerators and Managment
Host/Fabric SoftwareICs Switches/GatewaysAdapter Cards Cables/Modules
Comprehensive End-to-End InfiniBand and Ethernet Portfolio
Metro / WAN
© 2013 Mellanox Technologies 3- Mellanox Confidential -
Mellanox InfiniBand Paves the Road to Exascale Computing
Accelerating Half of the World’s Petascale SystemsMellanox Connected Petascale System Examples
© 2013 Mellanox Technologies 4- Mellanox Confidential -
TOP500 InfiniBand Accelerated Petascale Capable Machines
Mellanox FDR InfiniBand systems Tripled from Nov’12 to Nov’13• Accelerating 63% of the Petaflop capable systems (12 systems out of 19)
© 2013 Mellanox Technologies 5- Mellanox Confidential -
FDR InfiniBand Delivers Highest Return on Investment
Higher is better
Higher is better Higher is better
Higher is betterSource: HPC Advisory Council
© 2013 Mellanox Technologies 6- Mellanox Confidential -
Architectural Foundation for Exascale Computing
Connect-IB
© 2013 Mellanox Technologies 7- Mellanox Confidential -
Mellanox Connect-IB The World’s Fastest Adapter
The 7th generation of Mellanox interconnect adapters
World’s first 100Gb/s interconnect adapter (dual-port FDR 56Gb/s InfiniBand)
Delivers 137 million messages per second – 4X higher than competition
Support the new innovative InfiniBand scalable transport – Dynamically Connected
© 2013 Mellanox Technologies 8- Mellanox Confidential -
Connect-IB Provides Highest Interconnect Throughput
Source: Prof. DK Panda
Connect-IB FDR(Dual port)
ConnectX-3 FDR
ConnectX-2 QDR
Competition (InfiniBand)
Connect-IB FDR(Dual port)
ConnectX-3 FDR
ConnectX-2 QDR
Competition (InfiniBand)
Hig
her
is B
ette
r
Gain Your Performance Leadership With Connect-IB Adapters
© 2013 Mellanox Technologies 9- Mellanox Confidential -
Connect-IB Delivers Highest Application Performance
200% Higher Performance Versus Competition, with Only 32-nodes
Performance Gap Increases with Cluster Size
© 2013 Mellanox Technologies 10- Mellanox Confidential -
Dynamically Connected Transport Advantages
© 2013 Mellanox Technologies 11- Mellanox Confidential -
Accelerations for Parallel Programs
Mellanox ScalableHPC
© 2013 Mellanox Technologies 12- Mellanox Confidential -
Mellanox ScalableHPC Accelerate Parallel Applications
InfiniBand Verbs API
MXM• Reliable Messaging Optimized for Mellanox HCA• Hybrid Transport Mechanism • Efficient Memory Registration• Receive Side Tag Matching
FCA• Topology Aware Collective Optimization• Hardware Multicast• Separate Virtual Fabric for Collectives• CORE-Direct Hardware Offload
Memory
P1
Memory
P2
Memory
P3
MPI
Memory
P1 P2 P3
SHMEM
Logical Shared Memory
Memory
P1 P2 P3
PGAS
Memory Memory
Logical Shared Memory
© 2013 Mellanox Technologies 13- Mellanox Confidential -
Mellanox FCA Collective Scalability
0 500 1000 1500 2000 25000.0
20.0
40.0
60.0
80.0
100.0
Barrier Collective
Without FCA With FCA
Processes (PPN=8)
Lat
ency
(u
s)
0 500 1000 1500 2000 25000
500
1000
1500
2000
2500
3000
Reduce Collective
Without FCA With FCA
processes (PPN=8)
La
ten
cy
(u
s)
0 500 1000 1500 2000 25000
100020003000400050006000700080009000
8-Byte Broadcast
Without FCA With FCA
Processes (PPN=8)
Ban
dw
idth
(K
B*p
roce
sses
)
© 2013 Mellanox Technologies 14- Mellanox Confidential -
Nonblocking Alltoall (Overlap-Wait) Benchmark
CoreDirect Offload allows Alltoall benchmark with almost 100% compute
© 2013 Mellanox Technologies 15- Mellanox Confidential -
Accelerator and GPU Offloads
© 2013 Mellanox Technologies 16- Mellanox Confidential -
GPUDirect 1.0
CPU
GPUChipset
GPUMemory
InfiniBand
System Memory1
2
CPU
GPU Chipset
GPUMemory
InfiniBand
System Memory
12
TransmitReceive
CPU
GPUChipset
GPUMemory
InfiniBand
System Memory1CPU
GPU Chipset
GPUMemory
InfiniBand
System Memory
1
Non GPUDirect
GPUDirect 1.0
© 2013 Mellanox Technologies 17- Mellanox Confidential -
GPUDirect RDMA
TransmitReceive
CPU
GPUChipset
GPUMemory
InfiniBand
System Memory1CPU
GPU Chipset
GPUMemory
InfiniBand
System Memory
1
GPUDirect RDMA
CPU
GPUChipset
GPUMemory
InfiniBand
System Memory1CPU
GPU Chipset
GPUMemory
InfiniBand
System Memory
1
GPUDirect 1.0
© 2013 Mellanox Technologies 18- Mellanox Confidential -
1 2 4 8 16 32 64 128 256 512 1K 2K 4K0
200
400
600
800
1000
1200
1400
1600
1800
2000
Message Size (bytes)
Ba
nd
wid
th (
MB
/s)
1 2 4 8 16 32 64 128 256 512 1K 2K 4K0
5
10
15
20
25
Message Size (bytes)
La
ten
cy
(u
s)
GPU-GPU Internode MPI Latency
Lower is B
etter67 %
5.49 usec
Performance of MVAPICH2 with GPUDirect RDMA
67% Lower Latency
5X
GPU-GPU Internode MPI Bandwidth
Hig
her
is B
ette
r
5X Increase in Throughput
Source: Prof. DK Panda
© 2013 Mellanox Technologies 19- Mellanox Confidential -
Execution Time of HSG
(Heisenberg Spin Glass)
Application with 2 GPU Nodes
Source: Prof. DK Panda
Performance of MVAPICH2 with GPU-Direct-RDMA
Problem Size
© 2013 Mellanox Technologies 20- Mellanox Confidential -
Roadmap
© 2013 Mellanox Technologies 21- Mellanox Confidential -
Technology Roadmap – One-Generation Lead over the Competition
2000 202020102005
20Gbs 40Gbs 56Gbs 100Gbs
“Roadrunner”Mellanox Connected
1st 3rd
TOP500 2003Virginia Tech (Apple)
2015
200Gbs
Mega Supercomputers
Terascale Petascale Exascale
Mellanox
© 2013 Mellanox Technologies 22- Mellanox Confidential -
Paving The Road for 100Gb/s and Beyond
Recent Acquisitions are Part of Mellanox’s Strategy
to Make 100Gb/s Deployments as Easy as 10Gb/s
Copper (Passive, Active) Optical Cables (VCSEL) Silicon Photonics
© 2013 Mellanox Technologies 23- Mellanox Confidential -
The Only Provider of End-to-End 40/56Gb/s Solutions
From Data Center to Metro and WAN
X86, ARM and Power based Compute and Storage Platforms
The Interconnect Provider For 10Gb/s and Beyond
Host/Fabric SoftwareICs Switches/GatewaysAdapter Cards Cables/Modules
Comprehensive End-to-End InfiniBand and Ethernet Portfolio
Metro / WAN
Thank You
top related