A Third Variant in the Works

Team green introduced the 4th gen NVLink technology with Hopper allowing for a total bandwdith of 900GB/s. Of course, if one cannot use NVLink then PCIe is also an option but it is a whopping 7x slower than NVLink. NVIDIA has announced 2 variants for the Hopper based H100 GPU thus far, the SXM5 board and the PCIe variant. The main difference lies in the memory configuration where SXM5 ships with new and faster HBM3 memory whereas the PCIe variant makes use of HBM2e (Both are powered by 80GB of memory).  The third variant as highlighted by Zed__Wang ships with 120GB of memory, 1.5x more than the other versions. Do bear in mind, this is another PCIe variant (No NVLink) using HBM2e memory. Another fun fact about this leak is that, this picture allegedly features a Lovelace engineering sample. Now, we know where those leaks were coming from. 

Performance

The H100 120GB variant will reportedly use the full H100 GPU chip powered by 16896 Cuda cores and 3TB/s of memory bandwidth as per s-ss. Although, the full-fat H100 GPU ships with 18432 Cuda cores. As for the performance, the single precision floating point compute power is almost similar to its SXM5 counterpart standing at 60TFLOPs. 

Release Date

While we do not have an accurate release date for the H100 120GB variant, Jensen did say that Hopper is in full production with GPUs arriving as soon as October. Maybe NVIDIA is saving this variant for the upcoming CES 2023 event which is the talk of the town. We expect all three giants, NVIDIA, AMD and Intel to announce their budget friendly counterparts at this event whether it be from the CPU or from the GPU department.

NVIDIA May Announce A Third Hopper H100 GPU With 120GB HBM2e Memory - 17NVIDIA May Announce A Third Hopper H100 GPU With 120GB HBM2e Memory - 93