

MLPerf benchmark results help advance machine-learning performance and efficiency, allowing researchers to evaluate the efficacy of AI training and inference based on specific server configurations. ASUS will focus on three demonstrations outlining its strategic developments in AI, including: the methodology behind ASUS MLPerf Training v2.0 results that achieved multiple breakthrough records a success story exploring the building of an academic AI data center at King Abdullah University of Science and Technology (KAUST) in Saudi Arabia and a research AI data center created in conjunction with the National Health Research Institute in Taiwan. You can see the specifications provided by NVIDIA below.ĪSUS, the leading IT company in server systems, server motherboards and workstations, today announced its presence at NVIDIA GTC - a developer conference for the era of AI and the metaverse.

This is a further separation of the entire GPU market, where the HPC/AI SKUs get their own architecture, and GPUs for graphics processing are built on a new architecture as well. The "L" model in the current product stack is used to accelerate graphics, with display ports installed on the GPU, while the "H" models (H100) are there to accelerate HPC/AI installments where visual elements are a secondary task. NVIDIA is calling this their Omniverse GPU, as it is a part of the push to separate its GPUs used for graphics and AI/HPC models. Paired with an unknown SKU, we assume that it uses AD102 with adjusted frequencies to lower the TDP and allow for passive cooling. While the NVIDIA website provides sparse, the new L40 GPU uses 48 GB GDDR6 memory with ECC error correction. Called the L40, NVIDIA updated its previous Ampere-based A40 design. However, today, we also got a new Ada Lovelace card intended for the data center.

Dubbed NVIDIA GeForce RTX 40 series, it brings various updates like more CUDA cores, a new DLSS 3 version, 4th generation Tensor cores, 3rd generation Ray Tracing cores, and much more, which you can read about here. During its GTC 2022 session, NVIDIA introduced its new generation of gaming graphics cards based on the novel Ada Lovelace architecture.
