AMAX Deep Learning Solutions Upgraded With NVIDIA Tesla V100 GPU Accelerators

FREMONT, Calif., Sept. 28, 2017 /PRNewswire/ – AMAX, a leading provider of Deep Learning, HPC, Cloud/IaaS servers and appliances, today announced that its GPU solutions, including Deep Learning platforms, are now available with the latest NVIDIA® Tesla® V100 GPU accelerator. Solutions featuring the V100 GPUs are expected to begin shipping in Q4 2017.

Powered by the new NVIDIA Volta architecture, AMAX’s V100-based computing solutions are the most powerful GPU solutions on the market to accelerate HPC, Deep Learning, and data analytic workloads. The solutions combine the latest Intel® Xeon® Scalable Processor series with Tesla V100 GPUs to enable 6x the Tensor FLOPS for DL inference when compared to the previous generation NVIDIA Pascal™ GPUs.

“We are thrilled about the biggest breakthrough we’ve ever seen on data center GPUs,” said James Huang, Product Marketing Manager, AMAX. “This will deliver the most dramatic performance gains and cost savings opportunities for HPC and the AI industry that we cannot wait to see.”

NVIDIA Tesla V100 GPU accelerators are the most advanced data center GPUs ever built to accelerate AI, HPC and graphics applications. Equipped with 640 Tensor Cores, a single V100 GPU offers the performance of up to 100 CPUs, enabling data scientists, researchers, and engineers to tackle challenges that were once thought to be impossible. The V100 features six major technology breakthroughs:

  • New Volta Architecture: By pairing CUDA® cores and Tensor Cores within a unified architecture, a single server with Tesla V100 GPUs can replace hundreds of commodity CPU servers for traditional HPC and Deep Learning.
  • Tensor Core: Equipped with 640 Tensor Cores, Tesla V100 delivers 125 TeraFLOPS of deep learning performance. That’s 12X Tensor FLOPS for Deep Learning training, and 6X Tensor FLOPS for DL inference when compared to NVIDIA Pascal™ GPUs.
  • Next-Generation NVIDIA NVLink™ Interconnect Technology: NVLink in Tesla V100…

Read the full article at the Original Source..

Back to Top