APJ Computing MEA News News

NVIDIA Ampere GPUs Available Google Cloud

The NVIDIA A100 Tensor Core GPU is now available on Google Cloud. As per the release, A100, which is built on the newly introduced NVIDIA Ampere architecture, delivers NVIDIA’s greatest generational leap ever. It boosts training and inference performance by 20x over its predecessors, providing tremendous speedups for AI training and inference workloads to power the AI revolution. 

 “Google Cloud customers often look to us to provide the latest hardware and software services to help them drive innovation on AI and scientific computing workloads,” said, Manish Sainani, Director of Product Management, Google Cloud. “With our new A2 VM family, we are proud to be the first major cloud provider to market NVIDIA A100 GPUs, just as we were with NVIDIA’s T4 GPUs. We are excited to see what our customers will do with these new capabilities.”

In cloud data centers, A100 can power a broad range of compute-intensive applications, including AI training and inference, data analytics, scientific computing, genomics, edge video analytics, 5G services, and much more. 

Fast-growing, critical industries will be able to accelerate their discoveries with the breakthrough performance of A100 on Google Compute Engine. From scaling-up AI training and scientific computing, to scaling-out inference applications, to enabling real-time conversational AI, A100 accelerates complex and unpredictable workloads of all sizes running in the cloud.

Breakthrough A100 Performance in the Cloud for Every Size Workload

The new A2 VM instances can deliver different levels of performance to efficiently accelerate workloads across CUDA-enabled machine learning training and inference, data analytics, as well as high performance computing.

For customers with large, demanding workloads, Google Compute Engine allows customers to access up to 16 A100 GPUs. The a2-megagpu-16g instance comes with 16 A100 GPUs, offering a total of 640GB of GPU memory and 1.3TB of system memory, all connected through NVSwitch with up to 9.6TB/s of aggregate bandwidth. 

For those with smaller workloads, Google Compute Engine is also offering A2 VMs in smaller configurations to match specific applications’ needs. 

Google Cloud announced that additional NVIDIA A100 support is coming soon to Google Kubernetes Engine, Cloud AI Platform, and other Google Cloud services. 

Related posts

Team Computers and Apple Collaborate to Empower GCCs with Smarter Workplace Solutions

enterpriseitworld

Ajay Ajmera Joins Group CIO at Rockman Industries 

enterpriseitworld

Versa Envisions Securing Anywhere, Anytime Access with VersaONE Universal SASE

enterpriseitworld
x