Are Google’s AI supercomputers faster than Nvidia’s?

Image Source: techzine.eu

Google published details about its AI supercomputers saying they are faster and more efficient than competing Nvidia systems.

Tensor Processing Units, or TPUs, are artificial intelligence (AI) chips that Google has been developing and utilizing since 2016.

Nvidia currently holds a 90% share of the overall market for AI training models and deployment.

Google has produced several of the most significant developments in the area during the past ten years.

A large number of supercomputers and a lot of processors must work simultaneously to train models.

Google announced that it has developed a system with more than 4,000 TPUs connected to specialized parts intended to operate and train AI models.

Google's TPU v4, is “1.2x-1.7x faster and uses 1.3x-1.9x less power than the Nvidia A100.”

Google claimed that their TPU chips were used to train the AI image generator Midjourney.

Stay Updated With Us!!