
Nvidia's senior director of product management, Paresh Kharya, in prepared remarks emphasized that the commercial availability of Nvidia's chip was significant, saying it showed the rapid move from chip debut to shipping systems. The results were the first time that Nvidia has reported metrics by which to assess the relative performance advantage of its newest chip, the A100, which was unveiled in May. It blew past Nvidia's results on most tasks in the MLPerf test, if one considers both commercial and research achievements. Google's cloud computing service's home-grown chip, the Tensor Processing Unit, or TPU, is one such research project. Another category of submission is for systems that have the status of being research projects, meaning not available for use by customers. The current results were submitted by vendors in June and represent the first time this year the benchmark is being published.Ī distinction has to be made with respect to Nvidia's top marks: they were for commercially available systems. Historically, the contest results have been unveiled once every quarter, but the COVID-19 pandemic has played havoc with the usual schedule.

The second phase of machine learning, when those trained networks are used to make predictions in real time, known as inference, is covered in a separate competition that will be published later this year.
#Nvidia stone giant full#
The full roster of results can be seen in a spreadsheet form.

In the results announced Wednesday by the MLPerf organization, an industry consortium that administers the tests, Nvidia took top marks across the board for a variety of machine learning "training" tasks, meaning the computing operations required to develop a machine learning neural network from scratch. The devil's in the details, but both companies' achievements show the trend in AI continues to be that of bigger and bigger machine learning endeavors, backed by more-brawny computers.īenchmark tests are never without controversy, and some upstart competitors of Nvidia and Google, notably Cerebras Systems and Graphcore, continued to avoid the benchmark competition.
#Nvidia stone giant software#
Nvidia and Google on Wednesday each announced that they had aced a series of tests called MLPerf to be the biggest and best in hardware and software to crunch common artificial intelligence tasks.
