Meta Prepares AI Supercomputer, Claimed to be the Fastest in the World

Post a Comment
Social media giant Meta is developing an “AI supercomputer.” It is a high-speed computer specifically designed to train machine learning systems.

The company says the AI ​​(artificial intelligence) Research SuperCluster, or RSC for short, is already one of the fastest machines of its kind. When completed in mid-2022, the machine is claimed to be the fastest in the world.

Meta CEO Mark Zuckerberg once stated, that Meta has developed what is believed to be the world's fastest AI supercomputer. He called it the RSC for the AI ​​Research SuperCluster and it will be completed later this year.

The RSC will be used to train various systems across the Meta business. From the content moderation algorithms used to detect hate speech on Facebook and Instagram, to the augmented reality features that will one day be available on the company's AR hardware.

Not to forget, Meta also said that RSC will be used to design experiences in its new virtual world business, Metaverse.

"RSC will help AI Meta researchers build new and improved AI models that can learn from trillions of examples, operate in hundreds of different languages, analyze text, images, and videos simultaneously, develop new augmented reality tools and much more," the executive wrote. Meta, Kevin Lee and Shubho Sengupta, in a blog post outlining the news.

Work on RSC began a year and a half ago, (since this article was written), starting with designing various machine systems, from cooling, power, networking and cabling, all from scratch.

The first phase of RSC is up and running, consisting of 760 Nvidia GGX A100 systems and containing 6,080 GPUs connected. It is a kind of high-end processor in dealing with machine learning problems.

Meta says it has provided up to a 20-fold increase in performance on its standard machine vision research tasks.

However, before the end of 2022, RSC phase two will be completed. By then, it will contain around 16,000 GPUs in total and will be able to train AI systems “with over a trillion parameters on an exabyte of dataset.”

Photo: Pixabay

Related Posts

Post a Comment