Cerebras Systems has unveiled the Condor Galaxy 1, powered by nine networked supercomputers designed for a total of 4 exaflops of AI compute via 54 million cores. Cerebras says the CG-1 greatly accelerates AI model training, completing its first run on a large language AI trained for Abu Dhabi-based G42 in only 10 days. Cerebras and G42 have partnered to offer the Santa Clara, California-based CG-1 as a cloud service, positioning it as an alternative to Nvidia’s DGX GH200 cloud supercomputer. The companies plan to release CG-2 and CG-3 in early 2024.
Spectrum IEEE reports the CG-1 came online powered by 32 Cerebras CS2 systems, capable of 2 exaflops of compute — or a quadrillion (2 billion billion) operations per second — and is planned to double in size to full power with 64 CS2s within 12 weeks. The CG-2 and CG-3 will also each house 64 CS-2s, deploying in Austin, Texas, and Ashville, North Carolina.
As if anticipating the Biden administration’s AI security concerns, Cerebras’ press release notes that the Condor Galaxy line is “operated by Cerebras under U.S. laws, ensuring state of the art AI systems are not used by adversary states.” G42 Cloud is the largest cloud computing company in the UAE.
“CG-1 offers native support for training with long sequence lengths, up to 50,000 tokens out of the box, without any special software libraries,” Cerebras explains “Programing CG-1 is done entirely without complex distributed programming languages, meaning even the largest models can be run without weeks or months spent distributing work over thousands of GPUs.”
CG-1 marks “the first time Cerebras has partnered not only to build a dedicated AI supercomputer but also to manage and operate it.”
The Condor Galaxy will face an increasingly competitive market. Spectrum IEEE writes that Nvidia GPUs dominate in AI, resulting in many programming languages and software optimized for Nvidia chips. Meta Platforms built its AI Research SuperCluster using Nvidia GPUs, and Elon Musk purchased 10,000 of them to power xAI.
Companies including Google and Amazon have developed their own AI silicon (TPU and Trainium, respectively), and Spectrum lists startups including “Habana (now part of Intel), Graphcore, and SambaNova.”
The New York Times writes that “chips are set to play such a key role in AI that they could change the balance of power among tech companies and even nations.”
No Comments Yet
You can be the first to comment!
Leave a comment
You must be logged in to post a comment.