By
Paula ParisiJuly 24, 2023
Cerebras Systems has unveiled the Condor Galaxy 1, powered by nine networked supercomputers designed for a total of 4 exaflops of AI compute via 54 million cores. Cerebras says the CG-1 greatly accelerates AI model training, completing its first run on a large language AI trained for Abu Dhabi-based G42 in only 10 days. Cerebras and G42 have partnered to offer the Santa Clara, California-based CG-1 as a cloud service, positioning it as an alternative to Nvidia’s DGX GH200 cloud supercomputer. The companies plan to release CG-2 and CG-3 in early 2024. Continue reading Cerebras, G42 Partner on a Supercomputer for Generative AI
By
Paula ParisiSeptember 19, 2022
Nvidia, Intel and ARM have published a draft specification for a common AI interchange format aimed at faster and more efficient system development. The proposed “8-bit floating point” standard, known as FP8, will potentially accelerate both training and operating the systems by reducing memory usage and optimizing interconnect bandwidth. The lower precision number format is a key factor in driving efficiency. Transformer networks, in particular, benefit from an 8-bit floating point precision, and having a common interchange format should facilitate interoperability advances for both hardware and software platforms. Continue reading Nvidia, Intel and ARM Publish New FP8 AI Interchange Format
By
Paula ParisiMarch 24, 2022
Nvidia CEO Jensen Huang announced a host of new AI tech geared toward data centers at the GTC 2022 conference this week. Available in Q3, the H100 Tensor Core GPUs are built on the company’s new Hopper GPU architecture. Huang described the H100 as the next “engine of the world’s AI infrastructures.” Hopper debuts in Nvidia DGX H100 systems designed for enterprise. With data centers, “companies are manufacturing intelligence and operating giant AI factories,” Huang said, speaking from a real-time virtual environment in the firm’s Omniverse 3D simulation platform. Continue reading Nvidia Introduces New Architecture to Power AI Data Centers