MIT Spinoff Liquid Eschews GPTs for Its Fluid Approach to AI

AI startup Liquid, founded by alums of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), has released its first models. Called Liquid Foundation Models, or LFMs, the multimodal family approaches “intelligence” differently than the pre-trained transformer models that dominate the field. Instead, the LFMs take a path of “first principles,” which MIT describes as “the same way engineers build engines, cars, and airplanes,” explaining that the models are large neural networks with computational units “steeped in theories of dynamic systems, signal processing and numeric linear algebra.” Continue reading MIT Spinoff Liquid Eschews GPTs for Its Fluid Approach to AI

Databricks DBRX Model Offers High Performance at Low Cost

Databricks, a San Francisco-based company focused on cloud data and artificial intelligence, has released a generative AI model called DBRX that it says sets new standards for performance and efficiency in the open source category. The mixture-of-experts (MoE) architecture contains 132 billion parameters and was pre-trained on 12T tokens of text and code data. Databricks says it provides the open community and enterprises who want to build their own LLMs with capabilities previously limited to closed model APIs. Compared to other open models, Databricks claims it outperforms alternatives including Llama 2-70B and Mixtral on certain benchmarks. Continue reading Databricks DBRX Model Offers High Performance at Low Cost