Baidu’s Ernie AI Gets Improved Text-to-Image and App Builder

Ernie, the foundation model for Baidu’s generative AI, has been updated with iRAG technology to mitigate visual hallucinations and a no-code tool called Miaoda that creates apps using natural language. The company behind China’s largest search engine says Ernie now handles 1.5 billion daily user queries, up from 50 million circa its March 2023 launch (a 30x increase). Baidu also debuted Ernie-powered smart glasses from its Xiaodu Technology hardware unit. The Xiaodu AI Glasses features built-in voice activation and cameras for taking photos and video. The news was shared at this week’s Baidu World 2024 in Shanghai. Continue reading Baidu’s Ernie AI Gets Improved Text-to-Image and App Builder

Google Takes Its Bard Search Bot Public, a Rival to ChatGPT

Google has opened a public waitlist for its Bard AI chatbot to users in the U.S. and UK. The technology, which Google intends to compete with OpenAI’s ChatGPT, will be made available to increments of users on a rolling basis, the company said, with more countries and languages to come. Bard was announced last month. Powered by a lightweight, optimized version of Google’s LaMDA large language model, the company calls it an “early experiment” that will eventually be updated with more sophisticated models. The same can be said for ChatGPT, which already has more than 100 million users. Continue reading Google Takes Its Bard Search Bot Public, a Rival to ChatGPT

Meta Says Its LLaMA AI for Researchers Does More with Less

Meta Platforms has unveiled a new generative artificial intelligence language system called LLaMA, which doesn’t chat, but is designed as a research tool the company hopes will help “democratizing access in this important, fast-changing field.” The LLaMA (Large Language Model Meta AI) ranges in size from 7B to 65B parameters. Touted as a “smaller, more performant model,” LLaMA enables those members of the research community that do not “have access to large amounts of infrastructure to study these models,” Meta explains. Training smaller foundation models requires less computing power and resources for testing and validation. Continue reading Meta Says Its LLaMA AI for Researchers Does More with Less