Meta Launching Chatgpt Rival: New Llama 4-powered AI-app offers voice and personal contact with a social side | Mint
Meta introduced a new independent AI app driven by its latest language model, Llama 4, which provides users with a more personal and voting digital assistant experience. The app, which is now available in certain countries, is the first important step of Meta towards building an AI that is not only conversation but also deeper integrated into his family of products, including WhatsApp, Instagram, Facebook and Messenger. Voice talks take the center of the highlight of the new Meta Ai app is the first interface. Users designed to make interactions seamless and natural can now speak directly to the AI rather than type. Meta has introduced complete duplex speech technology in a demo mode, allowing the assistant to talk back in real time without waiting for the user to talk. This demonstration can be linked to or off and is currently available in the US, Canada, Australia and New Zealand. Unlike typical voice assistants who read predetermined reactions out loud, Meta AI generates voice reactions in real-time, mimicking the natural dialogue. However, since it is an early version, user may experience errors or delays, and Meta encourages feedback to refine the experience. Smarter and more personally according to Meta, built using Meta’s Llama 4 model, the app promises more relevant, useful and context-conscious answers. Meta AI learns from user interactions and can remember preferences when they have given permission. It is also stemmed from data already shared on Meta platforms – such as Facebook likes or Instagram follows – to provide answers that reflect individual interests. For example, if a user likes travel and language learning, Meta AI can adjust recommendations and ideas based on these topics. Users in the US and Canada will be the first to access personal answers via this new system. Integration with Meta ecosystem One of the most important strengths of the Meta AI app is its integration into the range of Meta’s range of products. Whether a user browsing on Instagram, talking to Messenger or catching up friends on Facebook, the AI assistant is always within reach. It is also accessible via Ray-Ban Meta Smart Glasses, which now works in accordance with the app. Meta combines the new AI app with the existing Meta View app used for the management of jet bar smiles. Once users update to the new app, their settings, media and linked devices will automatically be transferred to a tab of dedicated devices. This integration enables users to start a conversation on their smart glasses and continue it on the app or on the web later. Discover Feed and Community Sharing the Meta Ai-app introduces an discovery where users can browse popular directions, share their own AI-generated content and remix ideas shared by others. Nothing is put in public unless a user prefers to share it. It aims to promote a sense of community around AI use and stimulate creativity among users. Improved Weber experience Meta also claims to upgrade the web version of his AI assistant. It now supports voting interactions and contains a redesigned image -generation tool with adjustment options for style, lighting and mood. In selected regions, Meta tests a new documented editor that enables users to create scientific documents, execute it as PDFs and even introduce existing files for AI analysis. User control remains Central Meta says the new app puts users in charge. A visible icon will always indicate when the microphone is in use, and settings enable users to turn on a “ready to talk” function for hands -free communication. Privacy remains an important consideration, with features that allow users to manage which data is used and that Meta AI can remember. First published: 29 Apr 2025, 23:12 IST