Ad
TechLifestyle

Meta Launches Llama 4 AI Models to Power Chatbots on WhatsApp, Instagram and More

Meta Llama 4 – Updates: In a bold leap toward the future of AI, Meta has launched its most advanced large language models under the Llama 4 series. The company announced the immediate availability of Llama 4 Scout and Llama 4 Maverick, now powering its in-house chatbot across WhatsApp, Instagram, Messenger, and other Meta-owned platforms.

These cutting-edge models are also available for download through Meta’s official website and AI hubs like Hugging Face.


Meet the Llama 4 Family

In addition to Scout and Maverick, Meta offered a sneak peek at a third model — Llama 4 Behemoth — described as “one of the smartest LLMs in the world and our most powerful yet”. Behemoth is designed to serve as a teacher model, helping shape and guide the development of future AI systems at Meta.

All Llama 4 models are built to be natively multimodal, trained on massive datasets comprising text, images, and video. This means the models can process and generate responses using both visual and textual input, seamlessly blending the two for a richer AI experience.


Llama 4 Maverick: The Workhorse

Designed for everyday usage and enterprise-grade applications, Llama 4 Maverick features:

  • 17 billion active parameters
  • 128 experts using a mixture of experts architecture
  • A strong focus on precise image understanding, creative writing, and general chat assistant functions

Meta dubs it the “product workhorse”, meant for broad use across industries, from customer support to AI creative tools.


Llama 4 Scout: The Lightweight Genius

Llama 4 Scout is more compact yet highly capable:

  • 17 billion active parameters
  • 16 experts
  • 109 billion total parameters
  • 10 million token context window

Scout is tailored for high-efficiency tasks like document summarization, code reasoning, and technical analysis. According to Meta, it outperforms competitors across popular benchmarks including Gemma 3, Gemini 2.0 Flash Lite, and Mistral 3.1.


Powered by “Mixture of Experts” Architecture

Both models utilize a mixture of experts (MoE) approach, popularized by Chinese AI startup DeepSeek. This allows different parts of the model to specialize in specific tasks, improving performance and reducing computational overhead. MoE-based systems activate only the most relevant “experts” during a task, making the model both efficient and task-optimized.


Not Reasoning Models… Yet

While Llama 4 excels in image understanding and coding tasks, none of the newly released models are designed as reasoning models, such as OpenAI’s o3-mini or DeepSeek R1. These specialized reasoning models aim to simulate human-style thinking, trading speed for better answers in complex decision-making tasks.

That said, Meta’s roadmap hints at Llama 4 Behemoth being the bridge to such high-level capabilities.


Where to Use Llama 4

Meta has already begun integrating Llama 4 into its flagship apps:

  • WhatsApp
  • Instagram
  • Messenger
  • Meta AI website

Users can chat with Meta AI by opening a conversation with the assistant on any supported platform. The features are currently live in over 40 countries, though the multimodal capabilities (text + image) are limited to English-language users in the U.S. for now.


Future Outlook and Global Rollout

While users outside the U.S. will have to wait for full image generation features, the current release is a major leap in Meta’s AI roadmap. Meta plans to gradually expand the multimodal toolkit, possibly including Ghibli-style visual creation, AI video understanding, and robotic task execution in future updates.

Meta’s long-term goal, underlined by CEO Mark Zuckerberg, is to become a leader in open-source AI infrastructure, positioning Llama models as competitive alternatives to OpenAI, Anthropic, and Google DeepMind.

Meta’s Llama 4 release marks a transformational moment for consumer AI, with smart assistants becoming more visual, more contextual, and more powerful. Whether it’s summarizing a document, writing a script, or interpreting a meme, Meta AI is evolving into a force to watch in 2025.


The Hindustan Herald Is Your Source For The Latest In BusinessEntertainmentLifestyleBreaking News, And Other News. Please Follow Us On FacebookInstagramTwitter, And LinkedIn To Receive Instantaneous Updates. Also Don’t Forget To Subscribe Our Telegram Channel @hindustanherald

Related Articles

Back to top button