
Meta has delayed the discharge of its bigger generative AI mannequin, Llama 4 Behemoth, from its authentic April launch to an unspecified date within the fall. The mannequin’s a number of delays, reported by The Wall Road Journal, come as critics inside and outdoors of Meta query whether or not massive generative AI fashions have reached a efficiency plateau.
What’s Llama 4 Behemoth?
Llama 4 Behemoth is a 288-billion-parameter massive language mannequin. Meta described Behemoth as “one of many smartest LLMs on the earth and our strongest but to function a trainer for our new fashions.”
Initially, Meta deliberate to debut Behemoth at its AI developer convention in April. The discharge was first delayed till June, and has now been postponed once more. Behemoth can be the most important model of Llama 4, Meta’s newest flagship mannequin. Meta claimed Behemoth outperformed OpenAI’s GPT-4.5, Anthropic’s Claude Sonnet 3.7, and Google’s Gemini 2.0 Professional on a number of STEM benchmarks.
Meta has already used Behemoth to coach its smaller Llama 4 fashions, Scout and Maverick.
Meta competes with OpenAI, Google, Anthropic, and xAI, and different corporations within the generative AI market.
SEE: xAI’s Groq infrastructure permits high-speed output for Meta’s Llama API.
Doubts raised about AI efficiency leaps
In keeping with The Wall Road Journal, some Meta staff are questioning whether or not Behemoth presents a major sufficient enchancment over predecessors to warrant a public launch. On the similar time, senior executives are blaming the Llama 4 crew for not making sufficient progress.
These inside issues echo broader doubts inside the AI business concerning the tempo and price of advancing generative AI. Some consultants warn that additional features could include prohibitively excessive prices and slower growth cycles, making it troublesome to maintain the fast tempo of product launches seen in recent times by corporations like Meta and Open AI.
“Proper now, the progress is kind of small throughout all of the labs, all of the fashions,” Ravid Shwartz-Ziv, an assistant professor and college fellow at New York College’s Heart for Information Science, instructed The Wall Road Journal.