bunch of GPUs that are allegedly “bigger than anything” else that has already been reported.
Even though the development of the Llama 4 is still in progress, Zuckerberg reported to investors that in the case where the initial launch is expected to happen early next year also said “We're training the Llama 4 models on a cluster that is bigger than 100,000 H100s, or bigger than anything that I've seen reported for what others are doing,” making a reference to Nvidia chips that are used for training AI systems. He also added, “I expect that the smaller Llama 4 models will be ready first.”.
Developing AI models with more computing power and data is believed to be the secret weapon when it comes to creating more capable AI models. Currently, Meta is believed to be one of the primary ones nowadays, with most of its competitors training their models with compute clusters that have more than 100,000 chips.
Even more so, this March, Nvidia, and Meta also shared information about clusters of nearly 25,000 H100s that were used in the process of developing the Llama 3. Elon Musk also mentioned this July that his xAI project worked with X and Nvidia in order to set up 100,000 H100s, writing on its social media platform X that “It’s the most powerful AI training cluster in the world!”.
Yet, this Wednesday, Zuckerberg did not want to offer any additional details regarding Llama 4 capabilities and its potential, just mentioning a “much faster” system. Zuckerberg also presented their new generative AI, Llama as being “Open Source”, it still has some limitations when it comes to the commercial use model.
One of Meta’s concerns when it comes to Llama 4 is the energy power it requires and the access it has. In theory, the estimate of power needed by a cluster of 100,000 H100 chips would be around 150 megawatts of power. And for a better understanding of the situation, El Capitan, the largest national lab in the United States, only requires 30 megawatts of power.
Zuckerberg also talked about their open source approach saying “It seems pretty clear to me that open source will be the most cost-effective, customizable, trustworthy, performant, and easiest to use option that is available to developers,” he also added this Wednesday “And I am proud that Llama is leading the way on this.”
He also talked about the large number of users that access Meta AI, over 500 million people monthly. Susan Li, the CFO of Meta, reported that “There will be a broadening set of queries that people use it for, and the monetization opportunities will exist over time as we get there”.