
Meta uses Llama models extensively within its operations. For instance, Meta AI, the company’s assistant powered by Llama, serves nearly 600 million monthly active users. CEO Mark Zuckerberg has stated that Meta AI is on track to become the most widely used AI assistant globally.
However, the model’s open accessibility has posed challenges. In one case, Chinese military researchers reportedly used a Llama model to develop a defense chatbot. In response, Meta made its Llama models available to U.S. defense contractors.
The company has also faced regulatory concerns in Europe. The EU’s AI Act and GDPR provisions have prompted Meta to temporarily halt training on European user data to ensure compliance. Meta continues to advocate for updated regulations that balance innovation with privacy protections.
To support the development of future models, Meta is ramping up its computing infrastructure. The company recently announced a $10 billion investment in a new AI data center in Louisiana, its largest to date. Training the next iteration of Llama models, such as Llama 4, will require ten times the computing power used for Llama 3, according to Zuckerberg. To meet this demand, Meta has secured over 100,000 Nvidia GPUs, placing its resources on par with major competitors.
Building and training generative AI models is an expensive endeavor. Meta's capital expenditures increased by 33% in Q2 2024, reaching $8.5 billion, compared to $6.4 billion the previous year. These costs are driven by investments in servers, data centers, and network infrastructure necessary for advancing its AI capabilities.
Llama 3.3 70B represents Meta’s ongoing commitment to innovation in generative AI, balancing performance improvements with cost-efficiency while navigating regulatory and operational challenges.