Meta has taken a page out of Nvidia’s book – the social media company is now testing its first-ever internally developed chip for training AI. As per reports, the company, which owns Facebook, Instagram, Threads, and WhatsApp, has already deployed a small number of its new chips for testing. If the initial trials meet expectations, Meta is likely to ramp up production for broader deployment.

According to a report by Reuters, Meta’s new AI training chip is an accelerator that is designed specifically for AI-related computations. Unlike traditional GPUs, which handle a wide range of computing tasks, dedicated accelerators are optimized solely for AI workloads, making them potentially more power-efficient and cost-effective. The chip has been manufactured in partnership with Taiwan Semiconductor Manufacturing Company (TSMC), one of the world’s leading semiconductor foundries.

Meta’s testing phase began following the successful completion of a “tape-out,” which is a major milestone in chip development where the initial design is finalized and sent to a fabrication facility for production. This stage typically costs tens of millions of dollars and can take between three to six months to complete, with no guarantee of success. If the chip fails to meet performance expectations, Meta would need to troubleshoot the design and repeat the process.

The new AI training chip is part of Meta’s broader Meta Training and Inference Accelerator (MTIA) program, which has had a mixed track record. Previous efforts to develop custom AI chips have faced setbacks, with at least one prior project being scrapped after failing to meet performance targets. However, Meta successfully deployed an MTIA chip last year for inference tasks—running AI models in real-time as users interact with them—demonstrating some progress in its chip development efforts.

Meta’s AI strategy is initially focused on using its in-house training chips to improve the recommendation algorithms that power content selection on Facebook and Instagram. These recommendation systems are a core component of Meta’s business model, influencing everything from social media engagement to digital advertising. In the long term, Meta plans to expand the use of its proprietary chips beyond recommendation systems to power its generative AI products. One such application is Meta AI, the company’s chatbot designed to compete with offerings from OpenAI and Google.

This development comes at a time when Nvidia is the biggest name in AI chip making, and Meta itself faces ever-increasing infrastructure costs. The company has projected total expenses for 2025 to fall between $114 billion and $119 billion, with up to $65 billion allocated to capital expenditures. It remains to be seen whether the current trials are successful or not, given the company’s mized history with in-house AI hardware. In an earlier attempt to produce a custom AI inference chip, Meta abandoned the project after a small-scale deployment failed to deliver the desired performance. Following that failure, the company reversed course and placed substantial orders for Nvidia GPUs in 2022 to meet its AI infrastructure needs. Since then, Meta has remained one of Nvidia’s largest customers, purchasing GPUs to train its AI models.

Content originally published on The Tech Media – Global technology news, latest gadget news and breaking tech news.

Tags:

©2025 The Tech Media - Tech for Everyone powered by Digital Greedy

or

Log in with your credentials

or    

Forgot your details?

or

Create Account