Meta unveiled its roadmap of four new in-house AI chips on Wednesday, as the company pushes to expand its infrastructure at pace with surging AI demand.
The chips are part of Meta’s Meta Training and Inference Accelerator (MTIA) program. The first chip, the MTIA 300, is already deployed and powers Meta’s ranking and recommendation systems across its platforms.
Meta Platforms, Inc., META
The remaining three chips — the MTIA 400, 450, and 500 — will be released across the rest of 2026 and into 2027. The final two are designed specifically for inference workloads.
Inference is the process by which an AI model responds to user queries — the part users actually experience. It’s a different, and increasingly critical, workload compared to training large models from scratch.
Meta has had some wins with inference chips before. Training chips, though, have been a tougher nut to crack. The company has long aimed to build a generative AI training chip but hasn’t fully cracked it yet.
Starting with the MTIA 400, Meta has designed an entire server system around the chip — roughly the size of several server racks — and includes liquid cooling. That’s a step up from just designing a processor in isolation.
Custom chips let Meta optimize for its own workloads rather than relying entirely on general-purpose processors. The payoff? Lower energy use and better cost efficiency at scale.
That said, Meta isn’t going fully DIY. The company contracts Broadcom (AVGO) to help design certain elements, and uses Taiwan Semiconductor Manufacturing Co (TSMC) to fabricate the final processors.
In February, Meta also signed large deals with Nvidia (NVDA) and AMD (AMD) to purchase tens of billions of dollars worth of chips — so off-the-shelf hardware remains part of the mix.
Meta said in January that it expects capital expenditure of between $115 billion and $135 billion in 2026. That’s a substantial commitment to infrastructure and underlines why in-house chip design matters — at that spending level, even marginal efficiency gains translate to real money.
The six-month cadence for new chip releases reflects both the pace of Meta’s build-out and the urgency it sees around AI infrastructure. Song confirmed the rollout schedule is tied directly to how fast the company is expanding its data center footprint.
The MTIA 450 and 500 — the final two chips in this current roadmap — are slated for 2027 and are squarely aimed at inference, the workload Meta says is seeing the most rapid growth right now.
Meta stock (META) was up 0.17% on Wednesday as the announcement was made.
The post Meta Stock: Company Reveals Custom AI Chip Plans as Data Center Expansion Accelerates appeared first on CoinCentral.


