All four chips have been developed in partnership with Broadcom and are scheduled for deployment within the next two years.
The MTIA processors are the tech giant’s latest attempt to build its own AI hardware, even as it continues spending billions on gear from industry leaders like Nvidia.
MTIA custom silicon remains central to our AI infrastructure strategy, with four new generations of MTIA chips forthcoming in the next two years.
The timing is awkward, even by Silicon Valley standards. Last week, reports emerged that Meta had quietly killed its most ...
Meta's latest generations of its MTIA series of in-house chips for artificial intelligence will help support the company's ...
Meta has announced the next four generations of its Meta Training and Inference Accelerator (MTIA) chip. Dubbed the MTIA 300, 400, 450, and 500, Meta said the new chips have either already been ...
The new chips are part of the company's Meta Training and Inference Accelerator (MTIA) program and the first of the new chips, called the MTIA 300 is in use ...
Meta is advancing its AI infrastructure with the Meta Training and Inference Accelerator (MTIA), a family of custom-built silicon chips designed for efficient AI workloads. First developed in 2023, ...
Meta (NASDAQ:META) introduced updates to its in-house Meta Training and Inference Accelerator chip roadmap. The MTIA 300 is already deployed for ranking and recommendation workloads. The MTIA 400 has ...
The most advanced chip in the lineup, the MTIA 500, can provide 10 petaflops of performance when processing MX8 data. It also supports a more efficient data format called MX4. The latter technology ...
Deploying them by the gigawatt but still can’t flag obvious AI slop Social networking giant Meta has revealed details of four ...
Meta’s new generation of MTIA AI chips highlights how hyperscalers are redesigning the infrastructure stack, from silicon and interconnects to rack density, cooling, and ...