AI Firms Shift to Distillation
This is a OpenAI news story, published by Ars Technica, that relates primarily to Meta news.
OpenAI news
For more OpenAI news, you can click here:
more OpenAI newsMeta news
For more Meta news, you can click here:
more Meta newsNews about Ai research
For more Ai research news, you can click here:
more Ai research newsArs Technica news
For more news from Ars Technica, you can click here:
more news from Ars TechnicaAbout the Otherweb
Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
distillation. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest AI models news, large smart frontier model news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
teacher modelArs Technica
•Technology
Technology
AI firms follow DeepSeek’s lead, create cheaper models with “distillation”

70% Informative
OpenAI , Microsoft and Meta are turning to a process called “distillation” in the global race to create AI models that are cheaper for consumers and businesses to adopt.
The technique caught widespread attention after China ’s DeepSeek used it to build powerful and efficient AI models based on open source systems released by competitors Meta and Alibaba .
VR Score
77
Informative language
81
Neutral language
32
Article tone
formal
Language
English
Language complexity
63
Offensive language
not offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
long-living
External references
no external sources
Source diversity
no sources
Affiliate links
no affiliate links