Hosted on MSN
AI distillation could shrink models and cut costs
The AI industry is witnessing a transformative trend: the use of distillation to make AI models smaller and cheaper. This shift, spearheaded by companies like DeepSeek and OpenAI, is reshaping the AI ...
Microsoft researchers have developed On-Policy Context Distillation (OPCD), a training method that permanently embeds enterprise system prompt instructions into model weights, reducing inference ...
The original version of this story appeared in Quanta Magazine. The Chinese AI company DeepSeek released a chatbot earlier this year called R1, which drew a huge amount of attention. Most of it ...
OpenAI and Anthropic allege improper distillation of their models. Investors have pushed Chinese AI valuations sky-high anyway—raising a harder question about pricing power.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results