Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
Forbes contributors publish independent expert analyses and insights. There’s a new wrinkle in the saga of Chinese company DeepSeek’s recent announcement of a super-capable R1 model that combines high ...
Model distillation is one of the technology trends that has reached a level of maturity identified in Gartner’s 2025 Hype Cycle for artificial intelligence (AI) as “the slope of enlightenment”.
Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...