The science behind the foundation models underpinning generative AI assistants has been around for 15 years, but the explosion of recent providers has been fueled by three interdependent factors:
That last aspect has been a factor in the race between the US and China for AI supremacy as the previous administration ramped up restrictions on the export of advanced chips and other technology to China.
In media interviews, DeepSeek’s founder Liang Wenfeng described this “chip ban” as the major challenge to developing his company’s model—but it was a challenge his team claims to have overcome.
Leading AI models in the West are trained using an estimated 16,000 specialized chips, but DeepSeek says its model was trained using 2,000 high-performance chips and thousands of lower-grade chips. This training was done without NVIDIA’s latest chips (which caused turbulence in NVIDIA’s stock price) and it utilizes open sources such as Alibaba’s LLM, Qwen, alongside Meta’s open-source LLM Llama architecture.
So, what is DeepSeek?
Launched in app stores in the UK and the US on January 20, 2025, it was the m
Members of The Conference Board get exclusive access to Trusted Insights for What’s Ahead® through publications, Conferences and events, webcasts, podcasts, data & analysis, and Member Communities.