Notably, according to South China Morning Post, Alibaba Cloud’s flagship Qwen model family had surpassed 700 million downloads by January 2026 on the developer platform Hugging Face, making it the world’s most widely used open-source AI system. The report adds that Qwen’s momentum stems from Alibaba Cloud’s strategy of open-sourcing a broad lineup of models, from lightweight versions with 600 million parameters to large models with tens of billions of parameters.
Meanwhile, DeepSeek may be preparing a new model. As Commercial Times, points out, around the one-year anniversary of the Chinese AI startup’s release of its R1 reasoning model, a new project named “MODEL1” quietly appeared in the open-source community.
Nikkei’s “AI Model Ratings,” which assess the performance of leading models in Japanese, show that DeepSeek’s model released in December ranked ninth out of 92 models. The report points out that DeepSeek placed first among open-source models, followed by Alibaba Group, and outperformed the open-source models from Google and OpenAI in terms of performance.
These Chinese AI models have been gaining traction overseas. Nikkei notes that six of the top 10 models developed by Japanese companies, including those from emerging player ABEJA, are built on DeepSeek and Qwen (Tongyi Qianwen). The report adds that Japan’s National Institute of Informatics (NII) has also adopted Qwen to organize training data for its domestic AI development initiative, LLM-jp.

