Chinese tech giant Alibaba's cloud computing division has released its new AI model, QwQ-32B, which challenges the notion that bigger is always better in the AI world. The model, built on Alibaba's Qwen2.5-32B foundation, uses 32.5 billion parameters while delivering comparable performance to DeepSeek r1, which has 671 billion parameters. QwQ-32B excels in mathematical reasoning and coding tasks and has received positive feedback from the AI community. It scored well on various benchmark tests, but it does have limitations, such as struggling with language mixing and recursive reasoning loops. Unlike many advanced AI systems, QwQ-32B is open-source software. This release marks Alibaba's first step in scaling reinforcement learning to enhance reasoning capabilities and move closer to achieving Artificial General Intelligence.
Content Editor ( decrypt.co )
- 2025-03-07
Alibaba's Latest AI Model Beats OpenAI's o1-mini, On Par With DeepSeek R1
