Alibaba Cloud, the cloud computing arm of China’s Alibaba Group Ltd., today announced the release of more than 100 new artificial intelligence large language models open source as part of the Qwen 2.5 family of models.

Revealed at the company’s Apsara Conference, the new model series follows the release of the company’s foundation model Tongyi Qianwen, or Qwen, last year. Since then, the Qwen models have been downloaded more than 40 million times across platforms such as Hugging Face and Modelscope.

The new models range from sizes as small as a half-billion parameters to as large as 72 billion parameters. In an LLM, parameters define the behavior of an AI model and what it uses to make predictions about its skills such as mathematics, coding or expert knowledge.

Smaller, more lightweight models can be trained quickly using far less processing power on more focused training sets and excel at simpler tasks. In contrast, larger models need heavy processing power and longer training times and generally perform better on complex tasks requiring deep language understanding.

  • Skull giver
    link
    fedilink
    112 months ago

    Asking it about Taiwan and Tianamen Square told me everything I need to know about this AI model. Fun for chat bots, but clearly trained on lies and propaganda by the CCP.

    LLMs are already difficult to trust, but this one takes the cake.