|
- Huawei unveils a new Pangu AI model trained on Ascend chips
Pangu “Pro MoE” 72B (72-billion-parameter) is a hybrid expert model It uses statistical models to analyze data and symbolic AI, and provides insights into meaning The Chinese tech giant has trained the Huawei Pangu Pro MoE 72B AI model with Ascend GPU and NPU chips
- Huawei releases Pangu Pro MoE 72B, a language model trained . . .
Jul 02, 2025 23:00:00 Huawei releases 'Pangu Pro MoE 72B', a language model trained in China's AI ecosystem, and open-sources inference technology
- Huawei defends AI models as home-grown after whistle-blowers . . .
Huawei used an open-sourced artificial intelligence (AI) model called Pangu Pro MoE 72B, which had been trained on Huawei’s home-developed Ascend AI chips However, an account on the open-source
- Why did Huawei open source the Pangu model? - iMedia
On June 30, Huawei officially announced the open source of Pangu's 7-billion-parameter dense model, Pangu Pro MoE's 72-billion-parameter hybrid expert model, and Ascend-based model reasoning technology
- Huawei open-sources Pangu AI models, optimized for Ascend chips
Huawei has open-sourced its Pangu AI models, including a 7-billion-parameter model and a 72-billion-parameter Pangu Pro MoE (Mixture-of-Experts) model The release also features model
- Huawei’s AI Lab Fends Off Accusations It Copied Rival Models
The Pangu Pro MoE is the world’s first model of its kind to be trained on Ascend chips — Huawei’s answer to Nvidia Corp ’s AI accelerators — the lab said in a WeChat post over the weekend
- China’s Huawei goes the open-source route to boost AI models . . .
The firm revealed the open-sourcing of its Pangu dense model with 7 billion parameters, the Pangu Pro MoE (Mixtures-of-Experts) model with 72 billion parameters, and its model inference technology based on Ascend, which serves as the platform for AI infrastructure
|
|
|