Revolutionizing AI: Yuanxiang Unveils China’s Largest MoE Open Source Large Model, Dominates Hong Kong and Taiwan Charts
Yuanxiang XVERSE Released China’s Largest MoE Open Source Model
Accelerating Low-Cost AI Applications
Yuanxiang XVERSE released China’s largest MoE open source model: XVERSE-MoE-A36B, accelerating the low-cost deployment of AI applications and raising domestic open source to the international leading level. The model has a total parameter of 255B and an activation parameter of 36B, achieving a “cross-level” leap in model performance of 100B. At the same time, the training time is reduced by 30%, the inference performance is improved by 100%, and the cost per token is greatly reduced.
High-Performance Open Source Family Bucket
Yuanxiang’s “High-Performance Family Bucket” series is all open source and unconditionally free for commercial use, allowing a large number of small and medium-sized enterprises, researchers and developers to choose according to their needs.
MoE Technology Self-Development and Innovation
MoE is the most cutting-edge model framework in the industry. Due to the relatively new technology, domestic open source models or academic research have not yet been popularized. Yuanxiang has developed its own efficient training and reasoning framework for MoE and continues to promote technological innovation.

Leading Entertainment Applications
With the help of its accumulated customers in the fields of AI and 3D, Yuanxiang has also quickly promoted large models for commercial use.

Download Large Model for Free
