1 and in April of 20021,Huawei Cloud released Pangu series super-large-scale pre-training models, including one of the world's largest visual (CV) pre-training models with 3 billion parameters, and the world's largest Chinese (NLP) pre-training model with 1000 billion parameters and 40TB training data jointly developed by Circular Intelligence and Pengcheng Laboratory. Subsequently, Huawei Cloud will release multi-domain pre-training models such as multi-modal and scientific computing.
2. There are three core principles in Pangu's design: First, the super-large neural network can absorb massive data; Second, a powerful network architecture can achieve the ultimate performance; Third, the excellent generalization ability makes the whole scene cover more than 10 times, reaching the all-around champion.