China Telecom say AI model with 1 trillion parameters trained with Chinese chips

A Chinese state-owned carrier said it has developed two large language model (LLM) trained entirely by domestically produced chips, illustrating the progress China has made in its effort to achieve chip autonomy in artificial intelligence (AI).

Advertisement

The Institute of AI at China Telecom, one of the country’s large state-backed telecoms operators, said in a statement on Saturday that its open-source TeleChat2-115B and a second unnamed model were trained on tens of thousands of domestically produced chips, marking a milestone amid tightening US restrictions on China’s access to advanced semiconductors, including Nvidia’s latest AI chips.

The achievement “indicates that China has truly realised total self-sufficiency in domestic LLM training” and marks the start of a new phase for China’s innovation and self-reliance in LLMs, the technology behind OpenAI’s ChatGPT, the AI institute said in a statement published to WeChat.

Advertisement

China Telecom said the unnamed model has 1 trillion parameters – a machine-learning term for variables present in an AI system during training. The sophistication and effectiveness of an AI model depend largely on the scale of parameters involved in the training process. TeleChat2t-115B has over 100 billion parameters, the company said.

China Telecom is believed to be using chips from Huawei to train AI models. Photo: AFP
China Telecom is believed to be using chips from Huawei to train AI models. Photo: AFP

  

Read More