Space and Lemon Innovations
6 min readApr 3, 2024


But are they really that far?

DALL-E. prompt: an image for an article on the Chinese taken by surprise by chatGPT.

“And China?” Every time we present the list of Large Language Models (from OpenAI to Mistral), then comes the recurring question “And where are the Chinese in this race?”. China has decisively jumped in the field, but not as far as we imagine.

AI is the next big leap in economical efficiency and a big race altogether. Especially since the launch of ChatGPT that showed the world the incredible capacity of the transformer model compared to other machine learning models. The world? Yes, it has become a worldwide competition that China is massively entering. With its own ecosystem. Remember? Google, Facebook and of course ChatGPT are not available there.

China decided to be part of that movement and is moving into the generative AI sphere fast, with a strong ecosystem, large investment, big Chinese tech firms, and restrictions as well. Large Language Models (LLM) are the basis of what we all call generative AI, and describe models like GPT. LLM from China? A strategic thing. The motivation in China is two fold, geo-political and geo-economical. We’ll solely cover the economy and tech side of it.

China, first taken by surprise then investing strongly.

In the last 10 years, AI has gathered large investments in China, often represented by two flagships SenseTime and Face++, both strong on image recognition from facial identification to object detection for self-driving cars.

The new wave is LLM and China jumps in. It came by surprise and language models concentrated on the former model of Natural Language Processing (NLP), understanding Chinese and being a personal assistant (like Alexa). Shortly, based on traditional machine learning models.

In 2023, China boasted about how many LLM they have, 140 of them. Now the official number is 40, 40 approved by the regulator. It takes a lot of resources to build an LLM, 100’s of millions of $ devoted to servers, training and talents. Like always in China, it is difficult to double-check numbers, but one thing is sure, China is in the race with large investment, above all from big tech companies like Alibaba, Baidu and Tencent. They are the ones that have the server and cloud resources to run such models. And historically they have been strong on AI, specifically for image recognition and Chinese language understanding. Chinese understanding has been a challenge with ideogrammes and dialect variations in mainland China.

When looking into the details most models are hybrid (Baidu’s Ernie) or LLM of small size with specific training, or based on open-source models like Llama (meta). However, none would compete with GPT, Gemini or Claude, the three major LLM power houses. Still, Chinese experiments follow what we see in the West: multi-modal generation, conversation, summary, copywriting, maths, reasoning, memory and SLM (Small Language Models). And they have an advantage: high quantity of user data (demography).

Limitation ahead, regulation strong.

Very often, we in the West tend to think the Chinese are much further in AI than they really are. A difficult thing to evaluate since research and statistics can be unreliable. One thing is sure, there is a limitation for the LLM development in Mainland China. Obviously, like any other LLM, the availability of processors altogether. Even more difficult with the US export ban on AI-chips (NVIDIA) and AI chip machines (ASML) to China.

The other limitation is regulation — the LLMs must pass a central regulation process (CAC). Developers are made responsible for “inappropriate content”. It regulates any political aspect from Taiwan to Covid. Here, the government controls which cultural and knowledge funds the LLM have been trained on. It has to be in line with the central line. This will explain why neighbouring regions work on their own LLM and no other country uses a Chinese LLM.

The high number of LLMs is paradoxically an evidence of limitation. There are few general purpose LLM, most are small models and cover specific use cases.

The LLM players in China.

Similar to the US, the larger technology companies have historically invested in AI, mainly Baidu and Alibaba. Let’s remember that Alibaba is a large investor in Face++ and SenseTime we mentioned above. But there are new players stemming from AI research. Here we cover Shiyu and Moonshot.

Alibaba, LLM for enterprise

Alibaba is the enterprise cloud leader in China and overtakes the role of Microsoft Azure on this matter. Tongi Qianwen 2.0 (Oct. 2023) is the LLM available on AliCloud. Qianwen = “1.000 asking”, often called Qwen. A general LLM and framework similar to GPT but with variations trained for a specific industry.

Alibaba also offers a variation for South East-Asia (SeaLLM) integrating local Chinese dialects, Thai, Malay and Latin writing.

Baidu, LLM in the assistant ERNIE Bot (Wenxin)

ERNIE is the Chinese Alexa, over 7 year-old. In its latest version (Ernie 4.0) a LLM was added, an own development called PLATO. Ernie continues to function like a big search machine though. The LLM does what it can best: understand and generate natural language. Ernie has a reach of 100 million users (says Baidu CTO, Deember. 2023). It stems from search giant Baidu that grabbed billions of conversations and created a knowledge graph from the search engine’s crawlings

ERNIE is a “hybrid LLM” with PLATO being a dialog enhancement tool. Ernie is a competitor to ChatGPT, not GPT.

Tencent with Hunyuan LLM

Tencent comes into the race of LLM with lower ambition. They introduced a model called Hunyuan in September 2023. It has a strong focus on understanding Chinese. Integrated in 50 products of Tencent, mainly enterprise: Tencent Cloud and Tencent Meeting (Teams).

Moonshot AI, the Chinese OpenAI

Yuezhi Anmian (Moonshot) is the LLM shooting star of China with a recent investment of 1 billion USD, mainly from Alibaba. It resembles Microsoft investing massively in OpenAI. Moonshot LLM differentiates above all with long input capacities. Their “ChatGPT” is called Kimi. is the open-source version

The LLMs are divided between closed (OpenAI, Anthropic) and open source (Meta, Mistral). is an open-source model available on Hugging Face or GitHub. Yi is the name of the model. Behind is venture capital Sinovision ventures based in Beijing and… Alibaba. A model that can be used for academic purposes.

Other LLM (non exhaustive): ByteDance (Dou Bao), Zhipu AI (ChatGML), Meituan (LightYear acquisition), Xioami (, 4paradigm (SageGPT), Baichaun (open-source, Alibaba), LangBoat (MengZi GPT). SenseTime (SensaNova), MiniMax (backed by Alibaba and Tencent), iFlytek (Spark LLM), Vivo Mobile.

High stakes behind LLMs. Especially for China.

Large Language Models are a high promise for efficiency leaps in office work and manufacturing. They are bound to be a formidable personal assistant, even replacing internet search as we know it today. The stakes are high and it explains the race we are in. China will be part of it. Economical, political and societal implications for the Land of the Middle, teared between restriction and hope for a magic technology. Are they that far? No. Are they aware of it? Yes.

Hamburg, 03.04.2024. Research 05.03.-03.04.2024. A comprehensive list of sources posted in comments.

Author: Laurent Burdin (LinkedIn) is the founder of Space and Lemon Innovations, scouting and analysing digital trends, with a passion for AI. He has been observing China tech development since 2015. His approach to China comes first from history, then Mandarin language and calligraphy.

Header image DALL-E 3. Prompt: please generate an image for an article on the Chinese taken by surprise by ChatGPT. Comment: Here’s the image for your article. It depicts a traditional Chinese teahouse scene where people are amazed and curious about interacting with ChatGPT on their devices, blending ancient culture with modern technology.



Space and Lemon Innovations

Space and Lemon Innovations. Trendscouting and AI specialist. Non-obvious ✱ Exclusive trend radars ✱ Special research and use cases ✱ DIY workshops with a tech.