Event Reports
2025年12月15日
Architecture Innovation Drives the Hardware Revolution | RockAI CMO Zou Jiasi at 2025 GAIR Summit: "Breaking Free from the Constraints of Transformer, Letting Intelligence Redefine Hardware"
分享
On December 12, the 8th GAIR Summit(第八届GAIR全球人工智能与机器人大会) was held in Shenzhen. RockAI CMO, Zou Jiasi, delivered a keynote speech titled “Breaking Free from the Constraints of Transformers, Letting Intelligence Redefine Hardware.” He explored in depth the limitations of cloud-based models, the importance of on-device autonomous learning, and the future direction of the industry.
At the beginning of his speech, Zou Jiasi pointed out that cloud-based pipelines involve significant inefficiencies. Some voice commands are parsed in the cloud and then transmitted back to the device, with at least 50% of the transmission cost being wasted. The daily cost of operating global cloud-based models has reached the scale of trillions to quadrillions, yet their effective utilization remains questionable. Meanwhile, signs of an industry shift have already emerged. For example, OpenAI plans to launch hardware products next year, signaling a strategic transition from cloud-based architecture to on-device intelligence.
Regarding the current industry consensus that "more data, greater computing power, higher talent density, and larger parameters = better models," Zou Jiasi raised his own doubts: computing power is stifling innovation and causing small teams to lose opportunities. Architectures represented by Transformer models tend to compress intelligence and static functions, which do not generate more intelligence. While a large number of parameters does expand the space capacity of the function, but it does not truly generate knowledge.

He believes that the most important aspects of future intelligent hardware are native built-in memory and autonomous learning. In his speech, he discussed morphological memory and knowledge memory, arguing that it is difficult to talk about the personalization and evolution of models without sufficient memory.

He emphasized that the development of large models should shift from fixed tools to continuous learning, and from updating model knowledge every three or six months to a stage of real-time growth. While self-attention mechanisms solve the model correlation problem, they lead to a surge in costs. The industry may face limits in parameter scale in the future, therefore, innovation of architecture is needed, actively exploring cloud-based + edge-side combined solutions, hoping to achieve interconnectivity between devices and ultimately reach the goal of collective intelligence.
推荐新闻
-
RockAI Yan1.3全新发布,掀起群体智能革命
-
甲小姐对话RockAI刘凡平:以群体智能挑战OpenAI的造神之路
-
LLM Architecture Design and Algorithm Optimization for Edge AI
-
RockAI Participated in the Hong Kong EFAE with Its Partner, Where Autonomous Learning and AIPC Attracted Great Attention!
-
RockAI CEO Liu Fanping Took the Stage at RTE 2025, Sharing Insights on the Architectural Innovation and Applications of On-Device Large Models.
-
Huxiu F&M Innovation Festival (虎嗅F&M创新节)|RockAI CEO Liu Fanping: Abandoning Transformer, Driving the "Hardware Revolution" through "Architecture Innovation"
-
AI Future Program | Discussing the Future of AI with Bright Minds at Tsinghua University
-
Architecture Innovation Drives the Hardware Revolution | RockAI CTO Yang Hua at the 2025 Southern Weekly Science and Technology Innovation Conference: "The Innovation of AI Large Models: AI Redefining Hardware"
-
Architecture Innovation Drives the Hardware Revolution | RockAI CEO Liu Fanping at the 2025 Jiazi Gravity Year-End Gala: “Redefining Hardware with Intelligence: Breaking Free from the Constraints of Transformer”
-
Architecture Innovation Drives the Hardware Revolution | RockAI CMO Zou Jiasi at 2025 GAIR Summit: "Breaking Free from the Constraints of Transformer, Letting Intelligence Redefine Hardware"