Event Reports

2025年12月15日

Architecture Innovation Drives the Hardware Revolution | RockAI CMO Zou Jiasi at 2025 GAIR Summit: "Breaking Free from the Constraints of Transformer, Letting Intelligence Redefine Hardware"

分享

On December 12, the 8th GAIR Summit(第八届GAIR全球人工智能与机器人大会) was held in Shenzhen. RockAI CMO, Zou Jiasi, delivered a keynote speech titled “Breaking Free from the Constraints of Transformers, Letting Intelligence Redefine Hardware.” He explored in depth the limitations of cloud-based models, the importance of on-device autonomous learning, and the future direction of the industry.


At the beginning of his speech, Zou Jiasi pointed out that cloud-based pipelines involve significant inefficiencies. Some voice commands are parsed in the cloud and then transmitted back to the device, with at least 50% of the transmission cost being wasted. The daily cost of operating global cloud-based models has reached the scale of trillions to quadrillions, yet their effective utilization remains questionable. Meanwhile, signs of an industry shift have already emerged. For example, OpenAI plans to launch hardware products next year, signaling a strategic transition from cloud-based architecture to on-device intelligence.


Regarding the current industry consensus that "more data, greater computing power, higher talent density, and larger parameters = better models," Zou Jiasi raised his own doubts: computing power is stifling innovation and causing small teams to lose opportunities. Architectures represented by Transformer models tend to compress intelligence and static functions, which do not generate more intelligence. While a large number of parameters does expand the space capacity of the function, but it does not truly generate knowledge.


2.jpeg


He believes that the most important aspects of future intelligent hardware are native built-in memory and autonomous learning. In his speech, he discussed morphological memory and knowledge memory, arguing that it is difficult to talk about the personalization and evolution of models without sufficient memory.


3.jpeg


He emphasized that the development of large models should shift from fixed tools to continuous learning, and from updating model knowledge every three or six months to a stage of real-time growth. While self-attention mechanisms solve the model correlation problem, they lead to a surge in costs. The industry may face limits in parameter scale in the future, therefore, innovation of architecture is needed, actively exploring cloud-based + edge-side combined solutions, hoping to achieve interconnectivity between devices and ultimately reach the goal of collective intelligence. 


syforwa.jpg ph.jpg

Win-Win Cooperation

Make every device its own intelligence

a0icon15.svg

Book Now