Towards Intelligent Robotics with Foundation Models, 6G and MELISAC

Robots are becoming increasingly intelligent and capable, moving beyond repetitive and pre-programmed actions towards autonomous systems that can understand, adapt and respond in real time. At the centre of this transformation is the integration of artificial intelligence foundation models and the emerging capabilities of sixth generation mobile networks.

China's Fourteenth Five Year Plan for Robot Industry Development highlights the need to enhance the intelligence and connectivity of robots by integrating AI, 5G, big data and cloud computing. This strategy ensures the functionality, networking and data security of robotic systems, while driving forward the country's technological and industrial ambitions.

Traditional robotic systems rely heavily on deep learning models designed for specific tasks in structured environments. While effective in defined settings, these systems struggle to adapt to new or unpredictable scenarios. Foundation models represent a major shift by providing a flexible and general purpose intelligence. Pretrained on vast and diverse datasets, they can be adapted to many different tasks with minimal additional training. They allow robots to understand natural language instructions, interpret their environment using multiple sensors, and adjust their behaviour based on real time input.

However, to realise the full potential of these intelligent systems, a new kind of network infrastructure is needed. This is where 6G comes in. Designed to bring together communication, sensing and computing, 6G is expected to deliver native AI capabilities along with integrated sensing and positioning. Robots will be able to access distributed intelligence and high performance computing resources directly from the network. In addition, low latency and reliable communication will support real time decision making and control.

A recent Huawei paper, Robots Empowered by AI Foundation Models and the Opportunities for 6G, presents a state of the art analysis of current developments in this area. It describes how 3GPP’s SA1 working group has identified a range of use cases for service robots, including real time cooperative safety protection, smart communication and sensor fusion, and the deployment of autonomous and remotely operated robots in areas such as mining and delivery. Several technical aspects are also discussed, such as tactile and multimodal communication, integrated sensing and communication, the metaverse and high level control frameworks.

The one6G association is also active in this space. It is working to evolve, test and promote next generation wireless communication technologies, and sees robotics as a key application area across multiple industries and sectors. Its publicly available whitepapers explore how 6G can support robotic systems through communication, artificial intelligence and integrated sensing. These use cases range from collaborative robots and action planning to healthcare, industrial automation and disaster response, and have been explored in our earlier blog posts.

Huawei's vision takes a step further through the introduction of MELISAC, which stands for Machine Learning Integrated Sensing and Communication. MELISAC is a prototype robot combining a dual arm collaborative unit with an autonomous mobile platform. It features a wide array of sensors, including RGB cameras, microphones and sub terahertz radios for advanced communication and sensing.

The robot connects to AI agents hosted at the network edge. These agents are responsible for understanding human commands, analysing visual inputs and planning robot actions. MELISAC offloads computationally intensive tasks to the edge cloud, reducing the onboard processing burden and improving efficiency. Human operators can remotely demonstrate tasks, allowing the robot to learn and adapt using real time training data.

One of the most important ideas presented in the paper is meta level control. Unlike traditional control systems that focus on predefined tasks or actions, meta level control allows robots to define their own roles, identify problems and adapt to changing environments. This capability depends on the integration of AI, sensing and communication within the 6G framework, enabling a deeper level of autonomy and intelligence.

A video of Intelligent Robot (MELISAC) Empowered by 6G Native AI and ISAC from one6G Summit 2023 is as follows:

The MELISAC prototype shows how these capabilities can come together in practice. It demonstrates how intelligent robots can operate across different levels of control, supported by a network that does more than just connect—it helps interpret the world and guide action.

The combination of foundation models and 6G will open up new possibilities across a wide range of industries. Robots will be able to collaborate on complex tasks, assist in disaster relief, support healthcare delivery, and adapt to dynamic environments with minimal human intervention.

As 6G research and development progresses, Huawei’s MELISAC offers a glimpse into a future where robots are not only connected and capable, but also intelligent and context aware. This represents a significant step towards the next phase of digital transformation, where machines will act with greater understanding, flexibility and independence.

Related Posts: 

Comments