Keynote Speakers
Robotics, Physical & Industrial AI – Big Integration for Solving Real World Challenges
Abstract
Global industries face critical pressures, including acute manpower shortages and aging populations, necessitating a transition from manual robotic programming to a holistic system perspective. This keynote explores the integration of robotics and AI through a hierarchical structure, moving from top-level business and supply chain planning down to autonomous, machine-level execution. Central to this framework is Physical AI, which differs from digital-only models by requiring a sophisticated, inseparable bond between hardware and software. Successful real-world deployment relies on "Domain Data"—the specialized expertise and field experience accumulated within specific industries—to ensure robustness in non-standardized environments. We will discuss practical applications in sectors such as food handling, intelligent logistics, hospital supply and cleaning automation, where AI-driven 3D perception and grasp planning enable robots to navigate high-mix tasks previously reserved for human workers. These systems represent a "Hybrid Automation" approach, combining AI's high-level decision-making with deterministic, safety-critical motion control. Looking forward, the next frontier in Embodied AI involves pushing intelligence down to the most granular level. Our future work conceptualizes Embedded AI for robotic joint actuators, introducing a "neuromuscular-inspired" reflex layer. By embedding fast-real time intelligence directly into the robot’s "muscles," we aim to create actuators that are self-aware and proactive, capable of responding to electrical or mechanical shocks and adapting to their own fatigue states independent of a central controller. Drawing on 30 years of R&D and entrepreneurial experience, I will outline how this systematic integration allows robots to complement human skills. While the journey toward AGI and widely deployed humanoids continues, our immediate focus remains on building a robust, hierarchical intelligence framework to ensure industrial systems are reliable, cost-effective, and sustainable.
Bibliography
Professor I-Ming Chen received B.S. degree from National Taiwan University in 1986, and M.S. and Ph.D. degrees from California Institute of Technology, Pasadena, CA in 1989 and 1994 respectively. He is currently Full Professor in the School of Mechanical and Aerospace Engineering, Director of Bachelor’s Degree Program in Robotics, and Co-Director of CARTIN (Center for Advanced Robotics Technology and Innovation) in Nanyang Technological University (NTU), Technical Advisor to National Robotics Program Office in Singapore, and Certified Patent Valuation Analyst (CPVA). He was Editor-in-chief of IEEE/ASME Transactions on Mechatronics from 2020 to 2022, and Director of Robotics Research Centre (NTU) from 2013 to 2017. Professor Chen is Fellow of Singapore Academy of Engineering (SAEng), Fellow of IEEE and Fellow of ASME, General Chairman of 2017 IEEE International Conference on Robotics and Automation (ICRA 2017) in Singapore. His research interests are in Robot AI & perception, reconfigurable industrial AI, logistics and construction robots and human-robot interaction. He founded and managed Transforma Robotics Pte Ltd – a pioneer in construction robotics and Hand Plus Robotics Pte Ltd – a logistics robot and AI company. He has mentored many robotics-related startups and entrepreneurial teams in Singapore, China and Taiwan.
Human-Centered Mechatronics: How Wearable Human-Machine Interfaces are Redefining Assistive and Interactive Technologies
Abstract
There is a growing need for robust, intuitive, and bidirectional human–machine interfaces (HMIs) to enable seamless interaction between humans and devices in human-in-the-loop mechatronic systems such as prosthetics, drones, and rehabilitation technologies. Effective HMIs must not only identify user intent and physiological states but also translate them into actionable signals to control and guide external devices. Most wearable HMIs rely on electrodes that noninvasively capture electrical or mechanical biosignals generated by neuromuscular or neuronal activity. These biosignals represent human intent and enable the control of assistive or interactive technologies, including prosthetic limbs, wheelchairs, rehabilitation devices, and other mechatronic systems. This keynote will present an overview of state-of-the-art wearable HMI technologies, highlight the major challenges, and discuss promising research directions aimed at improving robustness, adaptability, and usability in real-world settings including human-centered mechatronics. Applications drawn from the speaker’s research group will be showcased, including the control of a prosthetic hand, a drone, a robot manipulator, and a computer game. The talk will emphasize the essential role of interdisciplinary collaboration in advancing wearable HMIs and helping assistive and interactive robotic systems more closely replicate natural human functions—pushing the frontier of human–machine integration in modern mechatronics.
Bibliography
Professor Gursel Alici (Senior Member, IEEE) received the Ph.D. degree in robotics from the Department of Engineering Science, Oxford University, Oxford, U.K., in 1994. He is currently a Senior Professor with the University of Wollongong, Wollongong, NSW, Australia, where he holds the position of the Executive Dean of the Faculty of Engineering Information Sciences, and director of Applied Mechatronics and Biomedical Engineering Research (AMBER) Group. His research interests include soft robotics, system dynamics and control, robotic drug delivery systems, novel actuation concepts for biomechatronic applications, robotic mechanisms and manipulation systems, soft and smart actuators and sensors, wearable human-machine interface (HMI) systems, and medical robotics. He has generated more than 400-refereed publications and delivered numerous invited seminars and keynote/plenary talks on his areas of research. Dr. Alici was a Senior Editor and Technical Editor for the IEEE/ASME Transactions on Mechatronics during 2020–2024 and 2008–2012, respectively. From 2007 to 2017, he was a Member of the Mechatronics National Panel formed by the Institution of Engineers, Australia. He has served on the international program committee of numerous IEEE/ASME International Conferences on Robotics and Mechatronics. He was the General Chair of the 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics held in Wollongong, Australia. He was the recipient of the Outstanding Contributions to Teaching and Learning Award in 2010, the Vice-Chancellor’s Interdisciplinary Research Excellence Award in 2013, and Vice-Chancellor’s Award for Research Supervision in 2018 from the University of Wollongong. He was a Visiting Professor with Swiss Federal Institute of Technology, Lausanne (2007, 2010), City University of Hong Kong (2014), University of Science and Technology of China (2015), and University of British Columbia, Canada (2019).
Human-Adaptive Cyber-Physical Robotic Systems for Sustainable and Resilient Construction
Abstract
There is a growing need for robust, intuitive, and bidirectional human–machine interfaces (HMIs) to enable seamless interaction between humans and devices in human-in-the-loop mechatronic systems such as prosthetics, drones, and rehabilitation technologies. Effective HMIs must not only identify user intent and physiological states but also translate them into actionable signals to control and guide external devices. Most wearable HMIs rely on electrodes that noninvasively capture electrical or mechanical biosignals generated by neuromuscular or neuronal activity. These biosignals represent human intent and enable the control of assistive or interactive technologies, including prosthetic limbs, wheelchairs, rehabilitation devices, and other mechatronic systems. This keynote will present an overview of state-of-the-art wearable HMI technologies, highlight the major challenges, and discuss promising research directions aimed at improving robustness, adaptability, and usability in real-world settings including human-centered mechatronics. Applications drawn from the speaker’s research group will be showcased, including the control of a prosthetic hand, a drone, a robot manipulator, and a computer game. The talk will emphasize the essential role of interdisciplinary collaboration in advancing wearable HMIs and helping assistive and interactive robotic systems more closely replicate natural human functions—pushing the frontier of human–machine integration in modern mechatronics.
Bibliography
Professor Hongnian Yu is a Professor in the School of Computing, Engineering and the Built Environment at Edinburgh Napier University, where he served as the Head of Research from 2019 to 2023. He received his Ph.D. degree from the University of Exeter, UK. Professor Yu is multi-disciplinarily educated with a divergent knowledge base in mechatronics and intelligent systems, having established a vast research network collaborating with partners across over 30 countries. His research primary focus lies in the synergy between Robotics and Intelligent Control, specifically developing adaptive and robust control methods for robot manipulators and mobile inertial robots. He is particularly interested in Applied Artificial Intelligence and data science, exploring how these tools can be integrated into digital healthcare and advanced manufacturing systems to solve real-world industrial challenges. Throughout his career, Professor Yu has pioneered the application of Emerging Technologies, such as RFID and wireless systems, to enhance the efficiency of healthcare and production environments. He has led major international research consortiums, including the EPSRC Human Adaptive Mechatronics network and the EU FUSION project, focusing on how intelligent systems can better serve human needs. He currently serves as a Fellow of the IET, Fellow of the RSA, and a Senior Member of the IEEE.


