My vision is to have physically intelligent robots that augment rather than replace humans in a safe, personalised, and ergonomic manner. Two overarching research questions emerge: 1) how to transfer physical interaction skills to robots, and 2) how to ensure the interaction is personalised and ergonomic? To address these research questions, my approach is to include humans in the control loop of robotic systems for real-time control of complex physical interactions in unstructured and unpredictable environments. My group employs user-centred design to create new interfaces for human-in-the-loop physical skill transfer and encodes these skills with statistical machine learning (GPR) to capture the stochasticity of human behaviour. For personalised and ergonomic assistance, we incorporate high-fidelity human musculoskeletal and motor control models to inform the interaction controller that is based on adaptive impedance control.
One of the key topics in this direction is Physical Human-Robot Collaboration (PHRC) since, in real-world tasks, robots are expected to share living/work space with humans and physically interact with them. Such collaboration can be either to assist healthy humans in industrial, household, and even space exploration tasks, or they can assist injured patients through Biomechanics-aware Robotic Physiotherapy. To facilitate PHRC at a distance, Teleimpedance is a key topic that enables humans to remotely operate, teach, and collaborate with robots to perform physical tasks at a distance (e.g., either remote tasks such as inspection & maintenance or remote robotic physiotherapy). Finally, to be able to design robot control systems for effective PHRC, we need to understand how humans physically behave; thus, complementary research into Human Motor Control adds crucial insights and closes the loop in terms of the overarching research vision.
One of my main areas of expertise is physical human-robot collaboration, where my group focuses on how to make robots understand human intentions and then control their physical actions in order to facilitate collaborative task executions. In this direction, the key element is to incorporate human behaviour and biomechanical models into the robot control system that enable real-time insight into human goals, internal states and ergonomics. Furthermore, we often incorporate machine learning methods that enable robots to gain new skills online while collaborating with humans. The main application areas we explore are collaborative manufacturing (e.g., assembly, polishing, sawing, etc.) and assistance in elderly care through the use of collaborative robots and exoskeletons.
Examples:
Additionally, my group explores how to use collaborative robots as a tool for physical therapy of musculoskeletal injuries, where similar principles can be exploited. The key innovation in this direction is to create an abstraction of a complex biomechanical model that can be included in the robot control system and used in real-time to facilitate safe and effective physiotherapy. To this end, we developed an abstraction of a biomechanical model called the "strain map".
Examples:
Another important area of expertise is teleimpedance, which is a form of teleoperation that enables the human operator to remotely control the robot and its physical interaction through real-time changes of impedance. Commanding a proper impedance depending on the type of task and conditions can greatly simplify complex interactions with the remote environment and make them safer. For example, when interacting with humans or fragile objects, the robot can become less stiff (i.e., soft) in order to make sure no harm is done during the interaction. When something is perturbing the robot while executing a precise task, the stiffness can be increased to ensure the disturbances are rejected and the desired accuracy is maintained. In teleimpedance, my group particularly focuses on the development of interfaces that enable the human operator to command the impedance of the remote robot in real-time. Furthermore, we develop methods where teleimpedance can be used as a way to teach the remote robot complex interaction skills. The main application areas we explore are manufacturing (e.g., assembly, polishing, sawing, etc.), assistance in elderly care, and inspection & maintenance, where remote robot control is needed.
Examples:
I am also interested in how humans control their movements and physical interactions since such insights can directly benefit the development of better robot control and teaching methods. In particular, my group is studying how the human neuromechanical system derives, controls, and optimises the movements of limbs and the body during physical interaction.
Examples: