Invited Speakers From Academia

 

Prof. Alexander Schmitz

Waseda University, Japan

Tactile Sensing for Human Collaborative Robots

Prof. Stefan Escaida Navarro

Universidad de O'Higgins, Chile / Inria Lille

Model-Based Sensing for Soft Robots

Prof. Alessandro Roncone


University of Colorado Boulder, USA                  

Robots working with and around people

 

 

Prof. Gordon Cheng

Technical University of Munich, Germany

Enriching humanoid robot interactions with large-area e-skin

Prof. Oliver Brock

Technical University of Berlin, Germany

Human-Robot Interaction

Prof. Ravinder Dahiya

University of Glasgow, UK

Energy Generating Large Area Electronic Skin

 

 

Invited Speakers From Industry

 

Rich Walker

Shadow Robot Company, UK

Perceptual learning for interaction

Mara Stamm

F&P Robotics AG, Switzerland

Human-Robot Interaction

David Reger

Neura Robotics GmbH, Germany

Human-Robot Interaction

 

 

Abstracts

 

Speaker Prof. Alexander Schmitz

Invited Talk

Tactile Sensing for Human Collaborative Robots
Abstract

Short Abstract: I will provide an overview of the tactile sensors developed in our lab (including uSkin), on how they have been used for various applications such as in hand-manipulation and grasp stability assessment. A focus will be the combined force and proximity sensors that we have been developing.

Video channel

Speaker Prof. Stefan Escaida Navarro

Invited Talk

Model-Based Sensing for Soft Robots
Abstract

In this talk, I will report on the work on model-based sensing for soft robots, which has been the focus of my recently finished postdoc at Inria Lille in the team DEFROST. Model-based sensing addresses the challenge of enabling tactile sensing and proprioception for soft robots in a principled way. Using inverse problem solving, the forces/deformations that best explain the observed sensor readings can be found. The first results were obtained with soft pads, which are passive devices. Air chambers are embedded in these devices and changes in volume or pressure are measured by pneumatic sensors. Forces magnitudes and deformations could be estimated using the mechanics model in SOFA. However, for estimating contact location, machine learning had to be employed. Therefore, as a follow-up, a multi-modal sensing approach was proposed: contact location is obtained using soft capacitive touchpads and thus interactions can completely be handled by the numerical simulation. Finally, in recent works, we studied how these approaches can be applied to actuated devices. We have found that these results can be applied to the development of anatomical soft robots, that is, novel medical phantoms having advanced functionality as well as multi-segment soft manipulators.

Video channel

Speaker Prof. Alessandro Roncone
Invited Talk

Robots working with and around people

Abstract

Robots have begun to transition from assembly lines, where they are physically separated from humans, to environments where human–robot interaction is inevitable. With this shift, research in physical human–robot interaction (pHRI) has grown to allow robots to work with and around humans on complex tasks. Safe pHRI requires robots to both avoid harmful collisions and continue to work toward their main task whenever possible. Furthermore, robots must reliably sense their surroundings and parse pertinent information in real time to avoid potentially harmful collisions. However, as HRI scenarios become commonplace, resorting to pure collision avoidance is not sufficient –  contact is inevitable, and, at times, desirable. In my talk, I will review my group's work on close-proximity HRI, i.e. allowing robots to avoid obstacles, to anticipate eventual collisions, and (as a longer-term objective) to purposely seek touch. I will present recent research on a framework capable of nearby-space perception, just-in-time control, and touch-informed motion planning, with the goal of creating "whole-body awareness" of the robot's surroundings. In all, this research will enable robot capabilities that were not previously possible, and will impact work domains other than manufacturing, including construction, logistics, and home care.

Speaker Prof. Gordon Cheng
Invited Talk Enriching humanoid robot interactions with large-area e-skin
Abstract

In this talk, I will present some of our latest results on multi-modal tactile skin on a full-sided autonomous humanoid robot. I will show rich interaction of the humanoid robot performing many tasks that is topically "avoided" by usual robotic methods.

Speaker Prof. Oliver Brock
Invited Talk Sensing Suitable for Soft Robots
Abstract

The inclusion of softness in robot design can significantly extend the robot’s capabilities. But why is this so? This question must be answered before we can speculate about the most suitable sensing for soft robots. The simple transfer of sensing technology for hard robots to soft robots does not seem to be very promising because soft robots have too many degrees of freedom and their softness often complicates the embedding of existing technologies. I will propose the concepts of explicit and implicit morphological sensing as an appropriate way to sensorize soft robots. I will also present successful implementations of these concept in soft robotic hands.

Speaker Mara Stamm
Invited Talk Interaction with Lio, the friendly Assistant
Abstract

In this short talk, I will introduce you to Lio, the friendly healthcare robot. You will learn how caregivers as well as patients and residents communicate with the robot and how Lio's hardware and sensors support these interactions. As a pioneer in the field of autonomous healthcare robots, F&P Robotics has the privilege to gain insight into the needs and wishes of healthcare professionals and their patients through customer and research projects. Using practical examples, you will learn what Lio's users value, how they interact with the robot, and where today's technology still has room for improvement to further advance human-machine interaction and relation.

Lio-A Personal Robot Assistant for Human-Robot Interaction and Care Applications

Speaker David Reger
Invited Talk How cognitive capabilities in robotics enable interactive and safe-human machine collaboration
Abstract

Automation with robots is one of the greatest achievements of the industrial age. It enables a remarkable level of efficiency and precision, achieving what would have been impossible with traditional human labor. Having robots, even collaborative robots, work side by side with humans has been possible only to a very limited extent because they lacked environmental awareness and intelligence.

So-called cognitive robots are game changers in automation. They can perceive and understand their environment and can safely collaborate and interact with humans. Their ability to continuously learn and understand complex situations, combined with easy configurability and a user-friendly interface, makes them the first choice for all industries.

In this talk, we'll take an in-depth look at the technologies that enable robots to relieve workers of repetitive, tedious, difficult, and dangerous tasks and work side-by-side with humans in a safe, collaborative environment. Most importantly, our novel Touchless Safe Human Detection sensor technology allows the robot to detect humans accurately and reliably in its environment and, if they fall below a certain safety threshold, slow the robot down up to a complete stop before resuming work once it is safe to do so. Combined with the robot's ability to understand voice commands and be controlled with gesture control, this enables a whole new era of intuitive and seamless human-machine interaction.

Speaker Prof. Ravinder Dahiya
Invited Talk Energy Generating Large Area Electronic Skin
Abstract

Tactile or electronic skin (e-Skin) is needed to provide critical haptic feedback to robots and amputees, as well as in several interactive systems. Energy autonomy of e-Skin is a critical factor in these application to enable portability and longer operation times. This talk will present an energy autonomous e-Skin, where solar cells are used as distributed touch sensors as well as energy harvesters. This advanced version of e-Skin also has proximity sensors and does not consume any energy for sensing operation and instead only produces energy. When present over large areas, the e-Skin could generate sufficient energy to power devices such as actuators used in robotics and prosthetics.