“The robot of the future is intelligent and can make its own decisions in a dynamic, evolving environment,” says Dr. Joost de Winter, lecturer and researcher in Cognitive Human-Robot Interaction at TU Delft. “On an abstract level, such a robot is a system of sensors and cameras that gather information, and based on that information make intelligent decisions, which are implemented by actuators that make the robot move. But only in a fully automated environment such as the Port of Rotterdam, for example, can you totally remove people from the equation. So there’s almost always interaction between human and machine.”

It is a development that can be seen clearly in De Winter’s research into self-driving vehicles. “In recent years, driving has become increasingly automated. For example, you have adaptive cruise control, which means you don’t have to keep changing speed, and systems that keep the vehicle in its lane,” explains De Winter. As more and more features such as these are being put into cars, the need for interaction between car and driver increases. Because even though the driver has less and less to do, he or she still has an important role: to intervene if something goes wrong. So how do you keep alert a driver who only needs to act once in a while?

Eye movements

One of the challenges in this respect is that the traditional physical contact between driver and car is disappearing, because you don’t have to touch the steering wheel or the pedals as often. A self-driving car can therefore no longer extract information from the controls being touched by the driver. De Winter is studying how eye movements can help. “You can use cameras to monitor where the driver is looking. If you glance at your dashboard or at your phone, you are distracted. When the car detects this, it can help you by providing feedback or extra assistance.” Eye movements can also help predict the behaviour of pedestrians. “If there has not been any eye contact, the pedestrian may not have seen the vehicle, and the car can adjust its behaviour or warn the pedestrian via a display on the car.”

The posture of the driver in the car, and the direction the driver’s head is facing, can also give clues in this way. “For example, if you’re leaning forward more, it often means you’re more stressed. Such information is very useful for the car: with a stressed driver, it is probably better to reduce the speed a little,” he says. “Ideally, a kind of symbiosis is created in which information is exchanged on the current status, and there is close cooperation between vehicle and driver. Then you have the best of both worlds.”

 

Just like a self-driving car has to travel in traffic among other road users and interact with the driver, robots will increasingly be operating in other places in environments with other robots and with people. For example, a distribution centre of a retail chain, or an online department store. “In theory, you can completely isolate such processes and have them run on a preprogrammed basis. But then you would have to be able to predict all disruptive factors and the robot would have to be intelligent enough to adapt to them. In most cases, that won’t work. In recent years, manufacturers such as Boeing and Tesla have even brought humans back into their production processes again, replacing robots.”

 

In the Cloud

No matter how automated things become, people will still be needed on most work floors to solve technical problems, carry out specialised tasks or cope with work pressure at peak times. And just like in a car, the interaction between human and machine will increasingly take place on a contactless basis, with sensors and cameras collecting information and wireless control and gesture control (hand gestures) taking the place of control panels and joysticks. “Our vision is that the interaction between human and machine will be taking place in the Cloud in the near future. This is where the knowledge is gathered and information can be shared with the various agents in order to achieve optimum cooperation.”

 

New Master’s degree programme

“I imagine a world in which human and machine share cognitive processes and adapt to each other. Think back to Knight Rider, for example, that intelligent car that understood exactly what Michael Knight wanted. Although that KITT talked a bit too much for my taste.” Dr. Joost de Winter

The increasing man-machine interaction has consequences for the field of robotics. Whereas in the past the main focus was on the operation and safety of machines, robotics is increasingly at the crossroads of disciplines: ethics, legislation and regulations and social aspects are also playing a role alongside mechanics and artificial intelligence. The robotics engineer of the future will therefore need a broad base, which was the impetus behind the establishment of the new Master’s degree programme in Robotics at TU Delft. Students get a thorough programme covering all technical aspects of robotics in complex systems, as well as an in-depth study of the social aspects.

Students program robots themselves, individually and in groups. “We buy the components for this, and they add the intelligence,” says De Winter. “All subjects are taught by the most experienced lecturer-researchers. This gives students access to the latest academic insights.” He himself is coordinator of the programme, and teaches about the interaction between people and robots.

Cooperation with businesses also plays a major role in the programme. “Students will be carrying out projects at businesses. This is because they not only have to learn to think about the operation of a robot, but also about its function within a business. To what extent do you place responsibility on the work floor with people or with the robot, for example, is an important issue.”

Transferable skills

Answering such questions requires more than just hardcore robot knowledge and hands-on programming experience. Unique to this programme is that, in addition to exams and projects, students build up a portfolio of transferable skills: communication skills, leadership, problem-solving skills, etc. “They have to describe in a vision document what kind of engineer they want to become. Do they want to be involved in interaction in that complex environment, or are they more into designing a robotic arm that does its work under controlled conditions? Do they want to do research or go into management? And in the run-up to graduation, we ask them to reflect on their development and experiences.”

The programme is a response to market demand, and has been set up in cooperation with the industry. “There are sectors where there is already a lot of potential for robotisation – parcel delivery, for example. While other businesses are trying to get a head start on the competition. They are all keen to make use of the knowledge of (future) robotics engineers.” In this sense, the programme is building on the collaboration within RoboValley, the Delft innovation hub for robotics. Students can also make use of the facilities of the RoboHouse robotics testing centre. All the robotics activity around the campus offers students plenty of opportunities for internships and business projects.

Open Science

“I used to play a lot of computer games and analyse how well people performed based on data from racing games. For example, I used to work on timekeeping at racetracks, where I also collected data. What I am doing now is almost a hobby that has become my work.” Dr. Joost de Winter

De Winter also works closely with the business community in his own research. For him it is important that this research leads to a scientific article. He is a great advocate of open and transparent research. He recently received an Open Science Award for his efforts in this field. “There is a lot of discussion about reproducibility of experiments. A few years ago, if you got a question about an earlier publication, you had to search through old hard disks for software and data. These days I publish my datasets and code on data.4TU. That is a form of public accountability. Moreover, software development often takes thousands of hours and costs hundreds of thousands of euros. If that is being financed with public money, you have to make it widely accessible.”

He has clear ideas about where those data and ideas should lead in the long term: a future in which robots truly understand humans. “I imagine a world in which human and machine share cognitive processes and adapt to each other. Think back to Knight Rider, for example, that intelligent car that understood exactly what Michael Knight wanted. Although that KITT talked a bit too much for my taste,” he jokes.

Despite all the problems, he sees the current crisis as a catalyst in that direction. “What we’re dealing with right now with Covid-19 isn’t pleasant. I have noticed that people around me are having a difficult time, and student motivation is a big issue. But that philosophy of sharing knowledge in the Cloud fits in well with this. In my opinion, the increased digitisation, no longer having to work from nine to five, etc., are positive social developments.”

About Dr. Joost de Winter

De Winter has been working with data for a very long time. “I was at a high school reunion a while back and told people what I’m doing now. They were surprised at how similar it is to what I was doing back then. I used to play a lot of computer games back then, and then analyse how well people performed based on data from racing games, for example,” he remembers. “I used to work on timekeeping at racetracks, where I also collected data. What I am doing now is almost a hobby that has become my work.”

Website of Joost de Winter: https://sites.google.com/site/jcfdewinter