Working in space comes with its fair share of challenges, to put it lightly. There’s the lack of gravity, extreme temperatures, intense cosmic radiation, delays in communication, clunky space suits, to name just a few things that astronauts contend with.
This complex environment means that tasks we would consider straightforward back on planet Earth, such as gripping and manipulating objects, are surprisingly difficult and time-consuming to accomplish. As humans continue to ramp-up their space exploration endeavours, attempting more daring feats and travelling deeper than ever before, scientists need to address these obstacles for future missions to be successful.
One potential helping hand could come in the form of robots. While the likes of R2-D2 and C-3PO remain in the realms of science fiction, robots have been used successfully in space for some time – think the ISS’s Canadarm, and the rovers trundling across Mars. But through project FAIR-SPACE, our researchers are hoping to improve the way that astronauts interact with them.
“We’re working on human-robot interactions in space, specifically how astronauts can operate robots from a distance,” says Dr Fani Deligianni, FAIR-SPACE researcher from IGHI’s Hamlyn Centre. “We’re aiming to enable more automation in space and to facilitate ways for people to be able to work with robots in a more collaborative way, and ultimately apply what we’ve learned to healthcare back on Earth.”
Sensing human capabilities and intentions
FAIR-SPACE, or Future AI and Robotics for Space, is a collaborative hub led by the University of Surrey that’s aiming to accelerate the development of artificial intelligence and robotics for space exploration. Deligianni and the Imperial College London research team are focussing on remotely, or tele-operated tasks in environments where it’s difficult for astronauts to travel safely.
Astronauts orbiting a distant planet from the safety of a station, for example, could control robots on the ground without having to land themselves. Robots could also make lighter work of much simpler operations like routine station maintenance, where the lack of dexterity offered by space suits, coupled with strength limitations due to deteriorating muscle mass from working in low gravity, create difficulties.
To develop better systems for such situations, the FAIR-SPACE work is focused on using virtual/augmented reality to run simulated tasks. In these simulations, the user operates a robotic arm – much like what happens, in reality, on the ISS. The researchers investigating a number of potential ways to improve performance. One of these is an emerging field called neuroergonomics, which studies the brain in relation to work.
“What we’re doing is monitoring the brain state of an operator to provide feedback that can hopefully enhance their performance, and also minimise performance differences between operators,” Deligianni explains.
How this works is a number of wearable technologies are employed to take various measurements from an operator, which are then fed into an AI algorithm. These readings include brain activity through an EEG skull cap, and eye-tracking through smart glasses (i.e. a hololens headset).
“The idea is that we integrate these measurements and use AI to determine the operator’s awareness and mental workload, and when their performance deteriorates,” says Deligianni. “We can then use the augmented reality feature of the hololens headset to provide direct feedback to the user as they’re working. This will hopefully improve their attention and make them more aware of the task at hand.”
Stress, sweat, share
The brain isn’t the only part of the body that the researchers are interested in. Dr Bruno Rosa and Panagiotis Kassanos are also developing flexible electronic sensors that can be worn on the body. These wearables would again monitor various different signals, including heart rate and a reaction to stress called galvanic skin response, where increased sweating changes the electrical characteristics of the skin.
The idea is that this information would be pooled to determine an individual’s stress levels, again with the ultimate aim of improving performance.
“When people are under pressure it can improve their performance up to a point because it makes the task more interesting,” says Deligianni. “But there’s an upper limit, beyond which performance begins to suffer. We want to be able to detect where this peak is and respond, either by alerting the operator or their team so that they can intervene and share the load.”
Fellow teammates aren’t the only option for divvying up tasks at critical times. This is again another opportunity for humans and robots to work together on a job, combining the strength and dexterity of robotics with the judgment and decision-making skills of a human.
“A major part of this work is shared autonomy – operating modes where the machine accomplishes part of the task, but the operator is in control to some degree,” Deligianni explains. “We want to find the right balance between the two for the best performance in a task.”
Already the group is exploring varying degrees of autonomy in their virtual reality simulations, coupling performance measures like gripping accuracy with the physiological signals collected through the wearable devices. “If we can find differences in performance, then this could indicate ways to effectively modulate their workload,” Deligianni says.
Watch this simulation, from Deligianni’s team, of an astronaut’s interaction with robotic systems at the ISS.
From robots in space to the operating theatre
At the moment the work is specifically looking at interactions in space, but the research could also be applied to other extreme environments on Earth where it’s dangerous for humans to be working. And while this research journey may seem an unconventional path for a Centre that’s fabled for its work in medical robotics, the project actually has roots in the operating theatre back on Earth.
Like astronauts, surgeons need to work on intricate and highly delicate procedures, under an enormous amount of stress. Space missions and surgery are also safety-critical and require specialist skills and lengthy training. Surgeons and astronauts also need to cope with incredibly high workloads, which is why Deligianni is collaborating with Hamlyn surgeon Mr Daniel Leff, who helped develop the “cancer-sniffing” iKnife, to share learnings between these two different yet parallel environments.
More than just rooted in medicine, Deligianni has aspirations that all of this work could have healthcare applications back on Earth. For example the group is working on a wearable electronic suit, or exoskeleton, that could not only facilitate task performance, but also tackle the issue of low gravity-induced muscle atrophy. “These exoskeletons could also help people who have musculoskeletal diseases or tremor,” Deligianni says.
“My hopes are that ultimately, we’ll be able to actually see this research used in healthcare and in space soon.”
Dr Deligianni gives thanks to Daniel Freer, Rejin Varghese, Shamas Khan, Anh Nguyen, Stephanie Pau, Yao Guo, Bruno Gil Rosa, Panagiotis Kassanos, Fani Deligianni, Robert Merrifield, Guang-Zhong Yang, who have all contributed to this project.