We all know how exploration by rover works. The rover is directed to a location and told to take a sample. Then it subjects that sample to analysis and sends home the results. It’s been remarkably effective.
But it’s expensive and time-consuming to send all this data home. Will this way of doing things still work? Or can it be automated?
The main thrust of missions to Mars right now is to detect past signs of life. A rover collects a sample, does some initial analysis on it, then sends the data home. The problem is the cost and the time it takes to send all that data back to Earth. What if rovers were smarter, and could optimize the data they send back to Earth? Could they then overcome some of the severe limitations to sending data back to Earth?
That’s a question a pair of scientists are addressing. They presented their research at the recent Goldschmidt Conference. The lead researcher is Victoria Da Poian from NASA’s Goddard Space Flight Center. Da Poian and her co-researcher Eric Lyness, also from the GSFC, have developed an AI system that will debut on the ESA/Roscosmos ExoMars Rover, which will land on Mars in 2023.
“This is a visionary step in space exploration,” Da Poian said in a press release. “It means that over time we’ll have moved from the idea that humans are involved with nearly everything in space, to the idea that computers are equipped with intelligent systems, and they are trained to make some decisions and are able to transmit in priority the most interesting or time-critical information.”
The question of effective data transmission is real. It’s a bottleneck in mission design. Data has a cost—check your mobile phone plan—and as we send missions further and further out into the Solar System, and as our rovers and orbiters become more and more science-capable, the cost of transmitting all that data will balloon.
“Data from a rover on Mars can cost as much as 100,000 times as much as data on your cell phone, so we need to make those bits as scientifically valuable as possible.” said co-researcher Eric Lyness.
But as Lyness points out, it’s not just the expense. The inability of current rovers to consider what they’re doing with their samples is holding us back scientifically.
“It costs a lot of time and money to send the data back to Earth which means scientists can’t run as many experiments or analyse as many samples as they would like,” said Lyness in a press release. “By using AI to do an initial analysis of the data after it is collected but before it is sent back to Earth, NASA can optimise what we receive, which greatly increases the scientific value of space missions.”
This work is centered around a single instrument on the ExoMars Rover. (The rover has been re-christened as the Rosalind Franklin Rover, in honor of the scientist Rosalind Franklin, whose work was critical to our understanding of DNA.) The instrument is MOMA, the Mars Organic Molecule Analyzer. MOMA is the largest instrument on the Rosalind Franklin, and the rover can drill down below the Martian surface to collect samples of organic molecules, where they’re safe from degradation from the Sun and cosmic rays.
But many of MOMA’s samples—maybe the majority of them—won’t contain any organic molecules of interest. Others will need to be re-tested. The idea behind the new AI is to hand those decisions over to the rover. That will reduce the amount of data that needs to be transmitted, and it will hopefully increase MOMA’s effectiveness.
“What we get from these unmanned missions is data, lots of it; and sending data over hundreds of millions of kilometres can be very challenging in different environments and extremely expensive; in other words, bandwidth is limited,” explained Da Poian. “We need to prioritize the volume of data we send back to Earth, but we also need to ensure that in doing that we don’t throw out vital information. This has led us to begin to develop smart algorithms which can for now help the scientists with their analysis of the sample and their decision-making process regarding subsequent operations, and as a longer-term objective, algorithms that will analyse the data itself, will adjust and tune the instruments to run next operations without the ground-in-the-loop, and will transmit home only the most interesting data.”
The Rosalind Franklin communicates with Earth via the Trace Gas Orbiter (TGO). However, the TGO only passes overhead twice per day, so ground controllers won’t be able to directly control the rover. Instead, it’s designed to navigate autonomously across the Martian surface, up to 70 m (230 ft) per Martian day. The intelligent systems for sample analysis will raise the autonomous capability of the rover, and hopefully, future rovers. For the Rosalind Franklin, most sample data will still be sent back to Earth, but for future rovers, that could change.
The team tested their autonomous system with a replica MOMA instrument in their lab. The testing allowed them to “train” the neural network algorithm to recognize familiar compounds. The system will then compare newly-taken samples with its catalog of known samples, and will alert scientists when it finds a match.
When MOMA encounters a spectrum of an unknown compound, it can categorize it with accuracy up to 94%. And it can match to previously known compounds with an accuracy of 87%. So the potential data and time savings are already substantial. And the researchers aren’t finished: they’re still improving its accuracy in preparation for the 2023 launch date.
We’re accustomed to rover missions outlasting their initial mission length substantially. For instance, NASA’s Opportunity rover was designed to last 90 sols on Mars, but it lasted 5,352 sols. And MSL Curiosity’s primary mission was planned for 668 sols, but has been working for over 2,800 sols and is still going strong.
But it’s unwise to assume that the Rosalind Franklin rover will exceed its mission length by such large margins. It’s mission is based on travelling 4 km (2.5 mi) in seven months. How long the mission will actually last is unclear.
“The mission will face severe time limits. When we will be operating on Mars, samples will only remain in the rover for at most a few weeks before the rover dumps the sample and moves to a new place to drill,” said Lyness. “So, if we need to retest a sample, we need to do it quickly, sometimes within 24 hours.”
When Rosalind Franklin drills down below the surface and collects a sample, it places it in the Analytical Laboratory Drawer (ALD). The samples will then be analyzed with MOMA, and two other instruments: an infrared spectrometer called MicrOmega, and a raman spectrometer called the Raman Laser Spectrometer (RLS). As Lyness points out, samples can’t be kept in the rover for long. And that time pressure will only be more severe for rovers exploring places like Saturn’s moon Titan, the destination for NASA’s Dragonfly mission in 2026.
“In the future, as we move to explore the moons of Jupiter such as Europa, and of Saturn such as Enceladus and Titan, we will need real-time decisions to be made onsite,” said Lyness. “With these moons it can take 5 to 7 hours for a signal from Earth to reach the instruments, so this will not be like controlling a drone, with an instant response. We need to give the instruments the autonomy to make rapid decisions to reach our science goals on our behalf.”
The data from Rosalind Franklin’s samples is all based on probabilities. The data can be difficult to interpret: there’s no red light that flashes and says “Evidence of Life Found!” The data and its probabilities needs to be interpreted, preferably by different researchers in the scientific community who then publish their results.
“These results will largely tell us about the geochemistry that the instruments find.” said Lyness. “We’re aiming for the system to give scientists directions, for example our system might say ‘I’ve got 91% confidence that this sample corresponds to a real world sample and I’m 87% sure it is phospholipids, similar to a sample tested on July 24th, 2018 and here is what that data looked like.’ We’ll still need humans to interpret the findings, but the first filter will be the AI system.”
Systems like this one are only going to get more sophisticated. There are some potential pitfalls to systems like these, but their potential for space agencies is too enticing to ignore. By the time the system is field-tested on Mars, and then implemented in NASA’s Dragonfly mission to Titan, who knows how powerful and capable it’ll be.
The spectra of distant galaxies shows that dying sun-like stars, not supernovae, enrich galaxies the…
Neutron stars are extraordinarily dense objects, the densest in the Universe. They pack a lot…
Think of the Moon and most people will imagine a barren world pockmarked with craters.…
In a few years, as part of the Artemis Program, NASA will send the "first…
China has a fabulously rich history when it comes to space travel and was among…
It was 1969 that humans first set foot on the Moon. Back then, the Apollo…