Rovers Will be Starting to Make Their Own Decisions About Where to Search for Life

We all know how exploration by rover works. The rover is directed to a location and told to take a sample. Then it subjects that sample to analysis and sends home the results. It’s been remarkably effective.

But it’s expensive and time-consuming to send all this data home. Will this way of doing things still work? Or can it be automated?

The main thrust of missions to Mars right now is to detect past signs of life. A rover collects a sample, does some initial analysis on it, then sends the data home. The problem is the cost and the time it takes to send all that data back to Earth. What if rovers were smarter, and could optimize the data they send back to Earth? Could they then overcome some of the severe limitations to sending data back to Earth?

“We need to prioritize the volume of data we send back to Earth, but we also need to ensure that in doing that we don’t throw out vital information.”

Victoria Da Poian, Lead Researcher, NASA Goddard Space Flight Center.

That’s a question a pair of scientists are addressing. They presented their research at the recent Goldschmidt Conference. The lead researcher is Victoria Da Poian from NASA’s Goddard Space Flight Center. Da Poian and her co-researcher Eric Lyness, also from the GSFC, have developed an AI system that will debut on the ESA/Roscosmos ExoMars Rover, which will land on Mars in 2023.

The Rosalind Franklin rover will likely land at Oxia Planum, near the Martian equator. That area has a smooth landing spot, and is also has the potential to hold any preserved biosignatures. Image Credit: By NASA – http://marsnext.jpl.nasa.gov/workshops/2014_05/14_Oxia_Thollot_webpage.pdf, Public Domain, https://commons.wikimedia.org/w/index.php?curid=44399454

“This is a visionary step in space exploration,” Da Poian said in a press release. “It means that over time we’ll have moved from the idea that humans are involved with nearly everything in space, to the idea that computers are equipped with intelligent systems, and they are trained to make some decisions and are able to transmit in priority the most interesting or time-critical information.”

The question of effective data transmission is real. It’s a bottleneck in mission design. Data has a cost—check your mobile phone plan—and as we send missions further and further out into the Solar System, and as our rovers and orbiters become more and more science-capable, the cost of transmitting all that data will balloon.

“Data from a rover on Mars can cost as much as 100,000 times as much as data on your cell phone, so we need to make those bits as scientifically valuable as possible.” said co-researcher Eric Lyness.

But as Lyness points out, it’s not just the expense. The inability of current rovers to consider what they’re doing with their samples is holding us back scientifically.

The Mars Organic Molecule analyzer is the largest instrument on the Rosalind Franklin rover. It mills the samples, heats them, and performs mass spectrometry and gas chromatography to identify molecules. The new neural net AI system will be tested with MOMA on Mars. Image Credit: Max Planck Institute for Solar System Research.

“It costs a lot of time and money to send the data back to Earth which means scientists can’t run as many experiments or analyse as many samples as they would like,” said Lyness in a press release. “By using AI to do an initial analysis of the data after it is collected but before it is sent back to Earth, NASA can optimise what we receive, which greatly increases the scientific value of space missions.”

This work is centered around a single instrument on the ExoMars Rover. (The rover has been re-christened as the Rosalind Franklin Rover, in honor of the scientist Rosalind Franklin, whose work was critical to our understanding of DNA.) The instrument is MOMA, the Mars Organic Molecule Analyzer. MOMA is the largest instrument on the Rosalind Franklin, and the rover can drill down below the Martian surface to collect samples of organic molecules, where they’re safe from degradation from the Sun and cosmic rays.

But many of MOMA’s samples—maybe the majority of them—won’t contain any organic molecules of interest. Others will need to be re-tested. The idea behind the new AI is to hand those decisions over to the rover. That will reduce the amount of data that needs to be transmitted, and it will hopefully increase MOMA’s effectiveness.

“What we get from these unmanned missions is data, lots of it; and sending data over hundreds of millions of kilometres can be very challenging in different environments and extremely expensive; in other words, bandwidth is limited,” explained Da Poian. “We need to prioritize the volume of data we send back to Earth, but we also need to ensure that in doing that we don’t throw out vital information. This has led us to begin to develop smart algorithms which can for now help the scientists with their analysis of the sample and their decision-making process regarding subsequent operations, and as a longer-term objective, algorithms that will analyse the data itself, will adjust and tune the instruments to run next operations without the ground-in-the-loop, and will transmit home only the most interesting data.”

Artist’s impression of New Horizons’ close encounter with the Pluto–Charon system. Data transmission was a critical issue for the mission. Its fly-by of Pluto was on July 14, 2015. The last data from that encounter wasn’t received on Earth until October 2016. Strong AI would be a benefit to missions to the outer reaches of the Solar System. Credit: NASA/JHU APL/SwRI/Steve Gribben

The Rosalind Franklin communicates with Earth via the Trace Gas Orbiter (TGO). However, the TGO only passes overhead twice per day, so ground controllers won’t be able to directly control the rover. Instead, it’s designed to navigate autonomously across the Martian surface, up to 70 m (230 ft) per Martian day. The intelligent systems for sample analysis will raise the autonomous capability of the rover, and hopefully, future rovers. For the Rosalind Franklin, most sample data will still be sent back to Earth, but for future rovers, that could change.

The team tested their autonomous system with a replica MOMA instrument in their lab. The testing allowed them to “train” the neural network algorithm to recognize familiar compounds. The system will then compare newly-taken samples with its catalog of known samples, and will alert scientists when it finds a match.

When MOMA encounters a spectrum of an unknown compound, it can categorize it with accuracy up to 94%. And it can match to previously known compounds with an accuracy of 87%. So the potential data and time savings are already substantial. And the researchers aren’t finished: they’re still improving its accuracy in preparation for the 2023 launch date.

“The mission will face severe time limits.”

Eric Lyness, Co-Researcher, NASA GSFC

We’re accustomed to rover missions outlasting their initial mission length substantially. For instance, NASA’s Opportunity rover was designed to last 90 sols on Mars, but it lasted 5,352 sols. And MSL Curiosity’s primary mission was planned for 668 sols, but has been working for over 2,800 sols and is still going strong.

But it’s unwise to assume that the Rosalind Franklin rover will exceed its mission length by such large margins. It’s mission is based on travelling 4 km (2.5 mi) in seven months. How long the mission will actually last is unclear.

ESA Exomars rover launch has been rescheduled to launch in 2023. Credit:ESA

“The mission will face severe time limits. When we will be operating on Mars, samples will only remain in the rover for at most a few weeks before the rover dumps the sample and moves to a new place to drill,” said Lyness. “So, if we need to retest a sample, we need to do it quickly, sometimes within 24 hours.”

“In the future, as we move to explore the moons of Jupiter such as Europa, and of Saturn such as Enceladus and Titan, we will need real-time decisions to be made onsite.”

ERic Lyness, Co-Researcher, NASA GSFC

When Rosalind Franklin drills down below the surface and collects a sample, it places it in the Analytical Laboratory Drawer (ALD). The samples will then be analyzed with MOMA, and two other instruments: an infrared spectrometer called MicrOmega, and a raman spectrometer called the Raman Laser Spectrometer (RLS). As Lyness points out, samples can’t be kept in the rover for long. And that time pressure will only be more severe for rovers exploring places like Saturn’s moon Titan, the destination for NASA’s Dragonfly mission in 2026.

In this illustration, the Dragonfly helicopter drone is descending to the surface of Titan. That mission will face even more severe data transmission problems than Mars rovers. Image: NASA

“In the future, as we move to explore the moons of Jupiter such as Europa, and of Saturn such as Enceladus and Titan, we will need real-time decisions to be made onsite,” said Lyness. “With these moons it can take 5 to 7 hours for a signal from Earth to reach the instruments, so this will not be like controlling a drone, with an instant response. We need to give the instruments the autonomy to make rapid decisions to reach our science goals on our behalf.”

“We’ll still need humans to interpret the findings, but the first filter will be the AI system.”

Eric Lyness, Co-Researcher, NASA GSFC

The data from Rosalind Franklin’s samples is all based on probabilities. The data can be difficult to interpret: there’s no red light that flashes and says “Evidence of Life Found!” The data and its probabilities needs to be interpreted, preferably by different researchers in the scientific community who then publish their results.

“These results will largely tell us about the geochemistry that the instruments find.” said Lyness. “We’re aiming for the system to give scientists directions, for example our system might say ‘I’ve got 91% confidence that this sample corresponds to a real world sample and I’m 87% sure it is phospholipids, similar to a sample tested on July 24th, 2018 and here is what that data looked like.’ We’ll still need humans to interpret the findings, but the first filter will be the AI system.”

Systems like this one are only going to get more sophisticated. There are some potential pitfalls to systems like these, but their potential for space agencies is too enticing to ignore. By the time the system is field-tested on Mars, and then implemented in NASA’s Dragonfly mission to Titan, who knows how powerful and capable it’ll be.

More:

Evan Gough

Recent Posts

Astronomers Find a 3 Million Year Old Planet

Astronomers have just found one of the youngest planets ever. At only 3 million years…

8 hours ago

There was Hot Water on Mars 4.45 Billion Years Ago

Mars formed 4.5 billion years ago, roughly the same time as the Earth. We know…

12 hours ago

Axion Dark Matter May Make Spacetime Ring

Dark matter made out of axions may have the power to make space-time ring like…

16 hours ago

Earth’s Old Trees Keep A Record of Powerful Solar Storms

Most of the time the Sun is pretty well-mannered, but occasionally it's downright unruly. It…

1 day ago

New Supercomputer Simulation Explains How Mars Got Its Moons

One mystery in planetary science is a satisfying origin story for Mars's moons, Phobos and…

1 day ago

The Early Universe May Have Had Giant Batteries of Dust

The largest magnetic fields in the universe may have found themselves charged up when the…

2 days ago