When humans start living and working on the Moon in the Artemis missions, they’re going to need good navigational aids. Sure, they’ll have a GPS equivalent to help them find their way around. And, there’ll be LunaNet, the Moon’s equivalent to the Internet. But, there are places on the lunar that are pretty remote. In those cases, explorers could require more than one method for communication and navigation. That prompted NASA Goddard research engineer Alvin Yew to create an AI-driven local map service. It uses local landmarks for navigation.
The idea is to use already-gathered surface data from astronaut photographs and mapping missions to provide overlapping navigational aids. “For safety and science geotagging, it’s important for explorers to know exactly where they are as they explore the lunar landscape,” said Alvin Yew, a research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Equipping an onboard device with a local map would support any mission, whether robotic or human.”
Having a map-based system as a backup would make life a lot easier for explorers in craters, for example, said Yew. “The motivation for me was to enable lunar crater exploration, where the entire horizon would be the crater rim.”
The heart of Yew’s system is data from the Lunar Reconnaissance Orbiter. That spacecraft is mapping the Moon’s surface in the highest possible detail and performing other lunar science and exploration tasks. The onboard Lunar Orbiter Laser Altimeter (LOLA) has provided high-resolution topographic maps of the Moon.
Yew fed LOLA data into an AI program that uses digital elevation models to recreate features on the lunar horizon. It makes them look as they would appear to an explorer on the lunar surface. The result is a series of digital panoramas. THE AI can correlate them with known surface objects—such as large boulders or ridges. The goal is to provide accurate location identification for any given area.
“Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks,” Yew said. “While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters). This accuracy opens the door to a broad range of mission concepts for future exploration.”
Yew’s geolocation system also has roots in the capabilities of GIANT (Goddard Image Analysis and Navigation Tool), developed by Goddard engineer Andrew Liounis developed it. Scientists used GIANT to double-check and verify navigation data for NASA’s OSIRIS-REx mission. That spacecraft went to the asteroid Bennu to collect a sample for analysis here on Earth.
There may soon come a time when a lunar explorer will head out to study various surface features. They’ll be equipped with cameras and communication equipment. That’s similar to Earth geologists heading into the field with a DSLR and a cellphone with GPS and satellite access. They can find their way around by noting landmarks, but it’s always useful to have backup methods. Of course, here on Earth, we have multiple communication networks.
On the Moon, that infrastructure isn’t in place. But, it should be there when the Artemis mission is fully underway. Still, it won’t be long before those lunar geologists are “in the field” themselves. And, they’ll need all the help they can get as they do their work. According to a study published by Goddard researcher Erwan Mazarico, a lunar surface explorer can see at most up to about 180 miles (300 kilometers) from any unobstructed location on the Moon. That makes long-term surface studies across wide areas a bit more challenging. Ideally, a surface explorer could use the “app” that Yew is developing in a handheld device. Like a portable GPS unit, a lunar wayfinding device would help astronauts in regions that don’t have the greatest line-of-site. Onboard terrain data sets including elevation data would be part of its software.
Yew’s geolocation system has some likely applications beyond the Moon. Even on Earth, location technology like Yew’s will help explorers in terrain where GPS signals are obstructed or subject to interference. This use of AI-interpreted visual data against known models of the lunar surface could provide a new generation of navigation tools not just for Earth and the Moon, but even on Mars.
NASA Developing AI to Steer Using Landmarks — On the Moon
Lunar Reconnaissance Orbiter
LunaNet: Empowering Artemis with Communications and Navigation Interoperability
The spectra of distant galaxies shows that dying sun-like stars, not supernovae, enrich galaxies the…
Neutron stars are extraordinarily dense objects, the densest in the Universe. They pack a lot…
Think of the Moon and most people will imagine a barren world pockmarked with craters.…
In a few years, as part of the Artemis Program, NASA will send the "first…
China has a fabulously rich history when it comes to space travel and was among…
It was 1969 that humans first set foot on the Moon. Back then, the Apollo…