Moon

Navigation Could be Done on the Moon Just by Looking at Nearby Landmarks

When humans start living and working on the Moon in the Artemis missions, they’re going to need good navigational aids. Sure, they’ll have a GPS equivalent to help them find their way around. And, there’ll be LunaNet, the Moon’s equivalent to the Internet. But, there are places on the lunar that are pretty remote. In those cases, explorers could require more than one method for communication and navigation. That prompted NASA Goddard research engineer Alvin Yew to create an AI-driven local map service. It uses local landmarks for navigation.

The idea is to use already-gathered surface data from astronaut photographs and mapping missions to provide overlapping navigational aids. “For safety and science geotagging, it’s important for explorers to know exactly where they are as they explore the lunar landscape,” said Alvin Yew, a research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Equipping an onboard device with a local map would support any mission, whether robotic or human.”

Having a map-based system as a backup would make life a lot easier for explorers in craters, for example, said Yew. “The motivation for me was to enable lunar crater exploration, where the entire horizon would be the crater rim.”

The collection of ridges, craters, and boulders that form a lunar horizon can be used by artificial intelligence to accurately locate a lunar traveler. A system being developed by Research Engineer Alvin Yew would provide a backup location service for future explorers, robotic or human. Credits: NASA/MoonTrek/Alvin Yew

Using Moon Mapping Data for Navigational Aid

The heart of Yew’s system is data from the Lunar Reconnaissance Orbiter. That spacecraft is mapping the Moon’s surface in the highest possible detail and performing other lunar science and exploration tasks. The onboard Lunar Orbiter Laser Altimeter (LOLA) has provided high-resolution topographic maps of the Moon.

Yew fed LOLA data into an AI program that uses digital elevation models to recreate features on the lunar horizon. It makes them look as they would appear to an explorer on the lunar surface. The result is a series of digital panoramas. THE AI can correlate them with known surface objects—such as large boulders or ridges. The goal is to provide accurate location identification for any given area.

“Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks,” Yew said. “While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters). This accuracy opens the door to a broad range of mission concepts for future exploration.”

Yew’s geolocation system also has roots in the capabilities of GIANT (Goddard Image Analysis and Navigation Tool), developed by Goddard engineer Andrew Liounis developed it. Scientists used GIANT to double-check and verify navigation data for NASA’s OSIRIS-REx mission. That spacecraft went to the asteroid Bennu to collect a sample for analysis here on Earth.

Moon Maps in Your Device

There may soon come a time when a lunar explorer will head out to study various surface features. They’ll be equipped with cameras and communication equipment. That’s similar to Earth geologists heading into the field with a DSLR and a cellphone with GPS and satellite access. They can find their way around by noting landmarks, but it’s always useful to have backup methods. Of course, here on Earth, we have multiple communication networks.

LunaNet concept graphic for a possible communication and navigation device used on the Moon. Credits: NASA/Reese Patillo

On the Moon, that infrastructure isn’t in place. But, it should be there when the Artemis mission is fully underway. Still, it won’t be long before those lunar geologists are “in the field” themselves. And, they’ll need all the help they can get as they do their work. According to a study published by Goddard researcher Erwan Mazarico, a lunar surface explorer can see at most up to about 180 miles (300 kilometers) from any unobstructed location on the Moon. That makes long-term surface studies across wide areas a bit more challenging. Ideally, a surface explorer could use the “app” that Yew is developing in a handheld device. Like a portable GPS unit, a lunar wayfinding device would help astronauts in regions that don’t have the greatest line-of-site. Onboard terrain data sets including elevation data would be part of its software.

Yew’s geolocation system has some likely applications beyond the Moon. Even on Earth, location technology like Yew’s will help explorers in terrain where GPS signals are obstructed or subject to interference. This use of AI-interpreted visual data against known models of the lunar surface could provide a new generation of navigation tools not just for Earth and the Moon, but even on Mars.

For More Information

NASA Developing AI to Steer Using Landmarks — On the Moon
Lunar Reconnaissance Orbiter
LunaNet: Empowering Artemis with Communications and Navigation Interoperability

Carolyn Collins Petersen

Recent Posts

Astronomers Find a 3 Million Year Old Planet

Astronomers have just found one of the youngest planets ever. At only 3 million years…

11 hours ago

There was Hot Water on Mars 4.45 Billion Years Ago

Mars formed 4.5 billion years ago, roughly the same time as the Earth. We know…

15 hours ago

Axion Dark Matter May Make Spacetime Ring

Dark matter made out of axions may have the power to make space-time ring like…

19 hours ago

Earth’s Old Trees Keep A Record of Powerful Solar Storms

Most of the time the Sun is pretty well-mannered, but occasionally it's downright unruly. It…

1 day ago

New Supercomputer Simulation Explains How Mars Got Its Moons

One mystery in planetary science is a satisfying origin story for Mars's moons, Phobos and…

2 days ago

The Early Universe May Have Had Giant Batteries of Dust

The largest magnetic fields in the universe may have found themselves charged up when the…

2 days ago