Categories: Apollo

AI Upscales Apollo Lunar Footage to 60 FPS

As exciting and thrilling as it is to watch all the historic footage from the Apollo Moon landings, you have to admit, the quality is sometimes not all that great. Even though NASA has worked on restoring and enhancing some of the most popular Apollo footage, some of it is still grainy or blurry — which is indicative of the video technology available in the 1960s.

But now, new developments in artificial intelligence have come to the rescue, providing viewers a nearly brand new experience in watching historic Apollo video.

A photo and film restoration specialist, who goes by the name of DutchSteamMachine, has worked some AI magic to enhance original Apollo film, creating strikingly clear and vivid video clips and images.

“I really wanted to provide an experience on this old footage that has not been seen before,” he told Universe Today.

Take a look at this enhanced footage from an Apollo 16 lunar rover traverse with Charlie Duke and John Young, where the footage that was originally shot with 12 frames per second (FPS) has been increased to 60 FPS:

Stunning, right? And I was blown away by the crisp view of the Moon’s surface in this enhanced view of Apollo 15’s landing site at Hadley Rille:

Or take a look at how clearly Neil Armstrong is visible in this enhanced version of the often-seen “first step” video from Apollo 11 taken by a 16mm video camera inside the Lunar Module:

Wow, just incredible!

The AI that DutchSteamMachine uses is called Depth-Aware video frame INterpolation, or DAIN for short. This AI is open source, free and constantly being developed and improved upon. Motion interpolation or motion-compensated frame interpolation is a form of video processing in which intermediate animation frames are generated between existing ones, in an attempt to make the video more fluid, to compensate for blurriness, etc.

“People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour,” he said. “This technique seemed like a great thing to apply to much newer footage.”

But you may not be able to try this at home. It takes a powerful, high-end GPU (with special cooling fans!) DutchSteamMachine said that a video of just 5 minutes can take anywhere from 6 to 20 hours to complete. But the results speak for themselves.

He explained how he does this work:

“First I set out to find the highest quality source videos, which I thankfully found as high-bitrate 720p video files,” he said. “So the quality problem was solved. It is important to start with the highest possible source and edit from there. However, most of the sequences shot were still very choppy. This is because to spare film and record for long periods of time, most of the rover footage was shot at 12, 6 or even 1 frame(s) per second. While people have previously tried to apply stabilization and/or types of frame-blending to ease this effect, I have never really been satisfied with it.”

DutchSteamMachine looks to find what framerate the footage was shot at, which can usually be found in NASA documents or, as in the case of the Apollo 16 footage above, the astronauts announce it when they turn the camera on.

“Unfortunately sometimes the framerate seems to be off or fluctuating, not always working as intended,” he said. “So the best way to find the framerate is to listen to landmarks the astronauts are talking about and match the footage to that.”

Want more details of the process? He explains more:

I split the source file up into individual PNG frames, input them to the AI together with the input framerate (1, 6, 12 or 24) and the desired output framerate by rate of interpolation (2x, 4x, 8x). The Ai starts using my GPU and looks at two real, consecutive frames. Using algorithms, it analyzes movements of objects in the two frames and renders entirely new ones. With an interpolation rate of for example;  5x, it is able to render 5 ‘fake’ frames from just 2 real frames. If footage was recorded at 12fps and the interpolation rate is set to 5x, the final framerate will be 60, meaning that with just 12 real frames it made 48 ‘fake’ frames. Both are then exported back to a video and played back at 60fps with both the real and fake frames.Finally, I apply colour correction, as often the source files have a blue or orange tint to them. I synchronize the footage with audio and if possible, also television and photos taken at the same time. Sometimes two 16mm cameras were running at the same time, so I can play those back next to each other.

Here’s a video he shared of his studio and his specialized equipment:

DutchSteamMachine does this work in his spare time, and posts it for free on his YouTube page.  His tagline is “Preserving the past for the future,” and he also uses the same techniques to enhance old home video, images and slides.

“It’s great to read people’s reactions on my footage,” he said. “So when people post things like, ‘Wow! This is Amazing! I have never seen this before!’. This keeps me going.”

If you’d like to support the amazing restoration/enhancement work that DutchSteamMachine is doing for the Apollo footage, here’s his Patreon Page. By supporting his work, you’ll get extras, early-access and previews of upcoming work and a chance to ask questions about the process.

And he’s planning to keep it all coming.

“I plan to improve tons of Apollo footage like this,” he said. “A lot more space and history-related footage is going to be published on my YT channel continuously.” He also has a Flickr page with more enhanced imagery.

Thanks to DutchSteamMachine for sharing the details of his work! More details at these links:

Patreon
YouTube
Flickr

Nancy Atkinson

Nancy has been with Universe Today since 2004, and has published over 6,000 articles on space exploration, astronomy, science and technology. She is the author of two books: "Eight Years to the Moon: the History of the Apollo Missions," (2019) which shares the stories of 60 engineers and scientists who worked behind the scenes to make landing on the Moon possible; and "Incredible Stories from Space: A Behind-the-Scenes Look at the Missions Changing Our View of the Cosmos" (2016) tells the stories of those who work on NASA's robotic missions to explore the Solar System and beyond. Follow Nancy on Twitter at https://twitter.com/Nancy_A and and Instagram at and https://www.instagram.com/nancyatkinson_ut/

Recent Posts

Earth’s Old Trees Keep A Record of Powerful Solar Storms

Most of the time the Sun is pretty well-mannered, but occasionally it's downright unruly. It…

11 hours ago

New Supercomputer Simulation Explains How Mars Got Its Moons

One mystery in planetary science is a satisfying origin story for Mars's moons, Phobos and…

16 hours ago

The Early Universe May Have Had Giant Batteries of Dust

The largest magnetic fields in the universe may have found themselves charged up when the…

21 hours ago

The First Close-Up Picture of Star Outside the Milky Way

Like a performer preparing for their big finale, a distant star is shedding its outer…

1 day ago

Here’s What We Know About Earth’s Temporary Mini-Moon

For a little over a month now, the Earth has been joined by a new…

2 days ago

New Study Suggests Black Holes Get their “Hair” from their Mothers

Despite decades of study, black holes are still one of the most puzzling objects in…

2 days ago