Categories: Space Exploration

Radiation Resistant Computers

EAFTC computers in a space-ready flight chassis. Image credit: NASA/Honeywell. Click to enlarge
Unfortunately, the radiation that pervades space can trigger such glitches. When high-speed particles, such as cosmic rays, collide with the microscopic circuitry of computer chips, they can cause chips to make errors. If those errors send the spacecraft flying off in the wrong direction or disrupt the life-support system, it could be bad news.

To ensure safety, most space missions use radiation hardened computer chips. “Rad-hard” chips are unlike ordinary chips in many ways. For example, they contain extra transistors that take more energy to switch on and off. Cosmic rays can’t trigger them so easily. Rad-hard chips continue to do accurate calculations when ordinary chips might “glitch.”

NASA relies almost exclusively on these extra-durable chips to make computers space-worthy. But these custom-made chips have some downsides: They’re expensive, power hungry, and slow — as much as 10 times slower than an equivalent CPU in a modern consumer desktop PC.

With NASA sending people back to the moon and on to Mars–see the Vision for Space Exploration–mission planners would love to give their spacecraft more computing horsepower.

Having more computing power onboard would help spacecraft conserve one of their most limited resources: bandwidth. The bandwidth available for beaming data back to Earth is often a bottleneck, with transmission speeds even slower than old dial-up modems. If the reams of raw data gathered by the spacecraft’s sensors could be “crunched” onboard, scientists could beam back just the results, which would take much less bandwidth.

On the surface of the moon or Mars, explorers could use fast computers to analyze their data right after collecting it, quickly identifying areas of high scientific interest and perhaps gathering more data before a fleeting opportunity passes. Rovers would benefit, too, from the extra intelligence of modern CPUs.

Using the same inexpensive, powerful Pentium and PowerPC chips found in consumer PCs would help tremendously, but to do so, the problem of radiation-induced errors must be solved.

This is where a NASA project called Environmentally Adaptive Fault-Tolerant Computing (EAFTC) comes in. Researchers working on the project are experimenting with ways to use consumer CPUs in space missions. They’re particularly interested in “single event upsets,” the most common kind of glitches caused by single particles of radiation barreling into chips.

Team member Raphael Some of JPL explains: “One way to use faster, consumer CPUs in space is simply to have three times as many CPUs as you need: The three CPUs perform the same calculation and vote on the result. If one of the CPUs makes a radiation-induced error, the other two will still agree, thus winning the vote and giving the correct result.”

This works, but often it’s overkill, wasting precious electricity and computing power to triple-check calculations that aren’t critical.

“To do this smarter and more efficiently, we’re developing software that weighs the importance of a calculation,” continues Some. “If it’s very important, like navigation, all three CPUs must vote. If it’s less important, like measuring the chemical makeup of a rock, only one or two CPUs might be involved.”

This is just one of dozens of error-correction techniques that EAFTC pulls together into a single package. The result is much better efficiency: Without the EAFTC software, a computer based on consumer CPUs needs 100-200% redundancy to protect against radiation-caused errors. (100% redundancy means 2 CPUs; 200% means 3 CPUs.) With EAFTC, only 15-20% redundancy is needed for the same degree of protection. All of that saved CPU time can be used productively instead.

“EAFTC is not going to replace rad-hard CPUs,” cautions Some. “Some tasks, such as life support, are so important we’ll always want radiation hardened chips to run them.” But, in due course, EAFTC algorithms might take some of the data-processing load off those chips, making vastly greater computer power available to future missions.

EAFTC’s first test will be onboard a satellite called Space Technology 8 (ST-8). Part of NASA’s New Millennium Program, ST-8 will flight-test new, experimental space technologies such as EAFTC, making it possible to use them in future missions with greater confidence.
The satellite, scheduled for a 2009 launch, will skim the Van Allen radiation belts during each of its elliptical orbits, testing EAFTC in this high-radiation environment similar to deep space.

If all goes well, space probes venturing across the solar system may soon be using the exact same chips found in your desktop PC — just without the glitches.

Original Source: NASA News Release

Fraser Cain

Fraser Cain is the publisher of Universe Today. He's also the co-host of Astronomy Cast with Dr. Pamela Gay. Here's a link to my Mastodon account.

Recent Posts

Quantum Correlations Could Solve the Black Hole Information Paradox

The black hole information paradox has puzzled physicists for decades. New research shows how quantum…

9 hours ago

M87 Releases a Rare and Powerful Outburts of Gamma-ray Radiation

In April 2019, the Event Horizon Telescope (EHT) collaboration made history when it released the first-ever…

12 hours ago

Astronomers Find a Black Hole Tipped Over on its Side

Almost every large galaxy has a supermassive black hole churning away at its core. In…

16 hours ago

NASA is Developing Solutions for Lunar Housekeeping’s Biggest Problem: Dust!

Through the Artemis Program, NASA will send the first astronauts to the Moon since the…

1 day ago

Where’s the Most Promising Place to Find Martian Life?

New research suggests that our best hopes for finding existing life on Mars isn’t on…

1 day ago

Can Entangled Particles Communicate Faster than Light?

Entanglement is perhaps one of the most confusing aspects of quantum mechanics. On its surface,…

2 days ago