Categories: Astronomy

Computer to Simulate Exploding Star

Image credit: University of Chicago
University scientists are preparing to run the most advanced supercomputer simulation of an exploding star ever attempted.

Tomasz Plewa, Senior Research Associate in the Center for Astrophysical Thermonuclear Flashes and Astronomy & Astrophysics, expects the simulation to reveal the mechanics of exploding stars, called supernovae, in unprecedented detail.

The simulation is made possible by the U.S. Department of Energy?s special allocation of an extraordinary 2.7 million hours of supercomputing time to the Flash Center, which typically uses less than 500,000 hours of supercomputer time annually.

?This is beyond imagination,? said Plewa, who submitted the Flash Center proposal on behalf of a research team at the University and Argonne National Laboratory.

The Flash Center project was one of three selected to receive supercomputer time allocations under a new competitive program announced last July by Secretary of Energy Spencer Abraham.

The other two winning proposals came from the Georgia Institute of Technology, which received 1.2 million processor hours, and the DOE?s Lawrence Berkeley National Laboratory, which received one million processor hours.

The supercomputer time will help the Flash Center more accurately simulate the explosion of a white dwarf star, one that has burned most or all of its nuclear fuel. These supernovae shine so brightly that astronomers use them to measure distance in the universe. Nevertheless, many details about what happens during a supernova remain unknown.

Simulating a supernova is computationally intensive because it involves vast scales of time and space. White dwarf stars gravitationally accumulate material from a companion star for millions of years, but ignite in less than a second. Simulations must also account for physical processes that occur on a scale that ranges from a few hundredths of an inch to the entire surface of the star, which is comparable in size to Earth.

Similar computational problems vex the DOE?s nuclear weapons Stockpile Stewardship and Management Program. In the wake of the Comprehensive Test Ban Treaty, which President Clinton signed in 1996, the reliability of the nation?s nuclear arsenal must now be tested via computer simulations rather than in the field.

?The questions ultimately are how is the nuclear arsenal aging with time, and is your code predicting that aging process correctly?? Plewa said.

Flash Center scientists verify the accuracy of their supernovae code by comparing the results of their simulations both to laboratory experiments and to telescopic observations. Spectral observations of supernovae, for example, provide a sort of bar code that reveals which chemical elements are produced in the explosions. Those observations currently conflict with simulations.

?You want to reconcile current simulations with observations regarding chemical composition and the production of elements,? Plewa said.

Scientists also wish to see more clearly the sequence of events that occurs immediately before a star goes supernova. It appears that a supernova begins in the core of a white dwarf star and expands toward the surface like an inflating balloon.

According to one theory, the flame front initially expands at a relatively ?slow? subsonic speed of 60 miles per second. Then, at some unknown point, the flame front detonates and accelerates to supersonic speeds. In the ultra-dense material of a white dwarf, supersonic speeds exceed 3,100 miles per second.

Another possibility: the initial subsonic wave fizzles when it reaches the outer part of the star, leading to a collapse of the white dwarf, the mixing of unburned nuclear fuel and then detonation.

?It will be very nice if in the simulations we could observe this transition to detonation,? Plewa said.

Flash Center scientists already are on the verge of recreating this moment in their simulations. The extra computer time from the DOE should push them across the threshold.

The center will increase the resolution of its simulations to one kilometer (six-tenths of a mile) for a whole-star simulation. Previously, the center could achieve a resolution of five kilometers (3.1 miles) for a whole-star simulation, or 2.5 kilometers (1.5 miles) for a simulation encompassing only one-eighth of a star.

The latter simulations fail to capture perturbations that may take place in other sections of the star, Plewa said. But they may soon become scientific relics.

?I hope by summer we?ll have all the simulations done and we?ll move on to analyze the data,? he said.

Original Source: University of Chicago News Release

Fraser Cain

Fraser Cain is the publisher of Universe Today. He's also the co-host of Astronomy Cast with Dr. Pamela Gay. Here's a link to my Mastodon account.

Recent Posts

Quantum Correlations Could Solve the Black Hole Information Paradox

The black hole information paradox has puzzled physicists for decades. New research shows how quantum…

12 hours ago

M87 Releases a Rare and Powerful Outburts of Gamma-ray Radiation

In April 2019, the Event Horizon Telescope (EHT) collaboration made history when it released the first-ever…

16 hours ago

Astronomers Find a Black Hole Tipped Over on its Side

Almost every large galaxy has a supermassive black hole churning away at its core. In…

20 hours ago

NASA is Developing Solutions for Lunar Housekeeping’s Biggest Problem: Dust!

Through the Artemis Program, NASA will send the first astronauts to the Moon since the…

1 day ago

Where’s the Most Promising Place to Find Martian Life?

New research suggests that our best hopes for finding existing life on Mars isn’t on…

2 days ago

Can Entangled Particles Communicate Faster than Light?

Entanglement is perhaps one of the most confusing aspects of quantum mechanics. On its surface,…

3 days ago