Categories: Astronomy

Computer to Simulate Exploding Star

Image credit: University of Chicago
University scientists are preparing to run the most advanced supercomputer simulation of an exploding star ever attempted.

Tomasz Plewa, Senior Research Associate in the Center for Astrophysical Thermonuclear Flashes and Astronomy & Astrophysics, expects the simulation to reveal the mechanics of exploding stars, called supernovae, in unprecedented detail.

The simulation is made possible by the U.S. Department of Energy?s special allocation of an extraordinary 2.7 million hours of supercomputing time to the Flash Center, which typically uses less than 500,000 hours of supercomputer time annually.

?This is beyond imagination,? said Plewa, who submitted the Flash Center proposal on behalf of a research team at the University and Argonne National Laboratory.

The Flash Center project was one of three selected to receive supercomputer time allocations under a new competitive program announced last July by Secretary of Energy Spencer Abraham.

The other two winning proposals came from the Georgia Institute of Technology, which received 1.2 million processor hours, and the DOE?s Lawrence Berkeley National Laboratory, which received one million processor hours.

The supercomputer time will help the Flash Center more accurately simulate the explosion of a white dwarf star, one that has burned most or all of its nuclear fuel. These supernovae shine so brightly that astronomers use them to measure distance in the universe. Nevertheless, many details about what happens during a supernova remain unknown.

Simulating a supernova is computationally intensive because it involves vast scales of time and space. White dwarf stars gravitationally accumulate material from a companion star for millions of years, but ignite in less than a second. Simulations must also account for physical processes that occur on a scale that ranges from a few hundredths of an inch to the entire surface of the star, which is comparable in size to Earth.

Similar computational problems vex the DOE?s nuclear weapons Stockpile Stewardship and Management Program. In the wake of the Comprehensive Test Ban Treaty, which President Clinton signed in 1996, the reliability of the nation?s nuclear arsenal must now be tested via computer simulations rather than in the field.

?The questions ultimately are how is the nuclear arsenal aging with time, and is your code predicting that aging process correctly?? Plewa said.

Flash Center scientists verify the accuracy of their supernovae code by comparing the results of their simulations both to laboratory experiments and to telescopic observations. Spectral observations of supernovae, for example, provide a sort of bar code that reveals which chemical elements are produced in the explosions. Those observations currently conflict with simulations.

?You want to reconcile current simulations with observations regarding chemical composition and the production of elements,? Plewa said.

Scientists also wish to see more clearly the sequence of events that occurs immediately before a star goes supernova. It appears that a supernova begins in the core of a white dwarf star and expands toward the surface like an inflating balloon.

According to one theory, the flame front initially expands at a relatively ?slow? subsonic speed of 60 miles per second. Then, at some unknown point, the flame front detonates and accelerates to supersonic speeds. In the ultra-dense material of a white dwarf, supersonic speeds exceed 3,100 miles per second.

Another possibility: the initial subsonic wave fizzles when it reaches the outer part of the star, leading to a collapse of the white dwarf, the mixing of unburned nuclear fuel and then detonation.

?It will be very nice if in the simulations we could observe this transition to detonation,? Plewa said.

Flash Center scientists already are on the verge of recreating this moment in their simulations. The extra computer time from the DOE should push them across the threshold.

The center will increase the resolution of its simulations to one kilometer (six-tenths of a mile) for a whole-star simulation. Previously, the center could achieve a resolution of five kilometers (3.1 miles) for a whole-star simulation, or 2.5 kilometers (1.5 miles) for a simulation encompassing only one-eighth of a star.

The latter simulations fail to capture perturbations that may take place in other sections of the star, Plewa said. But they may soon become scientific relics.

?I hope by summer we?ll have all the simulations done and we?ll move on to analyze the data,? he said.

Original Source: University of Chicago News Release

Fraser Cain

Fraser Cain is the publisher of Universe Today. He's also the co-host of Astronomy Cast with Dr. Pamela Gay. Here's a link to my Mastodon account.

Recent Posts

New Supercomputer Simulation Explains How Mars Got Its Moons

One mystery in planetary science is a satisfying origin story for Mars's moons, Phobos and…

4 hours ago

The Early Universe May Have Had Giant Batteries of Dust

The largest magnetic fields in the universe may have found themselves charged up when the…

9 hours ago

The First Close-Up Picture of Star Outside the Milky Way

Like a performer preparing for their big finale, a distant star is shedding its outer…

23 hours ago

Here’s What We Know About Earth’s Temporary Mini-Moon

For a little over a month now, the Earth has been joined by a new…

1 day ago

New Study Suggests Black Holes Get their “Hair” from their Mothers

Despite decades of study, black holes are still one of the most puzzling objects in…

1 day ago

Gaze at New Pictures of the Sun from Solar Orbiter

74 million kilometres is a huge distance from which to observe something. But 74 million…

1 day ago