New Simulation Models Galaxies Like Never Before

Zooming into an EAGLE galaxy. Credit: EAGLE Project Consortium/Schaye et al.

Astronomy is, by definition, intangible. Traditional laboratory-style experiments that utilize variables and control groups are of little use to the scientists who spend their careers analyzing the intricacies our Universe. Instead, astronomers rely on simulations – robust, mathematically-driven facsimiles of the cosmos – to investigate the long-term evolution of objects like stars, black holes, and galaxies. Now, a team of European researchers has broken new ground with their development of the EAGLE project: a simulation that, due to its high level of agreement between theory and observation, can be used to probe the earliest epochs of galaxy formation, over 13 billion years ago.

The EAGLE project, which stands for Evolution and Assembly of GaLaxies and their Environments, owes much of its increased accuracy to the better modeling of galactic winds. Galactic winds are powerful streams of charged particles that “blow” out of galaxies as a result of high-energy processes like star formation, supernova explosions, and the regurgitation of material by active galactic nuclei (the supermassive black holes that lie at the heart of most galaxies). These mighty winds tend to carry gas and dust out of the galaxy, leaving less material for continued star formation and overall growth.

Previous simulations were problematic for researchers because they produced galaxies that were far older and more massive than those that astronomers see today; however, EAGLE’s simulation of strong galactic winds fixes these anomalies. By accounting for characteristic, high-speed ejections of gas and dust over time, researchers found that younger and lighter galaxies naturally emerged.

After running the simulation on two European supercomputers, the Cosmology Machine at Durham University in England and Curie in France, the researchers concluded that the EAGLE project was a success. Indeed, the galaxies produced by EAGLE look just like those that astronomers expect to see when they look to the night sky. Richard Bower, a member of the team from Durham, raved, “The universe generated by the computer is just like the real thing. There are galaxies everywhere, with all the shapes, sizes and colours I’ve seen with the world’s largest telescopes. It is incredible.”

The upshots of this new work are not limited to scientists alone; you, too, can explore the Universe with EAGLE by downloading the team’s Cosmic Universe app. Videos of the EAGLE project’s simulations are also available on the team’s website.

A paper detailing the team’s work is published in the January 1 issue of Monthly Notices of the Royal Astronomical Society. A preprint of the results is available on the ArXiv.

New Signal May Be Evidence of Dark Matter, Say Researchers

Dark Matter Halo and dwarf galaxies
All galaxies are thought to have a dark matter halo. This image shows the distribution of dark matter surrounding our very own Milky Way. Image credit: J. Diemand, M. Kuhlen and P. Madau (UCSC)

Dark matter is the architect of large-scale cosmic structure and the engine behind proper rotation of galaxies. It’s an indispensable part of the physics of our Universe – and yet scientists still don’t know what it’s made of. The latest data from Planck suggest that the mysterious substance comprises 26.2% of the cosmos, making it nearly five and a half times more prevalent than normal, everyday matter. Now, four European researchers have hinted that they may have a discovery on their hands: a signal in x-ray light that has no known cause, and may be evidence of a long sought-after interaction between particles – namely, the annihilation of dark matter.

When astronomers want to study an object in the night sky, such as a star or galaxy, they begin by analyzing its light across all wavelengths. This allows them to visualize narrow dark lines in the object’s spectrum, called absorption lines. Absorption lines occur because a star’s or galaxy’s component elements soak up light at certain wavelengths, preventing most photons with those energies from reaching Earth. Similarly, interacting particles can also leave emission lines in a star’s or galaxy’s spectrum, bright lines that are created when excess photons are emitted via subatomic processes such as excitement and decay. By looking closely at these emission lines, scientists can usually paint a robust picture of the physics going on elsewhere in the cosmos.

But sometimes, scientists find an emission line that is more puzzling. Earlier this year, researchers at the Laboratory of Particle Physics and Cosmology (LPPC) in Switzerland and Leiden University in the Netherlands identified an excess bump of energy in x-ray light coming from both the Andromeda galaxy and the Perseus star cluster: an emission line with an energy around 3.5keV. No known process can account for this line; however, it is consistent with models of the theoretical sterile neutrino – a particle that many scientists believe is a prime candidate for dark matter.

The researchers believe that this strange emission line could result from the annihilation, or decay, of these dark matter particles, a process that is thought to release x-ray photons. In fact, the signal appeared to be strongest in the most dense regions of Andromeda and Perseus and increasingly more diffuse away from the center, a distribution that is also characteristic of dark matter. Additionally, the signal was absent from the team’s observations of deep, empty space, implying that it is real and not just instrumental artifact.

In a pre-print of their paper, the researchers are careful to stress that the signal itself is weak by scientific standards. That is, they can only be 99.994% sure that it is a true result and not just a rogue statistical fluctuation, a level of confidence that is known as 4σ. (The gold standard for a discovery in science is 5σ: a result that can be declared “true” with 99.9999% confidence) Other scientists are not so sure that dark matter is such a good explanation after all. According to predictions made based on measurements of the Lyman-alpha forest – that is, the spectral pattern of hydrogen absorption and photon emission within very distant, very old gas clouds – any particle purporting to be dark matter should have an energy above 10keV – more than twice the energy of this most recent signal.

As always, the study of cosmology is fraught with mysteries. Whether this particular emission line turns out to be evidence of a sterile neutrino (and thus of dark matter) or not, it does appear to be a signal of some physical process that scientists do not yet understand. If future observations can increase the certainty of this discovery to the 5σ level, astrophysicists will have yet another phenomena to account for – an exciting prospect, regardless of the final result.

The team’s research has been accepted to Physical Review Letters and will be published in an upcoming issue.

Gamma Ray Bursts Limit The Habitability of Certain Galaxies, Says Study

An artistic image of the explosion of a star leading to a gamma-ray burst. (Source: FUW/Tentaris/Maciej Fro?ow)

Gamma ray bursts (GRBs) are some of the brightest, most dramatic events in the Universe. These cosmic tempests are characterized by a spectacular explosion of photons with energies 1,000,000 times greater than the most energetic light our eyes can detect. Due to their explosive power, long-lasting GRBs are predicted to have catastrophic consequences for life on any nearby planet. But could this type of event occur in our own stellar neighborhood? In a new paper published in Physical Review Letters, two astrophysicists examine the probability of a deadly GRB occurring in galaxies like the Milky Way, potentially shedding light on the risk for organisms on Earth, both now and in our distant past and future.

There are two main kinds of GRBs: short, and long. Short GRBs last less than two seconds and are thought to result from the merger of two compact stars, such as neutron stars or black holes. Conversely, long GRBs last more than two seconds and seem to occur in conjunction with certain kinds of Type I supernovae, specifically those that result when a massive star throws off all of its hydrogen and helium during collapse.

Perhaps unsurprisingly, long GRBs are much more threatening to planetary systems than short GRBs. Since dangerous long GRBs appear to be relatively rare in large, metal-rich galaxies like our own, it has long been thought that planets in the Milky Way would be immune to their fallout. But take into account the inconceivably old age of the Universe, and “relatively rare” no longer seems to cut it.

In fact, according to the authors of the new paper, there is a 90% chance that a GRB powerful enough to destroy Earth’s ozone layer occurred in our stellar neighborhood some time in the last 5 billion years, and a 50% chance that such an event occurred within the last half billion years. These odds indicate a possible trigger for the second worst mass extinction in Earth’s history: the Ordovician Extinction. This great decimation occurred 440-450 million years ago and led to the death of more than 80% of all species.

Today, however, Earth appears to be relatively safe. Galaxies that produce GRBs at a far higher rate than our own, such as the Large Magellanic Cloud, are currently too far from Earth to be any cause for alarm. Additionally, our Solar System’s home address in the sleepy outskirts of the Milky Way places us far away from our own galaxy’s more active, star-forming regions, areas that would be more likely to produce GRBs. Interestingly, the fact that such quiet outer regions exist within spiral galaxies like our own is entirely due to the precise value of the cosmological constant – the factor that describes our Universe’s expansion rate – that we observe. If the Universe had expanded any faster, such galaxies would not exist; any slower, and spirals would be far more compact and thus, far more energetically active.

In a future paper, the authors promise to look into the role long GRBs may play in Fermi’s paradox, the open question of why advanced lifeforms appear to be so rare in our Universe. A preprint of their current work can be accessed on the ArXiv.

A Universe of 10 Dimensions

Superstrings may exist in 11 dimensions at once. Via National Institute of Technology Tiruchirappalli.

When someone mentions “different dimensions,” we tend to think of things like parallel universes – alternate realities that exist parallel to our own but where things work differently. However, the reality of dimensions and how they play a role in the ordering of our Universe is really quite different from this popular characterization.

To break it down, dimensions are simply the different facets of what we perceive to be reality. We are immediately aware of the three dimensions that surround us – those that define the length, width, and depth of all objects in our universes (the x, y, and z axes, respectively).

Beyond these three visible dimensions, scientists believe that there may be many more. In fact, the theoretical framework of Superstring Theory posits that the Universe exists in ten different dimensions. These different aspects govern the Universe, the fundamental forces of nature, and all the elementary particles contained within.

The first dimension, as already noted, is that which gives it length (aka. the x-axis). A good description of a one-dimensional object is a straight line, which exists only in terms of length and has no other discernible qualities. Add to that a second dimension, the y-axis (or height), and you get an object that becomes a 2-dimensional shape (like a square).

The third dimension involves depth (the z-axis) and gives all objects a sense of area and a cross-section. The perfect example of this is a cube, which exists in three dimensions and has a length, width, depth, and hence volume. Beyond these three dimensions reside the seven that are not immediately apparent to us but can still be perceived as having a direct effect on the Universe and reality as we know it.

The timeline of the universe, beginning with the Big Bang. Credit: NASA
The timeline of the Universe, beginning with the Big Bang. According to String Theory, this is just one of many possible worlds. Credit: NASA

Scientists believe that the fourth dimension is time, which governs the properties of all known matter at any given point. Along with the three other dimensions, knowing an object’s position in time is essential to plotting its position in the Universe. The other dimensions are where the deeper possibilities come into play, and explaining their interaction with the others is where things get particularly tricky for physicists.

According to Superstring Theory, the fifth and sixth dimensions are where the notion of possible worlds arises. If we could see on through to the fifth dimension, we would see a world slightly different from our own, giving us a means of measuring the similarity and differences between our world and other possible ones.

In the sixth, we would see a plane of possible worlds, where we could compare and position all the possible universes that start with the same initial conditions as this one (i.e., the Big Bang). In theory, if you could master the fifth and sixth dimensions, you could travel back in time or go to different futures.

In the seventh dimension, you have access to the possible worlds that start with different initial conditions. Whereas in the fifth and sixth, the initial conditions were the same, and subsequent actions were different, everything is different from the very beginning of time. The eighth dimension again gives us a plane of such possible universe histories. Each begins with different initial conditions and branches out infinitely (hence why they are called infinities).

In the ninth dimension, we can compare all the possible universe histories, starting with all the different possible laws of physics and initial conditions. In the tenth and final dimension, we arrive at the point where everything possible and imaginable is covered. Beyond this, nothing can be imagined by us lowly mortals, which makes it the natural limitation of what we can conceive in terms of dimensions.

String space - superstring theory lives in 10 dimensions, which means that six of the dimensions have to be "compactified" in order to explain why we can only perceive four. The best way to do this is to use a complicated 6D geometry called a Calabi-Yau manifold, in which all the intrinsic properties of elementary particles are hidden. Credit: A Hanson. String space - superstring theory lives in 10 dimensions, which means that six of the dimensions have to be "compactified" in order to explain why we can only perceive four. The best way to do this is to use a complicated 6D geometry called a Calabi-Yau manifold, in which all the intrinsic properties of elementary particles are hidden. Credit: A Hanson.
The existence of extra dimensions is explained using the Calabi-Yau manifold, in which all the intrinsic properties of elementary particles are hidden. Credit: A Hanson.

The existence of these additional six dimensions, which we cannot perceive, is necessary for String Theory for there to be consistency in nature. The fact that we can perceive only four dimensions of space can be explained by one of two mechanisms: either the extra dimensions are compactified on a very small scale, or else our world may live on a 3-dimensional submanifold corresponding to a brane, on which all known particles besides gravity would be restricted (aka. brane theory).

If the extra dimensions are compactified, then the extra six dimensions must be in the form of a Calabi–Yau manifold (shown above). While imperceptible as far as our senses are concerned, they would have governed the formation of the Universe from the very beginning. Hence why scientists believe that by peering back through time and using telescopes to observe light from the early Universe (i.e., billions of years ago), they might be able to see how the existence of these additional dimensions could have influenced the evolution of the cosmos.

Much like other candidates for a grand unifying theory – aka the Theory of Everything (TOE) – the belief that the Universe is made up of ten dimensions (or more, depending on which model of string theory you use) is an attempt to reconcile the standard model of particle physics with the existence of gravity. In short, it is an attempt to explain how all known forces within our Universe interact and how other possible universes themselves might work.

For additional information, here’s an article on Universe Today about parallel Universes and another on a parallel Universe that scientists thought they’d found, but doesn’t actually exist.

There are also some other great resources online. There is a great video that explains the ten dimensions in detail. You can also look at the PBS website for the TV show Elegant Universe. It has a great page on the ten dimensions.

You can also listen to Astronomy Cast. You might find Episode 137: Large Scale Structure of the Universe very interesting.

Source: PBS

New Cosmological Theory Goes Inflation-Free

This image, the best map ever of the Universe, shows the oldest light in the universe. This glow, left over from the beginning of the cosmos called the cosmic microwave background, shows tiny changes in temperature represented by color. Credit: ESA and the Planck Collaboration.

The Cosmic Microwave Background (CMB) radiation is one of the greatest discoveries of modern cosmology. Astrophysicist George Smoot once likened its existence to “seeing the face of God.” In recent years, however, scientists have begun to question some of the attributes of the CMB. Peculiar patterns have emerged in the images taken by satellites such as WMAP and Planck – and they aren’t going away. Now, in a paper published in the December 1 issue of The Astronomical Journal, one scientist argues that the existence of these patterns may not only imply new physics, but also a revolution in our understanding of the entire Universe.

Let’s recap. Thanks to a blistering ambient temperature, the early Universe was blanketed in a haze for its first 380,000 years of life. During this time, photons relentlessly bombarded the protons and electrons created in the Big Bang, preventing them from combining to form stable atoms. All of this scattering also caused the photons’ energy to manifest as a diffuse glow. The CMB that cosmologists see today is the relic of this glow, now stretched to longer, microwave wavelengths due to the expansion of the Universe.

As any fan of the WMAP and Planck images will tell you, the hallmarks of the CMB are the so-called anisotropies, small regions of overdensity and underdensity that give the picture its characteristic mottled appearance. These hot and cold spots are thought to be the result of tiny quantum fluctuations born at the beginning of the Universe and magnified exponentially during inflation.

Temperature and polarization around hot and cold spots (Credit: NASA / WMAP Science Team)
Temperature and polarization around hot and cold spots (Credit: NASA / WMAP Science Team)

Given the type of inflation that cosmologists believe occurred in the very early Universe, the distribution of these anisotropies in the CMB should be random, on the order of a Gaussian field. But both WMAP and Planck have confirmed the existence of certain oddities in the fog: a large “cold spot,” strange alignments in polarity known as quadrupoles and octupoles, and, of course, Stephen Hawking’s initials.

In his new paper, Fulvio Melia of the University of Arizona argues that these types of patterns (Dr. Hawking’s signature notwithstanding) reveal a problem with the standard inflationary picture, or so-called ΛCDM cosmology. According to his calculations, inflation should have left a much more random assortment of anisotropies than the one that scientists see in the WMAP and Planck data. In fact, the probability of these particular anomalies lining up the way they do in the CMB images is only about 0.005% for a ΛCDM Universe.

Melia posits that the anomalous patterns in the CMB can be better explained by a new type of cosmology in which no inflation occurred. He calls this model the R(h)=ct Universe, where c is the speed of light, t is the age of the cosmos, and R(h) is the Hubble radius – the distance beyond which light will never reach Earth. (This equation makes intuitive sense: Light, traveling at light speed (c) for 13.7 billion years (t), should travel an equivalent number of light-years. In fact, current estimates of the Hubble radius put its value at about 13.4 billion light-years, which is remarkably close to the more tightly constrained value of the Universe’s age.)

R(h)=ct holds true for both the standard cosmological scenario and Melia’s model, with one crucial difference: in ΛCDM cosmology, this equation only works for the current age of the Universe. That is, at any time in the distant past or future, the Universe would have obeyed a different law. Scientists explain this odd coincidence by positing that the Universe first underwent inflation, then decelerated, and finally accelerated again to its present rate.

Melia hopes that his model, a Universe that requires no inflation, will provide an alternative explanation that does not rely on such fine-tuning. He calculates that, in a R(h)=ct Universe, the probability of seeing the types of strange patterns that have been observed in the CMB by WMAP and Planck is 7–10%, compared with a figure 1000 times lower for the standard model.

So, could this new way of looking at the cosmos be a death knell for ΛCDM? Probably not. Melia himself cites a few less earth-shattering explanations for the anomalous signals in the CMB, including foreground noise, statistical biases, and instrumental errors. Incidentally, the Planck satellite is scheduled to release its latest image of the CMB this week at a conference in Italy. If these new results show the same patterns of polarity that previous observations did, cosmologists will have to look into each possible explanation, including Melia’s theory, more intensively.

New Analysis Sets a Space & Time Zone for Complex Life

A new research paper reveals more details of the effect gamma ray bursts (GRB) have had on the development of complex life throughout the cosmos. Illustration depicts a beam from a GRB as might have been directed toward early life on Earth during the Cambrian or Ordovician periods, ~500 million years ago. (Illustration Credit: T. Reyes)

If too close to an environment harboring complex life, a gamma ray burst could spell doom for that life. But could GRBs be the reason we haven’t yet found evidence of other civilizations in the cosmos? To help answer the big question of “where is everybody?” physicists from Spain and Israel have narrowed the time period and the regions of space in which complex life could persist with a low risk of extinction by a GRB.

GRBs are some of the most cataclysmic events in the Universe. Astrophysicists are astounded by their intensity, some of which can outshine the whole Universe for brief moments. So far, they have remained incredible far-off events. But in a new paper, physicists have weighed how GRBs could limit where and when life could persist and evolve, potentially into intelligent life.

In their paper, “On the role of GRBs on life extinctions in the Universe”, published in the journal Science, Dr. Piran from Hebrew University and Dr. Jimenez from University of Barcelona consider first what is known about gamma ray bursts. The metallicity of stars and galaxies as a whole are directly related to the frequency of GRBs. Metallicity is the abundance of elements beyond hydrogen and helium in the content of stars or whole galaxies. More metals reduce the frequency of GRBs. Galaxies that have a low metal content are prone to a higher frequency of GRBs. The researchers, referencing their previous work, state that observational data has shown that GRBs are not generally related to a galaxy’s star formation rate; forming stars, including massive ones is not the most significant factor for increased frequency of GRBs.

As fate would have it, we live in a high metal content galaxy – the Milky Way. Piran and Jimenez show that the frequency of GRBs in the Milky Way is lower based on the latest data available. That is the good news. More significant is the placement of a solar system within the Milky Way or any galaxy.

The brightest gamma-ray burst ever seen in X-rays temporarily blinded Swift's X-ray Telescope on 21 June 2010. This image merges the X-rays (red to yellow) with the same view from Swift's Ultraviolet/Optical Telescope, which showed nothing extraordinary. Credit: NASA/Swift/Stefan Immler
The brightest gamma-ray burst ever seen in X-rays temporarily blinded Swift’s X-ray Telescope on 21 June 2010. This image merges the X-rays (red to yellow) with the same view from Swift’s Ultraviolet/Optical Telescope, which showed nothing extraordinary. Credit: NASA/Swift/Stefan Immler

The paper states that there is a 50% chance of a lethal GRB’s having occurred near Earth within the last 500 million years. If a stellar system is within 13,000 light years (4 kilo-parsecs) of the galactic center, the odds rise to 95%. Effectively, this makes the densest regions of all galaxies too prone to GRBs to permit complex life to persist.

The Earth lies at 8.3 kilo-parsecs (27,000 light years) from the galactic center and the astrophysicists’ work also concludes that the chances of a lethal GRB in a 500 million year span does not drop below 50% until beyond 10 kilo-parsecs (32,000 light years). So Earth’s odds have not been most favorable, but obviously adequate. Star systems further out from the center are safer places for life to progress and evolve. Only the outlying low star density regions of large galaxies keep life out of harm’s way of gamma ray bursts.

The paper continues by describing their assessment of the effect of GRBs throughout the Universe. They state that only approximately 10% of galaxies have environments conducive to life when GRB events are a concern. Based on previous work and new data, galaxies (their stars) had to reach a metallicity content of 30% of the Sun’s, and the galaxies needed to be at least 4 kilo-parsecs (13,000 light years) in diameter to lower the risk of lethal GRBs. Simple life could survive repeated GRBs. Evolving to higher life forms would be repeatedly set back by mass extinctions.

Piran’s and Jimenez’s work also reveals a relation to a cosmological constant. Further back in time, metallicity within stars was lower. Only after generations of star formation – billions of years – have heavier elements built up within galaxies. They conclude that complex life such as on Earth – from jelly fish to humans – could not have developed in the early Universe before Z > 0.5, a cosmological red-shift equal to ~5 billion years ago or longer ago. Analysis also shows that there is a 95% chance that Earth experienced a lethal GRB within the last 5 billion years.

The question of what effect a nearby GRB could have on life has been raised for decades. In 1974, Dr. Malvin Ruderman of Columbia University considered the consequences of a nearby supernova on the ozone layer of the Earth and on terrestrial life. His and subsequent work has determined that cosmic rays would lead to the depletion of the ozone layer, a doubling of the solar ultraviolet radiation reaching the surface, cooling of the Earth’s climate, and an increase in NOx and rainout that effects biological systems. Not a pretty picture. The loss of the ozone layer would lead to a domino effect of atmospheric changes and radiation exposure leading to the collapse of ecosystems. A GRB is considered the most likely cause of the mass extinction at the end of the Ordovician period, 450 million years ago; there remains considerable debate on the causes of this and several other mass extinction events in Earth’s history.

The paper focuses on what are deemed long GRBs – lGRBs – lasting several seconds in contrast to short GRBs which last only a second or less. Long GRBs are believed to be due to the collapse of massive stars such as seen in supernovas, while sGRBs are from the collision of neutron stars or black holes. There remains uncertainty as to the causes, but the longer GRBs release far greater amounts of energy and are most dangerous to ecosystems harboring complex life.

The paper narrows the time and space available for complex life to develop within our Universe. Over the age of the Universe, approximately 14 billion years, only the last 5 billion years have been conducive to the creation of complex life. Furthermore, only 10% of the galaxies within the last 5 billion years provided such environments. And within only larger galaxies, only the outlying areas provided the safe distances needed to evade lethal exposure to a gamma ray burst.

This work reveals how well our Solar System fits within the ideal conditions for permitting complex life to develop. We stand at a fairly good distance from the Milky Way’s galactic center. The age of our Solar System, at approximately 4.6 billion years, lies within the 5 billion year safe zone in time. However, for many other stellar systems, despite how many are now considered to exist throughout the Universe – 100s of billions in the Milky Way, trillions throughout the Universe – simple is probably a way of life due to GRBs. This work indicates that complex life, including intelligent life, is likely less common when just taking the effect of gamma ray bursts into consideration.

References:

On the role of GRBs on life extinction in the Universe, Tsvi Piran, Raul Jimenez, Science, Nov 2014, pre-print

The Search for Dark Energy Just Got Easier

The Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Credit: Berkeley Lab

Since the early 20th century, scientists and physicists have been burdened with explaining how and why the Universe appears to be expanding at an accelerating rate. For decades, the most widely accepted explanation is that the cosmos is permeated by a mysterious force known as “dark energy”. In addition to being responsible for cosmic acceleration, this energy is also thought to comprise 68.3% of the universe’s non-visible mass.

Much like dark matter, the existence of this invisible force is based on observable phenomena and because it happens to fit with our current models of cosmology, and not direct evidence. Instead, scientists must rely on indirect observations, watching how fast cosmic objects (specifically Type Ia supernovae) recede from us as the universe expands.

This process would be extremely tedious for scientists – like those who work for the Dark Energy Survey (DES) – were it not for the new algorithms developed collaboratively by researchers at Lawrence Berkeley National Laboratory and UC Berkeley.

“Our algorithm can classify a detection of a supernova candidate in about 0.01 seconds, whereas an experienced human scanner can take several seconds,” said Danny Goldstein, a UC Berkeley graduate student who developed the code to automate the process of supernova discovery on DES images.

Currently in its second season, the DES takes nightly pictures of the Southern Sky with DECam – a 570-megapixel camera that is mounted on the Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Every night, the camera generates between 100 Gigabytes (GB) and 1 Terabyte (TB) of imaging data, which is sent to the National Center for Supercomputing Applications (NCSA) and DOE’s Fermilab in Illinois for initial processing and archiving.

A Type Ia supernova occurs when a white dwarf accretes material from a companion star until it exceeds the Chandrasekhar limit and explodes. By studying these exploding stars, astronomers can measure dark energy and the expansion of the universe. CfA scientists have found a way to correct for small variations in the appearance of these supernovae, so that they become even better standard candles. The key is to sort the supernovae based on their color.  Credit: NASA/CXC/M. Weiss
By studying Type Ia supernova, astronomers can measure dark energy and the expansion of the universe. Credit: NASA/CXC/M. Weiss

Object recognition programs developed at the National Energy Research Scientific Computing Center (NERSC) and implemented at NCSA then comb through the images in search of possible detections of Type Ia supernovae. These powerful explosions occur in binary star systems where one star is a white dwarf, which accretes material from a companion star until it reaches a critical mass and explodes in a Type Ia supernova.

“These explosions are remarkable because they can be used as cosmic distance indicators to within 3-10 percent accuracy,” says Goldstein.

Distance is important because the further away an object is located in space, the further back in time it is. By tracking Type Ia supernovae at different distances, researchers can measure cosmic expansion throughout the universe’s history. This allows them to put constraints on how fast the universe is expanding and maybe even provide other clues about the nature of dark energy.

“Scientifically, it’s a really exciting time because several groups around the world are trying to precisely measure Type Ia supernovae in order to constrain and understand the dark energy that is driving the accelerated expansion of the universe,” says Goldstein, who is also a student researcher in Berkeley Lab’s Computational Cosmology Center (C3).

UC Berkeley / Berkeley Lab graduate student Danny Goldstein developed a new code using the machine learning technique Random Forest to vet detections of supernova candidates automatically, in real time, optimizing it for the Dark Energy Survey. Credit: Danny Goldstein, UC Berkeley / Berkeley Lab)
Goldstein’s new code uses machine learning techniques to vet detections of supernova candidates. Credit: Danny Goldstein, UC Berkeley/Berkeley Lab)

The DES begins its search for Type Ia explosions by uncovering changes in the night sky, which is where the image subtraction pipeline developed and implemented by researchers in the DES supernova working group comes in. The pipeline subtracts images that contain known cosmic objects from new images that are exposed nightly at CTIO.

Each night, the pipeline produces between 10,000 and a few hundred thousand detections of supernova candidates that need to be validated.

“Historically, trained astronomers would sit at the computer for hours, look at these dots, and offer opinions about whether they had the characteristics of a supernova, or whether they were caused by spurious effects that masquerade as supernovae in the data. This process seems straightforward until you realize that the number of candidates that need to be classified each night is prohibitively large and only one in a few hundred is a real supernova of any type,” says Goldstein. “This process is extremely tedious and time-intensive. It also puts a lot of pressure on the supernova working group to process and scan data fast, which is hard work.”

To simplify the task of vetting candidates, Goldstein developed a code that uses the machine learning technique “Random Forest” to vet detections of supernova candidates automatically and in real-time to optimize them for the DES. The technique employs an ensemble of decision trees to automatically ask the types of questions that astronomers would typically consider when classifying supernova candidates.

Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild
Evolution of a Type Ia supernova. Credit: NASA/ESA/A. Feild

At the end of the process, each detection of a candidate is given a score based on the fraction of decision trees that considered it to have the characteristics of a detection of a supernova. The closer the classification score is to one, the stronger the candidate. Goldstein notes that in preliminary tests, the classification pipeline achieved 96 percent overall accuracy.

“When you do subtraction alone you get far too many ‘false-positives’ — instrumental or software artifacts that show up as potential supernova candidates — for humans to sift through,” says Rollin Thomas, of Berkeley Lab’s C3, who was Goldstein’s collaborator.

He notes that with the classifier, researchers can quickly and accurately strain out the artifacts from supernova candidates. “This means that instead of having 20 scientists from the supernova working group continually sift through thousands of candidates every night, you can just appoint one person to look at maybe few hundred strong candidates,” says Thomas. “This significantly speeds up our workflow and allows us to identify supernovae in real-time, which is crucial for conducting follow up observations.”

“Using about 60 cores on a supercomputer we can classify 200,000 detections in about 20 minutes, including time for database interaction and feature extraction.” says Goldstein.

Goldstein and Thomas note that the next step in this work is to add a second-level of machine learning to the pipeline to improve the classification accuracy. This extra layer would take into account how the object was classified in previous observations as it determines the probability that the candidate is “real.” The researchers and their colleagues are currently working on different approaches to achieve this capability.

Further Reading: Berkley Lab

Elusive Dark Matter Could Be Detected with GPS Satellites

GPS Satellite
According to a new proposal, GPS satellites may be the key to finding dark matter. Credit: NASA

You know the old saying: “if you want to hide something, put it in plain sight?” Well, according to a new proposal by two professors of physics, this logic may be the reason why scientists have struggled for so long to find the mysterious mass that is believed to comprise 27% of the matter in the universe.

In short, these two physicists believe that dark matter can be found the same way the you can find the fastest route to work: by consulting the Global Positioning System.

Andrei Derevianko, of the University of Nevada, Reno, and Maxim Pospelov, of the University of Victoria and the Perimeter Institute for Theoretical Physics in Canada, proposed this method earlier this year at a series of renowned scientific conferences, where it met with general approval.

Their idea calls for the use of GPS satellites and other atomic clock networks and comparing their times to look for discrepancies. Derevianko and Pospelov suggest that dark matter could have a disruptive affect on atomic clocks, and that by looking at existing networks of atomic clocks it might be possible to spot pockets of dark matter by their distinctive signature.

The two are starting to test this theory by analyzing clock data from the 30 GPS satellites, which use atomic clocks for everyday navigation. Correlated networks of atomic clocks, such as the GPS and some ground networks already in existence, can be used as a powerful tool to search for the topological defect dark matter where initially synchronized clocks will become desynchronized.

The HST WFPC2 image of gravitational lensing in the galaxy cluster Abell 2218, indicating the presence of large amount of dark matter (credit Andrew Fruchter at STScI).
The Hubble Space Telescope image of gravitational lensing in the galaxy cluster Abell 2218 indicating the presence of large amount of dark matter. Credit: NASA/Andrew Fruchter/STScI

“Despite solid observational evidence for the existence of dark matter, its nature remains a mystery,” Derevianko, a professor in the College of Science at the University, said. “Some research programs in particle physics assume that dark matter is composed of heavy-particle-like matter. This assumption may not hold true, and significant interest exists for alternatives.”

Their proposal builds on the idea that dark matter could come from cracks in the universe’s quantum fields that could disturb such fundamental properties as the mass of an electron, and have an effect on the way we measure time. This represents a break from the more conventional view that dark matter consists of subatomic particles such as WIMPs and axions.

“Our research pursues the idea that dark matter may be organized as a large gas-like collection of topological defects, or energy cracks,” Derevianko said. “We propose to detect the defects, the dark matter, as they sweep through us with a network of sensitive atomic clocks. The idea is, where the clocks go out of synchronization, we would know that dark matter, the topological defect, has passed by. In fact, we envision using the GPS constellation as the largest human-built dark-matter detector.”

Derevianko is collaborating on analyzing GPS data with Geoff Blewitt, director of the Nevada Geodetic Laboratory, also in the College of Science at the University of Nevada, Reno. The Geodetic Lab developed and maintains the largest GPS data processing center in the world, able to process information from about 12,000 stations around the globe continuously, 24/7.

Artist's rendering of a vacuum tube, one of the main components of an atomic clock. Credit: NASA
Artist’s rendering of a vacuum tube, one of the main components of an atomic clock. Credit: NASA

Blewitt, also a physicist, explained how an array of atomic clocks could possibly detect dark matter.

“We know the dark matter must be there, for example, because it is seen to bend light around galaxies, but we have no evidence as to what it might be made of,” he said. “If the dark matter were not there, the normal matter that we know about would not be sufficient to bend the light as much as it does. That’s just one of the ways scientists know there is a massive amount of dark matter somewhere out there in the galaxy. One possibility is that the dark matter in this gas might not be made out of particles like normal matter, but of macroscopic imperfections in the fabric of space-time.

“The Earth sweeps through this gas as it orbits the galaxy. So to us, the gas would appear to be like a galactic wind of dark matter blowing through the Earth system and its satellites. As the dark matter blows by, it would occasionally cause clocks of the GPS system to go out of sync with a tell-tale pattern over a period of about 3 minutes. If the dark matter causes the clocks to go out of sync by more than a billionth of a second we should easily be able to detect such events.”

“This type of work can be transformative in science and could completely change how we think about our universe,” Jeff Thompson, a physicist and dean of the University’s College of Science, said. “Andrei is a world class physicist and he has already made seminal contributions to physics. It’s a wonder to watch the amazing work that comes from him and his group.”

Derevianko teaches quantum physics and related subjects at the University of Nevada, Reno. He has authored more than 100 refereed publications in theoretical physics. He is a fellow of the American Physical Society, a Simons fellow in theoretical physics and a Fulbright scholar. Among a variety of research topics, he has contributed to the development of several novel classes of atomic clocks and precision tests of fundamental symmetries with atoms and molecules.

Their research appeared earlier this week in the online version of the scientific journal Nature Physics, ahead of the print version.

Further Reading: University of Nevada

Higgs Boson Threatened The Early Universe, But Gravity Saved The Day

Image Credit: Science/AAAS

All the physical properties of our Universe – indeed, the fact that we even exist within a Universe that we can contemplate and explore – owe to events that occurred very early in its history. Cosmologists believe that our Universe looks the way it does thanks to a rapid period of inflation immediately before the Big Bang that smoothed fluctuations in the vacuum energy of space and flattened out the fabric of the cosmos itself.

According to current theories, however, interactions between the famed Higgs boson and the inflationary field should have caused the nascent Universe to collapse. Clearly, this didn’t happen. So what is going on? Scientists have worked out a new theory: It was gravity that (literally) held it all together.

The interaction between the curvature of spacetime (more commonly known as gravity) and the Higgs field has never been well understood. Resolving the apparent problem of our Universe’s stubborn existence, however, provides a good excuse to do some investigating. In a paper published this week in Physical Review Letters, researchers from the University of Copenhagen, the University of Helsinki, and Imperial College London show that even a small interaction between gravity and the Higgs would have been sufficient to stave off a collapse of the early cosmos.

The researchers modified the Higgs equations to include the effect of gravity generated by UV-scale energies. These corrections were found to stabilize the inflationary vacuum at all but a narrow range of energies, allowing expansion to continue and the Universe as we know it to exist… without the need for new physics beyond the Standard Model.

This new theory is based on the controversial evidence of inflation announced by BICEP2 earlier this summer, so its true applicability will depend on whether or not those results turn out to be real. Until then, the researchers are hoping to support their work with additional observational studies that seek out gravitational waves and more deeply examine the cosmic microwave background.

At this juncture, the Higgs-gravity interaction is not a testable hypothesis because the graviton (the particle that handles all of gravity’s interactions) itself has yet to be detected. Based purely on the mathematics, however, the new theory presents an elegant and efficient solution to the potential conundrum of why we exist at all.

Macro View Makes Dark Matter Look Even Stranger

New research suggests that Dark Matter may exist in clumps distributed throughout our universe. Credit: Max-Planck Institute for Astrophysics

We know dark matter exists. We know this because without it and dark energy, our Universe would be missing 95.4% of its mass. What’s more, scientists would be hard pressed to explain what accounts for the gravitational effects they routinely see at work in the cosmos.

For decades, scientists have sought to prove its existence by smashing protons together in the Large Hadron Collider. Unfortunately, these efforts have not provided any concrete evidence.

Hence, it might be time to rethink dark matter. And physicists David M. Jacobs, Glenn D. Starkman, and Bryan Lynn of Case Western Reserve University have a theory that does just that, even if it does sound a bit strange.

In their new study, they argue that instead of dark matter consisting of elementary particles that are invisible and do not emit or absorb light and electromagnetic radiation, it takes the form of chunks of matter that vary widely in terms of mass and size.

As it stands, there are many leading candidates for what dark matter could be, which range from Weakly-Interacting Massive Particles (aka WIMPs) to axions. These candidates are attractive, particularly WIMPs, because the existence of such particles might help confirm supersymmetry theory – which in turn could help lead to a working Theory of Everything (ToE).

According to supersymmetry, dark-matter particles known as neutralinos (which are often called WIMPs) annihilate each other, creating a cascade of particles and radiation that includes medium-energy gamma rays. If neutralinos exist, the LAT might see the gamma rays associated with their demise. Credit: Sky & Telescope / Gregg Dinderman.
According to supersymmetry, dark-matter particles known as neutralinos (aka WIMPs) annihilate each other, creating a cascade of particles and radiation. Credit: Sky & Telescope / Gregg Dinderman.

But so far, no evidence has been obtained that definitively proves the existence of either. Beyond being necessary in order for General Relativity to work, this invisible mass seems content to remain invisible to detection.

According to Jacobs, Starkman, and Lynn, this could indicate that dark matter exists within the realm of normal matter. In particular, they consider the possibility that dark matter consists of macroscopic objects – which they dub “Macros” – that can be characterized in units of grams and square centimeters respectively.

Macros are not only significantly larger than WIMPS and axions, but could potentially be assembled out of particles in the Standard Model of particle physics – such as quarks and leptons from the early universe – instead of requiring new physics to explain their existence. WIMPS and axions remain possible candidates for dark matter, but Jacobs and Starkman argue that there’s a reason to search elsewhere.

“The possibility that dark matter could be macroscopic and even emerge from the Standard Model is an old but exciting one,” Starkman told Universe Today, via email. “It is the most economical possibility, and in the face of our failure so far to find dark matter candidates in our dark matter detectors, or to make them in our accelerators, it is one that deserves our renewed attention.”

After eliminating most ordinary matter – including failed Jupiters, white dwarfs, neutron stars, stellar black holes, the black holes in centers of galaxies, and neutrinos with a lot of mass – as possible candidates, physicists turned their focus on the exotics.

Particle Collider
Ongoing experiments at the Large Hadron Collider have so far failed to produce evidence of WIMPs. Credit: CERN/LHC/GridPP

Nevertheless, matter that was somewhere in between ordinary and exotic – relatives of neutron stars or large nuclei – was left on the table, Starkman said. “We say relatives because they probably have a considerable admixture of strange quarks, which are made in accelerators and ordinarily have extremely short lives,” he said.

Although strange quarks are highly unstable, Starkman points out that neutrons are also highly unstable. But in helium, bound with stable protons, neutrons remain stable.

“That opens the possibility that stable strange nuclear matter was made in the early Universe and dark matter is nothing more than chunks of strange nuclear matter or other bound states of quarks, or of baryons, which are themselves made of quarks,” said Starkman.

Such dark matter would fit the Standard Model.

This is perhaps the most appealing aspect of the Macros theory: the notion that dark matter, which our cosmological model of the Universe depends upon, can be proven without the need for additional particles.

Still, the idea that the universe is filled with a chunky, invisible mass rather than countless invisible particles does make the universe seem a bit stranger, doesn’t it?

Further Reading: Case Western