New Technique Could Track Down Dark Energy

Robert C. Byrd Green Bank Telescope CREDIT: NRAO/AUI/NSF

[/caption]

From an NRAO press release:

Dark energy is the label scientists have given to what is causing the Universe to expand at an accelerating rate, and is believed to make up nearly three-fourths of the mass and energy of the Universe. While the acceleration was discovered in 1998, its cause remains unknown. Physicists have advanced competing theories to explain the acceleration, and believe the best way to test those theories is to precisely measure large-scale cosmic structures. A new technique developed for the Robert C. Byrd Green Bank Telescope (GBT) have given astronomers a new way to map large cosmic structures such as dark energy.

Sound waves in the matter-energy soup of the extremely early Universe are thought to have left detectable imprints on the large-scale distribution of galaxies in the Universe. The researchers developed a way to measure such imprints by observing the radio emission of hydrogen gas. Their technique, called intensity mapping, when applied to greater areas of the Universe, could reveal how such large-scale structure has changed over the last few billion years, giving insight into which theory of dark energy is the most accurate.

“Our project mapped hydrogen gas to greater cosmic distances than ever before, and shows that the techniques we developed can be used to map huge volumes of the Universe in three dimensions and to test the competing theories of dark energy,” said Tzu-Ching Chang, of the Academia Sinica in Taiwan and the University of Toronto.

To get their results, the researchers used the GBT to study a region of sky that previously had been surveyed in detail in visible light by the Keck II telescope in Hawaii. This optical survey used spectroscopy to map the locations of thousands of galaxies in three dimensions. With the GBT, instead of looking for hydrogen gas in these individual, distant galaxies — a daunting challenge beyond the technical capabilities of current instruments — the team used their intensity-mapping technique to accumulate the radio waves emitted by the hydrogen gas in large volumes of space including many galaxies.

“Since the early part of the 20th Century, astronomers have traced the expansion of the Universe by observing galaxies. Our new technique allows us to skip the galaxy-detection step and gather radio emissions from a thousand galaxies at a time, as well as all the dimly-glowing material between them,” said Jeffrey Peterson, of Carnegie Mellon University.

The astronomers also developed new techniques that removed both man-made radio interference and radio emission caused by more-nearby astronomical sources, leaving only the extremely faint radio waves coming from the very distant hydrogen gas. The result was a map of part of the “cosmic web” that correlated neatly with the structure shown by the earlier optical study. The team first proposed their intensity-mapping technique in 2008, and their GBT observations were the first test of the idea.

“These observations detected more hydrogen gas than all the previously-detected hydrogen in the Universe, and at distances ten times farther than any radio wave-emitting hydrogen seen before,” said Ue-Li Pen of the University of Toronto.

“This is a demonstration of an important technique that has great promise for future studies of the evolution of large-scale structure in the Universe,” said National Radio Astronomy Observatory Chief Scientist Chris Carilli, who was not part of the research team.

In addition to Chang, Peterson, and Pen, the research team included Kevin Bandura of Carnegie Mellon University. The scientists reported their work in the July 22 issue of the scientific journal Nature.

Using Gravitational Lensing to Measure Age and Size of Universe

A graviational lens image of the B1608+656 system. Image courtesy Sherry Suyu of the Argelander Institut für Astronomie in Bonn, Germany. Click on image for larger version.

[/caption]

Handy little tool, this gravitational lensing! Astronomers have used it to measure the shape of stars, look for exoplanets, and measure dark matter in distant galaxies. Now its being used to measure the size and age of the Universe. Researchers say this new use of gravitation lensing provides a very precise way to measure how rapidly the universe is expanding. The measurement determines a value for the Hubble constant, which indicates the size of the universe, and confirms the age of Universe as 13.75 billion years old, within 170 million years. The results also confirm the strength of dark energy, responsible for accelerating the expansion of the universe.

Gravitational lensing occurs when two galaxies happen to aligned with one another along our line of sight in the sky. The gravitational field of the nearer galaxy distorts the image of the more distant galaxy into multiple arc-shaped images. Sometimes this effect even creates a complete ring, known as an “Einstein Ring.”
Researchers at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) used gravitational lensing to measure the distances light traveled from a bright, active galaxy to the earth along different paths. By understanding the time it took to travel along each path and the effective speeds involved, researchers could infer not just how far away the galaxy lies but also the overall scale of the universe and some details of its expansion.

Distinguishing distances in space is difficult. A bright light far away and a dimmer source lying much closer can look like they are at the same distance. A gravitational lens circumvents this problem by providing multiple clues as to the distance light travels. That extra information allows them to determine the size of the universe, often expressed by astrophysicists in terms of a quantity called Hubble’s constant.

“We’ve known for a long time that lensing is capable of making a physical measurement of Hubble’s constant,” KIPAC’s Phil Marshall said. However, gravitational lensing had never before been used in such a precise way. This measurement provides an equally precise measurement of Hubble’s constant as long-established tools such as observation of supernovae and the cosmic microwave background. “Gravitational lensing has come of age as a competitive tool in the astrophysicist’s toolkit,” Marshall said.

When a large nearby object, such as a galaxy, blocks a distant object, such as another galaxy, the light can detour around the blockage. But instead of taking a single path, light can bend around the object in one of two, or four different routes, thus doubling or quadrupling the amount of information scientists receive. As the brightness of the background galaxy nucleus fluctuates, physicists can measure the ebb and flow of light from the four distinct paths, such as in the B1608+656 system that was the subject of this study. Lead author on the study Sherry Suyu, from the University of Bonn, said, “In our case, there were four copies of the source, which appear as a ring of light around the gravitational lens.”

Though researchers do not know when light left its source, they can still compare arrival times. Marshall likens it to four cars taking four different routes between places on opposite sides of a large city, such as Stanford University to Lick Observatory, through or around San Jose. And like automobiles facing traffic snarls, light can encounter delays, too.

“The traffic density in a big city is like the mass density in a lens galaxy,” Marshall said. “If you take a longer route, it need not lead to a longer delay time. Sometimes the shorter distance is actually slower.”

The gravitational lens equations account for all the variables such as distance and density, and provide a better idea of when light left the background galaxy and how far it traveled.

In the past, this method of distance estimation was plagued by errors, but physicists now believe it is comparable with other measurement methods. With this technique, the researchers have come up with a more accurate lensing-based value for Hubble’s constant, and a better estimation of the uncertainty in that constant. By both reducing and understanding the size of error in calculations, they can achieve better estimations on the structure of the lens and the size of the universe.

There are several factors scientists still need to account for in determining distances with lenses. For example, dust in the lens can skew the results. The Hubble Space Telescope has infra-red filters useful for eliminating dust effects. The images also contain information about the number of galaxies lying around the line of vision; these contribute to the lensing effect at a level that needs to be taken into account.

Marshall says several groups are working on extending this research, both by finding new systems and further examining known lenses. Researchers are already aware of more than twenty other astronomical systems suitable for analysis with gravitational lensing.

These results of this study was published in the March 1 issue of The Astrophysical Journal. The researchers used data collected by the NASA/ESA Hubble Space Telescope, and showed the improved precision they provide in combination with the Wilkinson Microwave Anisotropy Probe (WMAP).

Source: SLAC

Quintessence

Quintessence is one idea – hypothesis – of what dark energy is (remember that dark energy is the shorthand expression of the apparent acceleration of the expansion of the universe … or the form of mass-energy which causes this observed acceleration, in cosmological models built with Einstein’s theory of general relativity).

The word quintessence means fifth essence, and is kinda cute … remember Earth, Water, Fire, and Air, the ‘four essences’ of the Ancient Greeks? Well, in modern cosmology, there are also four essences: normal matter, radiation (photons), cold dark matter, and neutrinos (which are hot dark matter!).

Quintessence covers a range of hypotheses (or models); the main difference between quintessence as a (possible) explanation for dark energy and the cosmological constant Λ (which harks back to Einstein and the early years of the 20th century) is that quintessence varies with time (albeit slooowly), and can also vary with location (space). One version of quintessence is phantom energy, in which the energy density increases with time, and leads to a Big Rip end of the universe.

Quintessence, as a scalar field, is not the least bit unusual in physics (the Newtonian gravitational potential field is one example, of a real scalar field; the Higgs field of the Standard Model of particle physics is an example of a complex scalar field); however, it has some difficulties in common with the cosmological constant (in a nutshell, how can it be so small).

Can quintessence be observed; or, rather, can quintessence be distinguished from a cosmological constant? In astronomy, yes … by finding a way to observed (and measure) the acceleration of the universe at widely different times (quintessence and Λ predict different results). Another way might be to observe variations in the fundamental constants (e.g. the fine structure constant) or violations of Einstein’s equivalence principle.

One project seeking to measure the acceleration of the universe more accurately was ESSENCE (“Equation of State: SupErNovae trace Cosmic Expansion”).

In 1999, CERN Courier published a nice summary of cosmology as it was understood then, a year after the discovery of dark energy The quintessence of cosmology (it’s well worth a read, though a lot has happened in the past decade).

Universe Today articles? Yep! For example Will the Universe Expand Forever?, More Evidence for Dark Energy, and Hubble Helps Measure the Pace of Dark Energy.

Astronomy Cast episodes relevant to quintessence include What is the universe expanding into?, and A Universe of Dark Energy.

Source: NASA

New Search for Dark Energy Goes Back in Time

This is a previous optical image of one of the approximately 200 quasars captured in the Baryon Oscillation Spectroscopic Survey (BOSS) "first light" exposure is shown at top, with the BOSS spectrum of the object at bottom. The spectrum allows astronomers to determine the object's redshift. With millions of such spectra, BOSS will measure the geometry of the Universe. Credit: David Hogg, Vaishali Bhardwaj, and Nic Ross of SDSS-III

[/caption]
Baryon acoustic oscillation (BAO) sounds like it could be technobabble from a Star Trek episode. BAO is real, but astronomers are searching for these particle fluctuations to do what seems like science fiction: look back in time to find clues about dark energy. The Baryon Oscillation Spectroscopic Survey(BOSS), a part of the Sloan Digital Sky Survey III (SDSS-III), took its “first light” of astronomical data last month, and will map the expansion history of the Universe.

“Baryon oscillation is a fast-maturing method for measuring dark energy in a way that’s complementary to the proven techniques of supernova cosmology,” said David Schlegel from the Lawrence Berkeley National Laboratory (Berkeley Lab), the Principal Investigator of BOSS. “The data from BOSS will be some of the best ever obtained on the large-scale structure of the Universe.”

BOSS uses the same telescope as the original Sloan Digital Sky Survey — 2.5-meter telescope
at Apache Point Observatory in New Mexico — but equipped with new, specially-built spectrographs to measure the spectra.

Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long
Senior Operations Engineer Dan Long loads the first cartridge of the night into the Sloan Digital Sky Survey telescope. The cartridge holds a “plug-plate” at the top which then holds a thousand optical fibers shown in red and blue. These cartridges are locked into the base of the telescope and are changed many times during a night. Photo credit: D. Long

Baryon oscillations began when pressure waves traveled through the early universe. The same density variations left their mark as the Universe evolved, in the periodic clustering of visible matter in galaxies, quasars, and intergalactic gas, as well as in the clumping of invisible dark matter.

Comparing these scales at different eras makes it possible to trace the details of how the Universe has expanded throughout its history – information that can be used to distinguish among competing theories of dark energy.

“Like sound waves passing through air, the waves push some of the matter closer together as they travel” said Nikhil Padmanabhan, a BOSS researcher who recently moved from Berkeley Lab to Yale University. “In the early universe, these waves were moving at half the speed of light, but when the universe was only a few hundred thousand years old, the universe cooled enough to halt the waves, leaving a signature 500 million light-years in length.”

“We can see these frozen waves in the distribution of galaxies today,” said Daniel Eisenstein of the University of Arizona, the Director of the SDSS-III. “By measuring the length of the baryon oscillations, we can determine how dark energy has affected the expansion history of the universe. That in turn helps us figure out what dark energy could be.”

“Studying baryon oscillations is an exciting method for measuring dark energy in a way that’s complementary to techniques in supernova cosmology,” said Kyle Dawson of the University of Utah, who is leading the commissioning of BOSS. “BOSS’s galaxy measurements will be a revolutionary dataset that will provide rich insights into the universe,” added Martin White of Berkeley Lab, BOSS’s survey
scientist.

On Sept. 14-15, 2009, astronomers used BOSS to measure the spectra of a thousand galaxies and quasars. The goal of BOSS is to measure 1.4 million luminous red galaxies at redshifts up to 0.7 (when the Universe was roughly seven billion years old) and 160,000 quasars at redshifts between 2.0 and 3.0 (when the Universe was only about three billion years old). BOSS will also measure variations in the density of hydrogen gas between the galaxies. The observation program will take five years.

Source: Sloan Digital Sky Survey

Variability in Type 1A Supernovae Has Implications for Studying Dark Energy

A Hubble Space Telescope-Image of Supernova 1994D (SN1994D) in galaxy NGC 4526 (SN 1994D is the bright spot on the lower left). Image Credit:HST

[/caption]

The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae, and these stellar explosions have long been used as “standard candles” for measuring the expansion. But not all type 1A supernovae are created equal. A new study reveals sources of variability in these supernovae, and to accurately probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to find a way to measure cosmic distances with much greater precision than they have in the past.

“As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance,” said lead author Daniel Kasen, of a study published in Nature this week. “We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness.”

Kasen and his coauthors–Fritz Röpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz–used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass–1.4 times the mass of the Sun, packed into an object the size of the Earth–the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their “light curves” (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. “The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry,” Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

“The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light,” Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. “Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based,” Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher “metallicity,” in astronomers’ terminology) than stars formed in the distant past.

“That’s the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity,” Kasen said. “When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less.”

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

Source: EurekAlert