New CMB Measurements Support Standard Model

The measure of polarized light from the early Universe allowed researchers to better plot the location of matter - the left image - which later became the stars and galaxies we have today. Image Credit: Sarah Church/Walter Gear

New measurements of the cosmic microwave background (CMB) – the leftover light from the Big Bang – lend further support the Standard Cosmological Model and the existence of dark matter and dark energy, limiting the possibility of alternative models of the Universe. Researchers from Stanford University and Cardiff University produced a detailed map of the composition and structure of matter as it would have looked shortly after the Big Bang, which shows that the Universe would not look as it does today if it were made up solely of ‘normal matter’.

By measuring the way the light of the CMB is polarized, a team led by Sarah Church of the Kavli Institute for Particle Astrophysics and Cosmology at Stanford University and by Walter Gear, head of the School of Physics and Astronomy at Cardiff University in the United Kingdom were able construct a map of the way the Universe would have looked shortly after matter came into existence after the Big Bang. Their findings lend evidence to the predictions of the Standard Model in which the Universe is composed of 95% dark matter and energy, and only 5% of ordinary matter.

Polarization is a feature of light rays in which the oscillation of the light wave lies in right angles to the direction in which the light is traveling. Though most light is unpolarized, light that has interacted with matter can become polarized. The leftover light from the Big Bang – the CMB – has now cooled to a few degrees above 0 Kelvin, but it still retains the same polarization it had in the early Universe, once it had cooled enough to become transparent to light. By measuring this polarization, the researchers were able to extrapolate the location, structure, and velocity of matter in the early Universe with unprecedented precision. The gravitational collapse of large clumps of matter in the early universe creates certain resonances in the polarization that allowed the researchers to create a map of the matter composition.

Dr. Gear said, “The pattern of oscillations in the power spectra allow us to discriminate, as “real” and “dark” matter affect the position and amplitudes of the peaks in different ways. The results are also consistent with many other pieces of evidence for dark matter, such as the rotation rate of galaxies, and the distribution of galaxies in clusters.”

The measurements made by the QUaD experiment further constrain those made by previous experiments to measure properties of the CMB, such as WMAP and ACBAR. In comparison to these previous experiments, the The QUaD experiment, located at the South Pole, allowed researchers to measure the polarization of the CMB with very high precision. Image Credit: Sarah Churchmeasurements come closer to fitting what is predicted by the Standard Cosmologicl Model by more than an order of magnitude, said Dr. Gear. This is a very important step on the path to verifying whether our model of the Universe is correct.

The researchers used the QUaD experiment at the South Pole to make their observations. The QUaD telescope is a bolometer, essentially a thermometer that measures how certain types of radiation increase the temperature of the metals in the detector. The detector itself has to be near 1 degree Kelvin to eliminate noise radiation from the surrounding environment, which is why it is located at the frigid South Pole, and placed inside of a cryostat.

Paper co-author Walter Gear said in an email interview:

“The polarization is imprinted at the time the Universe becomes transparent to light, about 400,000 years after the big bang, rather than right after the big bang before matter existed. There are major efforts now to try to find what is called the “B-mode” signal”  which is a more complicated polarization pattern that IS imprinted right after the big-bang. QuaD places the best current upper limit on this but is still more than an order of magnitude away in sensitivity from even optimistic predictions of what that signal might be. That is the next generation of experiments’s goal.”

The results, published in a paper titled Improved Measurements of the Temperature and Polarization of the Cosmic Microwave Background from QUaD in the November 1st Astrophysical Journal, fit the predictions of the Standard Model remarkably well, providing further evidence for the existence of dark matter and energy, and constraining alternative models of the Universe.

Source: SLAC, email interview with Dr. Walter Gear

If We Live in a Multiverse, How Many Are There?

Artist concept of the cyclic universe.

[/caption]
Theoretical physics has brought us the notion that our single universe is not necessarily the only game in town. Satellite data from WMAP, along with string theory and its 11- dimensional hyperspace idea has produced the concept of the multiverse, where the Big Bang could have produced many different universes instead of a single uniform universe. The idea has gained popularity recently, so it was only a matter of time until someone asked the question of how many multiverses could possibly exist. The number, according to two physicists, could be “humongous.”

Andrei Linde and Vitaly Vanchurin at Stanford University in California, did a few back-of- the- envelope calculations, starting with the idea that the Big Bang was essentially a quantum process which generated quantum fluctuations in the state of the early universe. The universe then underwent a period of rapid growth called inflation during which these perturbations were “frozen,” creating different initial classical conditions in different parts of the cosmos. Since each of these regions would have a different set of laws of low energy physics, they can be thought of as different universes.

Linde and Vanchurin then estimated how many different universes could have appeared as a result of this effect. Their answer is that this number must be proportional to the effect that caused the perturbations in the first place, a process called slow roll inflation, — the solution Linde came up with previously to answer the problem of the bubbles of universes colliding in the early inflation period. In this model, inflation occurred from a scalar field rolling down a potential energy hill. When the field rolls very slowly compared to the expansion of the universe, inflation occurs and collisions end up being rare.

Using all of this (and more – see their paper here) Linde and Vanchurin calculate that the number of universes in the multiverse and could be at least 10^10^10^7, a number which is definitely “humungous,” as they described it.

The next question, then, is how many universes could we actually see? Linde and Vanchurin say they had to invoke the Bekenstein limit, where the properties of the observer become an important factor because of a limit to the amount of information that can be contained within any given volume of space, and by the limits of the human brain.

The total amount of information that can be absorbed by one individual during a lifetime is about 10^16 bits. So a typical human brain can have 10^10^16 configurations and so could never distinguish more than that number of different universes.

The number of multiverses the human brain could distinguish. Credit: Linde and Vanchurin
The number of multiverses the human brain could distinguish. Credit: Linde and Vanchurin

“So, the total number of possibilities accessible to any given observer is limited not only by the entropy of perturbations of metric produced by inflation and by the size of the cosmological horizon, but also by the number of degrees of freedom of an observer,” the physicists write.

“We have found that the strongest limit on the number of different locally distinguishable geometries is determined mostly by our abilities to distinguish between different universes and to remember our results,” wrote Linde and Vanchurin. “Potentially it may become very important that when we analyze the probability of existencse of a universe of a given type, we should be talking about a consistent pair: the universe and an observer who makes the rest of the universe “alive” and the wave function of the rest of the universe time-dependant.”

So their conclusion is that the limit does not depend on the properties of the multiverse itself, but on the properties of the observer.

They hope to further study this concept to see if this probability if proportional to the observable entropy of inflation.

Sources: ArXiv, Technology Review Blog

What! No Parallel Universe? Cosmic Cold Spot Just Data Artifact

Region in space detected by WMAP cooler than its surroundings. But not really. Rudnick/NRAO/AUI/NSF, NASA.

Rats! Another perplexing space mystery solved by science. New analysis of the famous “cold spot” in the cosmic microwave background reveals, and confirms, actually, that the spot is just an artifact of the statistical methods used to find it. That means there is no supervoid lurking in the CMB, and no parallel universe lying just beyond the edge of our own. What fun is that?

Back in 2004, astronomers studying data from the Wilkinson Microwave Anisotropy Probe (WMAP) found a region of the cosmic microwave background in the southern hemisphere in the direction of the constellation of Eridanus that was significantly colder than the rest by about 70 microkelvin. The probability of finding something like that was extremely low. If the Universe really is homogeneous and isotropic, then all points in space ought to experience the same physical development, and appear the same. This just wasn’t supposed to be there.

Some astronomers suggested the spot could be a supervoid, a remnant of an early phase transition in the universe. Others theorized it was a window into a parallel universe.

Well, it turns out, it wasn’t there.

Ray Zhang and Dragan Huterer at the University of Michigan in Ann Arbor say that the cold spot is simply an artifact of the statistical method–called Spherical Mexican Hat Wavelets–used to analyze the WMAP data. Use a different method of analysis and the cold spot disappears (or at least is no colder than expected).

“We trace this apparent discrepancy to the fact that WMAP cold spot’s temperature profile just happens to favor the particular profile given by the wavelet,” the duo says in their paper. “We find no compelling evidence for the anomalously cold spot in WMAP at scales between 2 and 8 degrees.”

This confirms another paper from 2008 also by Huterer along with colleague Kendrick Smith from the University of Cambridge who showed that the huge void could be considered as a statistical fluke because it had stars both in front of and behind it.

And in fact, one of the earlier papers suggesting the cold spot by Lawrence Rudnick from the University of Minnesota does indeed say that statistical uncertainties have not been accounted for.

Oh well. Now, on to the next cosmological mysteries like dark matter and dark energy!

Zhang and Huterer’s paper.

Huterer and Smith’s paper (2008)

Rudnick’s paper 2007

Original paper “finding” the cold spot

Sources: Technology Review Blog, Science

Hubble’s Law

velocity vs distance, from Hubble's 1929 paper

[/caption]
“The distance to objects beyond the Local Group is closely related to how fast they seem to be receding from us,” that’s Hubble’s law in a nutshell.

Edwin Hubble, the astronomer the Hubble Space Telescope is named after, first described the relationship which later bore his name in a paper in 1929; here is one of the ways he described it, in that paper: “The data in the table [of “nebulae”, i.e. galaxies] indicate a linear correlation between distances and velocities“; in numerical form, v = Hd (v is the speed at which a distant object is receding from us, d is its distance, and H is the Hubble constant).

Today the Hubble law is usually expressed as a relationship between redshift and distance, partly because redshift is what astronomers can measure directly.

Hubble’s Law, which is an empirical relationship, was the first concrete evidence that Einstein’s theory of General Relativity applied to the universe as a whole, as proposed only two years earlier by Georges Lemaître (interestingly, Lemaître’s paper also includes an estimate of the Hubble constant!); the universal applicability of General Relativity is the heart of the Big Bang theory, and the way we see the predicted expansion of space is as the speed at which things seem to be receding being proportional to their distance, i.e. Hubble’s Law.

Although other astronomers, such as Vesto Silpher, did much of the work needed to measure the galaxy redshifts, Hubble was the one who developed techniques for estimating the distance to the galaxies, and who pulled it all together to show how distance and speed were related.

Hubble’s Law is not exact; the measured redshift of some galaxies is different from what Hubble’s Law says it should be, given their distances. This is particularly noticeable for galaxy clusters, and is explained as the motion of galaxies within their local groups or clusters, due to their mutual gravitation.

Because the exact value of the Hubble constant, H, is so important in extragalactic astronomy and cosmology – it leads to an estimate of the age of the universe, helps test theories of Dark Matter and Dark Energy, and much more – a great deal of effort has gone into working it out. Today it is estimated to be 71 kilometers per second per megaparsec, plus or minus 7; this is about 21 km/sec per million light-years. What does this mean? An object a million light-years away would be receding from us at 21 km/sec; an object 10 million light-years away, 210 km/sec, etc.

Perhaps the most dramatic revision to the Hubble Law came in 1998, when two teams independently announced that they’d discovered that the rate of expansion of the universe is accelerating; the shorthand name for this observation is Dark Energy.

Harvard University’s Professor of Cosmology John Huchra maintains a webpage on the history of the Hubble constant, and this page from Ned Wright’s Cosmology Tutorial explains how the Hubble law and cosmology are related.

There are several Universe Today stories about the Hubble relationship and the Hubble constant; for example Astronomers Closing in on Dark Energy with Refined Hubble Constant, and Cosmologists Improve on Standard Candles Measurement.

And we have done some Astronomy Casts on it too, How Old is the Universe? and, How Big is the Universe?

Sources:
UT-Knoxville
NASA
Cornell Astronomy

What is Entropy?

After some time, this cold glass will reach thermal equilibrium

Perhaps there’s no better way to understand entropy than to grasp the second law of thermodynamics, and vice versa. This law states that the entropy of an isolated system that is not in equilibrium will increase as time progresses until equilibrium is finally achieved.

Let’s try to elaborate a little on this equilibrium thing. Note that in the succeeding examples, we’ll assume that they’re both isolated systems.

First example. Imagine putting a hot body and a cold body side by side. What happens after some time? That’s right. They both end up in the same temperature; one that is lower than the original temperature of the hotter one and higher than the original temperature of the colder one.

Second example. Ever heard of a low pressure area? It’s what weather reporters call a particular region that’s characterized by strong winds and perhaps some rain. This happens because all fluids flow from a region of high pressure to a region of low pressure. Thus, when the fluid, air in this case, comes rushing in, they do so in the form of strong winds. This goes on until the pressures in the adjacent regions even out.

In both cases, the physical quantities which started to be uneven between the two bodies/regions even out in the end, i.e., when equilibrium is achieved. The measurement of the extent of this evening-out process is called entropy.

During the process of attaining equilibrium, it is possible to tap into the system to perform work, as in a heat engine. Notice, however, that work can only be done for as long as there is a difference in temperature. Without it, like when maximum entropy has already been achieved, there is no way that work can be performed.

Since the concept of entropy applies to all isolated systems, it has been studied not only in physics but also in information theory, mathematics, as well as other branches of science and applied science.

Because the accepted view of the universe is that of one that is finite, then it can very well be considered as a closed system. As such, it should also be governed by the second law of thermodynamics. Thus, like in all isolated systems, the entropy of the universe is expected to be increasing.

So what? Well, also just like all isolated systems, the universe is therefore also expected to end up in a useless heap in equilibrium, a.k.a. a heat death, wherein energy can no longer be extracted from anymore. To give you some relief, not everyone involved in the study of cosmology is totally in agreement with entropy’s so-called role in the grand scheme of things though.

You can read more about entropy here in Universe Today. Want to know why time might flow in one direction? Have you ever thought about the time before the Big Bang? The entire entropy concept plays an important role in understanding them.

There’s more about entropy at NASA and Physics World too. Here are a couple of sources there:

Here are two episodes at Astronomy Cast that you might want to check out as well:

Source:
Hyperphysics