ESA’s Tough Choice: Dark Matter, Sun Close Flyby, Exoplanets (Pick Two)

Thales Alenia Space and EADS Astrium concepts for Euclid (ESA)


Key questions relevant to fundamental physics and cosmology, namely the nature of the mysterious dark energy and dark matter (Euclid); the frequency of exoplanets around other stars, including Earth-analogs (PLATO); take the closest look at our Sun yet possible, approaching to just 62 solar radii (Solar Orbiter) … but only two! What would be your picks?

These three mission concepts have been chosen by the European Space Agency’s Science Programme Committee (SPC) as candidates for two medium-class missions to be launched no earlier than 2017. They now enter the definition phase, the next step required before the final decision is taken as to which missions are implemented.

These three missions are the finalists from 52 proposals that were either made or carried forward in 2007. They were whittled down to just six mission proposals in 2008 and sent for industrial assessment. Now that the reports from those studies are in, the missions have been pared down again. “It was a very difficult selection process. All the missions contained very strong science cases,” says Lennart Nordh, Swedish National Space Board and chair of the SPC.

And the tough decisions are not yet over. Only two missions out of three of them: Euclid, PLATO and Solar Orbiter, can be selected for the M-class launch slots. All three missions present challenges that will have to be resolved at the definition phase. A specific challenge, of which the SPC was conscious, is the ability of these missions to fit within the available budget. The final decision about which missions to implement will be taken after the definition activities are completed, which is foreseen to be in mid-2011.
[/caption]
Euclid is an ESA mission to map the geometry of the dark Universe. The mission would investigate the distance-redshift relationship and the evolution of cosmic structures. It would achieve this by measuring shapes and redshifts of galaxies and clusters of galaxies out to redshifts ~2, or equivalently to a look-back time of 10 billion years. It would therefore cover the entire period over which dark energy played a significant role in accelerating the expansion.

By approaching as close as 62 solar radii, Solar Orbiter would view the solar atmosphere with high spatial resolution and combine this with measurements made in-situ. Over the extended mission periods Solar Orbiter would deliver images and data that would cover the polar regions and the side of the Sun not visible from Earth. Solar Orbiter would coordinate its scientific mission with NASA’s Solar Probe Plus within the joint HELEX program (Heliophysics Explorers) to maximize their combined science return.

Thales Alenis Space concept, from assessment phase (ESA)

PLATO (PLAnetary Transit and Oscillations of stars) would discover and characterize a large number of close-by exoplanetary systems, with a precision in the determination of mass and radius of 1%.

In addition, the SPC has decided to consider at its next meeting in June, whether to also select a European contribution to the SPICA mission.

SPICA would be an infrared space telescope led by the Japanese Space Agency JAXA. It would provide ‘missing-link’ infrared coverage in the region of the spectrum between that seen by the ESA-NASA Webb telescope and the ground-based ALMA telescope. SPICA would focus on the conditions for planet formation and distant young galaxies.

“These missions continue the European commitment to world-class space science,” says David Southwood, ESA Director of Science and Robotic Exploration, “They demonstrate that ESA’s Cosmic Vision programme is still clearly focused on addressing the most important space science.”

Source: ESA chooses three scientific missions for further study

Seven-Year WMAP Results: No, They’re NOT Anomalies

CMB cool fingers, cold spots I and II (red; credit: NASA/WMAP science team)

Since the day the first Wilkinson Microwave Anisotropy Probe (WMAP) data were released, in 2003, all manner of cosmic microwave background (CMB) anomalies have been reported; there’s been the cold spot that might be a window into a parallel universe, the “Axis of Evil”, pawprints of local interstellar neutral hydrogen, and much, much more.

But do the WMAP data really, truly, absolutely contain evidence of anomalies, things that just do not fit within the six-parameters-and-a-model the WMAP team recently reported?

In a word, no.

Seven Year Microwave Sky (Credit: NASA/WMAP Science Team)

Every second year since 2003 the WMAP science team has published a set of papers on their analyses of the cumulative data, and their findings (with the mission due to end later this year, their next set will, sadly, be their last). With time and experience – not to mention inputs from the thousands of other researchers who have picked over the data – the team has not only amassed a lot more data, but has also come to understand how WMAP operates far better. As a consequence, not only are the published results – such as limits on the nature of dark energy, and the number of different kinds of neutrinos – more stringent and robust, but the team has also become very au fait with the various anomalies reported.

For the first time, the team has examined these anomalies, in detail, and has concluded that the answer to the question, in their words, “are there potential deviations from ?CDM within the context of the allowed parameter ranges of the existing WMAP observations?” is “no”.

The reported anomalies the team examined are many – two prominent cold spots, strength of the quadrupole, lack of large angular scale CMB power, alignment of the quadrupole and octupole components, hemispherical or dipole power asymmetry, to name but a handful – but the reasons for the apparent anomalies are few.

“Human eyes and brains are excellent at detecting visual patterns, but poor at assessing probabilities. Features seen in the WMAP maps, such as the large Cold Spot I near the Galactic center region, can stand out as unusual. However, the likelihood of such features can not be discerned by visual inspection of our particular realization of the universe,” they write, and “Monte Carlo simulations are an invaluable way to determine the expected deviations within the ?CDM model. Claims of anomalies without Monte Carlo simulations are necessarily weak claims”.

Stephen Hawking’s initials in the CMB (Credit: NASA/WMAP Science Team)

An amusing example: Stephen Hawking’s initials (“SH”) can be clearly seen in the WMAP sky map. “The “S” and “H” are in roughly the same font size and style, and both letters are aligned neatly along a line of fixed Galactic latitude,” the team says; “A calculation would show that the probability of this particular occurrence is vanishingly small. Yet, there is no case to made for a non-standard cosmology despite this extraordinarily low probability event,” they dryly note.

Many of the reports of WMAP CMB anomalies would likely make for good teaching material, as they illustrate well the many traps that you can so easily fall into when doing after-the-fact (a posteriori) statistical analyses. Or, as the team puts it in regard to the Stephen Hawking initials: “It is clear that the combined selection of looking for initials, these particular initials, and their alignment and location are all a posteriori choices. For a rich data set, as is the case with WMAP, there are a lot of data and a lot of ways of analyzing the data.”

And what happens when you have a lot of data? Low probability events are guaranteed to occur! “For example, it is not unexpected to find a 2? feature when analyzing a rich data set in a number of different ways. However, to assess whether a particular 2? feature is interesting, one is often tempted to narrow in on it to isolate its behavior. That process involves a posteriori choices that amplify the apparent significance of the feature.”

So, does the team conclude that all this anomaly hunting is a waste of effort? Absolutely not! I’ll quote from the team’s own conclusion: “The search for oddities in the data is essential for testing the model. The success of the model makes these searches even more important. A detection of any highly significant a posteriori feature could become a serious challenge for the model. The less significant features discussed in this paper provided the motivation for considering alternative models and developing new analysis of WMAP (and soon Planck) data. The oddities have triggered proposed new observations that can further test the models. It is often difficult to assess the statistical claims. It may well be that an oddity could be found that motivates a new theory, which then could be tested as a hypothesis against ?CDM. The data support these comparisons. Of course, other cosmological measurements must also play a role in testing new hypotheses. No CMB anomaly reported to date has caused the scientific community to adopt a new standard model of cosmology, but claimed anomalies have been used to provoke thought and to search for improved theories.”

Primary source: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Are There Cosmic Microwave Background Anomalies? (arXiv:1001.4758). The five other Seven-Year WMAP papers are: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Cosmological Interpretation (arXiv:1001.4538), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Planets and Celestial Calibration Sources (arXiv:1001.4731), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Sky Maps, Systematic Errors, and Basic Results (arXiv:1001.4744), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Power Spectra and WMAP-Derived Parameters (arXiv:1001.4635), and Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Galactic Foreground Emission (arXiv:1001.4555). Also check out the official WMAP website.

Universe to WMAP: ΛCDM Rules, OK?

Temperature and polarization around hot and cold spots (Credit: NASA / WMAP Science Team)

[/caption]
The Wilkinson Microwave Anisotropy Probe (WMAP) science team has finished analyzing seven full years’ of data from the little probe that could, and once again it seems we can sum up the universe in six parameters and a model.

Using the seven-year WMAP data, together with recent results on the large-scale distribution of galaxies, and an updated estimate of the Hubble constant, the present-day age of the universe is 13.75 (plus-or-minus 0.11) billion years, dark energy comprises 72.8% (+/- 1.5%) of the universe’s mass-energy, baryons 4.56% (+/- 0.16%), non-baryonic matter (CDM) 22.7% (+/- 1.4%), and the redshift of reionization is 10.4 (+/- 1.2).

In addition, the team report several new cosmological constraints – primordial abundance of helium (this rules out various alternative, ‘cold big bang’ models), and an estimate of a parameter which describes a feature of density fluctuations in the very early universe sufficiently precisely to rule out a whole class of inflation models (the Harrison-Zel’dovich-Peebles spectrum), to take just two – as well as tighter limits on many others (number of neutrino species, mass of the neutrino, parity violations, axion dark matter, …).

The best eye-candy from the team’s six papers are the stacked temperature and polarization maps for hot and cold spots; if these spots are due to sound waves in matter frozen in when radiation (photons) and baryons parted company – the cosmic microwave background (CMB) encodes all the details of this separation – then there should be nicely circular rings, of rather exact sizes, around the spots. Further, the polarization directions should switch from radial to tangential, from the center out (for cold spots; vice versa for hot spots).

And that’s just what the team found!

Concerning Dark Energy. Since the Five-Year WMAP results were published, several independent studies with direct relevance to cosmology have been published. The WMAP team took those from observations of the baryon acoustic oscillations (BAO) in the distribution of galaxies; of Cepheids, supernovae, and a water maser in local galaxies; of time-delay in a lensed quasar system; and of high redshift supernovae, and combined them to reduce the nooks and crannies in parameter space in which non-cosmological constant varieties of dark energy could be hiding. At least some alternative kinds of dark energy may still be possible, but for now Λ, the cosmological constant, rules.

Concerning Inflation. Very, very, very early in the life of the universe – so the theory of cosmic inflation goes – there was a period of dramatic expansion, and the tiny quantum fluctuations before inflation became the giant cosmic structures we see today. “Inflation predicts that the statistical distribution of primordial fluctuations is nearly a Gaussian distribution with random phases. Measuring deviations from a Gaussian distribution,” the team reports, “is a powerful test of inflation, as how precisely the distribution is (non-) Gaussian depends on the detailed physics of inflation.” While the limits on non-Gaussianity (as it is called), from analysis of the WMAP data, only weakly constrain various models of inflation, they do leave almost nowhere for cosmological models without inflation to hide.

Concerning ‘cosmic shadows’ (the Sunyaev-Zel’dovich (SZ) effect). While many researchers have looked for cosmic shadows in WMAP data before – perhaps the best known to the general public is the 2006 Lieu, Mittaz, and Zhang paper (the SZ effect: hot electrons in the plasma which pervades rich clusters of galaxies interact with CMB photons, via inverse Compton scattering) – the WMAP team’s recent analysis is their first to investigate this effect. They detect the SZ effect directly in the nearest rich cluster (Coma; Virgo is behind the Milky Way foreground), and also statistically by correlation with the location of some 700 relatively nearby rich clusters. While the WMAP team’s finding is consistent with data from x-ray observations, it is inconsistent with theoretical models. Back to the drawing board for astrophysicists studying galaxy clusters.

Seven Year Microwave Sky (Credit: NASA/WMAP Science Team)

I’ll wrap up by quoting Komatsu et al. “The standard ΛCDM cosmological model continues to be an exquisite fit to the existing data.”

Primary source: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Cosmological Interpretation (arXiv:1001.4738). The five other Seven-Year WMAP papers are: Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Are There Cosmic Microwave Background Anomalies? (arXiv:1001.4758), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Planets and Celestial Calibration Sources (arXiv:1001.4731), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Sky Maps, Systematic Errors, and Basic Results (arXiv:1001.4744), Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Power Spectra and WMAP-Derived Parameters (arXiv:1001.4635), and Seven-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Galactic Foreground Emission (arXiv:1001.4555). Also check out the official WMAP website.

Searching for Life in the Multiverse

Multiverse Theory
Artist concept of the multiverse. Credit: Florida State University

[/caption]
Other intelligent and technologically capable alien civilizations may exist in our Universe, but the problems with finding and communicating with them is that they are simply too far away for any meaningful two-way conversations. But what about the prospect of finding if life exists in other universes outside of our own?

Theoretical physics has brought us the notion that our single universe is not necessarily all there is. The “multiverse” idea is a hypothetical mega-universe full of numerous smaller universes, including our own.

In this month’s Scientific American, Alejandro Jenkins from Florida State University and Gilad Perez, a theorist at the Weizmann Institute of Science in Israel, discuss how multiple other universes—each with its own laws of physics—may have emerged from the same primordial vacuum that gave rise to ours. Assuming they exist, many of those universes may contain intricate structures and perhaps even some forms of life. But the latest theoretical research suggests that our own universe may not be as “finely tuned” for the emergence of life as previously thought.

Jenkns and Perez write about a provocative hypothesis known as the anthropic principle, which states that the existence of intelligent life (capable of studying physical processes) imposes constraints on the possible form of the laws of physics.

Alejandro Jenkins. Credit: Florida State University

“Our lives here on Earth — in fact, everything we see and know about the universe around us — depend on a precise set of conditions that makes us possible,” Jenkins said. “For example, if the fundamental forces that shape matter in our universe were altered even slightly, it’s conceivable that atoms never would have formed, or that the element carbon, which is considered a basic building block of life as we know it, wouldn’t exist. So how is it that such a perfect balance exists? Some would attribute it to God, but of course, that is outside the realm of physics.”

The theory of “cosmic inflation,” which was developed in the 1980s in order to solve certain puzzles about the structure of our universe, predicts that ours is just one of countless universes to emerge from the same primordial vacuum. We have no way of seeing those other universes, although many of the other predictions of cosmic inflation have recently been corroborated by astrophysical measurements.

Given some of science’s current ideas about high-energy physics, it is plausible that those other universes might each have different physical interactions. So perhaps it’s no mystery that we would happen to occupy the rare universe in which conditions are just right to make life possible. This is analogous to how, out of the many planets in our universe, we occupy the rare one where conditions are right for organic evolution.

“What theorists like Dr. Perez and I do is tweak the calculations of the fundamental forces in order to predict the resulting effects on possible, alternative universes,” Jenkins said. “Some of these results are easy to predict; for example, if there was no electromagnetic force, there would be no atoms and no chemical bonds. And without gravity, matter wouldn’t coalesce into planets, stars and galaxies.

“What is surprising about our results is that we found conditions that, while very different from those of our own universe, nevertheless might allow — again, at least hypothetically — for the existence of life. (What that life would look like is another story entirely.) This actually brings into question the usefulness of the anthropic principle when applied to particle physics, and might force us to think more carefully about what the multiverse would actually contain.”

A brief overview of the article is available for free on Scientific American’s website.

Source: Florida State University

The Extremely Large Telescope

The European Southern Observatory (ESO) is planning on building a massive – and I do mean massive – telescope in the next decade. The European Extremely Large Telescope (E-ELT) is a 42-meter telescope in its final planning stages. Weighing in at 5,000 tonnes, and made up of 984 individual mirrors, it will be able to image the discs of extrasolar planets and resolve individual stars in galaxies beyond the Local Group! By 2018 ESO hope to be using this gargantuan scope to stare so deep into space that they can actually see the Universe expanding!

The E-ELT is currently scheduled for completion around 2018 and when built it will be four times larger than anything currently looking at the sky in optical wavelengths and 100 times more powerful than the Hubble Space Telescope – despite being a ground-based observatory.

With advanced adaptive optics systems, the E-ELT will use up to 6 laser guide stars to analyse the twinkling caused by the motion of the atmosphere. Computer systems move the 984 individual mirrored panels up to a thousand times a second to cancel out this blurring effect in real time. The result is an image almost as crisp as if the telescope were in space.

This combination of incredible technological power and gigantic size mean that that the E-ELT will be able to not only detect the presence of planets around other stars but also begin to make images of them. It could potentially make a direct image of a Super Earth (a rocky planet just a few times larger than Earth). It would be capable of observing planets around stars within 15-30 light years of the Earth – there are almost 400 stars within that distance!

The E-ELT will be able to resolve stars within distant galaxies and as such begin to understand the history of such galaxies. This method of using the chemical composition, age and mass of stars to unravel the history of the galaxy is sometimes called galactic archaeology and instruments like the E-ELT would lead the way in such research.

Incredibly, by measuring the redshift of distant galaxies over many years with a telescope as sensitive as the E-ELT it should be possible to detect the gradual change in their doppler shift. As such the E-ELT could allow humans to watch the Universe itself expand!

ESO has already spent millions on developing the E-ELT concept. If it is completed as planned then it will eventually cost about €1 billion. The technology required to make the E-ELT happen is being developed right now all over the world – in fact it is creating new technologies, jobs and industry as it goes along. The telescope’s enclosure alone presents a huge engineering conundrum – how do you build something the size of modern sports stadium at high altitude and without any existing roads? They will need to keep 5,000 tonnes of metal and glass slewing around smoothly and easily once it’s operating – as well as figuring out how to mass-produce more than 1200 1.4m hexagonal mirrors.

The E-ELT has the capacity to transform our view not only of the Universe but of telescopes and the technology to build them as well. It will be a huge leap forward in telescope engineering and for European astronomy it will be a massive 42m jewel in the crown.

Early Galaxy Pinpoints Reionization Era

This is a composite of false color images of the galaxies found at the early epoch around 800 million years after the Big Bang. The upper left panel presents the galaxy confirmed in the 787 million year old universe. These galaxies are in the Subaru Deep Field. Credit: M. Ouchi et al.

[/caption]
Astronomers looking to pinpoint when the reionozation of the Universe took place have found some of the earliest galaxies about 800 million years after the Big Bang. 22 early galaxies were found using a method that looks for far-away redshifting sources that disappear or “drop-out” at a specific wavelength. The age of one galaxy was confirmed by a characteristic neutral hydrogen signature at 787 million years after the Big Bang. The finding is the first age-confirmation of a so-called dropout galaxy at that distant time and pinpoints when the reionization epoch likely began.

The reionization period is about the farthest back in time that astronomers can observe. The Big Bang, 13.7 billion years ago, created a hot, murky universe. Some 400,000 years later, temperatures cooled, electrons and protons joined to form neutral hydrogen, and the murk cleared. Some time before 1 billion years after the Big Bang, neutral hydrogen began to form stars in the first galaxies, which radiated energy and changed the hydrogen back to being ionized. Although not the thick plasma soup of the earlier period just after the Big Bang, this star formation started the reionization epoch.

Astronomers know that this era ended about 1 billion years after the Big Bang, but when it began has eluded them.

We look for ‘dropout’ galaxies,” said Masami Ouchi, who led a US and Japanese team of astronomers looking back at the reionization epoch. “We use progressively redder filters that reveal increasing wavelengths of light and watch which galaxies disappear from or ‘dropout’ of images made using those filters. Older, more distant galaxies ‘dropout’ of progressively redder filters and the specific wavelengths can tell us the galaxies’ distance and age. What makes this study different is that we surveyed an area that is over 100 times larger than previous ones and, as a result, had a larger sample of early galaxies (22) than past surveys. Plus, we were able to confirm one galaxy’s age,” he continued. “Since all the galaxies were found using the same dropout technique, they are likely to be the same age.”

Ouchi’s team was able to conduct such a large survey because they used a custom-made, super-red filter and other unique technological advancements in red sensitivity on the wide-field camera of the 8.3-meter Subaru Telescope. They made their observations from 2006 to 2009 in the Subaru Deep Field and Great Observatories Origins Deep Survey North field. They then compared their observations with data gathered in other studies.

Astronomers have wondered whether the universe underwent reionization instantaneously or gradually over time, but more importantly, they have tried to isolate when the universe began reionization. Galaxy density and brightness measurements are key to calculating star-formation rates, which tell a lot about what happened when. The astronomers looked at star-formation rates and the rate at which hydrogen was ionized.

Using data from their study and others, they determined that the star-formation rates were dramatically lower from 800 million years to about one billion years after the Big Bang, then thereafter. Accordingly, they calculated that the rate of ionization would be very slow during this early time, because of this low star-formation rate.

“We were really surprised that the rate of ionization seems so low, which would constitute a contradiction with the claim of NASA’s WMAP satellite. It concluded that reionization started no later than 600 million years after the Big Bang,” remarked Ouchi. “We think this riddle might be explained by more efficient ionizing photon production rates in early galaxies. The formation of massive stars may have been much more vigorous then than in today’s galaxies. Fewer, massive stars produce more ionizing photons than many smaller stars,” he explained.

The research will be published in a December issue of the Astrophysical Journal.

Source: EurekAlert

New CMB Measurements Support Standard Model

The measure of polarized light from the early Universe allowed researchers to better plot the location of matter - the left image - which later became the stars and galaxies we have today. Image Credit: Sarah Church/Walter Gear

New measurements of the cosmic microwave background (CMB) – the leftover light from the Big Bang – lend further support the Standard Cosmological Model and the existence of dark matter and dark energy, limiting the possibility of alternative models of the Universe. Researchers from Stanford University and Cardiff University produced a detailed map of the composition and structure of matter as it would have looked shortly after the Big Bang, which shows that the Universe would not look as it does today if it were made up solely of ‘normal matter’.

By measuring the way the light of the CMB is polarized, a team led by Sarah Church of the Kavli Institute for Particle Astrophysics and Cosmology at Stanford University and by Walter Gear, head of the School of Physics and Astronomy at Cardiff University in the United Kingdom were able construct a map of the way the Universe would have looked shortly after matter came into existence after the Big Bang. Their findings lend evidence to the predictions of the Standard Model in which the Universe is composed of 95% dark matter and energy, and only 5% of ordinary matter.

Polarization is a feature of light rays in which the oscillation of the light wave lies in right angles to the direction in which the light is traveling. Though most light is unpolarized, light that has interacted with matter can become polarized. The leftover light from the Big Bang – the CMB – has now cooled to a few degrees above 0 Kelvin, but it still retains the same polarization it had in the early Universe, once it had cooled enough to become transparent to light. By measuring this polarization, the researchers were able to extrapolate the location, structure, and velocity of matter in the early Universe with unprecedented precision. The gravitational collapse of large clumps of matter in the early universe creates certain resonances in the polarization that allowed the researchers to create a map of the matter composition.

Dr. Gear said, “The pattern of oscillations in the power spectra allow us to discriminate, as “real” and “dark” matter affect the position and amplitudes of the peaks in different ways. The results are also consistent with many other pieces of evidence for dark matter, such as the rotation rate of galaxies, and the distribution of galaxies in clusters.”

The measurements made by the QUaD experiment further constrain those made by previous experiments to measure properties of the CMB, such as WMAP and ACBAR. In comparison to these previous experiments, the The QUaD experiment, located at the South Pole, allowed researchers to measure the polarization of the CMB with very high precision. Image Credit: Sarah Churchmeasurements come closer to fitting what is predicted by the Standard Cosmologicl Model by more than an order of magnitude, said Dr. Gear. This is a very important step on the path to verifying whether our model of the Universe is correct.

The researchers used the QUaD experiment at the South Pole to make their observations. The QUaD telescope is a bolometer, essentially a thermometer that measures how certain types of radiation increase the temperature of the metals in the detector. The detector itself has to be near 1 degree Kelvin to eliminate noise radiation from the surrounding environment, which is why it is located at the frigid South Pole, and placed inside of a cryostat.

Paper co-author Walter Gear said in an email interview:

“The polarization is imprinted at the time the Universe becomes transparent to light, about 400,000 years after the big bang, rather than right after the big bang before matter existed. There are major efforts now to try to find what is called the “B-mode” signal”  which is a more complicated polarization pattern that IS imprinted right after the big-bang. QuaD places the best current upper limit on this but is still more than an order of magnitude away in sensitivity from even optimistic predictions of what that signal might be. That is the next generation of experiments’s goal.”

The results, published in a paper titled Improved Measurements of the Temperature and Polarization of the Cosmic Microwave Background from QUaD in the November 1st Astrophysical Journal, fit the predictions of the Standard Model remarkably well, providing further evidence for the existence of dark matter and energy, and constraining alternative models of the Universe.

Source: SLAC, email interview with Dr. Walter Gear

If We Live in a Multiverse, How Many Are There?

Artist concept of the cyclic universe.

[/caption]
Theoretical physics has brought us the notion that our single universe is not necessarily the only game in town. Satellite data from WMAP, along with string theory and its 11- dimensional hyperspace idea has produced the concept of the multiverse, where the Big Bang could have produced many different universes instead of a single uniform universe. The idea has gained popularity recently, so it was only a matter of time until someone asked the question of how many multiverses could possibly exist. The number, according to two physicists, could be “humongous.”

Andrei Linde and Vitaly Vanchurin at Stanford University in California, did a few back-of- the- envelope calculations, starting with the idea that the Big Bang was essentially a quantum process which generated quantum fluctuations in the state of the early universe. The universe then underwent a period of rapid growth called inflation during which these perturbations were “frozen,” creating different initial classical conditions in different parts of the cosmos. Since each of these regions would have a different set of laws of low energy physics, they can be thought of as different universes.

Linde and Vanchurin then estimated how many different universes could have appeared as a result of this effect. Their answer is that this number must be proportional to the effect that caused the perturbations in the first place, a process called slow roll inflation, — the solution Linde came up with previously to answer the problem of the bubbles of universes colliding in the early inflation period. In this model, inflation occurred from a scalar field rolling down a potential energy hill. When the field rolls very slowly compared to the expansion of the universe, inflation occurs and collisions end up being rare.

Using all of this (and more – see their paper here) Linde and Vanchurin calculate that the number of universes in the multiverse and could be at least 10^10^10^7, a number which is definitely “humungous,” as they described it.

The next question, then, is how many universes could we actually see? Linde and Vanchurin say they had to invoke the Bekenstein limit, where the properties of the observer become an important factor because of a limit to the amount of information that can be contained within any given volume of space, and by the limits of the human brain.

The total amount of information that can be absorbed by one individual during a lifetime is about 10^16 bits. So a typical human brain can have 10^10^16 configurations and so could never distinguish more than that number of different universes.

The number of multiverses the human brain could distinguish. Credit: Linde and Vanchurin
The number of multiverses the human brain could distinguish. Credit: Linde and Vanchurin

“So, the total number of possibilities accessible to any given observer is limited not only by the entropy of perturbations of metric produced by inflation and by the size of the cosmological horizon, but also by the number of degrees of freedom of an observer,” the physicists write.

“We have found that the strongest limit on the number of different locally distinguishable geometries is determined mostly by our abilities to distinguish between different universes and to remember our results,” wrote Linde and Vanchurin. “Potentially it may become very important that when we analyze the probability of existencse of a universe of a given type, we should be talking about a consistent pair: the universe and an observer who makes the rest of the universe “alive” and the wave function of the rest of the universe time-dependant.”

So their conclusion is that the limit does not depend on the properties of the multiverse itself, but on the properties of the observer.

They hope to further study this concept to see if this probability if proportional to the observable entropy of inflation.

Sources: ArXiv, Technology Review Blog

Planck First Light

Strips of the sky measured by Planck. Credit: ESA

[/caption]
One of the newest telescopes in space, the Planck spacecraft, recently completed its “first light” survey which began on August 13. Astronomers say the initial data, gathered from Planck’s vantage point at the L2 point in space, is excellent. Planck is studying the Cosmic Microwave Background, looking for variations in temperature that are about a million times smaller than one degree. This is comparable to measuring from Earth the body heat of a rabbit sitting on the Moon.

The initial survey yielded maps of a strip of the sky, one for each of Planck’s nine frequencies. Each map is a ring, about 15° wide, stretching across the full sky.

The the differences in color in the strips indicate the magnitude of the deviations of the temperature of the Cosmic Microwave Background from its average value, as measured by Planck at a frequency close to the peak of the CMB spectrum (red is hotter and blue is colder).

The large red strips trace radio emission from the Milky Way, whereas the small bright spots high above the galactic plane correspond to emission from the Cosmic Microwave Background itself.

In order to do its work, Planck’s detectors must be cooled to extremely low temperatures, some of them being very close to absolute zero (–273.15°C, or zero Kelvin, 0K).

Routine operations are now underway, and surveying will continue for at least 15 months without a break. In approximately 6 months, the first all-sky map will be assembled.

Within its projected operational life of 15 months, Planck will gather data for two complete sky maps. To fully exploit the high sensitivity of Planck, the data will require delicate adjustments and careful analysis. It promises to return a treasure trove that will keep both cosmologists and astrophysicists busy for decades to come.

Source: ESA

What! No Parallel Universe? Cosmic Cold Spot Just Data Artifact

Region in space detected by WMAP cooler than its surroundings. But not really. Rudnick/NRAO/AUI/NSF, NASA.

Rats! Another perplexing space mystery solved by science. New analysis of the famous “cold spot” in the cosmic microwave background reveals, and confirms, actually, that the spot is just an artifact of the statistical methods used to find it. That means there is no supervoid lurking in the CMB, and no parallel universe lying just beyond the edge of our own. What fun is that?

Back in 2004, astronomers studying data from the Wilkinson Microwave Anisotropy Probe (WMAP) found a region of the cosmic microwave background in the southern hemisphere in the direction of the constellation of Eridanus that was significantly colder than the rest by about 70 microkelvin. The probability of finding something like that was extremely low. If the Universe really is homogeneous and isotropic, then all points in space ought to experience the same physical development, and appear the same. This just wasn’t supposed to be there.

Some astronomers suggested the spot could be a supervoid, a remnant of an early phase transition in the universe. Others theorized it was a window into a parallel universe.

Well, it turns out, it wasn’t there.

Ray Zhang and Dragan Huterer at the University of Michigan in Ann Arbor say that the cold spot is simply an artifact of the statistical method–called Spherical Mexican Hat Wavelets–used to analyze the WMAP data. Use a different method of analysis and the cold spot disappears (or at least is no colder than expected).

“We trace this apparent discrepancy to the fact that WMAP cold spot’s temperature profile just happens to favor the particular profile given by the wavelet,” the duo says in their paper. “We find no compelling evidence for the anomalously cold spot in WMAP at scales between 2 and 8 degrees.”

This confirms another paper from 2008 also by Huterer along with colleague Kendrick Smith from the University of Cambridge who showed that the huge void could be considered as a statistical fluke because it had stars both in front of and behind it.

And in fact, one of the earlier papers suggesting the cold spot by Lawrence Rudnick from the University of Minnesota does indeed say that statistical uncertainties have not been accounted for.

Oh well. Now, on to the next cosmological mysteries like dark matter and dark energy!

Zhang and Huterer’s paper.

Huterer and Smith’s paper (2008)

Rudnick’s paper 2007

Original paper “finding” the cold spot

Sources: Technology Review Blog, Science