Astronomy Without A Telescope – Our Ageing Universe

Active energy transfer - the thing distinguishes a young universe from an old universe. Credit: Gemini observatory.

[/caption]

It all started so full of promise. All at once, our universe burst upon the scene, but much of that initial burst quickly dissipated into background neutrinos and photons – and ever since, pretty much  everything our universe has ever done has just dissipated more energy. So, despite the occasional enthusiastic outburst of supernovae and other celestial extravagances, it’s becoming increasingly apparent that our universe is getting on a bit.

The second law of thermodynamics (the one about entropy) demands that everything goes to pot over time – since anything that happens is an opportunity for energy to be dissipated.

The universe is full of energy and should always remain so, but that energy can only make something interesting happen if there is a degree of thermal disequilibrium. For example, if you take an egg out of the refrigerator and drop it in boiling water, it cooks. A useful and worthwhile activity, even if not a very efficient one – since lots of heat from the stove just dissipates into the kitchen, rather than being retained for the cooking of more eggs.

But, on the other hand, if you drop an already cooked, already heated egg into the same boiling water… well, what’s the point? No useful work is done, nothing of note really happens.

This is roughly the idea behind increasing entropy. Everything of note that happens in the universe involves a transfer of energy and at each such transfer some energy is lost from that system. So, following the second law to its logical conclusion, you eventually end up with a universe in thermal equilibrium with itself. At that point, there are no disequilibrium gradients left to drive energy transfer – or to cook eggs. Essentially, nothing else of note will ever happen again – a state known as heat death.

It’s true that the early universe was initially in thermal equilibrium, but there was also lots of gravitational potential energy. So, matter (both light and dark) ‘clumped’ – creating lots of thermal disequilibrium – and from there all sorts of interesting things were able to happen. But gravity’s ability to contribute useful work to the universe also has its limits.

In a static universe the end point of all this clumping is a collection of black holes – considered to be objects in a state of high entropy, since whatever they contain no longer engages in energy transfer. It just sits there – and, apart from some whispers of Hawking radiation, will just keep sitting there until eventually (in a googol or so years) the black holes evaporate.

The contents of an expanding universe may never achieve a state of maximum entropy since the expansion itself increases the value of maximum entropy for that universe – but you still end up with not much more than a collection of isolated and ageing white dwarfs – which eventually fizzle out and evaporate themselves.

A head count of the contributors to entropy in our universe. Supermassive black holes top the list. Credit: Egan and Lineweaver. (The full paper notes some caveats and recommendations for further work to improve these estimates).

It’s possible to estimate the current entropy of our universe by tallying up its various components – which have varying levels of entropy density. At the top of the scale are black holes – and at the bottom are luminous stars. These stars appear to be locally enthalpic – where for example, the Sun heats the Earth enabling all sorts of interesting things to happen here. But it’s a time-limited process and what the Sun mostly does is to radiate energy away into empty space.

Egan and Lineweaver have recently re-calculated the current entropy of the observable universe – and gained a value that is an order of magnitude higher than previous estimates (albeit we are talking 1×10104 – instead of 1×10103). This is largely the result of incorporating the entropy contributed by recently recognized supermassive black holes – where the entropy of a black hole is proportional to its size.

So this suggests our universe is a bit further down the track towards heat death than we had previously thought. Enjoy it while you can.

Further reading: Egan, C.A. and Lineweaver, C.H. (2010) A Larger Estimate of the Entropy of the Universe http://arxiv.org/abs/0909.3983

Cosmologists Provide Closest Measure of Elusive Neutrino

Slices through the SDSS 3-dimensional map of the distribution of galaxies. Earth is at the center, and each point represents a galaxy, typically containing about 100 billion stars. Galaxies are colored according to the ages of their stars, with the redder, more strongly clustered points showing galaxies that are made of older stars. The outer circle is at a distance of two billion light years. The region between the wedges was not mapped by the SDSS because dust in our own Galaxy obscures the view of the distant universe in these directions. Both slices contain all galaxies within -1.25 and 1.25 degrees declination. Credit: M. Blanton and the Sloan Digital Sky Survey.

[/caption]

Cosmologists – and not particle physicists — could be the ones who finally measure the mass of the elusive neutrino particle. A group of cosmologists have made their most accurate measurement yet of the mass of these mysterious so-called “ghost particles.” They didn’t use a giant particle detector but used data from the largest survey ever of galaxies, the Sloan Digital Sky Survey. While previous experiments had shown that neutrinos have a mass, it is thought to be so small that it was very hard to measure. But looking at the Sloan data on galaxies, PhD student Shawn Thomas and his advisers at University College London put the mass of a neutrino at no greater than 0.28 electron volts, which is less than a billionth of the mass of a single hydrogen atom. This is one of the most accurate measurements of the mass of a neutrino to date.

Their work is based on the principle that the huge abundance of neutrinos (there are trillions passing through you right now) has a large cumulative effect on the matter of the cosmos, which naturally forms into “clumps” of groups and clusters of galaxies. As neutrinos are extremely light they move across the universe at great speeds which has the effect of smoothing this natural “clumpiness” of matter. By analysing the distribution of galaxies across the universe (i.e. the extent of this “smoothing-out” of galaxies) scientists are able to work out the upper limits of neutrino mass.

A neutrino is capable of passing through a light year –about six trillion miles — of lead without hitting a single atom.

Central to this new calculation is the existence of the largest ever 3D map of galaxies, called Mega Z, which covers over 700,000 galaxies recorded by the Sloan Digital Sky Survey and allows measurements over vast stretches of the known universe.

“Of all the hypothetical candidates for the mysterious Dark Matter, so far neutrinos provide the only example of dark matter that actually exists in nature,” said Ofer Lahav, Head of UCL’s Astrophysics Group. “It is remarkable that the distribution of galaxies on huge scales can tell us about the mass of the tiny neutrinos.”

The Cosmologists at UCL were able to estimate distances to galaxies using a new method that measures the colour of each of the galaxies. By combining this enormous galaxy map with information from the temperature fluctuations in the after-glow of the Big Bang, called the Cosmic Microwave Background radiation, they were able to put one of the smallest upper limits on the size of the neutrino particle to date.

“Although neutrinos make up less than 1% of all matter they form an important part of the cosmological model,” said Dr. Shaun Thomas. “It’s fascinating that the most elusive and tiny particles can have such an effect on the Universe.”

“This is one of the most effective techniques available for measuring the neutrino masses,” said Dr. Filipe Abadlla. “This puts great hopes to finally obtain a measurement of the mass of the neutrino in years to come.”

The authors are confident that a larger survey of the Universe, such as the one they are working on called the international Dark Energy Survey, will yield an even more accurate weight for the neutrino, potentially at an upper limit of just 0.1 electron volts.
The results are published in the journal Physical Review Letters.

Source: University College London

Astronomy Without A Telescope – Is Time Real?

Time is an illusion caused by the passage of history (Douglas Adams 1952-2001).

The way that we deal with time is central to a major current schism in physics. Under classic Newtonian physics and also quantum mechanics – time is absolute, a universal metronome allowing you determine whether events occur simultaneously or in sequence. Under Einstein’s physics, time is not absolute – simultaneity and sequence depend on who’s looking. For Einstein, the speed of light (in a vacuum) is constant and time changes in whatever way is required to keep the speed of light constant from all frames of reference.

Under general relativity (GR) you are able to experience living for three score and ten years regardless of where you are or how fast you’re moving, but other folk might measure that duration quite differently. But even under GR, we need to consider whether time only has meaning for sub-light speed consciousnesses such as us. Were a photon to have consciousness, it may not experience time – and, from its perspective, would cross the apparent 100,000 light year diameter of the Milky Way in an instant. Of course, that gets you wondering whether space is real either. Hmm…

Quantum mechanics does (well, sometimes) require absolute time – most obviously in regards to quantum entanglement where determining the spin of one particle, determines the spin of its entangled partner instantaneously and simultaneously. Leaving aside the baffling conundrums imposed by this instantaneous action over a distance – the simultaneous nature of the event implies the existence of absolute time.

In one attempt to reconcile GR and quantum mechanics, time disappears altogether – from the Wheeler-DeWitt equation for quantum gravity – not that many regard this as a 100% successful attempt to reconcile GR and quantum mechanics. Nonetheless, this line of thinking highlights the ‘problem of time’ when trying to develop a Theory of Everything.

The winning entries for a 2008 essay competition on the nature of time run by the Fundamental Questions Institute could be roughly grouped into the themes ‘time is real’, ‘no, it isn’t’ and ‘either way, it’s useful so you can cook dinner.’

The ‘time isn’t real’ camp runs the line that time is just a by-product of what the universe does (anything from the Earth rotating to the transition of a Cesium atom – i.e. the things that we calibrate our clocks to).

How a return to equilibrium after a random downward fluctuation in entropy might appear. First there was light, then a whole bunch of stuff happened and then it started getting cold and dark and empty.

Time is the fire in which we burn (Soran, Star Trek bad guy, circa 24th century).

‘Time isn’t real’ proponents also refer to Boltzmann’s attempt to trivialise the arrow of time by proposing that we just live in a local pocket of the universe where there has been a random downward fluctuation of entropy – so that the perceived forward arrow of time is just a result of the universe returning to equilibrium – being a state of higher entropy where it’s very cold and most of the transient matter that we live our lives upon has evaporated. It is conceivable that another different type of fluctuation somewhere else might just as easily result in the arrow pointing the other way.

Nearly everyone agrees that time probably doesn’t exist outside our Big Bang universe and the people who just want to get on and cook dinner suggest we might concede that space-time could be an emergent property of quantum mechanics. With that settled, we just need to rejig the math – over coffee maybe.

I was prompted to write this after reading a Scientific American June 2010 article, Time Is An Illusion by Craig Callender.

Team Finds Most-Distant Galaxy Cluster Ever Seen

SXDF-XCLJ0218-0510. Max-Planck-Institut für extraterrestrische Physik

[/caption]

Like a location from Star Wars, this galaxy cluster is far, far away and with origins a long, long time ago. With the ungainly name of SXDF-XCLJ0218-0510, this cluster is actually the most distant cluster of galaxies ever seen. It is a whopping 9.6 billion light years away, and X-ray and infrared observations show that the cluster hosts predominantly old, massive galaxies. This means the galaxies formed when the universe was still very young, so finding this cluster and being able to see it is providing new information not only about early galaxy evolution but also about history of the universe as a whole.

An international team of astronomers from the Max Planck Institute for Extraterrestrial Physics, the University of Tokyo and the Kyoto University discovered this cluster using the Subaru telescope along with the XMM-Newton space observatory to look in different wavelengths.

Using the Multi-Object Infrared Camera and Spectrometer (MOIRCS) on the Subaru telescope, the team was able to look in near-infrared wavelengths, where the galaxies are most luminous.

“The MOIRCS instrument has an extremely powerful capability of measuring distances to galaxies. This is what made our challenging observation possible,” said Masayuki Tanaka from the University of Tokyo. “Although we confirmed only several massive galaxies at that distance, there is convincing evidence that the cluster is a real, gravitationally bound cluster.”

Like a contour map, the arrows in the image above indicate galaxies that are likely located at the same distance, clustered around the center of the image. The contours indicate the X-ray emission of the cluster. Galaxies with confirmed distance measurements of 9.6 billion light years are circled. The combination of the X-ray detection and the collection of massive galaxies unequivocally proves a real, gravitationally bound cluster.

That the individual galaxies are indeed held together by gravity is confirmed by observations in a very different wavelength regime: The matter between the galaxies in clusters is heated to extreme temperatures and emits light at much shorter wavelengths than visible to the human eye. The team therefore used the XMM-Newton space observatory to look for this radiation in X-rays.

“Despite the difficulties in collecting X-ray photons with a small effective telescope size similar to the size of a backyard telescope, we detected a clear signature of hot gas in the cluster,” said Alexis Finoguenov from the Max Planck Institute for Extraterrestrial Physics.

The combination of these different observations in what are invisible wavelengths to the human eye led to the pioneering discovery of the galaxy cluster at a distance of 9.6 billion light years – some 400 million light years further into the past than the previously most distant cluster known.

An analysis of the data collected about the individual galaxies shows that the cluster contains already an abundance of evolved, massive galaxies that formed some two billion years earlier. As the dynamical processes for galaxy aging are slow, presence of these galaxies requires the cluster assembly through merger of massive galaxy groups, each nourishing its dominant galaxy. The cluster is therefore an ideal laboratory for studying the evolution of galaxies, when the universe was only about a third of its present age.

As distant galaxy clusters are also important tracers of the large scale structure and primordial density fluctuations in the universe, similar observations in the future will lead to important information for cosmologists. The results obtained so far demonstrate that current near infrared facilities are capable of providing a detailed analysis of distant galaxy populations and that the combination with X-ray data is a powerful new tool. The team therefore is continuing the search for more distant clusters.

Source: Max Planck Institute for Extraterrestrial Physics

New Images from Planck Reveal Star Formation Processes

An active star-formation region in the Orion Nebula, as seen By Planck. Credits: ESA/LFI & HFI Consortia

[/caption]
While most newborn stars are hidden beneath a blanket of gas and dust, the Planck space observatory – with its microwave eyes – can peer beneath that shroud to provide new insights into star formation. The latest images released by the Planck team bring to light two different star forming regions in the Milky Way, and in stunning detail, reveal the different physical processes at work.

“Seeing” across nine different wavelengths, Planck took at look at star forming regions in the constellations of Orion and Perseus. The top image shows the interstellar medium in a region of the Orion Nebula where stars are actively forming in large numbers. “The power of Planck’s very wide wavelength coverage is immediately apparent in these images,” said Peter Ade of Cardiff University, co-Investigator on Planck. “The red loop seen here is Barnard’s Loop, and the fact that it is visible at longer wavelengths tells us that it is emitted by hot electrons, and not by interstellar dust. The ability to separate the different emission mechanisms is key for Planck’s primary mission.”

A comparable sequence of images, below, showing a region where fewer stars are forming near the constellation of Perseus, illustrates how the structure and distribution of the interstellar medium can be distilled from the images obtained with Planck.

This sequence of images, showing a region where fewer stars are forming near the constellation of Perseus, illustrates how the structure and distribution of the interstellar medium can be distilled from the images obtained with Planck. Credit: ESA / HFI and LFI Consortia

At wavelengths where Planck’s sensitive instruments observe, the Milky Way emits strongly over large areas of the sky. This emission arises primarily from four processes, each of which can be isolated using Planck. At the longest wavelengths, of about a centimeter, Planck maps the distribution of synchrotron emission due to high-speed electrons interacting with the magnetic fields of our Galaxy. At intermediate wavelengths of a few millimeters the emission is dominated by ionized gas being heated by newly formed stars. At the shortest wavelengths, of around a millimeter and below, Planck maps the distribution of interstellar dust, including the coldest compact regions in the final stages of collapse towards the formation of new stars.

“The real power of Planck is the combination of the High and Low Frequency Instruments which allow us, for the first time, to disentangle the three foregrounds,” said Professor Richard Davis of the University of Manchester’s Jodrell Bank Centre for Astrophysics. “This is of interest in its own right but also enables us to see the Cosmic Microwave Background far more clearly.”

Once formed, the new stars disperse the surrounding gas and dust, changing their own environment. A delicate balance between star formation and the dispersion of gas and dust regulates the number of stars that any given galaxy makes. Many physical processes influence this balance, including gravity, the heating and cooling of gas and dust, magnetic fields and more. As a result of this interplay, the material rearranges itself into ‘phases’ which coexist side-by-side. Some regions, known as ‘molecular clouds,’ contain dense gas and dust, while others, referred to as ‘cirrus’ (which look like the wispy clouds we have here on Earth), contain more diffuse material.

Location of the Planck images in Orion and Perseus. ESA / HFI and LFI Consortia, STSci/DSS/IRAS (background image)

Since Planck can look across such a wide range of frequencies, it can, for the first time, provide data simultaneously on all the main emission mechanisms. Planck’s wide wavelength coverage, which is required to study the Cosmic Microwave Background, proves also to be crucial for the study of the interstellar medium.

“The Planck maps are really fantastic to look at,” said Dr. Clive Dickinson, also of the University of Manchester. “These are exciting times.”

Planck maps the sky with its High Frequency Instrument (HFI), which includes the frequency bands 100-857 GHz (wavelengths of 3mm to 0.35mm), and the Low Frequency Instrument (LFI) which includes the frequency bands 30-70 GHz (wavelengths of 10mm to 4mm).

The Planck team will complete its first all-sky survey in mid-2010), and the spacecraft will continue to gather data until the end of 2012, during which time it will complete four sky scans. To arrive at the main cosmology results will require about two years of data processing and analysis. The first set of processed data will be made available to the worldwide scientific community towards the end of 2012.

Source: ESA and Cardiff University

GOODS, Under Astronomers’ AEGIS, Produce GEMS

No, not really (but I got all three key words into the title in a way that sorta makes sense).

Astronomers, like most scientists, just love acronyms; unfortunately, like most acronyms, on their own the ones astronomers use make no sense to non-astronomers.

And sometimes not even when written in full:
GOODS = Great Observatories Origins Deep Survey; OK that’s vaguely comprehensible (but what ‘origins’ is it about?)
AEGIS = All-wavelength Extended Groth strip International Survey; hmm, what’s a ‘Groth’?
GEMS = Galaxy Evolution from Morphology and SEDs; is Morphology the study of Morpheus’ behavior? And did you guess that the ‘S’ stood for ‘SEDs’ (not ‘Survey’)?

But, given that these all involve a ginormous amount of the ‘telescope time’ of the world’s truly great observatories, to produce such visually stunning images as the one below (NOT!), why do astronomers do it?

GEMS tile#58 (MPIfA)


Astronomy has made tremendous progress in the last century, when it comes to understanding the nature of the universe in which we live.

As late as the 1920s there was still debate about the (mostly faint) fuzzy patches that seemed to be everywhere in the sky; were the spiral-shaped ones separate ‘island universes’, or just funny blobs of gas and dust like the Orion nebula (‘galaxy’ hadn’t been invented then)?

Today we have a powerful, coherent account of everything we see in the night sky, no matter whether we use x-ray eyes, night vision (infrared), or radio telescopes, an account that incorporates the two fundamental theories of modern physics, general relativity and quantum theory. We say that all the stars, emission and absorption nebulae, planets, galaxies, supermassive black holes (SMBHs), gas and plasma clouds, etc formed, directly or indirectly, from a nearly uniform, tenuous sea of hydrogen and helium gas about 13.4 billion years ago (well, maybe the SMBHs didn’t). This is the ‘concordance LCDM cosmological model’, known popularly as ‘the Big Bang Theory’.

But how? How did the first stars form? How did they come together to form galaxies? Why did some galaxies’ nuclei ‘light up’ to form quasars (and others didn’t)? How did the galaxies come to have the shapes we see? … and a thousand other questions, questions which astronomers hope to answer, with projects like GOODS, AEGIS, and GEMS.

The basic idea is simple: pick a random, representative patch of sky and stare at it, for a very, very long time. And do so with every kind of eye you have (but most especially the very sharp ones).

By staring across as much of the electromagnetic spectrum as possible, you can make a chart (or graph) of the amount of energy is coming to us from each part of that spectrum, for each of the separate objects you see; this is called the spectral energy distribution, or SED for short.

By breaking the light of each object into its rainbow of colors – taking a spectrum, using a spectrograph – you can find the tell-tale lines of various elements (and from this work out a great deal about the physical conditions of the material which emitted, or absorbed, the light); “light” here is shorthand for electromagnetic radiation, though mostly ultraviolet, visible light (which astronomers call ‘optical’), and infrared (near, mid, and far).

By taking really, really sharp images of the objects you can classify, categorize, and count them by their shape, morphology in astronomer-speak.

And because the Hubble relationship gives you an object’s distance once you know its redshift, and as distance = time, sorting everything by redshift gives you a picture of how things have changed over time, ‘evolution’ as astronomers say (not to be confused with the evolution Darwin made famous, which is a very different thing).

GOODS

The great observatories are Chandra, XMM-Newton, Hubble, Spitzer, and Herschel (space-based), ESO-VLT (European Southern Observatory Very Large Telescope), Keck, Gemini, Subaru, APEX (Atacama Pathfinder Experiment), JCMT (James Clerk Maxwell Telescope), and the VLA. Some of the observing commitments are impressive, for example over 2 million seconds using the ISAAC instrument (doubly impressive considering that ground-based facilities, unlike space-based ones, can only observe the sky at night, and only when there is no Moon).

There are two GOODS fields, called GOODS-North and GOODS-South. Each is a mere 150 square arcminutes in size, which is tiny, tiny, tiny (you need five fields this size to completely cover the Moon)! Of course, some of the observations extend beyond the two core 150 square arcminutes fields, but every observatory covered every square arcsecond of either field (or, for space-based observatories, both).

GOODS-N ACS fields (GOODS/STScI)

GOODS-N is centered on the Hubble Deep Field (North is understood; this is the first HDF), at 12h 36m 49.4000s +62d 12′ 58.000″ J2000.
GOODS-S ACS fields (GOODS/STScI)

GOODS-S is centered on the Chandra Deep Field-South (CDFS), at 3h 32m 28.0s -27d 48′ 30″ J2000.

The Hubble observations were taken using the ACS (Advanced Camera for Surveys), in four wavebands (bandpasses, filters), which are approximately the astronomers’ B, V, i, and z.

Extended Groth Strip fields (AEGIS)

AEGIS

The ‘Groth’ refers to Edward J. Groth who is currently at the Physics Department of Princeton University. In 1995 he presented a ‘poster paper’ at the 185th meeting of the American Astronomical Society entitled “A Survey with the HST“. The Groth strip is the 28 pointings of the Hubble’s WFPC2 camera in 1994, centered on 14h 17m +52d 30′. The Extended Groth Strip (EGS) is considerably bigger than the GOODS fields, combined. The observatories which have covered the EGS include Chandra, GALEX, the Hubble (both NICMOS and ACS, in addition to WFPC2), CFHT, MMT, Subaru, Palomar, Spitzer, JCMT, and the VLA. The total area covered is 0.5 to 1 square degree, though the Hubble observations cover only ~0.2 square degrees (and only 0.0128 for the NICMOS ones). Only two filters were used for the ACS observations (approximately V and I).

I guess you, dear reader, can work out why this is called an ‘All wavelength’ and ‘International Survey’, can’t you?

GEMS' ACS fields (MPIfA)

GEMS

GEMS is centered on the CDFS (Chandra Deep Field-South, remember?), but covers a much bigger area than GOODS-S, 900 square arcminutes (the largest contiguous field so far imaged by the Hubble at the time, circa 2004; the COSMOS field is certainly larger, but most of it is monochromatic – I band only – so the GEMS field is the largest contiguous color one, to date). It is a mosaic of 81 ACS pointings, using two filters (approximately V and z).

Its SEDs component comes largely from the results of a previous large project covering the same area, called COMBO-17 (Classifying Objects by Medium-Band Observations – a spectrophotometric 17-band survey).

Sources: GOODS (STScI), GOODS (ESO), AEGIS, GEMS, ADS
Special thanks to reader nedwright for catching the error re GEMS (and thanks to to readers who have emailed me with your comments and suggestions; much appreciated)

Magnetic Fields in Inter-cluster Space: Measured at Last

How Does Light Travel?

[/caption]
The strength of the magnetic fields here on Earth, on the Sun, in inter-planetary space, on stars in our galaxy (the Milky Way; some of them anyway), in the interstellar medium (ISM) in our galaxy, and in the ISM of other spiral galaxies (some of them anyway) have been measured. But there have been no measurements of the strength of magnetic fields in the space between galaxies (and between clusters of galaxies; the IGM and ICM).

Up till now.

But who cares? What scientific importance does the strength of the IGM and ICM magnetic fields have?

The Large Area Telescope (LAT) on Fermi detects gamma-rays through matter (electrons) and antimatter (positrons) they produce after striking layers of tungsten. Credit: NASA/Goddard Space Flight Center Conceptual Image Lab

Estimates of these fields may provide “a clue that there was some fundamental process in the intergalactic medium that made magnetic fields,” says Ellen Zweibel, a theoretical astrophysicist at the University of Wisconsin, Madison. One “top-down” idea is that all of space was somehow left with a slight magnetic field soon after the Big Bang – around the end of inflation, Big Bang Nucleosynthesis, or decoupling of baryonic matter and radiation – and this field grew in strength as stars and galaxies amassed and amplified its intensity. Another, “bottom-up” possibility is that magnetic fields formed initially by the motion of plasma in small objects in the primordial universe, such as stars, and then propagated outward into space.

So how do you estimate the strength of a magnetic field, tens or hundreds of millions of light-years away, in regions of space a looong way from any galaxies (much less clusters of galaxies)? And how do you do this when you expect these fields to be much less than a nanoGauss (nG), perhaps as small as a femtoGauss (fG, which is a millionth of a nanoGauss)? What trick can you use??

A very neat one, one that relies on physics not directly tested in any laboratory, here on Earth, and unlikely to be so tested during the lifetime of anyone reading this today – the production of positron-electron pairs when a high energy gamma ray photon collides with an infrared or microwave one (this can’t be tested in any laboratory, today, because we can’t make gamma rays of sufficiently high energy, and even if we could, they’d collide so rarely with infrared light or microwaves we’d have to wait centuries to see such a pair produced). But blazars produce copious quantities of TeV gamma rays, and in intergalactic space microwave photons are plentiful (that’s what the cosmic microwave background – CMB – is!), and so too are far infrared ones.

MAGIC telescope (Credit: Robert Wagner)

Having been produced, the positron and electron will interact with the CMB, local magnetic fields, other electrons and positrons, etc (the details are rather messy, but were basically worked out some time ago), with the net result that observations of distant, bright sources of TeV gamma rays can set lower limits on the strength of the IGM and ICM through which they travel. Several recent papers report results of such observations, using the Fermi Gamma-Ray Space Telescope, and the MAGIC telescope.

So how strong are these magnetic fields? The various papers give different numbers, from greater than a few tenths of a femtoGauss to greater than a few femtoGauss.

“The fact that they’ve put a lower bound on magnetic fields far out in intergalactic space, not associated with any galaxy or clusters, suggests that there really was some process that acted on very wide scales throughout the universe,” Zweibel says. And that process would have occurred in the early universe, not long after the Big Bang. “These magnetic fields could not have formed recently and would have to have formed in the primordial universe,” says Ruth Durrer, a theoretical physicist at the University of Geneva.

So, perhaps we have yet one more window into the physics of the early universe; hooray!

Sources: Science News, arXiv:1004.1093, arXiv:1003.3884

Hubble Confirms Cosmic Acceleration with Weak Lensing

This image shows a smoothed reconstruction of the total (mostly dark) matter distribution in the COSMOS field, created from data taken by the NASA/ESA Hubble Space Telescope and ground-based telescopes.Credit: NASA, ESA, P. Simon (University of Bonn) and T. Schrabback (Leiden Observatory)

[/caption]

Need more evidence that the expansion of the Universe is accelerating? Just look to the Hubble Space Telescope. An international team of astronomers has indeed confirmed that the expansion of the universe is accelerating. The team, led by Tim Schrabback of the Leiden Observatory, conducted an intensive study of over 446,000 galaxies within the COSMOS (Cosmological Evolution Survey) field, the result of the largest survey ever conducted with Hubble. In making the COSMOS survey, Hubble photographed 575 slightly overlapping views of the same part of the Universe using the Advanced Camera for Surveys (ACS) onboard the orbiting telescope. It took nearly 1,000 hours of observations.

In addition to the Hubble data, researchers used redshift data from ground-based telescopes to assign distances to 194,000 of the galaxies surveyed (out to a redshift of 5). “The sheer number of galaxies included in this type of analysis is unprecedented, but more important is the wealth of information we could obtain about the invisible structures in the Universe from this exceptional dataset,” said co-author Patrick Simon from Edinburgh University.

In particular, the astronomers could “weigh” the large-scale matter distribution in space over large distances. To do this, they made use of the fact that this information is encoded in the distorted shapes of distant galaxies, a phenomenon referred to as weak gravitational lensing. Using complex algorithms, the team led by Schrabback has improved the standard method and obtained galaxy shape measurements to an unprecedented precision. The results of the study will be published in an upcoming issue of Astronomy and Astrophysics.

The meticulousness and scale of this study enables an independent confirmation that the expansion of the Universe is accelerated by an additional, mysterious component named dark energy. A handful of other such independent confirmations exist. Scientists need to know how the formation of clumps of matter evolved in the history of the Universe to determine how the gravitational force, which holds matter together, and dark energy, which pulls it apart by accelerating the expansion of the Universe, have affected them. “Dark energy affects our measurements for two reasons. First, when it is present, galaxy clusters grow more slowly, and secondly, it changes the way the Universe expands, leading to more distant — and more efficiently lensed — galaxies. Our analysis is sensitive to both effects,” says co-author Benjamin Joachimi from the University of Bonn. “Our study also provides an additional confirmation for Einstein’s theory of general relativity, which predicts how the lensing signal depends on redshift,” adds co-investigator Martin Kilbinger from the Institut d’Astrophysique de Paris and the Excellence Cluster Universe.

The large number of galaxies included in this study, along with information on their redshifts is leading to a clearer map of how, exactly, part of the Universe is laid out; it helps us see its galactic inhabitants and how they are distributed. “With more accurate information about the distances to the galaxies, we can measure the distribution of the matter between them and us more accurately,” notes co-investigator Jan Hartlap from the University of Bonn. “Before, most of the studies were done in 2D, like taking a chest X-ray. Our study is more like a 3D reconstruction of the skeleton from a CT scan. On top of that, we are able to watch the skeleton of dark matter mature from the Universe’s youth to the present,” comments William High from Harvard University, another co-author.

The astronomers specifically chose the COSMOS survey because it is thought to be a representative sample of the Universe. With thorough studies such as the one led by Schrabback, astronomers will one day be able to apply their technique to wider areas of the sky, forming a clearer picture of what is truly out there.

Source: EurekAlert

Paper: Schrabback et al., ‘Evidence for the accelerated expansion of the Universe from weak lensing tomography with COSMOS’, Astronomy and Astrophysics, March 2010,

This is Getting Boring: General Relativity Passes Yet another Big Test!

Princeton University scientists (from left) Reinabelle Reyes, James Gunn and Rachel Mandelbaum led a team that analyzed more than 70,000 galaxies and demonstrated that the universe - at least up to a distance of 3.5 billion light years from Earth - plays by the rules set out by Einstein in his theory of general relativity. (Photo: Brian Wilson)

[/caption]
Published in 1915, Einstein’s theory of general relativity (GR) passed its first big test just a few years later, when the predicted gravitational deflection of light passing near the Sun was observed during the 1919 solar eclipse.

In 1960, GR passed its first big test in a lab, here on Earth; the Pound-Rebka experiment. And over the nine decades since its publication, GR has passed test after test after test, always with flying colors (check out this review for an excellent summary).

But the tests have always been within the solar system, or otherwise indirect.

Now a team led by Princeton University scientists has tested GR to see if it holds true at cosmic scales. And, after two years of analyzing astronomical data, the scientists have concluded that Einstein’s theory works as well in vast distances as in more local regions of space.

A partial map of the distribution of galaxies in the SDSS, going out to a distance of 7 billion light years. The amount of galaxy clustering that we observe today is a signature of how gravity acted over cosmic time, and allows as to test whether general relativity holds over these scales. (M. Blanton, SDSS)

The scientists’ analysis of more than 70,000 galaxies demonstrates that the universe – at least up to a distance of 3.5 billion light years from Earth – plays by the rules set out by Einstein in his famous theory. While GR has been accepted by the scientific community for over nine decades, until now no one had tested the theory so thoroughly and robustly at distances and scales that go way beyond the solar system.

Reinabelle Reyes, a Princeton graduate student in the Department of Astrophysical Sciences, along with co-authors Rachel Mandelbaum, an associate research scholar, and James Gunn, the Eugene Higgins Professor of Astronomy, outlined their assessment in the March 11 edition of Nature.

Other scientists collaborating on the paper include Tobias Baldauf, Lucas Lombriser and Robert Smith of the University of Zurich and Uros Seljak of the University of California-Berkeley.

The results are important, they said, because they shore up current theories explaining the shape and direction of the universe, including ideas about dark energy, and dispel some hints from other recent experiments that general relativity may be wrong.

“All of our ideas in astronomy are based on this really enormous extrapolation, so anything we can do to see whether this is right or not on these scales is just enormously important,” Gunn said. “It adds another brick to the foundation that underlies what we do.”

GR is one, of two, core theories underlying all of contemporary astrophysics and cosmology (the other is the Standard Model of particle physics, a quantum theory); it explains everything from black holes to the Big Bang.

In recent years, several alternatives to general relativity have been proposed. These modified theories of gravity depart from general relativity on large scales to circumvent the need for dark energy, dark matter, or both. But because these theories were designed to match the predictions of general relativity about the expansion history of the universe, a factor that is central to current cosmological work, it has become crucial to know which theory is correct, or at least represents reality as best as can be approximated.

“We knew we needed to look at the large-scale structure of the universe and the growth of smaller structures composing it over time to find out,” Reyes said. The team used data from the Sloan Digital Sky Survey (SDSS), a long-term, multi-institution telescope project mapping the sky to determine the position and brightness of several hundred million galaxies and quasars.

By calculating the clustering of these galaxies, which stretch nearly one-third of the way to the edge of the universe, and analyzing their velocities and distortion from intervening material – due to weak lensing, primarily by dark matter – the researchers have shown that Einstein’s theory explains the nearby universe better than alternative theories of gravity.

Some of the 70,000 luminous galaxies in SDSS analyzed (Image: SDSS Collaboration)

The Princeton scientists studied the effects of gravity on the SDSS galaxies and clusters of galaxies over long periods of time. They observed how this fundamental force drives galaxies to clump into larger collections of galaxies and how it shapes the expansion of the universe.

Critically, because relativity calls for the curvature of space to be equal to the curvature of time, the researchers could calculate whether light was influenced in equal amounts by both, as it should be if general relativity holds true.

“This is the first time this test was carried out at all, so it’s a proof of concept,” Mandelbaum said. “There are other astronomical surveys planned for the next few years. Now that we know this test works, we will be able to use it with better data that will be available soon to more tightly constrain the theory of gravity.”

Firming up the predictive powers of GR can help scientists better understand whether current models of the universe make sense, the scientists said.

“Any test we can do in building our confidence in applying these very beautiful theoretical things but which have not been tested on these scales is very important,” Gunn said. “It certainly helps when you are trying to do complicated things to understand fundamentals. And this is a very, very, very fundamental thing.”

“The nice thing about going to the cosmological scale is that we can test any full, alternative theory of gravity, because it should predict the things we observe,” said co-author Uros Seljak, a professor of physics and of astronomy at UC Berkeley and a faculty scientist at Lawrence Berkeley National Laboratory who is currently on leave at the Institute of Theoretical Physics at the University of Zurich. “Those alternative theories that do not require dark matter fail these tests.”

Sources: “Princeton scientists say Einstein’s theory applies beyond the solar system” (Princeton University), “Study validates general relativity on cosmic scale, existence of dark matter” (University of California Berkeley), “Confirmation of general relativity on large scales from weak lensing and galaxy velocities” (Nature, arXiv preprint)

Dark Matter in Distant Galaxy Groups Mapped for the First Time

X-ray emission in the COSMOS field (XMM-Newton/ESA)

[/caption]
Galaxy density in the Cosmic Evolution Survey (COSMOS) field, with colors representing the redshift of the galaxies, ranging from redshift of 0.2 (blue) to 1 (red). Pink x-ray contours show the extended x-ray emission as observed by XMM-Newton.

Dark matter (actually cold, dark – non-baryonic – matter) can be detected only by its gravitational influence. In clusters and groups of galaxies, that influence shows up as weak gravitational lensing, which is difficult to nail down. One way to much more accurately estimate the degree of gravitational lensing – and so the distribution of dark matter – is to use the x-ray emission from the hot intra-cluster plasma to locate the center of mass.

And that’s just what a team of astronomers have recently done … and they have, for the first time, given us a handle on how dark matter has evolved over the last many billion years.

COSMOS is an astronomical survey designed to probe the formation and evolution of galaxies as a function of cosmic time (redshift) and large scale structure environment. The survey covers a 2 square degree equatorial field with imaging by most of the major space-based telescopes (including Hubble and XMM-Newton) and a number of ground-based telescopes.

Understanding the nature of dark matter is one of the key open questions in modern cosmology. In one of the approaches used to address this question astronomers use the relationship between mass and luminosity that has been found for clusters of galaxies which links their x-ray emissions, an indication of the mass of the ordinary (“baryonic”) matter alone (of course, baryonic matter includes electrons, which are leptons!), and their total masses (baryonic plus dark matter) as determined by gravitational lensing.

To date the relationship has only been established for nearby clusters. New work by an international collaboration, including the Max Planck Institute for Extraterrestrial Physics (MPE), the Laboratory of Astrophysics of Marseilles (LAM), and Lawrence Berkeley National Laboratory (Berkeley Lab), has made major progress in extending the relationship to more distant and smaller structures than was previously possible.

To establish the link between x-ray emission and underlying dark matter, the team used one of the largest samples of x-ray-selected groups and clusters of galaxies, produced by the ESA’s x-ray observatory, XMM-Newton.

Groups and clusters of galaxies can be effectively found using their extended x-ray emission on sub-arcminute scales. As a result of its large effective area, XMM-Newton is the only x-ray telescope that can detect the faint level of emission from distant groups and clusters of galaxies.

“The ability of XMM-Newton to provide large catalogues of galaxy groups in deep fields is astonishing,” said Alexis Finoguenov of the MPE and the University of Maryland, a co-author of the recent Astrophysical Journal (ApJ) paper which reported the team’s results.

Since x-rays are the best way to find and characterize clusters, most follow-up studies have until now been limited to relatively nearby groups and clusters of galaxies.

“Given the unprecedented catalogues provided by XMM-Newton, we have been able to extend measurements of mass to much smaller structures, which existed much earlier in the history of the Universe,” says Alexie Leauthaud of Berkeley Lab’s Physics Division, the first author of the ApJ study.

COSMOS-XCL095951+014049 (Subaru/NAOJ, XMM-Newton/ESA)

Gravitational lensing occurs because mass curves the space around it, bending the path of light: the more mass (and the closer it is to the center of mass), the more space bends, and the more the image of a distant object is displaced and distorted. Thus measuring distortion, or ‘shear’, is key to measuring the mass of the lensing object.

In the case of weak gravitational lensing (as used in this study) the shear is too subtle to be seen directly, but faint additional distortions in a collection of distant galaxies can be calculated statistically, and the average shear due to the lensing of some massive object in front of them can be computed. However, in order to calculate the lens’ mass from average shear, one needs to know its center.

“The problem with high-redshift clusters is that it is difficult to determine exactly which galaxy lies at the centre of the cluster,” says Leauthaud. “That’s where x-rays help. The x-ray luminosity from a galaxy cluster can be used to find its centre very accurately.”

Knowing the centers of mass from the analysis of x-ray emission, Leauthaud and colleagues could then use weak lensing to estimate the total mass of the distant groups and clusters with greater accuracy than ever before.

The final step was to determine the x-ray luminosity of each galaxy cluster and plot it against the mass determined from the weak lensing, with the resulting mass-luminosity relation for the new collection of groups and clusters extending previous studies to lower masses and higher redshifts. Within calculable uncertainty, the relation follows the same straight slope from nearby galaxy clusters to distant ones; a simple consistent scaling factor relates the total mass (baryonic plus dark) of a group or cluster to its x-ray brightness, the latter measuring the baryonic mass alone.

“By confirming the mass-luminosity relation and extending it to high redshifts, we have taken a small step in the right direction toward using weak lensing as a powerful tool to measure the evolution of structure,” says Jean-Paul Kneib a co-author of the ApJ paper from LAM and France’s National Center for Scientific Research (CNRS).

The origin of galaxies can be traced back to slight differences in the density of the hot, early Universe; traces of these differences can still be seen as minute temperature differences in the cosmic microwave background (CMB) – hot and cold spots.

“The variations we observe in the ancient microwave sky represent the imprints that developed over time into the cosmic dark-matter scaffolding for the galaxies we see today,” says George Smoot, director of the Berkeley Center for Cosmological Physics (BCCP), a professor of physics at the University of California at Berkeley, and a member of Berkeley Lab’s Physics Division. Smoot shared the 2006 Nobel Prize in Physics for measuring anisotropies in the CMB and is one of the authors of the ApJ paper. “It is very exciting that we can actually measure with gravitational lensing how the dark matter has collapsed and evolved since the beginning.”

One goal in studying the evolution of structure is to understand dark matter itself, and how it interacts with the ordinary matter we can see. Another goal is to learn more about dark energy, the mysterious phenomenon that is pushing matter apart and causing the Universe to expand at an accelerating rate. Many questions remain unanswered: Is dark energy constant, or is it dynamic? Or is it merely an illusion caused by a limitation in Einstein’s General Theory of Relativity?

The tools provided by the extended mass-luminosity relationship will do much to answer these questions about the opposing roles of gravity and dark energy in shaping the Universe, now and in the future.

Sources: ESA, and a paper published in the 20 January, 2010 issue of the Astrophysical Journal (arXiv:0910.5219 is the preprint)