Dark Energy Survey Will Study 300 Million Galaxies

Image credit: Hubble
University scientists have co-founded an international collaboration that seeks to measure with new precision the mysterious force causing the universe to fly apart. Plans call for the project, named the Dark Energy Survey, to collect data on approximately 300 million galaxies spanning two-thirds of the history of the universe.

The survey could begin making observations as early as the fall of 2009. Although the DES remains more than four years away, more ambitious surveys will take at least a decade to produce results. ?I don?t want to wait that long,? said Joshua Frieman, Professor in Astronomy & Astrophysics and the College.

According to physics accounting methods, dark energy makes up 70 percent of the universe. Dark energy might be a manifestation of Albert Einstein?s cosmological constant, a force that acts at all times and in all places throughout the universe. It might also be a breakdown of Einstein?s theory of gravity on vast scales.

?It essentially requires gravity to be repulsive,? said Wayne Hu, Associate Professor in Astronomy & Astrophysics. ?That?s possible under our standard theories of gravity, but it?s not expected.? Whatever dark energy is, Frieman said, ?it?s likely to have profound implications for fundamental physics.?

The DES collaboration consists of researchers at Chicago, Fermi National Accelerator Laboratory, the University of Illinois at Urbana-Champaign, the Lawrence Berkeley National Laboratory and the Cerro Tololo Inter-American Observatory, as well as groups from the United Kingdom and Barcelona, Spain. Funding for the $20 million project is likely to come primarily from the U.S. Department of Energy, European funding agencies, the member institutions, and other agencies and sources.

Frieman heads the University?s component of the collaboration. Joining him and Hu in the collaboration are John Carlstrom, the S. Chandrasekhar Distinguished Service Professor in Astronomy & Astrophysics and the College; Scott Dodelson, Professor in Astronomy & Astrophysics and the Physical Sciences Collegiate Division; Stephen Kent, Associate Professor in Astronomy & Astrophysics; Erin Sheldon, Fellow in the Kavli Institute for Cosmological Physics; and Risa Wechsler, Hubble Fellow in the Kavli Institute for Cosmological Physics. Frieman and Dodelson also are members of Fermilab?s Theoretical Astrophysics Group, which Dodelson heads, while Kent heads Fermilab?s Experimental Astrophysics Group.

The DES will entail installing a 520-megapixel camera on the existing four-meter Blanco Telescope at the Cerro Tololo Inter-American Observatory in Chile. ?This would be larger than any existing optical camera in the world,? Frieman said.

A few hundred megapixels may not sound like much, Frieman said, ?but they?re not the same pixels that go into your hand-held. They have much higher sensitivity. They?re high-precision, high-efficiency detectors.? Furthermore, the camera will allow the scientists to survey the sky 10 times faster than they could at any existing U.S. observatory.

?The camera that?s now on the telescope just has too small a field of view. It would take us many decades to do the survey,? Frieman said.

The new camera will enable the DES to employ four techniques in attempting to discriminate between the two broad explanations for dark energy?the cosmological constant or a breakdown of gravity.

?The first method and the one that really drives the survey design is to count clusters of galaxies,? Frieman said. In this effort it will work in tandem with Carlstrom?s South Pole Telescope, which is scheduled to begin making observations in March 2007.

The SPT will help reveal if dark energy has suppressed the formation of galaxy clusters over the history of the universe. A radio telescope, the SPT will detect galaxy clusters by the way they distort the microwave radiation left over from the big bang. If theorists know how distant and how massive the galaxy clusters are, they can predict how many there should be in the presence of dark energy. The DES will make optical measurements to estimate their distance through the colors of the galaxies and their mass by gravitational lensing, the distortion of light by an intervening galaxy cluster. ?That?s a really elegant test,? Hu said.

The third technique employs gravitational lensing on a cosmic scale. Theorists can predict the effect of the dark energy on the large-scale distribution of the dark matter. With its large survey area, the DES can measure the tiny distortion of the images of galaxies induced by fluctuations in the dark matter density.

The fourth method involves the same technique that led to the 1998 discovery of dark energy: measuring the distance to a certain type of exploding star to reconstruct the expansion history of the universe. Astronomers studied these exploding stars expecting to find that the expansion of the universe had slowed as time went on. They discovered instead an accelerated expansion.

?These techniques complement each other very well,? Frieman said. ?They suffer from different sources of error, so if they agree, that gives you confidence in your result.?

For his part, Hu hopes the tests will reveal some discrepancy between predictions and reality. ?To me that would be the most exciting thing.?

Original Source: University of Chicago News Release

Super Star Cluster Discovered in Our Own Milky Way

Super star clusters are groups of hundreds of thousands of very young stars packed into an unbelievably small volume. They represent the most extreme environments in which stars and planets can form.

Until now, super star clusters were only known to exist very far away, mostly in pairs or groups of interacting galaxies. Now, however, a team of European astronomers [1] have used ESO’s telescopes to uncover such a monster object within our own Galaxy, the Milky Way, almost, but not quite, in our own backyard!

The newly found massive structure is hidden behind a large cloud of dust and gas and this is why it took so long to unveil its true nature. It is known as “Westerlund 1” and is a thousand times closer than any other super star cluster known so far. It is close enough that astronomers may now probe its structure in some detail.

Westerlund 1 contains hundreds of very massive stars, some shining with a brilliance of almost one million suns and some two-thousand times larger than the Sun (as large as the orbit of Saturn)! Indeed, if the Sun were located at the heart of this remarkable cluster, our sky would be full of hundreds of stars as bright as the full Moon. Westerlund 1 is a most unique natural laboratory for the study of extreme stellar physics, helping astronomers to find out how the most massive stars in our Galaxy live and die.

From their observations, the astronomers conclude that this extreme cluster most probably contains no less than 100,000 times the mass of the Sun, and all of its stars are located within a region less than 6 light-years across. Westerlund 1 thus appears to be the most massive compact young cluster yet identified in the Milky Way Galaxy.

Super Star Clusters
Stars are generally born in small groups, mostly in so-called “open clusters” that typically contain a few hundred stars. From a wide range of observations, astronomers infer that the Sun itself was born in one such cluster, some 4,500 million years ago.

In some active (“starburst”) galaxies, scientists have observed violent episodes of star formation (see, for example, ESO Press Photo 31/04), leading to the development of super star clusters, each containing several million stars.

Such events were obviously common during the Milky Way’s childhood, more than 12,000 million years ago: the many galactic globular clusters – which are nearly as old as our Galaxy (e.g. ESO PR 20/04) – are indeed thought to be the remnants of early super star clusters.

All super star clusters so far observed in starburst galaxies are very distant. It is not possible to distinguish their individual stars, even with the most advanced technology. This dramatically complicates their study and astronomers have therefore long been eager to find such clusters in our neighbourhood in order to probe their structure in much more detail.

Now, a team of European astronomers [1] has finally succeeded in doing so, using several of ESO’s telescopes at the La Silla observatory (Chile).

Westerlund 1
The open cluster Westerlund 1 is located in the Southern constellation Ara (the Altar constellation). It was discovered in 1961 from Australia by Swedish astronomer Bengt Westerlund, who later moved from there to become ESO Director in Chile (1970 – 74). This cluster is behind a huge interstellar cloud of gas and dust, which blocks most of its visible light. The dimming factor is more than 100,000 – and this is why it has taken so long to uncover the true nature of this particular cluster.

In 2001, the team of astronomers identified more than a dozen extremely hot and peculiar massive stars in the cluster, so-called “Wolf-Rayet” stars. They have since studied Westerlund 1 extensively with various ESO telescopes.

They used images from the Wide Field Imager (WFI) attached to the 2.2-m ESO/MPG as well as from the SUperb Seeing Imager 2 (SuSI2) camera on the ESO 3.5-m New Technology Telescope (NTT). From these observations, they were able to identify about 200 cluster member stars.

To establish the true nature of these stars, the astronomers then performed spectroscopic observations of about one quarter of them. For this, they used the Boller & Chivens spectrograph on the ESO 1.52-m telescope and the ESO Multi-Mode Instrument (EMMI) on the NTT.

An Exotic Zoo
These observations have revealed a large population of very bright and massive, quite extreme stars. Some would fill the solar system space within the orbit of Saturn (about 2,000 times larger than the Sun!), others are as bright as a million Suns.

Westerlund 1 is obviously a fantastic stellar zoo, with a most exotic population and a true astronomical bonanza. All stars identified are evolved and very massive, spanning the full range of stellar oddities from Wolf-Rayet stars, OB supergiants, Yellow Hypergiants (nearly as bright as a million Suns) and Luminous Blue Variables (similar to the exceptional Eta Carinae object – see ESO PR 31/03).

All stars so far analysed in Westerlund 1 weigh at least 30-40 times more than the Sun. Because such stars have a rather short life – astronomically speaking – Westerlund 1 must be very young. The astronomers determine an age somewhere between 3.5 and 5 million years. So, Westerlund 1 is clearly a “newborn” cluster in our Galaxy!

The Most Massive Cluster
Westerlund 1 is incredibly rich in monster stars – just as one example, it contains as many Yellow Hypergiants as were hitherto known in the entire Milky Way!

“If the Sun were located at the heart of Westerlund 1, the sky would be full of stars, many of them brighter than the full Moon”, comments Ignacio Negueruela of the Universidad de Alicante in Spain and member of the team.

The large quantity of very massive stars implies that Westerlund 1 must contain a huge number of stars. “In our Galaxy, explains Simon Clark of the University College London (UK) and one of the authors of this study, “there are more than 100 solar-like stars for every star weighing 10 times as much as the Sun. The fact that we see hundreds of massive stars in Westerlund 1 means that it probably contains close to half a million stars, but most of these are not bright enough to peer through the obscuring cloud of gas and dust”. This is ten times more than any other known young clusterin the Milky Way.

Westerlund 1 is presumably much more massive than the dense clusters of heavy stars present in the central region of our Galaxy, like the Arches and Quintuplet clusters. Further deep infrared observations will be required to confirm this.

This super star cluster now provides astronomers with a unique perspective towards one of the most extreme environments in the Universe. Westerlund 1 will certainly provide new opportunities in the long-standing quest for more and finer details about how stars, and especially massive ones, do form.

… and the Most Dense
The large number of stars in Westerlund 1 was not the only surprise awaiting Clark and his colleagues. From their observations, the team members also found that all these stars are packed into an amazingly small volume of space, indeed less than 6 light-years across. In fact, this is more or less comparable to the 4 light-year distance to the star nearest to the Sun, Proxima Centauri!

It is incredible: the concentration in Westerlund 1 is so high that the mean separation between stars is quite similar to the extent of the Solar System.

“With so many stars in such a small volume, some of them may collide”, envisages Simon Clark. “This could lead to the formation of an intermediate-mass black hole more massive than 100 solar masses. It may well be that such a monster has already formed at the core of Westerlund 1.”

The huge population of massive stars in Westerlund 1 suggests that it will have a very significant impact on its surroundings. The cluster contains so many massive stars that in a time span of less than 40 million years, it will be the site of more than 1,500 supernovae. A gigantic firework that may drive a fountain of galactic material!

Because Westerlund 1 is at a distance of only about 10,000 light-years, high-resolution cameras such as NAOS/CONICA on ESO’s Very Large Telescope can resolve its individual stars. Such observations are now starting to reveal smaller stars in Westerlund 1, including some that are less massive than the Sun. Astronomers will thus soon be able to study this exotic galactic zoo in great depth.

More information
The research presented in this ESO Press Release will soon appear in the leading research journal Astronomy and Astrophysics (“On the massive stellar population of the Super Star Cluster Westerlund 1” by J.S. Clark and colleagues). The PDF file is available at the A&A web site. A second paper (“Further Wolf-Rayet stars in the starburst cluster Westerlund 1”, by Ignacio Negueruela and Simon Clark) will also soon be published in Astronomy and Astrophysics. It is available as astro-ph/0503303.
A Spanish press release issued by Universidad de Alicante is available on the web site of Ignacio Negueruela.

Note
[1]: The team is composed of Simon Clark (University College London, UK), Ignacio Negueruela (Universidad de Alicante, Spain), Paul Crowther (University of Sheffield, UK), Simon Goodwin (University of Wales, Cardiff, UK), Rens Waters (University of Amsterdam) and Sean Dougherty (Dominion Radio Astrophysical Observatory).

Original Source: ESO News Release

Seeing the Planks in Einstein’s Cross

Image credit: Hubble
Spiral galaxy PGC 69457 is located near the boundary of fall constellations Pegasus and Aquarius some 3 degrees south of third magnitude Theta Pegasi – but don’t dig out that 60mm refractor to look for it. The galaxy is actually some 400 million light years away and has an apparent brightness of magnitude 14.5. So next fall may be a good time to hook up with that “astro-nut” friend of yours who is always heading off into the sunset to get well away from city lights sporting a larger, much larger, amateur instrument…

But there are plenty of 14th magnitude galaxies in the sky – what makes PGC 69457 so special?

To begin with most galaxies don’t “block” the view of an even more distant quasar (QSO2237+0305). And should others exist, few have just the right distribution of high-density bodies needed to cause light to “bend” in a way that an otherwise invisible object is visible. With PGC 69457 you get not one – but four – separate 17th magnitude views of the same quasar for the trouble of setting up one 20 inch truss tube dobsonian. Is it worth it? (Can you say “quadruple your observing pleasure”?)

But the phenomenon behind such a view is even more interesting to professional astronomers. What can we learn from such a unique effect?

The theory is already well established – Albert Einstein predicted it in his “General Theory of Relativity” of 1915. Einstein’s core idea was that an observer undergoing acceleration and one stationary in a gravitational field could not tell the difference between the two on their “weight”. By exploring this idea to its fullest, it became clear that not only matter but light (despite being massless) undergoes the same sort of confusion. Because of this, light approaching a gravitational field at an angle is “accelerated toward” the source of the gravity – but because the velocity of light is constant such acceleration only effects light’s path and wavelength – not its actual speed.

Gravitational lensing itself was first detected during the total solar eclipse of 1919. This was seen as a slight shift in the positions of stars near the Sun’s corona as captured on photographic plates. Because of this observation, we now know that you don’t need a lens to bend light – or even water to refract the image of those Koi swimming in the pond. Light like matter takes the path of least resistance and that means following the gravitational curve of space as well as the optical curve of a lens. The light from QSO2237+0305 is only doing what comes naturally by surfing the contours of “space-time” arcing around dense stars lying along the line of sight from a distant source through a more neighboring galaxy. The really interesting thing about Einstein’s Cross comes down to what it tells us about all the masses involved – those in the galaxy that refracts the light, and the Big One in the heart of the quasar that sources it.

In their paper “Reconstruction of the microlensing light curves of the Einstein Cross” Korean astrophysicist Dong-Wook Lee (et al) of Sejong University in association with Belgian astrophysicist J. Surdez (et al) of the University of Liege, found evidence of an accretion disk surrounding the black hole in Quasar QSO2237+0305. How is such a thing possible at the distances involved?

Lenses in general “collect and focus light” and those “gravitational lenses” (Lee at al posit a minimum of five low-mass but highly condensed bodies) within PGC 69457, do the same. In this way, light from a quasar that would normally travel well away from our instruments “wraps around” the galaxy to come toward us. Because of this we “see” 100,000 times more detail than otherwise possible. But there is a catch: Despite getting 100,000 times more resolution, we still only see light, not detail. And because there are several masses refracting light in the galaxy, we see more than one view of the quasar.

To get useful information from the quasar, you have to collect light over long periods of time (months to years) and use special analytical algorithms to pull the resulting data together. The method used by Lee and associates is called LOHCAM (LOcal Hae CAustic Modeling). (HAE itself is an acronym for High Amplification Events). Using LOHCAM and data available from OGLE (Optical Gravitational Lensing Experiment) and GLIPT (Gravitational Lens International Time Project), the team determined not only that LOHCAM works as hoped but that QSO2237+0305 may include a detectable accretion disk (from which it draws matter to power its light engine). The team has also determined the approximate mass of the quasars black hole, the size of the ultraviolet region radiating from it, and estimated the transverse motion of the black hole as it moves relative to the spiral galaxy.

The central black hole in Quasar QSO2237+0305 is thought to have a combined mass of 1.5 billion Suns – a value rivaling those of the largest central black holes ever discovered. Such a mass number represents 1 percent of the total number of stars in our own Milky Way galaxy. Meanwhile and by comparison, QSO2237+0305’s black hole is roughly 50 times more massive than that in the center of our own galaxy.

Based on “double-peaks” in luminosity from the quasar, Lee et al used LOHCAM to also determine the size of QSO2237+0305’s accretion disk, its orientation, and detected a central obscuration region around the black hole itself. The disk itself is roughly 1/3rd of a light year in diameter and is turned face on towards us.

Impressed? Well let’s also add that the team has determined the minimum number of microlenses and related masses found in the lensing galaxy. Depending on transverse velocity assumed (in LOHCAM modeling), the smallest range from that of a gas giant – such as the planet Jupiter – through that of our own Sun.

So how does this “hole” thing work?

The OGLE and GLIPT projects monitored changes in the intensity of visual light streaming to us from each of the four 17th magnitude views of the quasar. Since most quasars are unresolvable,due to their great distances in space, by telescope. Fluctuations in luminosity are seen only as a single point of data based on the brightness of the entire quasar. However, QSO2237+0305 presents four images of the quasar and each image highlights luminosity originating from a different perspective of the quasar. By telescopically monitoring all four images simultaneously, slight variations in image intensity can be detected and recorded in terms of magnitude, date, and time. Over several months to years, a considerable number of such “high amplification events” can occur. Patterns emerging out of their occurrence (from one 17th magnitude view to the next) can then be analyzed to show motion and intensity. Out of this a super high resolution view of normally unseen structure within the quasar is possible.

Could you and your friend with that 20 inch dob-newtonian do this?

Sure – but not without some very expensive equipment and a good handle on some complex mathematical imaging algorithms. A nice place to start however might simply be to ogle the galaxy and hang with the cross for awhile…

Written by Jeff Barbour

Ripples in Spacetime Could Explain Dark Energy

Why is the universe expanding at an accelerating rate, spreading its contents over ever greater dimensions of space? An original solution to this puzzle, certainly the most fascinating question in modern cosmology, was put forward by four theoretical physicists, Edward W. Kolb of the U.S. Department of Energy’s Fermi National Accelerator Laboratory, Chicago (USA): Sabino Matarrese of the University of Padova; Alessio Notari from the University of Montreal (Canada); and Antonio Riotto of INFN (Istituto Nazionale di Fisica Nucleare) of Padova (Italy). Their study was submitted yesterday to the journal Physical Review Letters.

Over the last hundred years, the expansion of the universe has been a subject of passionate discussion, engaging the most brilliant minds of the century. Like his contemporaries, Albert Einstein initially thought that the universe was static: that it neither expanded nor shrank. When his own Theory of General Relativity clearly showed that the universe should expand or contract, Einstein chose to introduce a new ingredient into his theory. His “cosmological constant” represented a mass density of empty space that drove the universe to expand at an ever-increasing rate.

When in 1929 Edwin Hubble proved that the universe is in fact expanding, Einstein repudiated his cosmological constant, calling it “the greatest blunder of my life.” Then, almost a century later, physicists resurrected the cosmological constant in a variant called dark energy. In 1998, observations of very distant supernovae demonstrated that the universe is expanding at an accelerating rate. This accelerating expansion seemed to be explicable only by the presence of a new component of the universe, a “dark energy,” representing some 70 percent of the total mass of the universe. Of the rest, about 25 percent appears to be in the form of another mysterious component, dark matter; while only about 5 percent comprises ordinary matter, those quarks, protons, neutrons and electrons that we and the galaxies are made of.

“The hypothesis of dark energy is extremely fascinating,” explains Padova’s Antonio Riotto, “but on the other hand it represents a serious problem. No theoretical model, not even the most modern, such as supersymmetry or string theory, is able to explain the presence of this mysterious dark energy in the amount that our observations require. If dark energy were the size that theories predict, the universe would have expanded with such a fantastic velocity that it would have prevented the existence of everything we know in our cosmos.”

The requisite amount of dark energy is so difficult to reconcile with the known laws of nature that physicists have proposed all manner of exotic explanations, including new forces, new dimensions of spacetime, and new ultralight elementary particles. However, the new report proposes no new ingredient for the universe, only a realization that the present acceleration of the universe is a consequence of the standard cosmological model for the early universe: inflation.

“Our solution to the paradox posed by the accelerating universe,” Riotto says, “relies on the so-called inflationary theory, born in 1981. According to this theory, within a tiny fraction of a second after the Big Bang, the universe experienced an incredibly rapid expansion. This explains why our universe seems to be very homogeneous. Recently, the Boomerang and WMAP experiments, which measured the small fluctuations in the background radiation originating with the Big Bang, confirmed inflationary theory.

It is widely believed that during the inflationary expansion early in the history of the universe, very tiny ripples in spacetime were generated, as predicted by Einstein’s theory of General Relativity. These ripples were stretched by the expansion of the universe and extend today far beyond our cosmic horizon, that is over a region much bigger than the observable universe, a distance of about 15 billion light years. In their current paper, the authors propose that it is the evolution of these cosmic ripples that increases the observed expansion of the universe and accounts for its acceleration.

“We realized that you simply need to add this new key ingredient, the ripples of spacetime generated during the epoch of inflation, to Einstein’s General Relativity to explain why the universe is accelerating today,” Riotto says. “It seems that the solution to the puzzle of acceleration involves the universe beyond our cosmic horizon. No mysterious dark energy is required.”

Fermilab’s Kolb called the authors’ proposal the most conservative explanation for the accelerating universe. “It requires only a proper accounting of the physical effects of the ripples beyond our cosmic horizon,” he said.

Data from upcoming experiments will allow cosmologists to test the proposal. “Whether Einstein was right when he first introduced the cosmological constant, or whether he was right when he later refuted the idea will soon be tested by a new round of precision cosmological observations,” Kolb said. “New data will soon allow us to distinguish between our explanation for the accelerated expansion of the universe and the dark energy solution.”

INFN (Istituto Nazionale di Fisica Nucleare), Italy’s national nuclear physics institute, supports, coordinates and carries out scientific research in subnuclear, nuclear and astroparticle physics and is involved in developing relevant technologies.

Fermilab, in Batavia, Illinois, USA, is operated by Universities Research Association, Inc. for the Department of Energy’s Office of Science, which funds advanced research in particle physics and cosmology.

Original Source: Istituto Nazionale di Fisica Nucleare

Dark Energy in our Galactic Neighbourhood

Astrophysicists in recent years have found evidence for a force they call dark energy in observations from the farthest reaches of the universe, billions of light years away.

Now an international team of researchers has used data from powerful computer models, supported by observations from the Hubble Space Telescope, to find evidence of dark energy right in our own cosmic neighborhood.

The data paint a picture of the universe as a virtual sea of dark energy, with billions of galaxies as islands emerging from the sea, said Fabio Governato, a University of Washington research associate professor of astronomy and a researcher with Italy’s National Institute for Astrophysics.

In 1929 astronomer Edwin Hubble demonstrated that galaxies are moving away from each other, which supported the theory that the universe has been expanding since the big bang. In 1999 cosmologists reported evidence that an unusual force, called dark energy, was actually causing the expansion of the universe to accelerate.

However, the expansion is slower than it would be otherwise because of the tug of gravity among galaxies. As the battle between the attraction of gravity and the repellent force of dark energy plays out, cosmologists are left to ponder whether the expansion will continue forever or if the universe will collapse in a “big crunch.”

In 1997, Governato designed a computer model to simulate evolution of the universe from the big bang until the present. His research group found the model could not duplicate the smooth expansion that had been observed among galaxies around the Milky Way, the galaxy in which Earth resides. In fact, the model produced deviations from a purely radial expansion that were three to seven times higher than astronomers had actually observed, Governato said.

“The observed motion was small, and we could not duplicate it without the presence of dark energy,” he said. “When we added the dark energy, we got a perfect match.”

Governato is one of three authors of a paper describing the work, scheduled for publication in the Monthly Notices of the Royal Astronomical Society, an astronomy journal in the United Kingdom. Co-authors are Andrea Maccio of the University of Zurich in Switzerland and Cathy Horellou of Chalmers University of Technology in Sweden. The work was supported by grants from the National Science Foundation and Vetenskapsr?det, the Swedish Research Council.

The authors, part of an international research collaboration called the N-Body Shop that originated at the UW, ran simulations of universe expansion on powerful supercomputers in Italy and Alaska. Their findings provide supporting evidence for a sea of dark energy surrounding galaxies.

“We studied the properties of galaxies close to the Milky Way instead of looking billions of light years away,” Governato said. “It’s like traveling from Seattle to Portland, Ore., rather than from Seattle to New York, to measure the Earth’s curvature.”

Original Source: University of Washington News Release

Robot Finds Life in the Desert

Image credit: CMU
Current Mars expeditions raise the tantalizing possibility that there may be life somewhere on the red planet. But just how will future missions find it? A system being developed by Carnegie Mellon scientists could provide the answer.

At the 36th Lunar and Planetary Science Conference in Houston this week (March 14-18), Carnegie Mellon scientist Alan Waggoner is presenting results of the life detection system’s recent performance in Chile’s Atacama Desert, where it found growing lichens and bacterial colonies. This marks the first time a rover-based automated technology has been used to identify life in this harsh region, which serves as a test bed for technology that could be deployed in future Mars missions.

“Our life detection system worked very well, and something like it ultimately may enable robots to look for life on Mars,” says Waggoner, a member of the “Life in the Atacama” project team and director of the Molecular Biosensor and Imaging Center at Carnegie Mellon’s Mellon College of Science.

The “Life in the Atacama” 2004 field season?from August to mid-October?was the second phase of a three-year program whose goal is to understand how life can be detected by a rover that is being controlled by a remote science team. The project is part of NASA’s Astrobiology Science and Technology Program for Exploring Planets, or ASTEP, which concentrates on pushing the limits of technology in harsh environments.

David Wettergreen, associate research professor in Carnegie Mellon’s Robotics Institute, leads rover development and field investigation. Nathalie Cabrol, a planetary scientist at NASA Ames Research Center and the SETI Institute, leads the science investigation.

Life is barely detectable over most areas of the Atacama, but the rover’s instruments were able to detect lichens and bacterial colonies in two areas: a coastal region with a more humid climate and an interior, very arid region less hospitable to life.

“We saw very clear signals from chlorophyll, DNA and protein. And we were able to visually identify biological materials from a standard image captured by the rover,” says Waggoner.

“Taken together, these four pieces of evidence are strong indicators of life. Now, our findings are being confirmed in the lab. Samples collected in the Atacama were examined, and scientists found that they contained life. The lichens and bacteria in the samples are growing and awaiting analysis.”

Waggoner and his colleagues have designed a life detection system equipped to detect fluorescence signals from sparse life forms, including those that are mere millimeters in size. Their fluorescence imager, which is located underneath the rover, detects signals from chlorophyll-based life, such as cyanobacteria in lichens, and fluorescent signals from a set of dyes designed to light up only when they bind to nucleic acid, protein, lipid or carbohydrate?all molecules of life.

“We don’t know of other remote methods capable both of detecting low levels of micro-organisms and visualizing high levels incorporated as biofilms or colonies,” says Gregory Fisher, project imaging scientist.

“Our fluorescent imager is the first imaging system to work in the daylight while in the shade of the rover. The rover uses solar energy to operate so it needs to travel during daylight hours. Many times, the images we capture may only reveal a faint signal. Any sunlight that leaks in to the camera of a conventional fluorescence imager would obscure the signal,” Waggoner says.

“To avoid this problem, we designed our system to excite dyes with high intensity flashes of light. The camera only opens during those flashes, so we are able to capture a strong fluorescence signal during daytime exploration,” says Shmuel Weinstein, project manager.

During the mission, a remote science team located in Pittsburgh instructed the rover’s operations. A ground team at the site collected samples studied by the rover to bring back for further examination in the lab. On a typical day in the field, the rover followed a path designated the previous day by the remote operations science team. The rover stopped occasionally to perform detailed surface inspection, effectively creating a “macroscopic quilt” of geologic and biological data in selected 10 by 10 centimeter panels. After the rover departed a region, the ground team collected samples examined by the rover.

“Based on the rover findings in the field and our tests in the laboratory, there is not one example of the rover giving a false positive. Every sample we tested had bacteria in it,” says Edwin Minkley, director of the Center for Biotechnology and Environmental Processes in the Department of Biological Sciences.

Minkley is conducting analyses to determine the genetic characteristics of the recovered bacteria to identify the different microbial species present in the samples. He also is testing the bacteria’s sensitivity to ultraviolet (UV) radiation. One hypothesis is that the bacteria may have greater UV resistance because they are exposed to extreme UV radiation in the desert environment. According to Minkley, this characterization also may explain why such a high proportion of the bacteria from the most arid site are pigmented?red, yellow or pink?as they grow in the laboratory.

The first phase of the project began in 2003 when a solar-powered robot named Hyperion, also developed at Carnegie Mellon, was taken to the Atacama as a research test bed. Scientists conducted experiments with Hyperion to determine the optimum design, software and instrumentation for a robot that would be used in more extensive experiments conducted in 2004 and in 2005. Zo?, the rover used in the 2004 field season, is the result of that work. In the final year of the project, plans call for Zo?, equipped with a full array of instruments, to operate autonomously as it travels 50 kilometers over a two-month period.

The science team, led by Cabrol, is made up of geologists and biologists who study both Earth and Mars at institutions including NASA’s Ames Research Center and Johnson Space Center, SETI Institute, Jet Propulsion Laboratory, the University of Tennessee, Carnegie Mellon, Universidad Catolica del Norte (Chile), the University of Arizona, UCLA, the British Antarctic Survey, and the International Research School of Planetary Sciences (Pescara, Italy).

The Life in the Atacama project is funded with a three-year, $3 million grant from NASA to Carnegie Mellon’s Robotics Institute. William “Red” Whittaker is the principal investigator. Waggoner is principal investigator for the companion project in life-detection instruments, which garnered a separate $900,000 grant from NASA.

Original Source: CMU News Release

Helium-Richest Stars Found

On the basis of stellar spectra totalling more than 200 hours of effective exposure time with the 8.2-m VLT Kueyen telescope at Paranal (Chile), a team of astronomers [1] has made a surprising discovery about the stars in the giant southern globular cluster Omega Centauri.

It has been known for some time that, contrary to other clusters of this type, this stellar cluster harbours two different populations of stars that still burn hydrogen in their centres. One population, accounting for one quarter of its stars, is bluer than the other.

Using the FLAMES multi-object spectrograph that is particularly well suited to this kind of work, the astronomers found that the bluer stars contain more heavy elements than those of the redder population. This was exactly opposite to the expectation and they are led to the conclusion that the bluer stars have an overabundance of the light element helium of more than 50%. They are in fact the most helium rich stars ever found. But why is this so?

The team suggests that this puzzle may be explained in the following way. First, a great burst of star formation took place during which all the stars of the red population were produced. As other normal stars, these stars transformed their hydrogen into helium by nuclear burning. Some of them, with masses of 10-12 times the mass of the Sun, soon thereafter exploded as supernovae, thereby enriching the interstellar medium in the globular cluster with helium. Next, the blue population stars formed from this helium-rich medium.

This unexpected discovery provides important new insights into the way stars may form in larger stellar systems.

Two Populations
Globular clusters are large stellar clusters some of which contain hundreds of thousands of stars. It is generally believed that all stars belonging to the same globular cluster were born together, from the same interstellar cloud and at the same time. Strangely, however, this seems not to be the case for the large southern globular cluster Omega Centauri.

Omega Centauri is the galactic globular cluster with the most complex stellar population. Its large mass may represent an intermediate type of object, between globular clusters and larger stellar systems such as galaxies. In this sense, Omega Centauri is a very useful “laboratory” for better understanding the history of star formation.

However, it appears that the more information astronomers acquire about the stars in this cluster, the less they seem to understand the origin of these stars. But now, new intriguing results from the ESO Very Large Telescope (VLT) may show a possible way of resolving the present, apparently contradictory results.

Last year, an international team of astronomers [1], using data from the Hubble Space Telescope (HST), showed that Omega Centauri, unlike all other globular clusters, possesses two distinct populations of stars burning hydrogen in their centre. Even more puzzling was the discovery that the bluer population was more rare than the redder one: they accounted for only a quarter of the total number of stars still burning hydrogen in their central core. This is exactly the opposite of what the astronomers had expected, based on the observations of more evolved stars in this cluster.

Over Two Weeks of Total Exposure Time!
The same team of astronomers then went on to observe some of the stars from the two populations in this cluster by means of the FLAMES instrument on the Very Large Telescope at Paranal. They used the MEDUSA mode, allowing to obtain no less than 130 spectra simultaneously.

Twelve one-hour spectra were obtained for 17 stars of the blue population and the same number stars from the red one. These stars have magnitudes between 20 and 21, i.e., they are between 500,000 and 1 million times fainter than what can be seen with the unaided eye.

The individual spectra of stars from each population were then co-added. This produced a “mean” spectrum of a blue-population star and another of a red-population. Each of these spectra represents a total of no less than 204 hours of exposure time and accordingly provides information in unrivalled detail about these stars, especially in terms of their chemical composition.

The scientific outcome matches the technical achievement!
From a careful study of the combined spectra, the astronomers were able to establish that – contrary to all prior expectations – the bluer stars are more “metal-rich” (by a factor two) than the redder ones. “The latter were found to have an abundance of elements more massive than helium corresponding to about 1/40 the solar abundance [2] “, explains Raffaele Gratton of INAF-Osservatorio Astronomico di Padova in Italy. “This is indeed very puzzling as current models of stars predict that the more metal-rich a star is, the redder it ought to be”.

Giampaolo Piotto (University of Padova, Italy), leader of the team, thinks that there is a solution to this celestial puzzle: “The only way we can explain this discrepancy is by assuming that the two populations of stars have a different abundance of helium. We find that while the red stars have a normal helium abundance, the bluer stars must be enriched in helium by more than 50% with respect to the other population!”

These stars are thus the most helium-rich stars ever found, and not by just a few percent! It took some 8 billion years for the Milky Way Galaxy to increase its helium abundance from the primordial 24% value (created by the Big Bang) to the present solar 28% value, and yet in a globular cluster that formed only 1 or 2 billion years after the Big Bang, stars were produced with 39% of helium!

Contamination from supernovae
The obvious question is now: “Where does all this helium come from?”

Luigi Bedin (ESO), another member of the team, suggests that the solution might be connected to supernovae: “The scenario we presently favour is one in which the high helium content originates from material ejected during the supernovae explosions of massive stars. It is possible that the total mass of Omega Centauri was just right to allow the material expelled by high-mass supernovae to escape, while the matter from explosions of stars with about 10-12 times the mass of the Sun was retained.”

According to this scenario, Omega Centauri must therefore have seen two generations of stars. The first generation, with primordial helium abundance, produced the redder stars. A few tens of million years later, the most massive stars of this first generation exploded as supernovae. The helium-enriched matter that was expelled during the explosions of stars with 10-12 times the mass of the Sun “polluted” the globular cluster. Then a second population of stars, the bluer ones, formed from this helium-rich gas.

The scientists acknowledge that certain problems still remain and that the last word may not yet have been said about this unusual globular cluster. But the new results constitute an important step towards the solution of the biggest mystery of all: why is Omega Centauri the only one among the galactic globular cluster that was able to produce super helium-rich stars?

More information
The research presented here appeared in the March 10 issue of the Astrophysical Journal, Vol. 621, p. 777 (“Metallicities on the Double Main Sequence of Omega Centauri Imply Large Helium Enhancement” by G. Piotto et al.) and is available for download as astro-ph/0412016.

Notes
[1]: The team is composed of Giampaolo Piotto, Giovanni Carraro, Sandro Villanova, and Yazan Momany (University of Padova, Italy), Luigi R. Bedin (ESO, Garching), Raffaele Gratton and Sara Lucatello (INAF- Osservatorio Astronomico di Padova, Italy), Santi Cassisi (INAF- Osservatorio Astronomico di Teramo, Italy), Alejandra Recio-Blanco (Observatoire de Nice, France), Ivan R. King (University of Washington, USA), and Jay Anderson (Rice University, USA).

[2]: Helium is the second most abundant chemical element in the Universe, after hydrogen. The Sun contains about 70% hydrogen and 28% helium. The rest, about 2%, is made of all elements more heavier than helium. They are commonly referred to by astronomers as “metals”.

Original Source: ESO News Release

Probing the Large Scale Structure of the Universe

According to astrophysicist Naoki Seto of the California Institute of Technology, “Large angular CMBR fluctuations contain precious information of the largest spatial scale fluctuations, but they are also contaminated by the (less interesting) small spatial scale power. Therefore, if we can remove the small spatial scale ones, we can get a cleaner picture of the potentially anomalous features of our universe.”

It all comes down to filtering out the distractions. Say someone from another country asks you about where you live and you describe the cracks in the front driveway and the angle of the sign perched on the pole at the end of the street. Not very helpful you say – especially to someone living in an entirely different part of the world. Data from WMAP is like that. Although it reveals slight temperature related fluctuations in the CMBR across the sky, these fluctuations are mostly associated with scattering of CMBR by “nearby” matter. As a result they are “contaminated” by the expansionary influence of dark energy associated with galaxies as far off as several billions of light-years. From an astronomical point of view, CMBR fluctuations are caused by nearby cracks in the pavement. Ultimately the goal is to see the “big picture” of the entire universe. It’s all a matter of scale…

What will we learn about the Universe based on such large scale variations? “You can study interesting behaviors of the inflation that might generate seed perturbations for cosmic structure, like galaxies”, says Naoki.

Early on, a curious form of energy dominated the universe (during the so called hyper-inflationary phase). In this period the attractive influence of matter was not a factor and the universal balloon expanded incredibly fast. Later as matter dominated, gravitation put the brakes on things, the Universe decelerated and the balloon may have barely managed to keep expanding at all. After deceleration, another engine kicked in – the mysterious force called “dark energy”. The constraining influence of gravity was overcome and the Universe resumed expansion, but at a more leisurely rate. In our current epoch, studies of the light of distant supernovas have shown that the expansion of the universal balloon is accelerating again. We live in an era of universal inflation and questions about inflation, along with the possibility of dark energy driving it, can best be answered by studying previous cycles of slower expansion.

Naoki and Caltech associate Elena Pierpaoli hope to eliminate the effects of dark energy by studying the polarization of microwave radiation arriving at our solar system from the direction of older galaxy clusters. One possibility is to use a future WMAP-like probe capable of higher resolution of detail to collect microwave radiation from regions where the CMBR was once scattered by distant clouds of free electrons in space. Since electron scattering naturally occurs where matter is found, galaxy clusters make ideal candidates. The catch is that such clusters must be far enough away to provide a picture of scattering as it occured long ago. By focusing on galaxy clusters seven billion light years away, we could see the CMBR as it appeared from clusters when the universe was half its current age. Dark energy at work then would not be as strong as it is now.

The resulting picture could provide important clues related to insights coming out of the WMAP project group. There is a possibility that, at the very largest scales, the universe is quite different from what was originally thought to be true. “Very roughly speaking,?, says Naoki, ?we expected that there would be no characteristic length in the largest-scale observable universe. This includes the spatial spectrum of the fluctuations and the shape of the universe.”

Other researchers have considered the use of galaxy clusters to probe large scale structure in the universe as well. But these researchers were not convinced the approach would work. Naoki and Elena found two important factors not sufficiently emphasized in earlier studies. First, they linked the obscuring small scale fluctuations in CMBR anisotropy to the influence of dark energy associated with the current accelerating era. Second, they determined that this obscuration could be minimized by exploiting scattering effects projected from galaxy clusters 7 billion light years away. Together these two insights could make it possible to see the largest scale universal structures influencing things today.

According to Elena: “The beauty of what we showed is that the observable quantity we propose to use is a function that varies very slowly on the sky. In order to map it observationally, you don’t need a high-resolution all-sky experiment, but you need to observe targeted objects uniformly spaced on the sky. This is, observationally, a much easier task than mapping the whole sky with that resolution.”

Unfortunately it is not possible for WMAP to achieve the degree of resolution needed to bring out the largest scale structures hinted at in the original data. For this reason, it may be several years before information needed by Naoki, Elena, and other astrophysicists is collected. The next probe scheduled for launch is ESA’s Planck in 2007. Despite Planck’s increased sensitivity and resolution, the signals needed are so weak that it will be difficult to eliminate other competing signals from those polarized by distant galaxy clusters. However future high-altitude ground-based instruments, such as ACT, APEX-SZm, and SPT, may provide the aperture needed to resolve the 1 arc minute sized regions needed to bring out the largest scale structures of the Universe. The Cornell-Caltech Atacama Telescope – a 25 meter sized submillimeter-wave instrument currently undergoing feasibility study – could be sensitive to these effects. The CCAT is expected to collect first photons in the early part of the next decade. Such an instrument should be able to resolve signals separated by as little as .5 arc minutes (1/60th the diameter of the Moon).

Ah, what irony! To map the largest scale structures of 7 billion years past we still need to be able to see a few cracks in the pavement…

About The Author:
Inspired by the early 1900’s masterpiece: “The Sky Through Three, Four, and Five Inch Telescopes”, Jeff Barbour got a start in astronomy and space science at the age of seven. Currently Jeff devotes much of his time observing the heavens and maintaining the website Astro.Geekjoy.

Galaxies in the Early Universe Came in Every Flavour

What did the universe look like when it was only 2 to 3 billion years old? Astronomers used to think it was a pretty simple place containing relatively small, young star-forming galaxies. Researchers now are realizing that the truth is not that simple. Even the early universe was a wildly complex place. Studying the universe at this early stage is important in understanding how the galaxies near us were assembled over time.

Jiasheng Huang (Harvard-Smithsonian Center for Astrophysics) said, “It looks like vegetable soup! We’re detecting galaxies we never expected to find, having a wide range of properties we never expected to see.”

“It’s becoming more and more clear that the young universe was a big zoo with animals of all sorts,” said Ivo Labb? (Observatories of the Carnegie Institution of Washington), lead author on the study announcing this result.

Using the Infrared Array Camera (IRAC) aboard NASA’s Spitzer Space Telescope, the astronomers searched for distant, red galaxies in the Hubble Deep Field South-a region of the southern sky previously observed by the Hubble Space Telescope.

Their search was successful. The IRAC images displayed about a dozen very red galaxies lurking at distances of 10 to 12 billion light-years. Those galaxies existed when the universe was only about 1/5 of its present age of 14 billion years. Analysis showed that the galaxies exhibit a large range of properties.

“Overall, we’re seeing young galaxies with lots of dust, young galaxies with no dust, old galaxies with lots of dust, and old galaxies with no dust. There’s as much variety in the early universe as we see around us today,” said Labb?.

The team was particularly surprised to find a curious breed of galaxy never seen before at such an early stage in the universe–old, red galaxies that had stopped forming new stars altogether. Those galaxies had rapidly formed large numbers of stars much earlier in the universe’s history, raising the question of what caused them to “die” so soon.

The unpredicted existence of such “red and dead” galaxies so early in time challenges theorists who model galaxy formation.

“We’re trying to understand how galaxies like the Milky Way assembled and how they got to look the way they appear today,” said Giovanni Fazio (CfA), a co-author on the study. “Spitzer offers capabilities that Hubble and other instruments don’t, giving us a unique way to study very distant galaxies that eventually became the galaxies we see around us now.”

The study will be published in an upcoming issue of The Astrophysical Journal Letters.

This press release is being issued in conjunction with the Observatories of the Carnegie Institution of Washington.

NASA’s Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center, Pasadena, Calif. JPL is a division of California Institute for Technology, Pasadena.

Headquartered in Cambridge, Mass., the Harvard-Smithsonian Center for Astrophysics (CfA) is a joint collaboration between the Smithsonian Astrophysical Observatory and the Harvard College Observatory. CfA scientists, organized into six research divisions, study the origin, evolution and ultimate fate of the universe.

Detector Ready to Receive Beam of Neutrinos

Particle physicists from around the World are poised to unravel the secrets of the ethereal neutrino. Operational from this afternoon, March 4th, the Main Injector Neutrino Oscillation Search (MINOS) will produce a beam of neutrinos and fire them through the earth. By comparing neutrinos at the start with those that at the finish, some 735 km away, the scientists hope to understand many of their properties, including their most mysterious behaviour; how neutrinos can morph between three different types!

“This strange property of neutrinos was only recently discovered experimentally, because neutrinos interact with the their surroundings very rarely – in fact millions are passing through air, earth and even people unnoticed at any given time. Even a specially built detector like the MINOS Far detector is only expected to see 1,500 neutrinos in a year – billions more will pass straight through!” says UK project spokesperson, Dr Geoff Pearce of the CCLRC Rutherford Appleton Laboratory.

The MINOS experiment will use a neutrino beam produced just outside Chicago, USA at Fermilab’s Main Injector accelerator to probe the secrets of these elusive subatomic particles: where do they come from, what are their masses and how do they change from one kind to another? There are three types or ‘flavours’ of neutrino: electron, muon and tau, each with different properties. The neutrino beam will be projected straight through the earth from Fermilab to the Soudan Mine in Northern Minnesota – a distance of 735 kilometres. No tunnel is needed because neutrinos interact so rarely with matter that they can pass straight through the earth virtually unhindered. In a ceremony this afternoon, the Speaker of the US House of Representatives, the Honourable J. Dennis Hastert Jr. will activate the neutrino beam, sending the first particles on their journey to the detector in the Soudan mine.

Dr Alfons Weber, University of Oxford explains “This is an exciting time for us. The beam we now generate at Fermilab will contain only one type of neutrino – muon neutrinos. When it arrives at the Far Detector in the Soudan Mine fractions of a second later, some of the muon neutrinos will have changed into the other types – tau and electron neutrinos. We want to understand how they do this.”

Two massive neutrino detectors have been built by MINOS, both of which are complete and ready for the beam. The 1000 ton ‘near’ detector will sample the beam as it leaves Fermilab and provide the control measurements. The 5,500 ton ‘far’ detector, half a mile underground in the Soudan Mine, will measure the neutrinos when they arrive, just 2.5 milliseconds later. The detectors have to be a long distance apart to allow the neutrinos, which travel at close to the speed of light, time to oscillate. “By comparing these two measurements we will be able to study how the neutrinos have oscillated and provide the world’s most precise measurement of this effect with muon-type neutrinos” explains Dr Geoff Pearce.

Prof. Ian Halliday, CEO of the Particle Physics and Astronomy Research Council which funds UK work on this project, anticipated the revelations from the experiment’s precision measurements.

“The mysteries of the elusive neutrino are about to be unveiled,” Halliday said. “For the very first time we will be able to investigate the changing state of this bizarre particle to an unprecedented accuracy of a few percent in a controlled beam of neutrinos created in the laboratory. I’m extremely proud that UK scientists have played a key role in bringing this experiment to fruition and, in collaboration with their international colleagues, will be amongst the first in the world to study its unique characteristics.”

“Physicists from around the world are trying to understand what these mysterious neutrinos are telling us,” said Fermilab director Michael Witherell. “Today, we are embarking on a journey of exploration using the most powerful neutrino facility in the world. I am extremely proud of what the people of Fermilab have accomplished in completing the NuMI project. I would like to thank the American people and the federal government for making the necessary commitment to support great science.”

Original Source: PPARC News Release