Gemini Goes Silver

Image credit: Gemini
To investors looking for the next sure thing, the silver coating on the Gemini South 8-meter telescope mirror might seem like an insider’s secret tip-off to invest in this valuable metal for a huge profit. However, it turns out that this immense mirror required less than two ounces (50 grams) of silver, not nearly enough to register on the precious metals markets. The real return on Gemini’s shiny investment is the way it provides unprecedented sensitivity from the ground when studying warm objects in space.

The new coating-the first of its kind ever to line the surface of a very large astronomical mirror-is among the final steps in making Gemini the most powerful infrared telescope on our planet. “There is no question that with this coating, the Gemini South telescope will be able to explore regions of star and planet formation, black holes at the centers of galaxies and other objects that have eluded other telescopes until now,” said Charlie Telesco of the University of Florida who specializes in studying star- and planet-formation regions in the mid-infrared.

Covering the Gemini mirror with silver utilizes a process developed over several years of testing and experimentation to produce a coating that meets the stringent requirements of astronomical research. Gemini’s lead optical engineer, Maxime Boccas who oversaw the mirror-coating development said, “I guess you could say that after several years of hard work to identify and tune the best coating, we have found our silver lining!”

Most astronomical mirrors are coated with aluminum using an evaporation process, and require recoating every 12-18 months. Since the twin Gemini mirrors are optimized for viewing objects in both optical and infrared wavelengths, a different coating was specified. Planning and implementing the silver coating process for Gemini began with the design of twin 9-meter-wide coating chambers located at the observatory facilities in Chile and Hawaii. Each coating plant (originally built by the Royal Greenwich Observatory in the UK) incorporates devices called magnetrons to “sputter” a coating on the mirror. The sputtering process is necessary when applying multi-layered coatings on the Gemini mirrors in order to accurately control the thickness of the various materials deposited on the mirror’s surface. A similar coating process is commonly used for architectural glass to reduce air-conditioning costs and produce an aesthetic reflection and color to glass on buildings, but this is the first time it has been applied to a large astronomical telescope mirror.

The coating is built up in a stack of four individual layers to assure that the silver adheres to the glass base of the mirror and is protected from environmental elements and chemical reactions. As anyone with silverware knows, tarnish on silver reduces the reflection of light. The degradation of an unprotected coating on a telescope mirror would have a profound impact on its performance. Tests done at Gemini with dozens of small mirror samples over the past few years show that the silvered coating applied to the Gemini mirror should remain highly reflective and usable for at least a year between recoatings.

In addition to the large primary mirror, the telescope’s 1-meter secondary mirror and a third mirror that directs light into scientific instruments were also coated using the same protected silver coatings. The combination of these three mirror coatings as well as other design considerations are all responsible for the dramatic increase in Gemini’s sensitivity to thermal infrared radiation.

A key measure of a telescope’s performance in the infrared is its emissivity (how much heat it actually emits compared to the total amount it can theoretically emit) in the thermal or mid-infrared part of the spectrum. These emissions result in a background noise against which astronomical sources must be measured. Gemini has the lowest total thermal emissivity of any large astronomical telescope on the ground, with values under 4% prior to receiving its silver coating. With this new coating, Gemini South’s emissivity will drop to about 2%. At some wavelengths this has the same effect on sensitivity as increasing the diameter of the Gemini telescope from 8 to more than 11 meters! The result is a significant increase in the quality and amount of Gemini’s infrared data, which allows detection of objects that would otherwise be lost in the noise generated by heat radiating from the telescope. It is common among other ground-based telescopes to have emissivity values in excess of 10%

The recoating procedure was successfully performed on May 31, and the newly coated Gemini South mirror has been re-installed and calibrated in the telescope. Engineers are currently testing the systems before returning the telescope to full operations. The Gemini North mirror on Mauna Kea will undergo the same coating process before the end of this year.

Why Silver?
The reason astronomers wish to use silver as the surface on a telescope mirror lies in its ability to reflect some types of infrared radiation more effectively than aluminum. However, it is not just the amount of infrared light that is reflected but also the amount of radiation actually emitted from the mirror (its thermal emissivity) that makes silver so attractive. This is a significant issue when observing in the mid-infrared (thermal) region of the spectrum, which is essentially the study of heat from space. ?The main advantage of silver is that it reduces the total thermal emission of the telescope. This in turn increases the sensitivity of the mid-infrared instruments on the telescope and allows us to see warm objects like stellar and planetary nurseries significantly better,? said Scott Fisher a mid-infrared astronomer at Gemini.

The advantage comes at a price however. To use silver, the coating must be applied in several layers, each with a very precise and uniform thickness. To do this, devices called magnetrons are used to apply the coating. They work by surrounding an extremely pure metal plate (called the target) with a plasma cloud of gas (argon or nitrogen) that knocks atoms out from the target and deposits them uniformly on the mirror (which rotates slowly under the magnetron). Each layer is extremely thin; with the silver layer only about 0.1 microns thick or about 1/200 the thickness of a human hair. The total amount of silver deposited on the mirror is approximately equal to 50 grams.

Studying Heat Originating from Space
Some of the most intriguing objects in the universe emit radiation in the infrared part of the spectrum. Often described as “heat radiation,” infrared light is redder than the red light we see with our eyes. Sources that emit in these wavelengths are sought after by astronomers since most of their infrared radiation can pass through clouds of obscuring gas dust and reveal secrets otherwise shrouded from view. The infrared wavelength regime is split into three main regions, near- , mid- and far-infrared. Near-infrared is just beyond what the human eye can see (redder than red), mid-infrared (often called thermal infrared) represents longer wavelengths of light usually associated with heat sources in space, and far-infrared represents cooler regions.

Gemini’s silver coating will enable the most significant improvements in the thermal infrared part of the spectrum. Studies in this wavelength range include star- and planet-formation regions, with intense research that seeks to understand how our own solar system formed some five billion years ago.

Original Source: Gemini News Release

Molecular Nitrogen Found Outside our Solar System

Image credit: Orbital Sciences
Using NASA’s Far Ultraviolet Spectroscopic Explorer (FUSE) satellite, researchers have for the first time detected molecular nitrogen in interstellar space, giving them their first detailed look into how the universe’s fifth most-abundant element behaves in an environment outside the Solar System.

This discovery, made by astronomers at The Johns Hopkins University, Baltimore, promises to enhance understanding not only of the dense regions between the stars, but also of the very origins of life on Earth.

“Detecting molecular nitrogen is vital for improved understanding of interstellar chemistry,” said David Knauth, a post-doctoral fellow at Johns Hopkins and first author of a paper in the June 10 issue of Nature. “And because stars and planets form from the interstellar medium, this discovery will lead to an improved understanding of their formation, as well.”

Nitrogen is the most prevalent element of Earth’s atmosphere. Its molecular form, known as N2, consists of two combined nitrogen atoms. A team of researchers led by Knauth and physics and astronomy research scientist and co-author B-G Andersson continued investigations of N2 that began in the 1970s with the Copernicus satellite. At least 10,000 times more sensitive than Copernicus, FUSE – a satellite-telescope designed at and operated by Johns Hopkins for NASA – allowed the astronomers to probe the dense interstellar clouds where molecular nitrogen was expected to be a dominant player.
“Astronomers have been searching for molecular nitrogen in interstellar clouds for decades,” said Dr. George Sonneborn, FUSE Project Scientist at NASA Goddard Space Flight Center, Greenbelt, Md. “Its discovery by FUSE will greatly improve our knowledge of molecular chemistry in space.”

The astronomers faced several challenges along the way, including the fact that they were peering through dusty, dense interstellar clouds which blocked a substantial amount of the star’s light. In addition, the researchers confronted a classic Catch-22: Only the brightest stars emitted enough of a signal to allow FUSE to detect molecular nitrogen’s presence, but many of those stars were so bright they threatened to damage the satellite’s exquisitely-sensitive detectors.

HD 124314, a moderately-reddened star in the southern constellation of Centaurus, ended up being the first sight line where researchers could verify molecular nitrogen’s presence. This discovery is an important step in ascertaining the complicated process of how much molecular nitrogen exists in the interstellar medium and how its presence varies in different environments.

“For nitrogen, most models say that a major part of the element should be in the form of N2, but as we had not been able to measure this molecule, it’s been very hard to test whether those models and theories are right or not. The big deal here is that now we have a way to test and constrain those models,” Andersson said.

Launched on June 24, 1999, FUSE seeks to understand several fundamental questions about the Universe. What were the conditions shortly after the Big Bang? What are the properties of interstellar gas clouds that form stars and planetary systems? How are the chemical elements made and dispersed throughout our galaxy?

FUSE is a NASA Explorer mission. Goddard manages the Explorers Program for the Office of Space Science at NASA Headquarters in Washington, D.C. For more on the FUSE mission, go the website at: http://fuse.pha.jhu.edu

Original Source: NASA News Release

Europeans Agree to Build Instrument for Webb Telescope

Image credit: ESA
An agreement between ESA and seven Member States to jointly build a major part of the MIRI instrument, which will considerably extend the capability of the James Webb Space Telescope (JWST), was signed, 8 June 2004.

This agreement also marks a new kind of partnership between ESA and its Member States for the funding and implementation of payload for scientific space missions.

MIRI, the Mid-Infrared Instrument, is one of the four instruments on board the JWST, the mission scheduled to follow on the heritage of Hubble in 2011. MIRI will be built in cooperation between Europe and the United States (NASA), both equally contributing to its funding. MIRI?s optics, core of the instrument, will be provided by a consortium of European institutes. According to this formal agreement, ESA will manage and co-ordinate the whole development of the European part of MIRI and act as the sole interface with NASA, which is leading the JWST project.

This marks a difference with respect to the previous ESA scientific missions. In the past the funding and the development of the scientific instruments was agreed by the participating ESA Member States on the basis of purely informal arrangements with ESA. In this case, the Member States involved in MIRI have agreed on formally guaranteeing the required level of funding on the basis of a multi-lateral international agreement, which still keeps scientists in key roles.

Over the past years, missions have become more complex and demanding, and more costly within an ever tighter budget. They also require a more and more specific expertise which is spread throughout the vast European scientific community. As a result, a new management procedure for co-ordination of payload development has become a necessity to secure the successful and timely completion of scientific space projects. ESA?s co-ordination of the MIRI European consortium represents the first time such an approach has been used, which will be applied to the future missions of the ESA long-term Science Programme ? the ?Cosmic Vision?. The technology package for LISA (LTP), an ESA/NASA mission to detect gravitational waves, is already being prepared under the same scheme.

Sergio Volonte, ESA Co-ordinator for Astrophysics and Fundamental Physics Missions, comments: ?I?m delighted for such an achievement between ESA and its Member States. With MIRI we will start an even more effective co-ordination on developing our scientific instruments, setting a new framework to further enhance their excellence.?

The James Webb Space Telescope (JWST), is a partnership between ESA, NASA and the Canadian Space Agency. Formerly known as the Next Generation Space Telescope (NGST), it is due to be launched in August 2011, and it is considered the successor of the NASA/ESA Hubble Space Telescope. It is three times larger and more powerful than its predecessor and it is expected to shed light on the ‘Dark Ages of the Universe’ by studying the very distant Universe, observing infrared light from the first stars and galaxies that ever emerged.

MIRI (Mid-Infrared Camera-Spectrograph) is essential for the study of the old and distant stellar population; regions of obscured star formation; hydrogen emission from previously unthinkable distances; the physics of protostars; and the sizes of ?Kuiper belt? objects and faint comets.

Further to the contribution to MIRI, Europe through ESA is contributing to JWST with the NIRSPEC (Near-Infrared multi-object Spectrograph) instrument (fully funded and managed by ESA) and, as agreed in principle with NASA, with the Ariane 5 launcher. The ESA financial contribution to JWST will be about 300 million Euros, including the launcher. The European institutions involved in MIRI will contribute about 70 million Euros overall.

The European institutions who signed the MIRI agreement with ESA are: the Centre Nationale des Etudes Spatiales (CNES), the Danish Space Research Institute (DSRI), the German Aerospace Centre (DLR), the Spanish Ministerio de Educaci?n y Ciencia (MEC), the Nederlandse Onderzoekschool voor Astronomie (NOVA), the UK Particle Physics and Astronomy Research Council (PPARC) and the Swedish National Space Board (SNSB).

Four European countries, Belgium, Denmark, Ireland and Switzerland contribute to MIRI through their participation into ESA?s Scientific Experiment Development programme (PRODEX). This is an optional programme, mainly used by smaller countries, by which they delegate to ESA the management of funding to develop scientific instruments.

The delivery to NASA of the MIRI instrument is due for March 2009.

Original Source: ESA News Release

New Estimate for the Mass of Higgs Boson

Image credit: Berkeley Lab
In a case of the plot thickening as the mystery unfolds, the Higgs boson has just gotten heavier, even though the subatomic particle has yet to be found. In a letter to the scientific journal Nature, published in the June 10, 2004 issue, an international collaboration of scientists working at the Tevatron accelerator of the Fermi National Accelerator Laboratory (Fermilab), report the most precise measurements yet for the mass of the top quark ? a subatomic particle that has been found ? and this requires an upward revision for the long-postulated but still undetected Higgs boson.

“Since the top quark mass we are reporting is a bit higher than previously measured, it means the most likely value of the Higgs mass is also higher,” says Ron Madaras, a physicist with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), who heads the local participation in the D-Zero experiment at the Tevatron. “The most likely Higgs mass has now been increased from 96 to 117 GeV/c2” ? GeV/c2 is a common particle-physics unit of mass; the mass of the proton measures about 1 GeV/c2 ? “which means it’s probably beyond the sensitivity of current experiments, but very likely to be found in future experiments at the Large Hadron Collider being built at CERN.”

The Higgs boson has been called the missing link in the Standard Model of Particles and Fields, the theory that’s been used to explain fundamental physics since the 1970s. Prior to 1995 the top quark was also missing, but then the experimental teams working at the Tevatron’s two large detector systems, D-Zero and CDF, were able to discover it independently.

Scientists believe that the Higgs boson, named for Scottish physicist Peter Higgs, who first theorized its existence in 1964, is responsible for particle mass, the amount of matter in a particle. According to the theory, a particle acquires mass through its interaction with the Higgs field, which is believed to pervade all of space and has been compared to molasses that sticks to any particle rolling through it. The Higgs field would be carried by Higgs bosons, just as the electromagnetic field is carried by photons.

“In the Standard Model, the Higgs boson mass is correlated with top quark mass,” says Madaras, “so an improved measurement of the top quark mass gives more information about the possible value of the Higgs boson mass.”

According to the Standard Model, at the beginning of the universe there were six different types of quarks. Top quarks exist only for an instant before decaying into a bottom quark and a W boson, which means those created at the birth of the universe are long gone. However, at Fermilab’s Tevatron, the most powerful collider in the world, collisions between billions of protons and antiprotons yield an occasional top quark. Despite their brief appearances, these top quarks can be detected and characterized by the D-Zero and CDF experiments.

In announcing the D-Zero results, experiment cospokesperson John Womersley said, “An analysis technique that allows us to extract more information from each top quark event that occurred in our detector has yielded a greatly improved precision of plus or minus 5.3 GeV/c2 in the top mass measurement, compared with previous measurements. The new measurement is comparable to the precision of all previous top quark mass measurements put together. When this new result is combined with all other measurements from both the D-Zero and CDF experiments, the new world average for the top mass becomes 178.0 plus or minus 4.3 GeV/c2.”

The D-Zero detector system consists of a central tracking detector array, a hermetic calorimeter for measuring energy, and a large solid-angle muon detector system. Berkeley Lab designed and built the two electromagnetic end-cap calorimeters and also the initial vertex detector, the innermost component of the tracking system. Tracking detectors supplement calorimeters by measuring particle trajectories. Only when trajectory and energy measurements are combined can scientists identify and characterize particles.

While raising the central value for the top quark mass appears to diminish the possibility that the Higgs boson could be discovered at the Tevatron, it does open a wider door for new discoveries in supersymmetry, also known as SUSY, an extension of the Standard Model that unites particles of force and matter through the existence of superpartners (sometimes referred to as “sparticles”). Supersymmetry seeks to fill gaps left by the Standard Model.

“The current mass limits or bounds that exclude supersymmetric particles are very sensitive to the top quark mass,” says Madaras. “Since the top quark mass is now higher, these limits or bounds are not as severe, which increases the chance of seeing supersymmetric particles at the Tevatron.”

Scientists from nearly 40 US universities and 40 foreign institutions contributed to the data analysis reported in the letter to Nature by the D-Zero experimental group. Berkeley Lab co-authors of the letter in addition to Madaras were Mark Strovink, Al Clark, Tom Trippe, and Daniel Whiteson.

Fermilab Director Michael Witherell said in a statement that these results do not end the story of precision measurements of the top quark mass. “The two collider detectors, D-Zero and CDF, are recording large amounts of data in Run II of the Tevatron. The CDF collaboration has recently reported preliminary new measurements of the top mass based on Run II data. The precision of the world average will improve further when their results are final. Over the next few years, both experiments will make increasingly precise measurements of the top quark mass.”

Fermilab, like Berkeley Lab, is funded by the Department of Energy?s Office of Science. In response to the Nature letter from the D-Zero group, Raymond L. Orbach, Director of the Office of Science, said: ?These important results demonstrate how our scientists are applying new techniques to existing data, producing new estimates for the mass of the Higgs boson. We eagerly await the next round of results from the vast quantities of data that are generated today at the Fermilab Tevatron.?

Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California. Fermilab is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Universities Research Association, Inc.

Original Source: Berkeley Lab News Release

Phoebe: Cassini’s First Target

Image credit: NASA/JPL/Space Science Institute
The Cassini spacecraft is closing in fast on its first target of observation in the Saturn system: the small, mysterious moon Phoebe, only 220 kilometers (137 miles) across.

The three images shown here, the latest of which is twice as good as any image returned by the Voyager 2 spacecraft in 1981, were captured in the past week on approach to this outer moon of Saturn. Phoebe’s surface is already showing a great deal of contrast, most likely indicative of topography, such as tall sunlit peaks and deep shadowy craters, as well as genuine variation in the reflectivity of its surface materials. Left to right, the three views were captured at a Sun-Saturn-spacecraft, or phase, angle of 87 degrees between June 4 and June 7, from distances ranging from 4.1 million km (2.6 million miles) to 2.5 million km (1.5 million miles). The image scale ranges from 25 to 15 km (16 to 9 miles) per pixel.

The images have been magnified eight times using a linear interpolation scheme; the contrast has been untouched. Phoebe rotates once every 9 hours and 16 minutes; each of these images shows a different region on Phoebe.

Cassini’s powerful cameras will provide the best-ever look at this moon on Friday, June 11, when the spacecraft will streak past Phoebe at a distance of only about 2,000 kilometers (1,240 miles) from the moon’s surface. The current images, and the presence of large craters, promise a heavily cratered surface which will come into sharp view over the next few days when image scales should shrink to a few 10’s of meters…the size of office buildings.

Because of its small size and retrograde orbit – Phoebe orbits Saturn in a direction opposite to that of the larger inner Saturnian moons – and because of the presence of water ice on its surface, Phoebe is believed to be a body from the distant outer solar system, one of the building blocks of the outer planets that was captured into orbit around Saturn. If true, the little moon will provide a windfall of precious information about a primitive piece of the solar system that has never before been explored up close.

Phoebe was the first moon discovered using photography in 1898 and has a very dark surface. It has long been believed that material coming from Phoebe’s surface and impacting the very dark leading hemisphere of Iapetus may play some role in thelatter’s extreme albedo asymmetry, though the precise relationship is unclear. Cassini should help solve this and other mysteries during its exciting encounter with Phoebe.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini-Huygens mission for NASA’s Office of Space Science, Washington, D.C. The imaging team is based at the Space Science Institute, Boulder, Colorado.

For more information about the Cassini-Huygens mission, visit http://saturn.jpl.nasa.gov and the Cassini imaging team home page, http://ciclops.org.

Original Source: CICLOPS News Release

New Horizons Mission Will Measure the Solar Wind out at Pluto

Image credit: NASA/JHUAPL/SwRI
The Solar Wind Around Pluto (SWAP) instrument aboard the New Horizons spacecraft is designed to measure the interactions of Pluto and Charon with the solar wind, the high-speed stream of charged particles flowing out from the sun. Understanding these interactions will expand researchers’ knowledge of the astrophysical processes affecting these bodies and that part of the solar system.

The space science community understands the extremes (called the bounding states) of solar wind interactions with planets, comets and other bodies, but no one knows what kind of interaction is present at Pluto. Comet Borrelly represents a strong interaction with the solar wind, while Venus represents a weak one.

“We expect solar wind interactions at Pluto to lie somewhere between the strong and weak extremes,” says SWAP Principal Investigator Dr. David J. McComas, a senior executive director at Southwest Research Institute? (SwRI?).

After taking measurements at Pluto, researchers plan to use the SWAP data to define basic parameters about the system. For example, once researchers know how such material comes off Pluto, they can then estimate the amount of Pluto’s atmosphere that escapes into space. This will reveal insights into the structure and destiny of the atmosphere itself.

SWAP would go on to take similar measurements at Charon and at least one Kuiper belt object; however, the team expects those interactions to be much weaker simply because the atmospheres of these objects are expected to be less extensive and not likely to emit much material.

Another of the many Pluto mysteries is where the interactions of the solar wind will occur around the planet, so science plans call for SWAP to take continuous measurements as it nears and passes Pluto.

“We know when and where to use some of the instruments to take an image or a measurement at Pluto,” says McComas. “Solar wind interactions, however, present quite a challenge because we’re trying to measure this invisible thing surrounding Pluto at an uncertain distance from it.”

“The science we expect SWAP to perform is impossible to accomplish without actually going to Pluto-Charon and directly sampling its environment. That capability is something that NASA pioneered and which, to this day, only the United States can do,” says Dr. Alan Stern, principal investigator of New Horizons and an executive director at SwRI.

The incredible distances of Pluto from the sun required that the SWAP team build the largest aperture instrument ever used to measure the solar wind. It allows SWAP to make measurements even when the solar wind is very tenuous. The instrument also combines a retarding potential analyzer (RPA) with an electrostatic analyzer (ESA) to enable extremely fine, accurate energy measurements of the solar wind.

“Should the interaction between Pluto and the solar wind turn out to be very small, the RPA and ESA combination will allow us to measure minute changes in solar wind speed,” says Scott Weidner, the SWAP instrument manager and an SwRI principal scientist.

The various instruments aboard New Horizons were designed and are being built independently, yet they are expected to work together to reveal significant new insights about Pluto, Charon and their Kuiper belt neighbors. SWAP measures low energy interactions, such as those caused by the solar wind. Its complement, the Pluto Energetic Particle Spectrometer Science Investigation, or PEPSSI, will look at higher energy particles, such as pickup ions. The top of SWAP’s energy range can measure some pickup ions, and PEPSSI picks up where SWAP leaves off to see the highest energy interactions.

The sun and its solar wind affect the entire solar system and should create interesting science opportunities for SWAP throughout its planned nine-year voyage to Pluto. SWAP will operate for more than a month each year and will sample heliospheric pickup ions?ions that originate in interstellar space and get ionized when they come near the sun. Other pickup ions come from material inside the solar system. Researchers have shown that even collisions between Kuiper belt objects result in tiny grains that drift toward the sun, evaporate and become ionized. The Cassini spacecraft, when it reaches Saturn this July, will allow researchers to observe
these so-called “outer source” pickup ions to 10 astronomical units (AU, the distance from the Earth to the sun), the region where pickup ions from the outer source are believed to begin.

“We’ll be out to 30 AU before New Horizons even reaches Pluto. While we’re targeting a Kuiper belt object, we could be anywhere from 30 to 50 AU, where the influence of heliospheric pickup ions becomes greater and greater in the solar wind,” says McComas. “On the journey out to Pluto, we’ll be able to validate or disprove the outer source theory, which is an exciting warm up to reaching Pluto itself.”

Original Source: SWRI News Release

New Simulation Improves Ideas of Galaxy Formation

Image credit: U of Chicago
Astrophysicists led by the University of Chicago?s Andrey Kravtsov have resolved an embarrassing contradiction between a favored theory of how galaxies form and what astronomers see in their telescopes.

Astrophysicists base their understanding of how galaxies form on an extension of the big bang theory called the cold dark matter theory. In this latter theory, small galaxies collide and merge, inducing bursts of star formation that create the different types of massive and bright galaxies that astronomers see in the sky today. (Dark matter takes its name from the idea that 85 percent of the total mass of the universe is made of unknown matter that is invisible to telescopes, but whose gravitational effects can be measured on luminous galaxies.)

This theory fits some key data that astrophysicists have collected in recent years. Unfortunately, when astrophysicists ran supercomputer simulations several years ago, they ended up with 10 times more dark matter satellites?clumps of dark matter orbiting a large galaxy?than they expected.

?The problem has been that the simulations don?t match the observations of galaxy properties,? said David Spergel, professor of astrophysics at Princeton University. ?What Andrey?s work represents is a very plausible solution to this problem.?

Kravtsov and his collaborators found the potential solution in new supercomputer simulations they will describe in a paper that will appear in the July 10 issue of the Astrophysical Journal. ?The solution to the problem is likely to be in the way the dwarf galaxies evolve,? Kravtsov said, referring to the small galaxies that inhabit the fringes of large galaxies.

In general, astrophysicists believe that formation of very small dwarf galaxies should be suppressed. This is because gas required for continued formation of stars can be heated and expelled by the first generation of exploding supernovae stars. In addition, ultraviolet radiation from galaxies and quasars that began to fill the universe approximately 12 billion years ago heats the intergalactic gas, shutting down the supply of fresh gas to dwarf galaxies.

In the simulations, Kravtsov, along with Oleg Gnedin of the Space Telescope Science Institute and Anatoly Klypin of New Mexico State University, found that some of the dwarf galaxies that are small today have been more massive in the past and could gravitationally collect the gas they need to form stars and become a galaxy.

?The systems that appear rather feeble and anemic today could, in their glory days, form stars for a relatively brief period,? Kravtsov said. ?After a period of rapid mass growth, they lost the bulk of their mass when they experienced strong tidal forces from their host galaxy and other galaxies surrounding them.?

This galactic ?cannibalism? persists even today, with many of the ?cannibalized? dwarf galaxies becoming satellites orbiting in the gravitational pull of larger galaxies.

?Just like the planets in the solar system surrounding the sun, our Milky Way galaxy and its nearest neighbor, the Andromeda galaxy, are surrounded by about a dozen faint ?dwarf? galaxies,? Kravtsov said. ?These objects were pulled in by the gravitational attraction of the Milky Way and Andromeda some time ago during their evolution.?

The simulations had succeeded where others had failed because Kravtsov?s team analyzed simulations that were closely spaced in time at high resolution. This allowed the team to track the evolution of individual objects in the simulations. ?This is rather difficult and is not often done in analyses of cosmological simulations. But in this case it was the key to recognize what was going on and get the result,? Kravtsov said.

The result puts the cold dark matter scenario on more solid ground. Scientists had attempted to modify the main tenets of the scenario and the properties of dark matter particles to eliminate the glaring discrepancy between theory and observation of dwarf galaxies. ?It turns out that the proposed modifications introduced more problems than they solved,? Kravtsov said.

The simulations were performed at the National Center for Supercomputer Applications, University of Illinois at Urbana-Champaign, with grants provided by the National Science Foundation and the National Aeronautics and Space Administration.

Original Source: University of Chicago News Release

How Deforestation in Brazil is Affecting Local Climate

Image credit: NASA
NASA satellite data are giving scientists insight into how large-scale deforestation in the Amazon Basin in South America is affecting regional climate. Researchers found during the Amazon dry season last August, there was a distinct pattern of higher rainfall and warmer temperatures over deforested regions.

Researchers analyzed multiple years of data from NASA’s Tropical Rainfall Measuring Mission (TRMM). They also used data from the Department of Defense Special Sensor Microwave Imager and the National Oceanic and Atmospheric Administration’s Geostationary Operational Environmental Satellites.

The study appeared in a recent issue of the American Meteorological Society’s Journal of Climate. Lead authors, Andrew Negri and Robert Adler, are research meteorologists at NASA’s Goddard Space Flight Center (GSFC), Greenbelt, Md. Other authors include Liming Xu, formerly of the University of Arizona, Tucson, and Jason Surratt, North Carolina State University, Raleigh.

“In deforested areas, the land heats up faster and reaches a higher temperature, leading to localized upward motions that enhance the formation of clouds and ultimately produce more rainfall,” Negri said.

The researchers caution the rainfall increases were most pronounced in August, during the transition from dry to wet seasons. In this transition period, the effects of land cover, such as evaporation, are not overwhelmed by large-scale weather disturbances that are common during the rest of the year. While the study, based on satellite data analysis, focused on climate changes in the deforested areas, large increases in cloud cover and rainfall were also observed in the naturally un-forested savanna region and surrounding the urban area of Port Velho, Brazil, particularly in August and September.

Recent studies by Dr. Marshall Shepherd cited similar findings, including an average rain-rate increase of 28 percent downwind of urban areas and associated changes in the daily timing of cloud formation and precipitation. He is also a research meteorologist at GSFC.

This research confirmed the Amazon savanna region experienced a shift in the onset of cloudiness and rainfall toward the morning hours. The shift was likely initiated by the contrast in surface heating across the deforested and savanna region.

The varied heights of plants and trees in the region change the aerodynamics of the atmosphere, creating more circulation and rising air. When the rising air reaches the dew point in the cooler, upper atmosphere, it condenses into water droplets and forms clouds.

Negri acknowledged other factors are involved. The savanna in this study is approximately 100 kilometers (62 miles) wide, the perfect size to influence precipitation, such as rain showers and thunderstorms. Earlier studies hypothesized certain land surfaces, such as bands of vegetation 50 to 100 kilometers (31-62 miles) wide in semiarid regions, could result in enhanced precipitation.

This research is in agreement with the recent and sophisticated computer models developed by the Massachusetts Institute of Technology. The models concluded small-scale circulations, including the mixing and rising of air induced by local land surfaces, could enhance cloudiness and rainfall. Many earlier studies that relied on models developed in the 1990s or earlier concluded widespread deforestation of the Amazon Basin would lead to decreased rainfall.

“The effects here are rather subtle and appear to be limited to the dry season. The overall effect of this deforestation on annual and daily rainfall cycles is probably small and requires more study,” Negri said. Future research will use numerical models for investigating the linkage between deforested land surface and the cloud-precipitation components of the water cycle.

NASA’s Earth Science Enterprise is dedicated to understanding the Earth as an integrated system and applying Earth System Science to improve prediction of climate, weather, and natural hazards using the unique vantage point of space.

Original Source: NASA News Release

Opportunity Checks the Edge of the Crater

Image credit: NASA/JPL
NASA’s Mars Opportunity rover began its latest adventure today inside the martian crater informally called Endurance. Opportunity will roll in with all six wheels, then back out to the rim to check traction by looking at its own track marks.

“We’re going in, but we’re doing it cautiously,” said Jim Erickson, deputy project manager for the Mars Exploration Rovers at NASA’s Jet Propulsion Laboratory, Pasadena, Calif. Barring any surprises, Opportunity will enter the stadium-sized crater Wednesday for two to three weeks of scientific studies.

“NASA has made a careful decision. The potential science benefits of sending Opportunity into the crater are well worth the calculated risk the rover might not be able to climb back out,” said JPL’s Dr. Firouz Naderi, manager of NASA’s Mars Exploration Program. “Inside the Endurance crater waits the possibility for the most compelling science investigations Opportunity could add to what it has already accomplished. We have done the ground testing necessary to evaluate the likelihood of exiting the crater afterwards.”

“Spirit and Opportunity are well into their bonus periods after successfully completing their three-month primary missions in April,” Naderi said. “Both rovers are starting new chapters. Spirit is within a stone’s throw of Columbia Hills, and Opportunity is entering the crater.”

Dr. Steve Squyres of Cornell University, Ithaca, N.Y., the rovers’ principal investigator, said, “We expect the science return of going a short way into Endurance to be very high.” The target for inspection within the crater is an exposure of rock layers beneath a layer that corresponds to rocks Opportunity previously examined in the shallower Eagle crater, where the rover landed in January.

The sulfur-rich layer seen in Eagle yielded evidence that a body of gently flowing water once covered the area. The underlying rock layers come from an earlier period. Opportunity’s observations from the rim of Endurance already have shown their composition differs from the Eagle crater’s layers.

“If there was a change in rock type, there was a change in environment,” Squyres said. “This unit will tell us what came before the salty water environment the Eagle crater unit told us about. We want to get to the contact between the two units to see how the environment changed. Is it gradual? Is it abrupt?” Even if the lower layers formed under dry conditions, they may have been exposed to water later. The water’s effect on them could have left telltale evidence of that interaction.”

One section of the target outcrop is only five to seven meters (16 to 23 feet) from the crater rim in an area dubbed Karatepe. The rover team’s plan is to get there, examine the rocks for several days, and then exit the crater. Reaching lower-priority targets, like at the bottom of the crater, would entail driving on sand, with a higher risk of not getting out again.

The strategy for driving on the crater’s inner slope is to keep wheels on rock surfaces instead of sand, said JPL rover-mobility engineer Randy Lindemann. The team ran trials with a test rover on a surface specifically built to simulate Karatepe’s surface conditions. “The tests indicate we have a substantial margin of safety for going up a rocky slope of 25 degrees,” Lindemann said. Opportunity’s observations from the rim at the top of the planned entry route show a slope of less than 20 degrees.

Spirit, launched one year ago Thursday, has driven more than 3.2 kilometers (2 miles) inside the Gusev Crater. A trench it dug in May exposed soil with relatively high levels of sulfur and magnesium, reported Dr. Johannes Brueckner, of Max-Planck-Institut fuer Chemie, Mainz, Germany. Spirit’s alpha particle X-ray spectrometer showed concentrations of these two elements varied in parallel at different locations in the trench, suggesting they may be paired as a magnesium sulfate salt.

Squyres said, “The most likely explanation is water percolated through the subsurface and dissolved out minerals. As the water evaporated near the surface, it left concentrated salts behind. I’m not talking about a standing body of water like we saw signs of at Eagle crater, but we also have an emerging story of subsurface water at Gusev,” he said.

JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Exploration Rover project for NASA’s Office of Space Science, Washington, D.C.

For images and information about the Mars project on the Internet, visit http://marsrovers.jpl.nasa.gov & http://athena.cornell.edu.

Original Source: NASA/JPL News Release

Early Earth was Warm, Despite Less Energy From the Sun

Image credit: Stanford
If a time machine could take us back 4.6 billion years to the Earth’s birth, we’d see our sun shining 20 to 25 percent less brightly than today. Without an earthly greenhouse to trap the sun’s energy and warm the atmosphere, our world would be a spinning ball of ice. Life may never have evolved.

But life did evolve, so greenhouse gases must have been around to warm the Earth. Evidence from the geologic record indicates an abundance of the greenhouse gas carbon dioxide. Methane probably was present as well, but that greenhouse gas doesn’t leave enough of a geologic footprint to detect with certainty. Molecular oxygen wasn’t around, indicate rocks from the era, which contain iron carbonate instead of iron oxide. Stone fingerprints of flowing streams, liquid oceans and minerals formed from evaporation confirm that 3 billion years ago, Earth was warm enough for liquid water.

Now, the geologic record revealed in some of Earth’s oldest rocks is telling a surprising tale of collapse of that greenhouse — and its subsequent regeneration. But even more surprising, say the Stanford scientists who report these findings in the May 25 issue of the journal Geology, is the critical role that rocks played in the evolution of the early atmosphere.

“This is really the first time we’ve tried to put together a picture of how the early atmosphere, early climate and early continental evolution went hand in hand,” said Donald R. Lowe, a professor of geological and environmental science who wrote the paper with Michael M. Tice, a graduate student investigating early life. NASA’s Exobiology Program funded their work. “In the geologic past, climate and atmosphere were really profoundly influenced by development of continents.”

The record in the rocks
To piece together geologic clues about what the early atmosphere was like and how it evolved, Lowe, a field geologist, has spent virtually every summer since 1977 in South Africa or Western Australia collecting rocks that are, literally, older than the hills. Some of the Earth’s oldest rocks, they are about 3.2 to 3.5 billion years old.

“The further back you go, generally, the harder it is to find a faithful record, rocks that haven’t been twisted and squeezed and metamorphosed and otherwise altered,” Lowe says. “We’re looking back just about as far as the sedimentary record goes.”

After measuring and mapping rocks, Lowe brings samples back to Stanford to cut into sections so thin that their features can be revealed under a microscope. Collaborators participate in geochemical and isotopic analyses and computer modeling that further reveal the rocks’ histories.

The geologic record tells a story in which continents removed the greenhouse gas carbon dioxide from an early atmosphere that may have been as hot as 70 degrees Celsius (158 F). At this time the Earth was mostly ocean. It was too hot to have any polar ice caps. Lowe hypothesizes that rain combined with atmospheric carbon dioxide to make carbonic acid, which weathered jutting mountains of newly formed continental crust. Carbonic acid dissociated to form hydrogen ions, which found their way into the structures of weathering minerals, and bicarbonate, which was carried down rivers and streams to be deposited as limestone and other minerals in ocean sediments.

Over time, great slabs of oceanic crust were pulled down, or subducted, into the Earth’s mantle. The carbon that was locked into this crust was essentially lost, tied up for the 60 million years or so that it took the minerals to get recycled back to the surface or outgassed through volcanoes.

The hot early atmosphere probably contained methane too, Lowe says. As carbon dioxide levels fell due to weathering, at some point, levels of carbon dioxide and methane became about equal, he conjectures. This caused the methane to aerosolize into fine particles, creating a haze akin to that which today is present in the atmosphere of Saturn’s moon Titan. This “Titan Effect” occurred on Earth 2.7 to 2.8 billion years ago.

The Titan Effect removed methane from the atmosphere and the haze filtered out light; both caused further cooling, perhaps a temperature drop of 40 to 50 degrees Celsius. Eventually, about 3 billion years ago, the greenhouse just collapsed, Lowe and Tice theorize, and the Earth’s first glaciation may have occurred 2.9 billion years ago.

The rise after the fall
Here the rocks reveal an odd twist in the story — eventual regeneration of the greenhouse. Recall that 3 billion years ago, Earth was essentially Waterworld. There weren’t any plants or animals to affect the atmosphere. Even algae hadn’t evolved yet. Primitive photosynthetic microbes were around and may have played a role in the generation of methane and minor usage of carbon dioxide.

As long as rapid continental weathering continued, carbonate was deposited on the oceanic crust and subducted into what Lowe calls “a big storage facility … that kept most of the carbon dioxide out of the atmosphere.”

But as carbon dioxide was removed from the atmosphere and incorporated into rock, weathering slowed down — there was less carbonic acid to erode mountains and the mountains were becoming lower. But volcanoes were still spewing into the atmosphere large amounts of carbon from recycled oceanic crust.

“So eventually the carbon dioxide level climbs again,” Lowe says. “It may never return to its full glorious 70 degrees Centigrade level, but it probably climbed to make the Earth warm again.”

This summer, Lowe and Tice will collect samples that will allow them to determine the temperature of this time interval, about 2.6 to 2.7 billion years ago, to get a better idea of how hot Earth got.

New continents formed and weathered, again taking carbon dioxide out of the atmosphere. About 3 billion years ago, maybe 10 or 15 percent of the Earth’s present area in continental crust had formed. By 2.5 billion years ago, an enormous amount of new continental crust had formed — about 50 to 60 percent of the present area of continental crust. During this second cycle, weathering of the larger amount of rock caused even greater atmospheric cooling, spurring a profound glaciation about 2.3 to 2.4 billion years ago.

Over the past few million years we have been oscillating back and forth between glacial and interglacial epochs, Lowe says. We are in an interglacial period right now. It’s a transition — and scientists are still trying to understand the magnitude of global climate change caused by humans in recent history compared to that caused by natural processes over the ages.

“We’re disturbing the system at rates that greatly exceed those that have characterized climatic changes in the past,” Lowe said. “Nonetheless, virtually all of the experiments, virtually all of the variations and all of the climate changes that we’re trying to understand today have happened before. Nature’s done most of these experiments already. If we can analyze ancient climates, atmospheric compositions and the interplay among the crust, atmosphere, life and climate in the geologic past, we can take some first steps at understanding what is happening today and likely to happen tomorrow.”

Original Source: Stanford News Release