Supernova Left No Core Behind

The 1987A supernova remnant doesn’t seem to have a neutron star. Image credit: Hubble. Click to enlarge.
In 1987, earthbound observers saw a star explode in the nearby dwarf galaxy called the Large Magellanic Cloud. Astronomers eagerly studied this supernova-the closest seen in the past 300 years-and have continued to examine its remains. Although its blast wave has lit up surrounding clouds of gas and dust, the supernova appears to have left no core behind. Astronomers now report that even the sharp eyes of the Hubble Space Telescope failed to locate the black hole or ultracompact neutron star they believe was created by the star’s death 18 years ago.

“We think a neutron star was formed. The question is: Why don’t we see it?” said astronomer Genevieve Graves of UC Santa Cruz, first author on the paper announcing these results.

“Therein lies the mystery-where is that missing neutron star?” mused co-author Robert Kirshner of the Harvard-Smithsonian Center for Astrophysics (CfA).

When a massive star explodes, it leaves behind some sort of compact object, either a city-sized ball of subatomic particles called a neutron star, or a black hole. The outcome depends on the mass of the progenitor star. Smaller stars form neutron stars while larger stars form black holes.

The progenitor of supernova (SN) 1987A weighed 20 times as much as the sun, placing it right on the dividing line and leaving astronomers uncertain about what type of compact object it produced. All observations to date have failed to detect a light source in the center of the supernova remnant, leaving the question of the outcome unanswered.

Detecting a black hole or neutron star is challenging. A black hole can be detected only when it swallows matter, because the matter heats up and emits light as it falls into the black hole. A neutron star at the distance of the Large Magellanic Cloud can be detected only when it emits beams of radiation as a pulsar, or when it accretes hot matter like a black hole.

“A neutron star could just be sitting there inside SN 1987A, not accreting matter and not emitting enough light for us to see,” said astronomer Peter Challis (CfA), second author on the study.

Observations have ruled out the possibility of a pulsar within SN 1987A. Even if the pulsar’s beams were not aimed at the earth, they would light the surrounding gas clouds. However, theories predict that it can take anywhere from 100 to 100,000 years for a pulsar to form following a supernova, because the neutron star must gain a sufficiently strong magnetic field to power the pulsar beam. SN 1987A may be too young to hold a pulsar.

As a result, the only way astronomers might detect the central object is to search for evidence of matter accreting onto either a neutron star or a black hole. That accretion could happen in one of two ways: spherical accretion in which matter falls in from all directions, or disk accretion in which matter spirals inward from a disk onto the compact object.

The Hubble data rule out spherical accretion because light from that process would be bright enough to detect. If disk accretion is taking place, the light it generates is very faint, meaning that the disk itself must be quite small in both mass and radial extent. Also, the lack of detectable radiation indicates that the disk accretion rate must be extremely low-less than about one-fifth the mass of the Moon per year.

In the absence of a definitive detection, astronomers hope to learn more about the central object by studying the dust clouds surrounding it. That dust absorbs visible and ultraviolet light and re-radiates the energy at infrared wavelengths.

“By studying that reprocessed light, we hope to find out what’s powering the supernova remnant and lighting the dust,” said Graves. Future observations by NASA’s Spitzer Space Telescope should provide new clues to the nature of the hidden object.

Additional observations by Hubble also could help solve the mystery. “Hubble is the only existing facility with the resolution and sensitivity needed to study this problem,” said Kirshner.

The paper describing these findings is online at http://arxiv.org/abs/astro-ph?0505066

Original Source: CfA News Release

Opportunity Rolls Free from the Dune

A view back into the sand dune that had captured Opportunity. Image credit: NASA/JPL. Click to enlarge.
NASA’s Mars Exploration Rover mission engineers and managers cheered when images from the Martian surface confirmed Opportunity successfully escaped from a sand trap.

From about 108 million miles away, the rover team at NASA’s Jet Propulsion Laboratory (JPL), Pasadena, Calif., had worked diligently for nearly five weeks to extricate the rover. The long-distance roadside assistance was a painstaking operation to free the six wheeled rover, which was mired in the soft sand of a small Martian dune.

“After a nerve wracking month of hard work, the rover team is both elated and relieved to finally see our wheels sitting on top of the sand instead of half buried in it,” said Jeffrey Biesiadecki, a JPL rover mobility engineer.

Traction was difficult in the ripple-shaped dune of windblown dust and sand that Opportunity drove into on April 26. In the weeks following, the rover churned 629 feet worth of wheel rotations before gaining enough traction to actually move three feet. The rover team directed the drives in cautious increments from May 13 through last Saturday.

“We did careful testing for how to get Opportunity out of the sand. Then we patiently followed the strategy developed from the testing, monitoring every step of the way,” Biesiadecki said. “We hope to have Opportunity busy with a full schedule of scientific exploration again shortly,” he added.

Opportunity’s next task is to examine the site to provide a better understanding of what makes that ripple different from the dozens of similar ones the rover easily crossed. “After we analyze this area, we’ll be able to plan safer driving in the terrain ahead,” said JPL’s Jim Erickson, rover project manager.

Both Spirit and Opportunity have worked in harsh Martian conditions much longer than anticipated. They have been studying geology on opposite sides of Mars for more than a year of extended missions since successfully completing their three-month primary missions in April 2004.

“The first thing we’re going to do is simply take a hard look at the stuff we were stuck in,” said Dr. Steve Squyres of Cornell University, Ithaca, N.Y. He is the principal investigator for the Mars rovers’ science instruments. “After that, we will begin a cautious set of moves to get us on our way southward again. South is where we think the best science is, so that’s still where we want to go,” he added.

Shortly after landing in January 2004, Opportunity found layered bedrock that bore geological evidence for a shallow ancient sea. Spirit did not find extensive layered bedrock until more than a year later, after driving more than two miles and climbing into a range of hills known as “Columbia Hills.”

Original Source: NASA News Release

Strange Ozone Hole this Year

Changing ozone hole. Image credit: NASA/JPL. Click to enlarge.
Despite near-record levels of chemical ozone destruction in the Arctic this winter, observations from NASA’s Aura spacecraft showed that other atmospheric processes restored ozone amounts to near average and stopped high levels of harmful ultraviolet radiation from reaching Earth’s surface.

Analyses from Aura’s Microwave Limb Sounder indicated Arctic chemical ozone destruction this past winter peaked at near 50 percent in some regions of the stratosphere, a region of Earth’s atmosphere that begins about 8 to 12 kilometers (5 to 7 miles) above Earth’s poles. This was the second highest level ever recorded, behind the 60 percent level estimated for the 1999-2000 winter. Data from another instrument on Aura, the Ozone Monitoring Instrument, found the total amount of ozone over the Arctic this past March was similar to other recent years when much less chemical ozone destruction occurred. So what tempered the ozone loss? The answer appears to lie in this year’s unusual Arctic atmospheric conditions.

“This was one of the most unusual Arctic winters ever,” said scientist Dr. Gloria Manney of NASA’s Jet Propulsion Laboratory, Pasadena, Calif., who led the Microwave Limb Sounder analyses. “Arctic lower stratospheric temperatures were the lowest on record. But other conditions like wind patterns and air motions were less conducive to ozone loss this year.”

While the Arctic polar ozone was being chemically destroyed toward the end of winter, stratospheric winds shifted and transported ozone-rich air from Earth’s middle latitudes into the Arctic polar region, resulting in little net change in the total amount of ozone. As a result, harmful ultraviolet radiation reaching Earth’s surface remained at near-normal levels.

Imagery and an animation depicting the Microwave Limb Sounder and Ozone Monitoring Instrument 2005 Arctic ozone observations may be viewed at:

http://www.nasa.gov/vision/earth/lookingatearth/ozone-aura.html

Extensive ozone loss occurs each winter over Antarctica (the “ozone hole”) due to the extreme cold there and its strong, long-lived polar vortex (a band of winds that forms each winter at high latitudes). This vortex isolates the region from middle latitudes. In contrast, the Arctic winter is warmer and its vortex is weaker and shorter-lived. As a result, Arctic ozone loss has always been lower, more variable and much more difficult to quantify.

This was the first Arctic winter monitored by Aura, which was launched in July 2004. Aura’s Microwave Limb Sounder is contributing to our understanding of the processes that cause Arctic wind patterns to push ozone-rich air to the Arctic lower stratosphere from higher altitudes and lower latitudes. Through Aura’s findings, scientists can differentiate chemical ozone destruction from ozone level changes caused by air motions, which vary dramatically from year to year.

“Understanding Arctic ozone loss is critical to diagnosing the health of Earth’s ozone layer,” said Dr. Phil DeCola, Aura program scientist at NASA Headquarters, Washington. “Previous attempts to quantify Arctic ozone loss have suffered from a lack of data. With Aura, we now have the most comprehensive, simultaneous, global daily measurements of many of the key atmospheric gases needed to understand and quantify chemical ozone destruction.”

Ozone loss in Earth’s stratosphere is caused primarily by chemical reactions with chlorine from human-produced compounds like chlorofluorocarbons. When stratospheric temperatures drop below minus 78 degrees Celsius (minus 108 degrees Fahrenheit), polar stratospheric clouds form. Chemical reactions on the surfaces of these clouds activate chlorine, converting it into forms that destroy ozone when exposed to sunlight.

The data obtained by Aura were independently confirmed by instruments participating in NASA’s Polar Aura Validation Experiment, which flew underneath Aura as it passed over the polar vortex. The experiment, flown on NASA’s DC-8 flying laboratory from NASA’s Dryden Flight Research Center, Edwards, Calif., carried 10 instruments to measure temperatures, aerosols, ozone, nitric acid and other gases. The experiment was carried out in January and February 2005.

Aura is the third and final major Earth Observing System satellite. Aura carries four instruments: the Ozone Monitoring Instrument, built by the Netherlands and Finland in collaboration with NASA; the High Resolution Dynamics Limb Sounder, built by the United Kingdom and the United States; and the Microwave Limb Sounder and Tropospheric Emission Spectrometer, both built by JPL. Aura is managed by NASA’s Goddard Space Flight Center, Greenbelt, Md.

For more information on Aura on the Internet, visit: http://aura.gsfc.nasa.gov/

For more information on the Microwave Limb Sounder on the Internet, visit: http://mls.jpl.nasa.gov/

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Original Source: NASA/JPL News Release

View Through the Rings

Cassini’s beautiful view of Saturn, looking through its rings. Image credit: NASA/JPL/SSI. Click to enlarge.
In this fabulous close-up, Cassini peers directly through regions of the A, B and C rings (from top to bottom here) to glimpse shadows of the very same rings cast upon the planet’s atmosphere. Near the top, shadows cast by ringlets in the Cassini division (center) look almost like a photo negative.

This type of image helps scientists probe the rings’ structure in detail and provides information about the density of their constituent particles.

The image was taken in visible light with the Cassini spacecraft narrow-angle camera on April 26, 2005, at a distance of approximately 2.3 million kilometers (1.4 million miles) from Saturn. The image scale is 14 kilometers (9 miles) per pixel.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging team is based at the Space Science Institute, Boulder, Colo.

For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov . For additional images visit the Cassini imaging team homepage http://ciclops.org .

Original Source: NASA/JPL/SSI News Release

Mars Phoenix Mission Prepares for 2007 Launch

A mock up of the Phoenix lander. Image credit: NASA/Lockheed Martin. Click to enlarge.
NASA has given the green light to a project to put a long-armed lander on to the icy ground of the far-northern Martian plains. NASA’s Phoenix lander is designed to examine the site for potential habitats for water ice, and to look for possible indicators of life, past or present.

Today’s announcement allows the Phoenix mission to proceed with preparing the spacecraft for launch in August 2007. This major milestone followed a critical review of the project’s planning progress and preliminary design, since its selection in 2003.

Phoenix is the first project in NASA’s Mars Scout Program of competitively selected missions. Scouts are innovative and relatively low-cost complements to the core missions of the agency’s Mars exploration program.

“The Phoenix Mission explores new territory in the northern plains of Mars analogous to the permafrost regions on Earth,” said the project’s principal investigator, Peter Smith of the University of Arizona, Tucson. “NASA’s confirmation supports this project and may eventually lead to discoveries relating to life on our neighboring planet.”

Phoenix is a stationary lander. It has a robotic arm to dig down to the Martian ice layer and deliver samples to sophisticated analytical instruments on the lander’s deck. It is specifically designed to measure volatiles, such as water and organic molecules, in the northern polar region of Mars. In 2002, NASA’s Mars Odyssey orbiter found evidence of ice-rich soil very near the surface in the arctic regions.

Like its namesake, Phoenix rises from ashes, carrying the legacies of two earlier attempts to explore Mars. The 2001 Mars Surveyor lander, administratively mothballed in 2000, is being resurrected for Phoenix. Many of the scientific instruments for Phoenix were built or designed for that mission or flew on the unsuccessful Mars Polar Lander in 1999.

“The Phoenix team’s quick response to the Odyssey discoveries and the cost-saving adaptation of earlier missions’ technology are just the kind of flexibility the Mars Scout Program seeks to elicit,” said NASA’s Mars Exploration Program Director, Doug McCuistion.

“Phoenix revives pieces of past missions in order to take NASA’s Mars exploration into an exciting future,” said NASA’s Director, Solar System Division, Science Mission Directorate, Andrew Dantzler.

The cost of the Phoenix mission is $386 million, which includes the launch. The partnership developing the Phoenix mission includes the University of Arizona; NASA’s Jet Propulsion Laboratory (JPL), Pasadena, Calif.; Lockheed Martin Space Systems, Denver; and the Canadian Space Agency, which is providing weather-monitoring instruments.

“The confirmation review is an important step for all major NASA missions,” said JPL’s Barry Goldstein, project manager for Phoenix. “This approval essentially confirms NASA’s confidence that the spacecraft and science instruments will be successfully built and launched, and that once the lander is on Mars, the science objectives can be successfully achieved.”

Much work lies ahead. Team members will assemble and test every subsystem on the spacecraft and science payload to show they comply with design requirements. Other tasks include selecting a landing site, which should be aided by data provided by the Mars Reconnaissance Orbiter launching in August, and preparing to operate the spacecraft after launch.

JPL, a division of the California Institute of Technology, Pasadena, manages Phoenix for NASA’s Science Mission Directorate.

Original Source: NASA/JPL News Release

Following the Dust Trail

Halley’s Comet. Image credit: MPAE. Click to enlarge.
As Professor Emeritus of the Max Planck Institute, Dr. Kissel has a life-long devotion to the study of comets. “In the early 20th century the comet tails lead to the postulation and later to the detection of the ‘solar wind’, a stream of ionized atoms constantly blown away from the sun. As astronomical observations became more powerful, more and more constituents could be identified, both solid state particles and gaseous molecules, neutral and ionized.” As our techniques of studying these outer solar system visitors became more refined, so have our theories of what they might be comprised of – and what they look like. Says Kissel, “Many models have been proposed to describe the dynamic appearance of a comet, from which Fred Whipple’s was apparently the most promising. It postulated a nucleus made up from water-ice and dust. Under the influence of the sun, the water-ice would sublime and accelerate dust particles along its way.”

Still, they were a mystery – a mystery that science was eager to solve. “Not until Halley was it known that many comets are part of our solar system and orbit the sun just like the planets do, just on other type orbits and with additional effects due to the emission of materials.” comments Kissel. But only by getting up close and personal with a comet were we able to discover far more. With Halley’s return to our inner solar system, the plans were made to catch a comet and its name was Giotto.

Giotto’s mission was obtain color photographs of the nucleus, determine the elemental and isotopic composition of volatile components in the cometary coma, study the parent molecules, and help us to understand the physical and chemical processes that occur in the cometary atmosphere and ionosphere. Giotto would be the first to investigate the macroscopic systems of plasma flows resulting from the cometary-solar wind interaction. High on its list of priorities was measuring the gas production rate and determining the elemental and isotopic composition of the dust particles. Critical to the scientific investigation was the dust flux – its size and mass distribution and the crucial dust-to-gas ratio. As the on-board cameras imaged the nucleus from 596 km away – determining its shape and size – it was also monitoring structures in the dust coma and studying the gas with both neutral and ion mass spectrometers. As science suspected, the Giotto mission found the gas to be predominantly water, but it contained carbon monoxide, carbon dioxide, various hydrocarbons, as well as a trace of iron and sodium.

As a team research leader for the Giotto mission, Dr. Kissel recalls, “When the first close up missions to comet 1P/Halley came along, a nucleus was clearly identified in 1986. It was also the first time that dust particles, the comet released gases were analyzed in situ, i.e. without man made interference nor transportation back to ground.” It was an exciting time in cometary research, through Giotto’s instrumentation, researchers like Kissel could now study data like never before. “These first analyses showed that particles are all an intimate mixture of high mass organic material and very small dust particles. The biggest surprise was certainly the very dark nucleus (reflecting only 5% of the light shining onto it) and the amount and complexity of the organic material.”

But was a comet truly something more or just a dirty snowball? “Up until today there is – to my knowledge – no measurement showing the existence of solid water ice exposed on a cometary surface.” says Kissel, “However, we found that water (H2O) as a gas could be released by chemical reactions going on when the comet is increasingly heated by the sun. The reason could be ‘latent heat’, i.e. energy stored in the very cold cometary material, which acquired the energy by intense cosmic radiation while the dust was traveling through interstellar space through bond breaking. Very close to the model for which the late J. Mayo Greenberg has argued for years.”

We now know Comet Halley consisted of the most primitive material known to us in the solar system. With the exception of nitrogen, the light elements shown were quite similar in abundance as that of our own Sun. Several thousand dust particles were determined to be hydrogen, carbon, nitrogen, oxygen – as well as mineral forming elements such as sodium, magnesium, silicon, calcium and iron. Because the lighter elements were discovered far away from the nucleus, we knew they were not cometary ice particles. From our studies of the chemistry of interstellar gas surrounding stars, we’ve learned how carbon chain molecules react to elements such as nitrogen, oxygen, and in a very small part, hydrogen. In the extreme cold of space, they can polymerize – changing the molecular arrangement of these compounds to form new. They would have the same percentage composition of the original, but a greater molecular weight and different properties. But what are those properties?

Thanks to some very accurate information from the probe’s close encounter with Comet Halley, Ranjan Gupta of the Inter-University Centre of Astronomy and Astrophysics (IUCAA) and his colleagues have made some very interesting findings with cometary dust composition and scattering properties. Since the beginning missions to comets were “fly-bys”, all the material captured was analyzed in-situ. This type of analysis showed that cometary materials are generally a mixture of silicates and carbon in amorphous and crystalline structure formed in the matrix. Once the water evaporates, the sizes of these grains range from sub-micron to micron and are highly porous in nature – containing non-spherical and irregular shapes.

According to Gupta, most of the early models of light scattering from such grains were “based on solid spheres with conventional Mie theory and only in the recent years – when the space missions provided strong evidences against this – have new models have been evolved where non-spherical and porous grains have been used to reproduce the observed phenomenon”. In this case, linear polarization is produced by the comet from the incident solar light. Confined to a plane – the direction from which the light is scattered – it varies by position as the comet approaches or recedes from the the Sun. As Gupta explains, “An important feature of this polarization curve versus the scattering angle (referred to the sun-earth-comet geometry) is that there is some degree of negative polarization.”

Known as ‘back scattering’, this negativity occurs when monitoring a single wavelength – monochromatic light. The Mie algorithm models all of the accepted scattering processes caused by a spherical shape, taking into account external reflection, multiple internal reflections, transmission and surface waves. This intensity of scattered light works as a function of the angle, where 0? implies forward-scattering, away from the lights original direction, while 180? implies back scattering – back awards the source of the light.
According to Gupta, “Back scattering is seen in most of the comets generally in the visible bands and for some comets in the near-infra red (NIR) bands.” At the present time, models attempting to reproduce this aspect of negative polarization at high scattering angles have very limited success.

Their study has used a modified DDA (discrete dipole approximation) – where each dust grain is assumed to be an array of dipoles. A great range of molecules can contain bonds that are between the extremes of ionic and covalent. This difference between the electronegativities of the atoms in the molecules is sufficient enough that the electrons aren’t shared equally – but are small enough that the electrons aren’t attracted only to one of the atoms to form positive and negative ions. This type of bond in molecules is known as polar. because it has positive and negative ends – or poles – and the molecules have a dipole moment.

These dipoles interact with each other to produce the light scattering effects like extinction – spheres larger than the wavelength of light will block monochromatic and white light – and polarization – the scattering of the wave of the incoming light. By using a model of composite grains with a matrix of graphite and silicate spheroids, a very specific grain size range may be required to explain the observed properties in cometary dust. “However, our model is also unable to reproduce the negative branch of polarization which is observed in some comets. Not all comets show this phenomenon in the NIR band of 2.2 microns.”

These composite grain models developed by Gupta et al; will need to be refined further to explain the negative polarization branch, as well as the amount of polarization in various wavelengths. In this case, it is a color effect with higher polarization in red than green light. More extensive laboratory simulations of composite grains are upcoming and “The study of their light scattering properties will help in refining such models.”

Mankind’s successful beginnings at following this cometary dust trail started with Halley. Vega 1, Vega 2 and Giotto provided the models needed to better research equipment. In May 2000, Drs. Franz R. Krueger and Jochen Kissel of Max Planck Institute published their findings as “First Direct Chemical Analysis of Interstellar Dust”. Says Dr. Kissel, “Three of our dust impact mass spectrometers (PIA on board GIOTTO, and PUMA-1 and -2 onboard VEGA-1 and -2) encountered Comet Halley. With those we were able to determine the elementary composition of the cometary dust. Molecular information, however, was only marginal.” Deep Space 1’s close encounter with Comet Borrelly returned the best images and other science data received so far. On the Borelly Team, Dr. Kissel replies, “The more recent mission to Borrelly (and STARDUST) showed fascinating details of the comet surface such as steep 200m high slopes and spires some 20m wide and 200m high.”

Despite the mission’s many problems, Deep Space 1 proved to be a total success. According to Dr. Mark Rayman’s December 18, 2001 Mission Log, “The wealth of science and engineering data returned by this mission will be analyzed and used for years to come. The testing of high risk, advanced technologies means that many important future missions that otherwise would have been unaffordable or even impossible now are within our grasp. And as all macroscopic readers know, the rich scientific harvest from comet Borrelly is providing scientists fascinating new insights into these important members of the solar system family.”

Now Stardust has taken our investigations just one step further. Collecting these primitive particles from Comet Wild 2, the dust grains will be stored safely in aerogel for study upon the probe’s return. NASA’s Donald Brownlee says, “Comet dust will also be studied in real time by a time-of-flight mass spectrometer derived from the PIA instrument carried to comet Halley on the Giotto mission. This instrument will provide data on the organic particle materials that may not survive aerogel capture, and it will provide an invaluable data set that can be used to evaluate the diversity among comets by comparison with Halley dust data recorded with the same technique.”

These very particles might contain an answer, explaining how interstellar dust and comets may have seeded life on Earth by providing the physical and chemical elements crucial to its development. According to Browlee, “Stardust captured thousands of comet particles that will be returned to Earth for analysis, in intimate detail, by researchers around the world.” These dust samples will allow us to look back some 4.5 billion years ago – teaching us about fundamental nature of interstellar grains and other solid materials – the very building blocks of our own solar system. Both atoms found on Earth and in our own bodies contain the same materials as released by comets.

And it just keeps getting better. Now en route to Comet Comet 67 P/Churyumov- Gerasimenko, ESA’s Rosetta will delve deeper into the mystery of comets as it attempts a successful landing on the surface. According to ESA, equipment such as “Grain Impact Analyser and Dust Accumulator (GIADA) will measure the number, mass, momentum, and velocity distribution of dust grains coming from the comet nucleus and from other directions (reflected by solar radiation pressure) – while Micro-Imaging Dust Analysis System (MIDAS) will study the dust environment around the comet. It will provide information on particle population, size, volume, and shape.”

A single cometary particle could be a composite of millions of individual interstellar dust grains, allowing us new insight on galactic and nebular processes increasing our understanding of both comets and stars. Just as we have produced amino acids in laboratory conditions that simulate what may occur in a comet, most of our information has been indirectly obtained. By understanding polarization, wavelength absorption, scattering properties and the shape of a silicate feature, we gain valuable knowledge into the physical properties of what we have yet to explore. Rosetta’s goal will be to carry a lander to the a comet’s nucleus and deploy it on the surface. The lander science will focus on in-situ study of the composition and structure of the nucleus – an unparalleled study of cometary material – providing researchers like Dr. Jochen Kissel valuable information.

On July 4, 2005, the Deep Impact mission will arrive at Comet Temple 1. Buried beneath its surface may be even more answers. In an effort to form a new crater on the comet’s surface, a 370 kg mass will be released to impact Tempel 1’s sunlit side. The result will be the fresh ejection of ice and dust particles and will further our understanding about comets by observing the changes in activity. The fly-by craft will monitor structure and composition of the crater’s interior – relaying data back to Earth’s cometary dust expert, Kissel. “Deep Impact will be the first to simulate a natural event, the impact of a solid body onto a comet nucleus. The advantage is that the impact time is well known and a spacecraft properly equipped is around, when the impact occurs. This will definitely provide information of what is below the surfaces from which we have pictures by the previous missions. Many theories have been formulated to describe the thermal behavior of the comet nucleus, requiring crusts thick or thin and or other features. I’m sure all these models will have to be complimented by new ones after the Deep Impact.”

After a lifetime of cometary research, Dr. Kissel is still following the dust trail, “It’s the fascination of comet research that after each new measurement there are new facts, which show us, how wrong we were. And that is still on a rather global level.” As our methods improve, so does our understanding of these visitors from the Oort Cloud. Says Kissel, “The situation is not simple and as many simple models describe the global cometary activities pretty well, while details have still to be worked, and models including the chemistry aspects are not yet available.” For a man who has been there since the very beginning, working with Deep Impact continues a distinguished career. “It’s exciting to be part of it” says Dr. Kissel, “and I am eager to see what happens after the Deep Impact and grateful to be a part of it.”

For the very first time, studies will go well beneath the surface of a comet, revealing its pristine materials – untouched since its formation. What lay beneath the surface? Let’s hope spectroscopy shows carbon, hydrogen, nitrogen and oxygen. These are known to produce organic molecules, starting with the basic hydrocarbons, such as methane. Will these processes have increased in complexity to create polymers? Will we find the basis for carbohydrates, saccharides, lipids, glycerides, proteins and enzymes? Following dust trail might very well lead to the foundation of the most spectacular of all organic matter – deoxyribonucleic acid – DNA.

Written by Tammy Plotner

Recent Blast was Probably a Neutron Star Collision

Swift’s X-Ray telescope captured this image of GRB050509b embedded in the diffuse X-ray emission associated with the galaxy cluster. Image credit: NASA. Click to enlarge.
Two billion years and 25 days ago, an event destined to be a watershed in the astronomical community took place in a distant galaxy ? a blast of gamma rays lasting a mere a thirtieth of a second. The aptly-named Swift observatory ‘saw’ the gammas with its Burst Alert Telescope (BAT) instrument, worked out roughly where they were coming from, and turned its X-ray and UV telescopes. The international GCN (GRB Coordinates Network) lit up with notices from observatories all over the world (and out in space), reporting what they found when they looked there. Data came in from Namibia, the Canaries, continental US, Chile, India, the Netherlands, and above all Hawaii. The world?s leading optical telescopes, the VLT, the Kecks, Gemini, Subaru, all swung into action; the electromagnetic spectrum was covered from extremely high energy gammas to the radio.

And all for what? A few dozen gamma rays plus about a dozen X-rays? Astronomers have known for over a decade that gamma ray bursts (GRBs) come in two different kinds: ?long-soft? and ?short-hard?. GRB050509b was a short-hard one. It lasted about 30 ms, its gamma spectrum had more ?hard? gammas than ?soft? ones, and it was the first time an X-ray afterglow was ever detected.

Astronomers have been “desperately seeking afterglows” for years. These are the X-ray, UV, optical, IR, and radio waves streaming from the site of the GRB, after the gamma radiation tails off. Because we can pinpoint the source of these more accurately than the GRBs themselves, finding afterglows is the first step to working out what they are.

Before GRB050509b, astronomers were leaning towards the theory that long-soft GRBs are core-collapse supernovae (collapsars). While there have been dozens of theoretical papers published on what short-hard GRBs might be, only three scenarios seemed to fit the gamma ray data ? the merger (or collision) of a neutron star with another (or a black hole), a giant flare from a magnetar (a ?starquake? in an intensely magnetic neutron star), or some variation on the collapsar theme.

Now the first of what will likely be hundreds of papers on GRB050509b has been submitted for publication. The 28 authors conclude that “there is now observational support for the hypothesis that short-hard bursts arise during the merger of a compact binary (two neutron stars, or a neutron star and a black hole).”

The key to the researchers? conclusion is the ‘localization’ of the X-ray afterglow.

Swift?s X-ray telescope detected X-rays coming from the same region of the sky as the gammas; after some sleuthing to tie the apparent X-ray position to the astronomers? coordinate system (RA and Dec), the Swift XRT team determined that the afterglow came from a circle about 15″ (arc seconds) across, whose centre is about 10″ from the heart of an elliptical galaxy (which now has the memorable name G1), itself a member of a rich cluster of galaxies bathed in X-rays. How did they know it was an afterglow? Because it faded; the diffuse X-ray glow from clusters doesn?t do that.

And despite looking very carefully, no other electromagnetic afterglow was detected.

So now our 28 astronomers had to work out whether G1?s suburbs is where the stardeath happened, or somewhere else; what is the ?host?, in astronomer-speak.

Modern astronomy makes heavy use of statistics; to be sure they don?t have a fluke, researchers normally want lots and lots of examples. In this case, the only stats the paper?s authors could do is a calculation ? how likely is it that a short-hard GRB (assuming that such are stardeath events) would occur ?near? an elliptical galaxy, in a rich cluster, just by chance? Many different ?how likely? questions were asked; the answers in all cases are, ?not very likely?. However, no one is ruling out bad luck.

Our researchers could now turn to the various theoretical models of short-hard GRBs, and of GRB afterglows, to see how well the observational data fit the theoretical expectations, assuming the GRB went off in G1.

Good news (#1) is that the afterglow data matches well: short-hard GRBs release a lot less (gamma) energy than do long-soft ones (so afterglows from short-hard GRBs should be fainter; the gamma energy is an indicator of the energy used to power the afterglow). Better yet, since what the burst debris smashes into determines how bright the afterglow will be, the faint GRB050509b afterglow is just what you?d expect if it happened in the rarified gas of the interstellar medium of an elliptical (collapsar afterglows are bright in part because they happen in the messy remnants of the gas-dust clouds from which they were born a mere few million years earlier).

The second piece of good news is that, no trace of recent star formation could be found in G1, thus pretty much ruling out a collapsar as the progenitor. Why? Because collapsars are very young stars, and so cannot have moved far from their birthplace before their death. Further, the debris of even the wimpiest collapsar supernova would have been visible, several days afterwards.

What about a giant flare from a magnetar? This cannot be strongly ruled out for GRB050509b, but a magnetar in a galaxy like G1 is not very likely, and GRB050509b was a thousand times brighter than the strongest magnetar flare we?ve seen, to date.

That leaves the merger of a neutron star binary (or NS-BH binary). Where would we find such a binary, just ready to merge? They certainly could be found in the suburbs of spiral galaxies, or in globular clusters, but giant elliptical galaxies like G1 is mostly where.

So it?s ?case closed?? Not quite. ?Other progenitor models are still viable, and additional rapidly localized bursts from the Swift mission will undoubtedly help to further clarify the progenitor picture.?

Could GRB050509b be a stardeath in a much more distant galaxy? Maybe one of the dozen or so fuzzy blobs (a much more distant galaxy cluster? such chance alignments are very common) in or near the X-ray afterglow? Perhaps this will be discussed in future papers on GRB050509b.

Original Source: http://arxiv.org/abs/astro-ph/0505480

New Jupiter Mission Moves Forward

Galileo’s image of Jupiter. Image credit: NASA/JPL. Click to enlarge.
NASA today announced that a mission to fly to Jupiter will proceed to a preliminary design phase. The mission is called Juno, and it is the second in NASA’s New Frontiers Program.

The mission will conduct an in-depth study of the giant planet. The mission proposes to place a spacecraft in a polar orbit around Jupiter to investigate the existence of an ice-rock core; determine the amount of global water and ammonia present in the atmosphere; study convection and deep wind profiles in the atmosphere; investigate the origin of the jovian magnetic field; and explore the polar magnetosphere.

“We are excited at the prospect of the new scientific understanding and discoveries by Juno in our continued exploration of the outer reaches of our solar system during the next decade,” said Dr. Ghassem Asrar, deputy associate administrator for NASA’s Science Mission Directorate.

At the end of the preliminary design study, the mission must pass a confirmation review that will address significant schedule, technical and cost risks before being confirmed for the development phase.

Dr. Scott Bolton of Southwest Research Institute, Boulder, Colo., is the principal investigator. NASA’s Jet Propulsion Laboratory, Pasadena, Calif., will provide mission project management. Lockheed Martin Space Systems, Denver, will build the spacecraft.

NASA selected two proposed mission concepts for study in July 2004 from seven submitted in February 2004 in response to an agency Announcement of Opportunity. “This was a very tough decision given the exciting and innovative nature of the two missions,” Asrar added.

The selected New Frontiers science mission must be ready for launch no later than June 30, 2010, within a mission cost cap of $700 million.

The New Frontiers Program is designed to provide opportunities to conduct several of the medium-class missions identified as top priority objectives in the Decadal Solar System Exploration Survey, conducted by the Space Studies Board of the National Research Council.

The first NASA New Frontiers mission will fly by the Pluto-Charon system in 2014 and then target another Kuiper asteroid belt object.

For information about NASA’s science programs on the Web, visit: http://science.hq.nasa.gov/. For information about NASA and agency programs on the Web, visit: http://www.nasa.gov/home/index.html.

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Original Source: NASA News Release

A Simulation of the Whole Universe

Simulated image that shows the distribution of matter in the Universe. Image credit: MPG. Click to enlarge.
The Virgo consortium, an international group of astrophysicists from the UK, Germany, Japan, Canada and the USA has today (June 2nd) released first results from the largest and most realistic simulation ever of the growth of cosmic structure and the formation of galaxies and quasars. In a paper published in Nature, the Virgo Consortium shows how comparing such simulated data to large observational surveys can reveal the physical processes underlying the build-up of real galaxies and black holes.

The “Millennium Simulation” employed more than 10 billion particles of matter to trace the evolution of the matter distribution in a cubic region of the Universe over 2 billion light-years on a side. It kept the principal supercomputer at the Max Planck Society’s Supercomputing Centre in Garching, Germany occupied for more than a month. By applying sophisticated modelling techniques to the 25 Terabytes (25 million Megabytes) of stored output, Virgo scientists are able to recreate evolutionary histories for the approximately 20 million galaxies which populate this enormous volume and for the supermassive black holes occasionally seen as quasars at their hearts.

Telescopes sensitive to microwaves have been able to image the Universe directly when it was only 400,000 years old. The only structure at that time was weak ripples in an otherwise uniform sea of matter and radiation. Gravitationally driven evolution later turned these ripples into the enormously rich structure we see today. It is this growth which the Millennium Simulation is designed to follow, with the twin goals of checking that this new paradigm for cosmic evolution is indeed consistent with what we see, and of exploring the complex physics which gave rise to galaxies and their central black holes.

Recent advances in cosmology demonstrate that about 70 percent of our Universe currently consists of Dark Energy, a mysterious force field which is causing it to expand ever more rapidly. About one quarter apparently consists of Cold Dark Matter, a new kind of elementary particle not yet directly detected on Earth. Only about 5 percent is made out of the ordinary atomic matter with which we are familiar, most of that consisting of hydrogen and helium. All these components are treated in the Millennium Simulation.

In their Nature article, the Virgo scientists use the Millennium Simulation to study the early growth of black holes. The Sloan Digital Sky Survey (SDSS) has discovered a number of very distant and very bright quasars which appear to host black holes at least a billion times more massive than the Sun at a time when the Universe was less than a tenth its present age.

“Many astronomers thought this impossible to reconcile with the gradual growth of structure predicted by the standard picture”, says Dr Volker Springel (Max Planck Institute for Astrophysics, Garching) the leader of the Millennium project and the first author of the article, “Yet, when we tried out our galaxy and quasar formation modelling we found that a few massive black holes do form early enough to account for these very rare SDSS quasars. Their galaxy hosts first appear in the Millennium data when the Universe is only a few hundred million years old, and by the present day they have become the most massive galaxies at the centres of the biggest galaxy clusters.”

For Prof Carlos Frenk (Institute for Computational Cosmology, University of Durham) the head of Virgo in the UK, the most interesting aspect of the preliminary results is the fact that the Millennium Simulation demonstrates for the first time that the characteristic patterns imprinted on the matter distribution at early epochs and visible directly in the microwave maps, should still be present and should be detectable in the observed distribution of galaxies. “If we can measure the baryon wiggles sufficiently well”, says Prof Frenk, “then they will provide us with a standard measuring rod to characterise the geometry and expansion history of the universe and so to learn about the nature of the Dark Energy.”

“These simulations produce staggering images and represent a significant milestone in our understanding of how the early Universe took shape.” said PPARC’s Chief Executive, Prof Richard Wade. “The Millennium Simulation is a brilliant example of the interaction between theory and experiment in astronomy as the latest observations of astronomical objects can be used to test the predictions of theoretical models of the Universe’s history.”

The most interesting and far-reaching applications of the Millennium Simulation are still to come according to Prof Simon White (Max Planck Institute for Astrophysics) who heads Virgo efforts in Germany. “New observational campaigns are providing us with information of unprecedented precision about the properties of galaxies, black holes and the large-scale structure of our Universe,” he notes. “Our ability to predict the consequences of our theories must reach a matching level of precision if we are to use these surveys effectively to learn about the origin and nature of our world. The Millennium Simulation is a unique tool for this. Our biggest challenge now is to make its power available to astronomers everywhere so that they can insert their own galaxy and quasar formation modelling in order to interpret their own observational surveys.”

Original Source: PPARC News Release

Quasar Image Revises Theories About Their Jets

VLBA image of quasar 3C 273, with its long jet blasting out. Image credit: NRAO. Click to enlarge.
When a pair of researchers aimed the National Science Foundation’s Very Long Baseline Array (VLBA) radio telescope toward a famous quasar, they sought evidence to support a popular theory for why the superfast jets of particles streaming from quasars are confined to narrow streams. Instead, they got a surprise that “may send the theorists back to the drawing boards,” according to one of the astronomers.

“We did find the evidence we were looking for, but we also found an additional piece of evidence that seems to contradict it,” said Robert Zavala, an astronomer at the U.S. Naval Observatory’s Flagstaff, Arizona, station. Zavala and Greg Taylor, of the National Radio Astronomy Observatory and the Kavli Institute of Particle Astrophysics and Cosmology, presented their findings to the American Astronomical Society’s meeting in Minneapolis, Minnesota.

Quasars are generally thought to be supermassive black holes at the cores of galaxies, the black hole surrounded by a spinning disk of material being drawn inexorably into the black hole’s gravitational maw. Through processes still not well understood, powerful jets of particles are propelled outward at speeds nearly that of light. A popular theoretical model says that magnetic-field lines in the spinning disk are twisted tightly together and confine the fast-moving particles into narrow “jets” streaming from the poles of the disk.

In 1993, Stanford University and Kavli Institute astrophysicist Roger Blandford suggested that such a twisted magnetic field would produce a distinct pattern in the alignment, or polarization, of radio waves coming from the jets. Zavala and Taylor used the VLBA, capable of producing the most detailed images of any telescope in astronomy, to seek evidence of Blandford’s predicted pattern in a well-known quasar called 3C 273.

“We saw exactly what Blandford predicted, supporting the idea of a twisted magnetic field. However, we also saw another pattern that is not explained by such a field,” Zavala said.

In technical terms, the twisted magnetic field should cause a steady change, or gradient, in the amount by which the alignment (polarization) of the radio waves is rotated as one looks across the width of the jet. That gradient showed up in the VLBA observations. However, with a twisted magnetic field, the percentage of the waves that are similarly aligned, or polarized, should be at its greatest at the center of the jet and decrease steadily toward the edges. Instead, the observations showed the percentage of polarization increasing toward the edges.

That means, the astronomers say, there either is something wrong with the twisted-magnetic-field model or its effects are washed out by interactions between the jet and the interstellar medium that it is drilling through. “Either way, the theorists have to get to work to figure out how this can happen,” Zavala said.

When notified of the new results, Blandford said, “these observations are good enough to warrant further development of the theory.”

3C 273 is one of the most famous quasars in astronomy, and was the first to be recognized as a very distant object in 1963. Caltech astronomer Maarten Schmidt was working on a brief scientific article about 3C273 on the afternoon of February 5 that year when he suddenly recognized a pattern in the object’s visible-light spectrum that allowed an immediate calculation of its distance. He later wrote that “I was stunned by this development…” Just minutes later, he said, he met his colleague Jesse Greenstein, who was studying another quasar, in a hallway. In a matter of another few minutes, they found that the second one also was quite distant. 3C 273 is about two billion light-years from Earth in the constellation Virgo, and is visible in moderate-sized amateur telescopes.

The VLBA is a system of ten radio-telescope antennas, each with a dish 25 meters (82 feet) in diameter and weighing 240 tons. From Mauna Kea on the Big Island of Hawaii to St. Croix in the U.S. Virgin Islands, the VLBA spans more than 5,000 miles, providing astronomers with the sharpest vision of any telescope on Earth or in space. Dedicated in 1993, the VLBA has an ability to see fine detail equivalent to being able to stand in New York and read a newspaper in Los Angeles.

“The extremely sharp radio ‘vision’ of the VLBA was absolutely necessary to do this work,” Zavala explained. “We used the highest radio frequencies at which we could detect 3C273’s jet to maximize the detail we could get, and this effort paid off with great science,” he added.

The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc.

Original Source: NRAO News Release