Antarctica is Melting Faster

Antarctica. Image credit: Ben Holt, Sr. Click to enlarge
Researchers have completed the first comprehensive survey of Antarctic ice mass; not surprisingly, ice loss is on the rise – mostly from the West Antarctic ice shelf. From 2002 to 2005, the continent lost enough ice to raise global sea levels by about 1.2 mm (0.05 inches). The measurements were made by the GRACE satellite, which detects slight changes in the Earth’s gravity field over time. This is the most accurate estimate of Antarctic ice loss ever made.

The first-ever gravity survey of the entire Antarctic ice sheet, conducted using data from the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (Grace), concludes the ice sheet’s mass has decreased significantly from 2002 to 2005.

Isabella Velicogna and John Wahr, both from the University of Colorado, Boulder, conducted the study. They demonstrated for the first time that Antarctica’s ice sheet lost a significant amount of mass since 2002. The estimated mass loss was enough to raise global sea level about 1.2 millimeters (0.05 inches) during the survey period, or about 13 percent of the overall observed sea level rise for the same period. The researchers found Antarctica’s ice sheet decreased by 152 (plus or minus 80) cubic kilometers of ice annually between April 2002 and August 2005.

That is about how much water the United States consumes in three months (a cubic kilometer is one trillion liters; approximately 264 billion gallons of water). This represents a change of about 0.4 millimeters (.016 inches) per year to global sea level rise. Most of the mass loss came from the West Antarctic ice sheet.

“Antarctica is Earth’s largest reservoir of fresh water,” Velicogna said. “The Grace mission is unique in its ability to measure mass changes directly for entire ice sheets and can determine how Earth’s mass distribution changes over time. Because ice sheets are a large source of uncertainties in projections of sea level change, this represents a very important step toward more accurate prediction, and has important societal and economic impacts. As more Grace data become available, it will become feasible to search for longer-term changes in the rate of Antarctic mass loss,” she said.

Measuring variations in Antarctica’s ice sheet mass is difficult because of its size and complexity. Grace is able to overcome these issues, surveying the entire ice sheet, and tracking the balance between mass changes in the interior and coastal areas.

Previous estimates have used various techniques, each with limitations and uncertainties and an inherent inability to monitor the entire ice sheet mass as a whole. Even studies that synthesized results from several techniques, such as the assessment by the Intergovernmental Panel on Climate Change, suffered from a lack of data in critical regions.

“Combining Grace data with data from other instruments such as NASA’s Ice, Cloud and Land Elevation Satellite; radar; and altimeters that are more effective for studying individual glaciers is expected to substantially improve our understanding of the processes controlling ice sheet mass variations,” Velicogna said.

The Antarctic mass loss findings were enabled by the ability of the identical twin Grace satellites to track minute changes in Earth’s gravity field resulting from regional changes in planet mass distribution. Mass movement of ice, air, water and solid earth reflect weather patterns, climate change and even earthquakes. To track these changes, Grace measures micron-scale variations in the 220-kilometer (137-mile) separation between the two satellites, which fly in formation.

Grace is managed for NASA by the Jet Propulsion Laboratory, Pasadena, Calif. The University of Texas Center for Space Research has overall mission responsibility. GeoForschungsZentrum Potsdam (GFZ), Potsdam, Germany, is responsible for German mission elements. Science data processing, distribution, archiving and product verification are managed jointly by JPL, the University of Texas and GFZ. The results will appear in this week’s issue of Science.

For information about NASA and agency programs on the Web, visit:
http://www.nasa.gov/home

For more information about Grace on the Web, visit:
http://www.csr.utexas.edu/grace ; and http://www.gfz-potsdam.de/grace

For University of Colorado information call Jim Scott at: (303) 492-3114.

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Original Source: NASA News Release

Greenland Ice Loss Doubled in the Past Decade

Helheim Glacier, located in southeast Greenland. Image credit: NASA/JPL Click to enlarge
The loss of ice from Greenland doubled between 1996 and 2005, as its glaciers flowed faster into the ocean in response to a generally warmer climate, according to a NASA/University of Kansas study.

The study will be published tomorrow in the journal Science. It concludes the changes to Greenland’s glaciers in the past decade are widespread, large and sustained over time. They are progressively affecting the entire ice sheet and increasing its contribution to global sea level rise.

Researchers Eric Rignot of NASA’s Jet Propulsion Laboratory, Pasadena, Calif., and Pannir Kanagaratnam of the University of Kansas Center for Remote Sensing of Ice Sheets, Lawrence, used data from Canadian and European satellites. They conducted a nearly comprehensive survey of Greenland glacial ice discharge rates at different times during the past 10 years.

“The Greenland ice sheet’s contribution to sea level is an issue of considerable societal and scientific importance,” Rignot said. “These findings call into question predictions of the future of Greenland in a warmer climate from computer models that do not include variations in glacier flow as a component of change. Actual changes will likely be much larger than predicted by these models.”

The evolution of Greenland’s ice sheet is being driven by several factors. These include accumulation of snow in its interior, which adds mass and lowers sea level; melting of ice along its edges, which decreases mass and raises sea level; and the flow of ice into the sea from outlet glaciers along its edges, which also decreases mass and raises sea level. This study focuses on the least well known component of change, which is glacial ice flow. Its results are combined with estimates of changes in snow accumulation and ice melt from an independent study to determine the total change in mass of the Greenland ice sheet.

Rignot said this study offers a comprehensive assessment of the role of enhanced glacier flow, whereas prior studies of this nature had significant coverage gaps. Estimates of mass loss from areas without coverage relied upon models that assumed no change in ice flow rates over time. The researchers theorized if glacier acceleration is an important factor in the evolution of the Greenland ice sheet, its contribution to sea level rise was being underestimated.

To test this theory, the scientists measured ice velocity with interferometric synthetic-aperture radar data collected by the European Space Agency’s Earth Remote Sensing Satellites 1 and 2 in 1996; the Canadian Space Agency’s Radarsat-1 in 2000 and 2005; and the European Space Agency’s Envisat Advanced Synthetic Aperture Radar in 2005. They combined the ice velocity data with ice sheet thickness data from airborne measurements made between 1997 and 2005, covering almost Greenland’s entire coast, to calculate the volumes of ice transported to the ocean by glaciers and how these volumes changed over time. The glaciers surveyed by those satellite and airborne instrument data drain a sector encompassing nearly 1.2 million square kilometers (463,000 square miles), or 75 percent of the Greenland ice sheet total area.

From 1996 to 2000, widespread glacial acceleration was found at latitudes below 66 degrees north. This acceleration extended to 70 degrees north by 2005. The researchers estimated the ice mass loss resulting from enhanced glacier flow increased from 63 cubic kilometers in 1996 to 162 cubic kilometers in 2005. Combined with the increase in ice melt and in snow accumulation over that same time period, they determined the total ice loss from the ice sheet increased from 96 cubic kilometers in 1996 to 220 cubic kilometers in 2005. To put this into perspective, a cubic kilometer is one trillion liters (approximately 264 billion gallons of water), about a quarter more than Los Angeles uses in one year.

Glacier acceleration has been the dominant mode of mass loss of the ice sheet in the last decade. From 1996 to 2000, the largest acceleration and mass loss came from southeast Greenland. From 2000 to 2005, the trend extended to include central east and west Greenland.

“In the future, as warming around Greenland progresses further north, we expect additional losses from northwest Greenland glaciers, which will then increase Greenland’s contribution to sea level rise,” Rignot said.

For information about NASA and agency programs on the Web, visit:
http://www.nasa.gov/home.

For University of Kansas Center for Remote Sensing of Ice Sheets information, visit:
http://www.cresis.ku.edu/flashindex.htm.

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Original Source: NASA News Release

Volcanoes Helped Slow Ocean Warming Trend

The June 12, 1991 eruption column from Mount Pinatubo, Philippines. Image credit: Richard P. Hoblitt/USGS Click to enlarge
Ocean temperatures might have risen even higher during the last century if it weren’t for volcanoes that spewed ashes and aerosols into the upper atmosphere, researchers have found. The eruptions also offset a large percentage of sea level rise caused by human activity.

Using 12 new state-of-the-art climate models, the researchers found that ocean warming and sea level rise in the 20th century were substantially reduced by the 1883 eruption of the Krakatoa volcano in Indonesia. Volcanic aerosols blocked sunlight and caused the ocean surface to cool.

“That cooling penetrated into deeper layers of the ocean, where it remained for decades after the event,” said Peter Gleckler, an atmospheric scientist at Lawrence Livermore National Laboratory (LLNL). “We found that volcanic effects on sea level can persist for many decades.”

Gleckler, along with LLNL colleagues Ben Santer, Karl Taylor and Krishna AchutaRao and collaborators from the National Center for Atmospheric Research, the University of Reading and the Hadley Centre, tested the effects of volcanic eruptions on recent climate models. They examined model simulations of the climate from 1880 to 2000, comparing them with available observations.

External “forcings,” such as changes in greenhouse gases, solar irradiance, sulphate and volcanic aerosols, were included in the models.

Oceans expand and contract depending on the ocean temperature. This causes sea level to increase when the water is warmer and to recede in cooler temperatures.

The volume average temperature of oceans (down to 300 meters) worldwide has warmed by roughly .037 degrees Celsius in recent decades due to increasing atmospheric greenhouse gases. While seemingly small, this corresponds to a sea level rise of several centimeters and does not include the effect of other factors such as melting glaciers. That sea level jump, however, would have been even greater if it weren’t for volcanic eruptions over the last century, Gleckler said.

“The ocean warming suddenly drops,” he said. “Volcanoes have a big impact. The ocean warming and sea level would have risen much more if it weren’t for volcanoes.”

Volcanic aerosols scatter sunlight and cause the ocean surface temperature to cool, an anomaly that is gradually subducted into deeper layers, where it remains for decades.

The experiments studied by Gleckler’s team also included the more recent 1991 Mt. Pinatubo eruption in the Philippines, which was comparable to Krakatoa in terms of its size and intensity. While similar ocean surface cooling resulted from both eruptions, the heat-content recovery occurred much more quickly in the case of Pinatubo.

“The heat content effects of Pinatubo and other eruptions in the late 20th century are offset by the observed warming of the upper ocean, which is primarily due to human influences,” Gleckler said.

The research appears in the Feb. 9 issue of the journal Nature.

Founded in 1952, Lawrence Livermore National Laboratory has a mission to ensure national security and apply science and technology to the important issues of our time. Lawrence Livermore National Laboratory is managed by the University of California for the U.S. Department of Energy’s National Nuclear Security Administration.

Original Source: Lawrence Livermore National Laboratory

2005 Was the Hottest Year

2005 was the warmest year since the late 1800s. Image credit: NASA Click to enlarge
The year 2005 may have been the warmest year in a century, according to NASA scientists studying temperature data from around the world.

Climatologists at NASA’s Goddard Institute for Space Studies (GISS) in New York City noted that the highest global annual average surface temperature in more than a century was recorded in their analysis for the 2005 calendar year.

Some other research groups that study climate change rank 2005 as the second warmest year, based on comparisons through November. The primary difference among the analyses, according to the NASA scientists, is the inclusion of the Arctic in the NASA analysis. Although there are few weather stations in the Arctic, the available data indicate that 2005 was unusually warm in the Arctic.

In order to figure out whether the Earth is cooling or warming, the scientists use temperature data from weather stations on land, satellite measurements of sea surface temperature since 1982, and data from ships for earlier years.

Previously, the warmest year of the century was 1998, when a strong El Nino, a warm water event in the eastern Pacific Ocean, added warmth to global temperatures. However, what’s significant, regardless of whether 2005 is first or second warmest, is that global warmth has returned to about the level of 1998 without the help of an El Nino.

The result indicates that a strong underlying warming trend is continuing. Global warming since the middle 1970s is now about 0.6 degrees Celsius (C) or about 1 degree Fahrenheit (F). Total warming in the past century is about 0.8? C or about 1.4? F.

“The five warmest years over the last century occurred in the last eight years,” said James Hansen, director of NASA GISS. They stack up as follows: the warmest was 2005, then 1998, 2002, 2003 and 2004.

Over the past 30 years, the Earth has warmed by 0.6? C or 1.08? F. Over the past 100 years, it has warmed by 0.8? C or 1.44? F.

Current warmth seems to be occurring nearly everywhere at the same time and is largest at high latitudes in the Northern Hemisphere. Over the last 50 years, the largest annual and seasonal warmings have occurred in Alaska, Siberia and the Antarctic Peninsula. Most ocean areas have warmed. Because these areas are remote and far away from major cities, it is clear to climatologists that the warming is not due to the influence of pollution from urban areas.

Original Source: NASA News Release

Satellites on a Budget – High Altitude Balloons

Balloon photograph taken from 25km. Image credit: Paul Verhage. Click to enlarge.
Paul Verhage has some pictures that you’d swear were taken from space. And they were. But Verhage is not an astronaut, nor does he work for NASA or any company that has satellites orbiting Earth. He is a teacher in the Boise, Idaho school district. His hobby, however, is out of this world.

Verhage is one of about 200 people across the United States who launch and recover what have been called a “poor man’s satellite.” Amateur Radio High Altitude Ballooning (ARHAB) allows individuals to launch functioning satellites to “near space,” at a fraction of the cost of traditional rocket launch vehicles.

Usually, the cost to launch anything into space on regular rockets is quite high, reaching thousands of dollars per pound. Additionally, the waiting period for payloads to be put on a manifest and then launched can be several years.

Verhage says that the total cost for building, launching and recovering these Near Spacecraft is less than $1,000. “Our launch vehicles and fuel are latex weather balloons and helium,” he said.

Plus, once an individual or small group begins designing a Near Spacecraft, it could be ready for launch within six to twelve months.

Verhage has launched about 50 balloons since 1996. Payloads on his Near Spacecraft include mini-weather stations, Geiger counters and cameras.

Near space lies begins between 60,000 and 75,000 feet (~ 18 to 23 km) and continues to 62.5 miles (100km), where space begins.

“At these altitudes, air pressure is only 1% of that at ground level, and air temperatures are approximately -60 degrees F,” he said. “These conditions are closer to the surface of Mars than to the surface of Earth.”

Verhage also said that because of the low air pressure, the air is too thin to refract or scatter sunlight. Therefore, the sky is black rather than blue. So, what is seen at these altitudes is very close to what the shuttle astronauts see from orbit.

Verhage said his highest flight reached an altitude of 114,600 feet (35 km), and his lowest went only 8 feet (2.4 meters) off the ground.

The main parts of a Near Spacecraft are flight computers, an airframe, and a recovery system. All these components are reusable for multiple flights. “Think of building this Near Spacecraft as building your own reusable Space Shuttle,” said Verhage.

The avionics operates experiments, collects data, and determines the status of the spacecraft, and Verhage makes his own flight computers. The airframe is usually the most inexpensive part of the spacecraft and can be made from materials such as Styrofoam and Ripstop Nylon, put together with hot glue.

The recovery system consists of a GPS, a radio receiver such as a ham radio, and a laptop with GPS software. Additionally, and probably most important is the Chase Crew. “It’s like a road rally,” says Verhage, “but no one in the Chase Crew knows quite for sure where they are going to end up!”

The process of launching a Near Spacecraft involves getting the capsule ready, filling the balloon with helium and releasing it. Ascent rates for the balloons vary for each flight but are typically between 1000 and 1200 feet per minute, with the flights taking 2-3 hours to reach apogee. A filled balloon is about 7 feet tall and 6 feet wide. They expand in size as the balloon ascends, and at maximum altitude can be over 20 feet wide.

The flight ends when the balloon bursts from the reduced atmospheric pressure. To ensure a good landing, a parachute is pre-deployed before launch. A Near Spacecraft will free fall, with speeds of over 6,000 feet per minute until about 50,000 feet in altitude, where the air is dense enough to slow the capsule.

The GPS receiver that Verhage uses signals its position every 60 seconds, so after the spacecraft lands, Verhage and his team usually know where the spacecraft is, but recovering it is mostly a matter of being able to get to where it lies. Verhage has lost only one capsule. The batteries died during the flight, so the GPS wasn’t functioning. Another capsule was recovered 815 days after launch, found by the Air National Guard near a bombing range.

Some balloons are recovered only 10 miles from the launch site, while others have traveled over 150 miles away.

“Some of the recoveries are easy,” said Verhage. “In one flight, one of my chase crew, Dan Miller, caught the balloon as it landed. But some recoveries in Idaho are tough. We’ve spent hours climbing a mountain in some cases.”

Other experiments that Verhage has flown include a Visible Light Photometer, Medium Bandwidth Photometers, an Infrared Radiometer, a Glider Drop, Insect Survival, and Bacteria Exposure.

One of Verhage’s most interesting experiments involved using a Geiger counter to measure cosmic radiation. On the ground, a Geiger counter detects about 4 cosmic rays a minute. At 62,000 the count goes to 800 counts per minute, but Verhage discovered that above that altitude the count does down. “I learned about primary cosmic rays from that discovery,” he said.

Flying the experiments are a great experience, Verhage said, but launching a camera and getting pictures from Near Space provides an irreplaceable “wow” factor. “To have an image of the Earth showing its curvature is pretty amazing,” Verhage said.

“For cameras,” he continued, “the dumber they are the better. Too many of the newer cameras have a power save feature, so they shut off when they’re not used in so many minutes. When they turn off at 50,000 feet, there’s nothing I can do to turn them back on.”

While digital cameras are easy to interface with the flight computer, Verhage said, they require some inventive wiring too keep the camera from shutting off. He said that so far, his best photos have come from film cameras.

Verhage is writing an e-book that details how to build, launch and recover a Near Spacecraft, and the first 8 chapters are available free, online. The e-book will have 15 chapters when finished, totaling about 800 pages in length.
Parallax, the company that manufactures a microcontroller is sponsoring the e-book’s publication.

Verhage teaches electronics at the Dehryl A. Dennis Professional Technical Center in Boise. He writes a bimonthly column about his adventures with ARHAB for Nuts and Volts magazine, and also shares his enthusiasm for space exploration through the NASA/JPL Solar System Ambassador program.

Verhage said his hobby incorporates everything he is interested in: GPS, microcontrollers and space exploration, and he encourages anyone to experience the thrill of sending a spacecraft to Near Space.

By Nancy Atkinson

Greenland is Melting Faster

Decreasing levels of ice thickness from Greenland. Image credit: NASA/JPL. Click to enlarge.
In the first direct, comprehensive mass survey of the entire Greenland ice sheet, scientists using data from the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (Grace) have measured a significant decrease in the mass of the Greenland ice cap. Grace is a satellite mission that measures movement in Earth’s mass.

In an update to findings published in the journal Geophysical Research Letters, a team led by Dr. Isabella Velicogna of the University of Colorado, Boulder, found that Greenland’s ice sheet decreased by 162 (plus or minus 22) cubic kilometers a year between 2002 and 2005. This is higher than all previously published estimates, and it represents a change of about 0.4 millimeters (.016 inches) per year to global sea level rise.

“Greenland hosts the largest reservoir of freshwater in the northern hemisphere, and any substantial changes in the mass of its ice sheet will affect global sea level, ocean circulation and climate,” said Velicogna. “These results demonstrate Grace’s ability to measure monthly mass changes for an entire ice sheet ? a breakthrough in our ability to monitor such changes.”

Other recent Grace-related research includes measurements of seasonal changes in the Antarctic Circumpolar Current, Earth’s strongest ocean current system and a very significant force in global climate change. The Grace science team borrowed techniques from meteorologists who use atmospheric pressure to estimate winds. The team used Grace to estimate seasonal differences in ocean bottom pressure in order to estimate the intensity of the deep currents that move dense, cold water away from the Antarctic. This is the first study of seasonal variability along the full length of the Antarctic Circumpolar Current, which links the Atlantic, Pacific and Indian Oceans.

Dr. Victor Zlotnicki, an oceanographer at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., called the technique a first step in global satellite monitoring of deep ocean circulation, which moves heat and salt between ocean basins. This exchange of heat and salt links sea ice, sea surface temperature and other polar ocean properties with weather and climate-related phenomena such as El Ninos. Some scientific studies indicate that deep ocean circulation plays a significant role in global climate change.

The identical twin Grace satellites track minute changes in Earth’s gravity field resulting from regional changes in Earth’s mass. Masses of ice, air, water and solid Earth can be moved by weather patterns, seasonal change, climate change and even tectonic events, such as this past December’s Sumatra earthquake. To track these changes, Grace measures micron-scale changes in the 220-kilometer (137-mile) separation between the two satellites, which fly in formation. To limit degradation of Grace’s satellite antennas due to atomic oxygen exposure and thereby preserve mission life, a series of maneuvers was performed earlier this month to swap the satellites’ relative positions in orbit.

In a demonstration of the satellites’ sensitivity to minute changes in Earth’s mass, the Grace science team reported that the satellites were able to measure the deformation of the Earth’s crust caused by the December 2004 Sumatra earthquake. That quake changed Earth’s gravity by one part in a billion.

Dr. Byron Tapley, Grace principal investigator at the University of Texas at Austin, said that the detection of the Sumatra earthquake gravity signal illustrates Grace’s ability to measure changes on and within Earth’s surface. “Grace’s measurements will add a global perspective to studies of large earthquakes and their impacts,” said Tapley.

Grace is managed for NASA by JPL. The University of Texas Center for Space Research has overall mission responsibility. GeoForschungsZentrum Potsdam, or GFZ, Potsdam, Germany, is responsible for German mission elements. Science data processing, distribution, archiving and product verification are managed jointly by JPL, the University of Texas and GFZ.

Imagery related to these latest Grace findings may be viewed at: http://www.nasa.gov/vision/earth/lookingatearth/grace-images-20051220.html .

For more information on Grace, visit: http://www.csr.utexas.edu/grace or http://www.gfz-potsdam.de/grace .

Original Source: NASA News Release

Northern Lights on the Move

Earth’s northern lights. Image credit: Philippe Moussette.Click to enlarge
After some 400 years of relative stability, Earth’s North Magnetic Pole has moved nearly 1,100 kilometers out into the Arctic Ocean during the last century and at its present rate could move from northern Canada to Siberia within the next half-century.

If that happens, Alaska may be in danger of losing one of its most stunning natural phenomena – the Northern Lights.

But the surprisingly rapid movement of the magnetic pole doesn’t necessarily mean that our planet is going through a large-scale change that would result in the reversal of the Earth’s magnetic field, Oregon State University paleomagnetist Joseph Stoner reported at the annual meeting of the American Geophysical Union in San Francisco, Calif.

“This may be part of a normal oscillation and it will eventually migrate back toward Canada,” said Stoner, an assistant professor in OSU’s College of Oceanic and Atmospheric Sciences. “There is a lot of variability in its movement.”

Calculations of the North Magnetic Pole’s location from historical records goes back only about 400 years, while polar observations trace back to John Ross in 1838 at the west coast of Boothia Peninsula. To track its history beyond that, scientists have to dig into the Earth to look for clues.

Stoner and his colleagues have examined the sediment record from several Arctic lakes. These sediments – magnetic particles called magnetite – record the Earth’s magnetic field at the time they were deposited. Using carbon dating and other technologies – including layer counting – the scientists can determine approximately when the sediments were deposited and track changes in the magnetic field.

The Earth last went through a magnetic reversal some 780,000 years ago. These episodic reversals, in which south becomes north and vice versa, take thousands of years and are the result of complex changes in the Earth’s outer core. Liquid iron within the core generates the magnetic field that blankets the planet.

Because of that field, a compass reading of north in Oregon will be approximately 17 degrees east from “true geographic north.” In Florida, farther away and more in line with the poles, the declination is only 4-5 degrees west.

The Northern Lights, which are triggered by the sun and fixed in position by the magnetic field, drift with the movement of the North Magnetic Pole and may soon be visible in more southerly parts of Siberia and Europe – and less so in northern Canada and Alaska.

In their research, funded by the National Science Foundation, Stoner and his colleagues took core samples from several lakes, but focused on Sawtooth Lake and Murray Lake on Ellesmere Island in the Canadian Arctic. These lakes, about 40 to 80 meters deep, are covered by 2-3 meters of ice. The researchers drill through the ice, extend their corer down through the water, and retrieve sediment cores about five meters deep from the bottom of the lakes.

The 5-meter core samples provide sediments deposited up to about 5,000 years ago. Below that is bedrock, scoured clean by ice about 7,000 to 8,000 years ago.

“The conditions there give us nice age control,” Stoner said. “One of the problems with tracking the movement of the North Magnetic Pole has been tying the changes in the magnetic field to time. There just hasn’t been very good time constraint. But these sediments provide a reliable and reasonably tight timeline, having consistently been laid down at the rate of about one millimeter a year in annual layers.

“We’re trying to get the chronology down to a decadal scale or better.”

What their research has told Stoner and his colleagues is that the North Magnetic Pole has moved all over the place over the last few thousand years. In general, it moves back and forth between northern Canada and Siberia. But it also can veer sideways.

“There is a lot of variability in the polar motion,” Stoner pointed out, “but it isn’t something that occurs often. There appears to be a ‘jerk’ of the magnetic field that takes place every 500 years or so. The bottom line is that geomagnetic changes can be a lot more abrupt than we ever thought.”

Shifts in the North Magnetic Pole are of interest beyond the scientific community. Radiation influx is associated with the magnetic field, and charged particles streaming down through the atmosphere can affect airplane flights and telecommunications.

Original Source: NASA Astrobiology

Smaller Ozone Hole This Year

The ozone hole: 2005. Image credit: NASA. Click to enlarge
NASA researchers, using data from the agency’s AURA satellite, determined the seasonal ozone hole that developed over Antarctica this year is smaller than in previous years.

NASA’s 2005 assessment of the size and thickness of the ozone layer was the first based on observations from the Ozone Monitoring Instrument on the agency’s Aura spacecraft. Aura was launched in 2004.

This year’s ozone hole measured 9.4 million square miles at its peak between September and mid-October, which was slightly larger than last year’s peak. The size of the ozone hole in 1998, the largest ever recorded, averaged 10.1 million square miles. For 10 of the past 12 years, the Antarctic ozone hole has been larger than 7.7 million square miles. Before 1985, it measured less than 4 million square miles.

The protective ozone layer over Antarctica annually undergoes a seasonal change, but since the first satellite measurements in 1979, the ozone hole has gotten larger. Human-produced chlorine and bromine chemicals can lead to the destruction of ozone in the stratosphere. By international agreement, these damaging chemicals were banned in 1995, and their levels in the atmosphere are decreasing.

Another important factor in how much ozone is destroyed each year is the temperature of the air high in the atmosphere. As with temperatures on the ground, some years are colder than others. When it’s colder in the stratosphere, more ozone is destroyed. The 2005 ozone hole was approximately 386,000 square miles larger than it would have been in a year with normal temperatures, because it was colder than average. Only twice in the last decade has the ozone hole shrunk to the size it typically was in the late 1980s. Those years, 2002 and 2004, were the warmest of the period.

Scientists also monitor how much ozone there is in the atmosphere from the ground to space. The thickness of the Antarctic ozone layer was the third highest of the last decade, as measured by the lowest reading recorded during the year. The level was 102 Dobson Units (the system of measurement designated to gauge ozone thickness). That is approximately one-half as thick as the layer before 1980 during the same time of year.

The Ozone Monitoring Instrument is the latest in a series of ozone-observing instruments flown by NASA over the last two decades. This instrument provides a more detailed view of ozone and is also able to monitor chemicals involved in ozone destruction. The instrument is a contribution to the mission from the Netherlands’ Agency for Aerospace Programs in collaboration with the Finnish Meteorological Institute. The Royal Netherlands Meteorological Institute is the principal investigator on the instrument.

For more information on NASA’s Aura mission on the Web, visit:
http://www.nasa.gov/aura

Original Source: NASA News Release

New View of Space Weather Cold Fronts

Artist’s impression of Earth auroras. Image credit: NASA Click to enlarge
Scientists from NASA and the National Science Foundation discovered a way to combine ground and space observations to create an unprecedented view of upper atmosphere disturbances during space storms.

Large, global-scale disturbances resemble weather cold fronts. They form in the Earth’s electrified upper atmosphere during space storms. The disturbances result from plumes of electrified plasma that form in the ionosphere. When the plasma plumes pass overhead, they impede low and high frequency radio communications and delay Global Positioning System navigation signals.

“Previously, they seemed like random events,” said John Foster, associate director of the Massachusetts Institute of Technology’s Haystack Observatory. He is principal investigator of the Foundation supported Millstone Hill Observatory, Wesford, Mass.

“People knew there was a space storm that must have disrupted their system, but they had no idea why,” said Tony Mannucci, group supervisor of Ionospheric and Atmospheric Remote Sensing at NASA’s Jet Propulsion Laboratory, Pasadena, Calif. “Now we know it’s not just chaos; there is cause and effect. We are beginning to put together the full picture, which will ultimately let us predict space storms.”

Predicting space weather is a primary goal of the National Space Weather Program involving NASA, the foundation and several other federal agencies. The view researchers created allowed them to link movement of the plumes to processes that release plasma into space. “Discovering this link is like discovering the movement of cold fronts is responsible for sudden thunderstorms,” said Jerry Goldstein, principal scientist at the Southwest Research Institute, San Antonio.

Since the occurrence of plasma plumes in the ionosphere disrupts GPS signals, they provide a continuous monitor of these disturbances. Researchers discovered a link between GPS data and satellite images of the plasmasphere. The plasmasphere is a plasma cloud surrounding Earth above the ionosphere. It is being observed from NASA’s Imager for Magnetopause to Aurora Global Exploration satellite. The researchers discovered the motion of the ionospheric plumes corresponded to the ejection of plasma from the plasmasphere during space storms.

The combined observations allowed construction of an underlying picture of the processes during space storms, when the Earth’s magnetic field is buffeted by hot plasma from the sun. As the solar plasma blows by, it generates an electric field that is transmitted to the plasmasphere and ionosphere. This electric field propels the ionospheric and the plasmaspheric plasma out into space. For the first time, scientists can directly connect the plasma observed in the ionosphere with the plasmasphere plumes that extend many thousand of kilometers into space.

“We also know these disturbances occur most often between noon and dusk, and between mid to high latitudes, due to the global structure of the electric and magnetic fields during space storms,” said Anthea Coster of the Haystack Observatory. “Ground and space based, and in situ measurements are allowing scientists to understand the ionosphere-thermosphere-magnetosphere as a coupled system.”

The plumes degrade GPS signals in two primary ways. First, they cause position error by time delaying the propagation of GPS signals. Second, the turbulence they generate causes receivers to lose the signal through an effect known as scintillation. It is similar to the apparent twinkling of stars caused by atmospheric turbulence.

Researchers are presenting the findings today during the American Geophysical Union meeting in San Francisco, Calif. For information about space weather and other research on the Web, visit:
http://www.nasa.gov/vision/universe/solarsystem/cold_front.html

Original Source: NASA News Release

Oxygen Levels on Earth Rose Gradually

Earth. Image credit: NASA Click to enlarge
The history of life on Earth is closely linked to the appearance of oxygen in the atmosphere. The current scientific consensus holds that significant amounts of oxygen first appeared in Earth’s atmosphere some 2.4 billion years ago, with a second large increase in atmospheric oxygen occurring much later, perhaps around 600 million years ago.

However, new findings by University of Maryland geologists suggest that the second jump in atmospheric oxygen actually may have begun much earlier and occurred more gradually than previously thought. The findings were made possible using a new tool for tracking microbial life in ancient environments developed at Maryland. Funded by the National Science Foundation and NASA, the work appears in the December 2 issue of Science.

Graduate researcher David Johnston, research scientist Boswell Wing and colleagues in the University of Maryland’s department of geology and Earth System Science Interdisciplinary Center led an international team of researchers that used high-precision measurements of a rare sulfur isotope, 33S, to establish that ancient marine microbes known as sulfur disproportionating prokaryotes were widely active almost 500 million years earlier than previously thought.

The intermediate sulfur compounds used by these sulfur disproportionating bacteria are formed by the exposure of sulfide minerals to oxygen gas. Thus, evidence of widespread activity by this type of bacteria has been interpreted by scientists as evidence of increased atmospheric oxygen content.

“These measurements imply that sulfur compound disproportionation was an active part of the sulfur cycle by [1.3 million years ago], and that progressive Earth surface oxygenation may have characterized the [middle Proterozoic],” the authors write.

The Proterozoic is the period in Earth’s history from about 2.4 billion years ago to 545 million years ago.

“The findings also demonstrate that the new 33S-based research method can be used to uniquely track the presence and character of microbial life in ancient environments and provide a glimpse of evolution in action,” said Johnston. “This approach provides a significant new tool in the astrobiological search for early life on Earth and beyond.”

The Air That We Breathe

When our planet formed some 4.5 billion years ago, virtually all the oxygen on Earth was chemically bound to other elements. It was in solid compounds like quartz and other silicate minerals, in liquid compounds like water, and in gaseous compounds like sulfur dioxide and carbon dioxide. Free oxygen — the gas that allows us to breath, and which is essential to all advanced life — was practically non-existent.

Scientists have long thought that appearance of oxygen in the atmosphere was marked by two distinct jumps in oxygen levels. In recent years, researchers have used a method developed by University of Maryland geologist James Farquhar and Maryland colleagues to conclusively determine that significant amounts of oxygen first appeared in Earth’s atmosphere some 2.4 billion years ago. Sometimes referred to as the “Great Oxidation Event,” this increase marks the beginning of the Proterozoic period.

A general scientific consensus has also held that the second major rise in atmospheric oxygen occurred some 600 million years ago, with oxygen rising to near modern levels at that time. Evidence of multicellular animals first appears in the geologic around this time.

“There has been a lot of discussion about whether the second major increase in atmospheric oxygen was quick and stepwise, or slow and progressive,” said Wing. “Our results support the idea that the second rise was progressive and began around 1.3 billion years ago, rather than 0.6 billion years ago.”

In addition to Johnston, Wing’s Maryland co-authors on the Dec. 2 paper are geology colleagues James Farquhar and Jay Kaufman. Their group works to document links between sulfur isotopes and the evolution of Earth’s atmosphere using a combination of field research, laboratory analysis of rock samples, geochemical models, photochemical experiments with sulfur-bearing gases and microbial experiments.

“Active microbial sulfur disproportionation in the Mesoproterozoic” by David T. Johnston, Boswell A. Wing, James Farquhar and Alan J. Kaufman, University of Maryland; Harald Strauss, Universit?t M?nster; Timothy W. Lyons, University of California, Riverside; Linda C. Kah, University of Tennessee; Donald E. Canfield, Southern Denmark University: Science, Dec. 2, 2005.

Original Source: UM News Release