Book Review: President’s Commission on Implementation of U.S. Space Exploration Policy

On January 17, 2004, President Bush announced his new vision for human spaceflight. The space shuttle would fly again, to complete the International Space Station. And then the next stage in human space exploration would begin, with humans landing on the Moon by 2015-2020; missions to Mars will follow. He announced a new commission would be formed, led by Edward “Pete” Aldridge to figure out the best way to implement this vision.

The commissioners conducted five public forums and fact finding missions. They interviewed 94 witnesses, including NASA employees, astronauts, academics, media, students, labour unions, space advocates, and many of the agency’s biggest critics. Three months after they began, the commissioners delivered their 64-page report to the President and the public.

This report lays out what I think is a realistic strategy of how to change NASA so that it’s better equipped to accomplish this vision. But I think the commissioners went a step further and got to the heart of what’s wrong with NASA, and offered solutions to get the agency back on track.

The commissioners suggest that “the space vision must be managed as a national priority”, and offer ideas: national advisors, representatives at federal agencies, commissions and councils. This could be layers of extra bureaucracy, or effective oversight. I’m not sure which it would be.

It goes on to make a series of recommendations on how to make the private industry assume a pivotal role in space exploration, by providing services to NASA, especially supplying low-Earth orbit. The commission suggests that NASA should become a customer, purchasing launch services and other products from a healthy private space industry. NASA’s role should be largely limited to science, and the risky research and development where there is “irrefutable demonstration that only government can perform the proposed activity.” I’d like to see how you measure an “irrefutable demonstration”, but that’s good, strong language.

The report goes on to suggest how risky technologies should be identified, directed into mature technologies, and then transitioned into the private sector. This is key. If business is unwilling to take a risk on nuclear propulsion, then NASA – an innovative and adventurous NASA – can swoop in, figure out if it’s possible, build a prototype, and then hand it off to private industry. This could be done directly by NASA, or through competitions like the X Prize ($1 billion for the first company to put a human on the Moon, for example). It’s one of the most exhilarating visions for NASA I can imagine, and I’m sure the people working there would be inspired too.

“The space industry will become a national treasure”, suggests the report. It encourages NASA to dig deep throughout the nation to find the best ideas, people and technologies and get them working to fulfill the exploration vision. I like the sound of that; it’s a 180-degree departure from the agency’s current reputation for close-mindedness. If you’re on the outside right now, you have to fight tooth-and-nail to get your great ideas considered by NASA. This created the bad blood between NASA and private industry today. The commissioners set a great example preparing the report, and let anyone provide ideas through the public forums, and via their website – 6,000 written comments were recieved. Many of these freely offered ideas ended up being quoted word-for-word in the report.

The commissioners suggest that NASA should embrace the international space community to develop future endeavors in space. That’s fine, but a similar vision created the International Space Station. Perhaps a better direction would be to allow NASA to work with suppliers outside of the US. Competing against Russian rocket builders might just light a fire under Lockheed Martin and Boeing.

The report reminds us that a large part of NASA is its role in scientific discovery, and encourages the agency to connect with the scientific community to hear their priorities. The current state is a severe disconnect. Although NASA has enabled some terrific science, it’s funneled billions of dollars into research that has more to do with politics than science. If NASA can figure out how to rebalance this, scientists would be much happier.

Finally, the commission recommends that NASA do a better job of connecting to the public; to encourage future generations of scientists, aerospace engineers and software programmers to direct their careers towards space exploration. I’m in the media, and I can tell you that NASA could go a long way to improving its relations with us… and you, the public. It feels secretive and controlling, dispensing information carefully and selectively. Why aren’t astronauts making the talk show circuit? Where are the reality shows? I want new episodes of Cosmos, maybe hosted by Dr. Brian Greene and Dr. Michio Kaku. Just look at the success of the television show CSI, it’s entertaining and scientific.

Before I started reading the report, I was worried it would either be too aggressive or just plain boring. Instead, the Aldridge report was realistic; perhaps the best compliment I could heap on it. It was very entertaining to read, and I was constantly nodding my head in agreement.

It’s realistic because it recognizes that NASA already has many assets, in equipment, programs, and personnel. These can evolve, improving what works and discarding what doesn’t. Radical space advocates want to see the agency scoured. Disband the centres and fire everyone. That makes me cringe to think what kind of assets and goodwill would get flushed down the toilet. Not to mention, it would be political suicide.

This report suggests, no… demands, that NASA and private enterprise sit down at the table and work things out. Get to the bottom of why the agency has resisted its influence in the past, and see the wheels of free enterprise spinning again. Get the burden off the shoulders of the taxpayer and into grateful hands of business. When people ask, “what’s the point of space exploration, why should we spend $15 billion a year on this when we should be feeding the poor”, it demonstrates how NASA has failed to create a self sustaining spacefaring industry.

My main concern with the Aldridge commission’s report is that it doesn’t do enough to define the “critical success factors”. That’s management speak for the things you can point to which indicate you’re on the right track. The report encourages NASA to become sustainable, affordable, and credible, but doesn’t provide the details about what that agency would look like. The trick with critical success factors is they aren’t goals, they’re principles. They guide your organization into a virtuous spiral of improvement. A responsible leader provides followers with the vision, and then backs it up with these principles to help everyone guide their efforts – it prevents an organization from going off the rails in the future.

In recent years NASA has seemed to be in the business of maintaining its existence. Fill an organization with people regularly under attack from budget cuts, public mistakes, taxpayer displeasure, and a non-existent job market, and it shouldn’t come as a surprise that people are mainly looking to protect their jobs. That the thrilling vision and enthusiasm for space exploration has been watered down by politics and bureaucracy.

The easiest time to change someone’s mind in this situation – someone would otherwise maintain the status quo – is when something disastrous happens to confront their world view. The Columbia disaster was just this event. It briefly drove a stake deep into the heart of the bureaucracy and I know it caused every single person in NASA to wonder what went wrong.

And be open for change.

NASA employees and managers have an open mind right now. Congress and the Senate understand that bad decisions by government contributed to the situation. This affected President Bush, and he announced a new direction; an exciting vision to return to the Moon and then head off to Mars.

Although I’m hard pressed to think of something more exciting for space exploration than humans setting foot on Mars, I’m more excited by the possibility that NASA will reinvent itself from an organization that defends itself and restricts free enterprise, to one that embraces entrepreneurs and ensures that mankind returns to space… for good.

NASA needed a plan which would inject free enterprise deep into its bloodstream, while maintaining its value to science, and developing the risky technologies that business won’t touch. In my opinion, this is what they got from Aldridge and the rest of the commissioners. Good job.

Now, let’s see President Bush embrace the plan. Let’s see NASA implement it in a way that respects its employees and takes advantage of their creativity, experience, and infrastructure. Let’s judge their progress by how well they stick to their principles.

Return to space, and never turn back. Failure is not an option.

Read the report for yourself.

NASA Begins its Transformation

In the latest of what will be ongoing briefings, Administrator Sean O’Keefe today announced a transformation of NASA’s organization structure designed to streamline the agency and position it to better implement the Vision for Space Exploration.

In a report released last week, the President’s Commission on Implementation of U.S. Space Exploration Policy found, “NASA needs to transform itself into a leaner, more focused agency by developing an organizational structure that recognizes the need for a more integrated approach to science requirements, management, and implementation of systems development and exploration missions.”

“Our task is to align Headquarters to eliminate the ‘stove pipes,’ promote synergy across the agency, and support the long-term exploration vision in a way that is sustainable and affordable,” said Administrator O’Keefe. “We need to take these critical steps to streamline the organization and create a structure that affixes clear authority and accountability.”

This transformation fundamentally restructures NASA’s Strategic Enterprises into Mission Directorates to better align with the Vision. It also restructures Headquarters support functions and clarifies organizational roles and responsibilities. The Mission Directorate organizational structure includes:

* Aeronautics Research: Research and develop aeronautical technologies for safe, reliable and efficient aviation systems

* Science: Carry out the scientific exploration of the Earth, Moon, Mars and beyond; chart the best route of discovery; and reap the benefits of Earth and space exploration for society. A combined organization is best able to establish an understanding of the Earth, other planets and their evolution, bring the lessons of our study of Earth to the exploration of the Solar System, and to assure the discoveries made here will enhance our work there

* Exploration Systems: Develops capabilities and supporting research and technology that enable sustained and affordable human and robotic exploration; includes the biological and physical research necessary to ensure the health and safety of crew during long duration space flight

* Space Operations: Direct space flight operations, space launches and space communications, as well as the operation of integrated systems in low-Earth orbit and beyond

Two agency-wide priorities will continue with direct responsibility for all related activities across NASA.

* Safety and Mission Assurance Officer: Reports directly to the Administrator and reflects NASA’s commitment to provide a clear and direct line to agency senior leadership for issues regarding safety

* Chief Education Officer: Directs the agency’s important work to improve scientific and technological literacy and inspire a new generation of explorers

NASA functional offices will be restructured as Mission Support Offices. Headquarters and field center offices will be aligned to improve communications and responsibility.

The major Mission Support Offices are:

* Chief Financial Officer (CFO): Conducts all financial matters, including procurement and small and disadvantaged business activities. All field center financial officers report directly to the Headquarters CFO to better address critical financial issues

* Associate Administrator for Institutions and Management: Responsible for providing operational and management support for Headquarters; directs a full range of activities relating to personnel and institutional management across the agency

* Chief Information Officer: Responsible for the development of an integrated focus on information resource management strategies, policies and practices

* Chief Engineer: Ensures the development efforts and missions operations are being planned and conducted on a sound engineering basis; assures independent technical authority within the agency’s engineering, operations and safety organizations

* Chief of Strategic Communications: Directs NASA’s communication efforts in Public Affairs, Legislative Affairs and External Relations; responsible for internal communications management

* General Counsel: Responsible for the legal aspects of all NASA’s activities; manages the agency’s intellectual property and ethics programs

To improve the decision-making process, NASA will create:

* Strategic Planning Council: Chaired by the NASA Administrator, the Council develops multi-year strategic plans, strategic roadmaps, and a multi-year detailed plan that forms the basis for policies and budgets

* Director of Advanced Planning: Responsible for the preparation of options, studies and assessments for the Strategic Planning Council

* Chief Operating Officer Council: Chaired by the Deputy Administrator, implements direction provided by the Strategic Planning Council and develops standard administrative practices to build on the President’s Management Agenda

The Associate Deputy Administrator for Systems Integration is responsible for strategic and systems integration across Mission Directorates and mission support functions

The agency will also redefine its relationships with the NASA Field Centers by developing clear and straightforward lines of responsibility and accountability. Specific Mission Associate Administrators will be assigned as Headquarters Center Executives. They will have oversight of field center performance in implementing agency policies and programs. The Associate Administrator for Institutions and Management will address field center infrastructure concerns.

The changes outlined today become effective August 1, 2004. They represent the next step in implementing the recommendations of the President’s Commission on Implementation of U.S. Space Exploration Policy and reflect NASA’s ongoing efforts to apply the findings and recommendations of the Columbia Accident Investigation Board across the agency.

Over the next several weeks, the Administrator will engage teams in each NASA location to provide front line guidance on implementing their early stages of the transformation plan. The discussions will be the precursor for a renewed commitment to mission success and excellence in an employee-centric organization.

“This transformation will be an evolutionary process, exploring new ways to move forward and implement change. We’ll also be engaging other government agencies, industry, academia and the international community to assist us in developing the tools and processes we need to successfully advance the Vision for Space Exploration,” added Administrator O’Keefe. “Doing so will enable us to take the next bold steps into space and rekindle the innovation and entrepreneurial skills that is our legacy to humankind.”

Additional presentation information and a new NASA organization chart is available on the Internet at:

http://www.nasa.gov/formedia

Original Source: NASA News Release

Wallpaper: Phoebe

During its historic close encounter with Phoebe, the Cassini spacecraft captured a series of high resolution images of the small moon, six of which have been mosaicked together to create this detailed view.

Phoebe shows an unusual variation in brightness over its surface due to the existence on some crater slopes and floors of bright material ? thought to contain ice ? on what is otherwise one of the darkest known bodies in the solar system. Bright streaks on the rim of the large crater in the North (up in this image) may have been revealed by the collapse of overlying darker material from the crater wall. The large crater below right-of-center shows evidence of layered deposits of alternating bright and dark material. A possible mechanism for this apparent layering was discussed in an earlier image release (PIA 06067).

Hints of Phoebe?s irregular topography can be seen peeking out from the shadows near the lower left and upper left parts of the image. These are real features ? possibly crater rims or mountain peaks ? that are just being hit by the first light of sunrise on Phoebe.

Phoebe?s surface shows many large- and small-scale craters. The emerging view of Phoebe is that it might have been part of an ancestral population of icy, comet-like bodies, some of which now reside in the Kuiper Belt beyond Neptune.

The images in this mosaic were taken in visible light with the narrow-angle camera at distances ranging from 15,974 kilometers (9,926 miles) to 12,422 kilometers (7,719 miles). The image scale is 74 meters (243 feet) per pixel. Contrast in the image has been enhanced slightly to improve visibility.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini-Huygens mission for NASA’s Office of Space Science, Washington, D.C. The imaging team is based at the Space Science Institute, Boulder, Colorado.

For more information about the Cassini-Huygens mission, visit http://saturn.jpl.nasa.gov and the Cassini imaging team home page, http://ciclops.org.

Original Source: CICLOPS News Release

New Instrument Finds its First Supernova

The Nearby Supernova Factory, an international collaboration of astronomers and astrophysicists, has announced that SNIFS, the Supernova Integral Field Spectrograph, achieved “first light” during the early morning hours of Tuesday, June 8, when the new instrument acquired its first astronomical target, a Type Ia supernova designated SN 2004ca. Type Ia supernovae are the kind used by astronomers to measure the expansion of the universe.

Analysis of the initial data, plus a separate observation of the newly discovered supernova SN 2004cr on Sunday, June 20th, confirm that SNIFS ? while still in its commissioning phase ? is meeting its design goals as a remarkable new tool for observing supernovae.

SNIFS, which was recently mounted on the University of Hawaii’s 2.2-meter telescope atop Mauna Kea on the island of Hawaii, is an innovative instrument designed to track down the idiosyncrasies and precise distances of Type Ia supernovae by simultaneously obtaining over 200 spectra of each target, its home galaxy, and the nearby night sky.

SNIFS is a crucial element in the international Nearby Supernova Factory (SNfactory), initiated at the Department of Energy’s Lawrence Berkeley National Laboratory. The SNfactory’s goal is to find and study over 300 nearby Type Ia supernovae in order to reduce uncertainties about these foremost astronomical “standard candles,” whose measurement led to the discovery that the expansion rate of the universe is increasing.

“Better knowledge of these extraordinarily bright and remarkably uniform objects will make them even better tools for measuring the cosmos,” says astronomer Greg Aldering of Berkeley Lab’s Physics Division, who leads the SNfactory collaboration. “Type Ia supernovae are the key to understanding the mysterious dark energy that’s causing the universe to expand ever faster.”

The body of the SNIFS instrument was built by the SNfactory’s French collaborators, members of the Laboratoire de Physique Nucl?aire et de Haute Energies (LPNHE) in Paris, the Centre de Recherche Astronomique de Lyon (CRAL), and the Institut de Physique Nucl?aire de Lyon (INPL), supported by the Institut National de Physique Nucl?aire et de Physique des Particules (CNRS/IN2P3) and the Institut National des Sciences de l’Univers (CNRS/INSU). Berkeley Lab, with help from Yale University, developed the cameras used to detect the light from SNIFS, while the University of Chicago developed instruments to monitor the performance of SNIFS.

The SNIFS instrument produces a spectrum at each position within a six- by six-arc-second region around the target supernova, including its home galaxy and surrounding sky, by using an “integral field unit” consisting of an array of individual lenslets. Light is extracted from the telescope’s field of view by a small prism and directed to either blue-sensitive or red-sensitive, eight-megapixel, astronomical CCD cameras. Together these cameras collect all the optical light from each supernova.

A separate photometry camera, running in parallel with the spectrograph under identical observing conditions, allows spectra to be corrected for variables like thin cloud cover. A guide camera keeps the spectrograph precisely aligned on target by measuring the position of a guide star within the telescope’s wider field of view once each second, adjusting the aim if necessary.

Flown to Hilo in March and assembled in working order at sea level, SNIFS was taken apart, carried to the 4,245-meter (nearly 14,000-foot) summit of Mauna Kea, and reassembled on the University of Hawaii’s 2.2-meter telescope on April 6.

“At sea level we made sure everything was in order and also rehearsed the assembly,” says Aldering. “When you get to 14,000 feet things get tricky. Everybody carries a ‘dumb list’ so they don’t start off to do something and then forget what it was.”

Two months of engineering to align and calibrate the instrument on the telescope preceded SNIFS observation of its first new Type Ia supernova, SN 2004ca, on June 8th, in the constellation Cygnus, the swan. This was followed by the observation of SN 2004cr in the constellation Cepheus, the king, on June 20th. Shortly routine observations of SNfactory-discovered supernovae will begin.

“Now that SNIFS is in regular operation,” Aldering says, “our daily lives have changed dramatically.” After years of planning and long-distance meetings, including monthly videoconferences, “the activity level has escalated ? every day we have to react instantly as our new supernova data come pouring in.”

A full schedule ahead
The SNfactory strategy has two “pipelines,” the first being a supernova search using automated wide-field sky surveys. Data are provided by the QUEST-II 160-megapixel camera, built by Yale University and Indiana University and operated at Palomar Observatory by the QUEST-II group, as well as by the Jet Propulsion Laboratory’s Near Earth Asteroid Tracking team and the California Institute of Technology. The data are transmitted by the High-Performance Research and Education Network to the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab for identification of likely supernova candidates.

The ideal candidate is a recently exploded Type Ia supernova that is near enough for accurate measurement of its spectrum and light curve (its rising and falling brightness) but far enough away to be “in the smooth Hubble flow” ? meaning that its redshift is mostly due to the expansion of the universe alone, unaffected by the motion of its home galaxy through space.

The SNfactory’s search phase has been operating for over a year, although not at full capacity. “The search will now be going full steam,” Aldering says. “We’ll be getting a few candidates each night of the year ? more than the entire current worldwide rate of discovery.”

SNIFS is mounted on the University of Hawaii’s 2.2-meter telescope atop Mauna Kea on the island of Hawaii.

The second SNfactory pipeline passes the search candidates on to SNIFS, where the type and redshift of each supernova are determined and the most promising are selected and scheduled for more detailed study. The SNfactory uses the University of Hawaii’s telescope three times a week for half a night ? the half beginning at midnight, as a courtesy to local observers ? with SNIFS available to other projects at other times.

Eventually SNIFS will operate fully automatically. Remote control of the telescope and spectrograph was first done from Hilo, Hawaii and is now being done from Berkeley Lab and France.

SNIFS can determine a given Type Ia’s specific physical characteristics including, for example, whether or not it is unusually energetic or how much its light may have been dimmed by dust in its home galaxy. Such unparalleled spectrographic and photometric detail makes it possible to take advantage of a unique characteristic of Type Ia supernovae: that “they can be calibrated individually, not simply statistically,” Aldering says. “We’ll be able to measure the luminosity with confidence. Knowing the luminosity, we can tell you the distance with precision.”

By collecting large numbers of Type Ia supernovae in the Hubble flow, SNfactory scientists will be able to pin down the low-redshift end of the luminosity-redshift diagram upon which measures of the universe’s expansion rate are based. This, plus detailed understanding of the physical factors that cause small variations in Type Ia spectra and light curves, will improve the accuracy of the high-redshift measurements crucial to choosing among the many competing theoretical models of dark energy.

Members of the Nearby Supernova Factory team include Greg Aldering, Peter Nugent, Saul Perlmutter, Lifan Wang, Brian C. Lee, Rollin Thomas, Richard Scalzo, Michael Wood-Vasey, Stewart Loken, and James Siegrist from Berkeley Lab; Jean-Pierre Lemonnier, Arlette Pecontal, Emmanuel Pecontal, Christophe Bonnaud, Lionel Capoani, Dominique Dubet, Francois Heunault, and Blandine Lantz from CRAL; Gerard Smadja, Emmanuel Gangler, Yannick Copin, Sebastien Bongard, and Alain Castera from INPL; Reynald Pain, Pierre Antilogus, Pierre Astier, Etienne Barrelet, Gabriele Garavini, Sebastien Gilles, Luz-Angela Guevara, Didier Imbault, Claire Juramy, and Daniel Vincent from LPNHE; and Rick Kessler and Ben Dilday from the University of Chicago. Recently the astrophysics group at Yale University, under the leadership of Charles Baltay, has joined the Nearby Supernova Factory.

The Berkeley Lab is a U.S. Department of Energy national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California. Visit our website at http://www.lbl.gov.

Original Source: Berkeley Lab News Release

As Requested, Bigger Photos

Thanks for all your feedback, I really appreciate it. I can’t believe how positive everyone was. The common thread was: “you’ve got a good thing going… don’t mess it up”. I get the message. 🙂 One thing people did want to see; however, was a way to enlarge pictures even more (will your insatiable hunger for images ever be fulfilled?). It was easy to do, so I’ve made the change with today’s issue. When you look at the full story, the big picture is clickable, and takes you to an even larger version of the image, so you can look at more detail, or turn it into a wallpaper (right-click and then select “Set as Wallpaper). Then just click back in your browser to get back to the story. Any further expansion of images would probably destroy the Universe, so we’ll just keep it as is.

If anything else springs to mind, please let me know.

Thanks!

Fraser Cain
Publisher
Universe Today

Photo Gallery of SpaceShipOne

In case you weren’t one of the 10,000 plus people who made the trip to Mojave airport to watch Monday’s launch of SpaceShipOne, here’s a photo gallery of images from that amazing day. A special thanks to correspondent and photographer Tony Hesch for taking these amazing photos from the event. And a big thanks to Scaled Composites for giving Universe Today backstage access to all the action. I really wish I could have made the trip myself… maybe next time. 🙂

Russia Cancels Olsen’s Trip to Space

Although he was scheduled to become the third privately funded person to fly to the International Space Station, Gregory Olsen appears to have had his wings clipped. Officials from the Russian Space Agency said today that Olsen has been dropped from the program because of health concerns. This is unusual, because Olsen has already been training at Moscow’s Star City for several months, and has already undergone many health tests. Olsen, the founder of Sensors Unlimited in New Jersey had been planning to perform some experiments while on board the station.

Deeper Analysis of Phoebe Flyby

Like a woolly mammoth trapped in Arctic ice, Saturn’s small moon Phoebe may be a frozen artifact of a bygone era, some four billion years ago. The finding is suggested by new data from the Cassini spacecraft.

Cassini scientists reviewed data from the spacecraft’s June 11, 2004, flyby of the diminutive moon. They concluded Phoebe is likely a primordial mixture of ice, rock and carbon-containing compounds similar in many ways to material seen in Pluto and Neptune’s moon Triton. Scientists believe bodies like Phoebe were plentiful in the outer reaches of the solar system about four and a half billion years ago.

These icy planetesimals (small bodies) formed the building blocks of the outer solar system and some were incorporated into the giant planets Jupiter, Saturn, Uranus and Neptune. During this process, gravitational interactions ejected much of this material to distant orbits, joining a native population of similar bodies to form the Kuiper Belt.

“Phoebe apparently stayed behind, trapped in orbit about the young Saturn, waiting eons for its secrets to be revealed during its rendezvous with the Cassini spacecraft,” said Dr. Torrence Johnson, Cassini imaging team member at NASA’s Jet Propulsion Laboratory, Pasadena, Calif.

“All our evidence leads us to conclude, Phoebe’s surface is made of water ice, water-bearing minerals, carbon dioxide, possible clays and primitive organic chemicals in patches at different locations on the surface,” said Dr. Roger N. Clark, team member for the visual and infrared mapping spectrometer, U.S. Geological Survey in Denver. “We also see spectral signatures of materials we have not yet identified.” Cassini’s observations gave scientists the first detailed look at one of these primitive icy planetesimals.

Phoebe’s mass was determined from precise tracking of the spacecraft and optical navigation, combined with an accurate volume estimate from images. The measurements yield a density of about 1.6 grams per cubic centimeter (100 pounds per cubic foot), much lighter than most rocks, but heavier than pure ice at approximately 0.93 grams per cubic centimeter (58 pounds per cubic foot). This suggests a composition of ice and rock similar to Pluto and Triton.

Spectral measurements, light intensity as a function of color or wavelength, confirmed the presence of water ice previously detected by Earth-based telescopes. The measurements provided evidence for hydrated minerals on Phoebe’s surface, and detected carbon dioxide and solid hydrocarbons similar to those found in primitive meteorites.

“One intriguing result is the discovery of possible chemical similarities between the materials on Phoebe and those seen on comets,” said Dr. Robert H. Brown, team leader for the visible and infrared mapping spectrometer, University of Arizona, Tucson. Evidence that Phoebe might be chemically kin to comets strengthens the case that it is similar to Kuiper Belt Objects.

Measurements taken by the composite infrared spectrometer were used to generate temperature maps. The maps show the surface of Phoebe is very cold, only about 110 degrees above absolute zero (minus 163 degrees Celsius, or minus 261 degrees Fahrenheit). Even colder nighttime temperatures suggest a fluffy, porous surface layer.

“One of the first results from this map is the surface of Phoebe has been badly chewed up, probably by meteorite impacts,” said Dr. John Pearl, a Cassini co-investigator for the composite infrared spectrometer, at NASA’s Goddard Space Flight Center, Greenbelt, Md. “We are discovering Phoebe is a very complex object, with large variations in topography.”

Cassini also made radar observations of Phoebe’s enigmatic surface, making it the first spacecraft radar observations of an outer-planet moon. The results are consistent with the dirty, rocky, icy surface suggested by other observations.

“We have conducted our first analysis of an outer solar system resident akin to Kuiper Belt Objects,” said Dr. Dennis Matson, project scientist of the Cassini-Huygens mission at JPL. “In two short weeks, we have added more to what we know about Phoebe than we had learned about it since it was discovered 100 years ago. We did this by having multiple instruments conducting investigations all at one time during our flyby.”

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL manages the mission for NASA’s Office of Space Science, Washington. For the latest images and more information about the mission on the Internet, visit http://www.nasa.gov and http://saturn.jpl.nasa.gov .

Original Source: NASA/JPL News Release

Space Simulator Models the Universe

For the past several years, a team of University of California astrophysicists working at Los Alamos National Laboratory have been using a cluster of roughly 300 computer processors to model some of the most intriguing aspects of the Universe. Called the Space Simulator, this de facto supercomputer has not only proven itself to be one of the fastest supercomputers in the world, but has also demonstrated that modeling and simulation of complex phenomena, from supernovae to cosmology, can be done on a fairly economical basis.

According to Michael Warren, one of the Space Simulator’s three principal developers, “Our goal was to acquire a computer which would deliver the highest performance possible on the astrophysics simulations we wanted to run, while remaining within the modest budget that we were allotted. Building the Space Simulator turned out to be a excellent choice.”

The Space Simulator is a 294-node Beowulf cluster with theoretical peak performance just below 1.5 teraflops, or trillions of floating point operations per second. Each Space Simulator processing node looks much like a computer you would find at home than at a supercomputer center, consisting of a Pentium 4 processor, 1 gigabyte of 333 MHz SDRAM, an 80 gigabyte hard drive and a gigabit
Ethernet card. Each individual node cost less than $1,000 and the entire system cost under $500,000. The cluster achieved Linpack performance of 665.1 gigaflops per second on 288 processors in October 2002, making it the 85th fastest computer in the world, according to the 20th TOP500 list (see www.top500.org). A gigaflop is a billion floating-point operations per second.

Since 2002, the Space Simulator has moved down to #344 on the most recent TOP500 list as faster computers are built, but Warren and his colleagues are not worried. They built the Space Simulator to do specific astrophysics research, not to compete with other computers. It was never designed to compete with Laboratory’s massive supercomputers and, in fact, is not scalable enough to do so.

The Space Simulator has been used almost continuously for theoretical astrophysics simulations since it was built, and has spent much of the past year calculating the evolution of the Universe. The first results of that work were recently presented at a research conference in Italy by Los Alamos postdoctoral research associate Luis Teodoro. Further analysis of the simulations, in collaboration with Princeton University professor Uros Seljak, will soon be published in the prestigious journal Monthly Notices of the Royal Astronomical Society. In addition to simulating the structure and evolution of the Universe, the Space Simulator has been used to study the explosions of massive stars and to help understand the X-ray emission from the center of our galaxy.

The Space Simulator is actually the Laboratory’s third generation Beowulf cluster. The first was Loki, which was constructed in 1996 from 16 200 MHz Pentium Pro processors. Loki was followed by the Avalon cluster, which consisted of 144 alpha processors. The Space Simulator follows the same basic architecture as these previous Beowulf machines, but is the first to use Gigabit Ethernet as the network fabric, and requires significantly less space than a cluster using typical computers. The Space Simulator runs parallel N-body algorithms, which were originally designed for astrophysical applications involving gravitational interactions, but have since been used to model more complex particle systems.

In addition to Warren, the developers of the Space Simulator include Los Alamos staff members Chris Fryer and Patrick Goda.

Los Alamos’ Laboratory-Directed Research and Development (LDRD) program provided funding for the Space Simulator research. LDRD funds basic and applied research and development focusing on employee-initiated creative proposals selected at the discretion of the Laboratory director.

Los Alamos National Laboratory is operated by the University of California for the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy and works in partnership with NNSA’s Sandia and Lawrence Livermore national laboratories to support NNSA in its mission.

Los Alamos enhances global security by ensuring the safety and reliability of the U.S. nuclear deterrent, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to defense, energy, environment, infrastructure, health and national security concerns.

Original Source: LANL News Release

General Accounting Office Blasts NASA

Science Committee Chairman Sherwood Boehlert (R-NY) and Ranking Minority Member Bart Gordon (D-TN) today released a General Accounting Office (GAO) study they had requested titled, ?NASA: Lack of Disciplined Cost-Estimating Processes Hinders Effective Program Management.?

GAO concluded that ?NASA lacks a clear understanding of how much programs will cost and how long they will take to achieve their objectives?.NASA?s basic cost-estimating processes?lack the discipline needed to ensure that program estimates are reasonable.? As part of the study, GAO reviewed 27 programs, 10 of them in-depth.

In a response included in the appendix to the GAO report, NASA ?concur[red]? with the recommendations in the report and listed steps the agency has underway to implement them. The recommendations include having NASA develop ?an integrated plan for improving cost estimating? and establishing ?a standard framework for developing life-cycle cost estimates.? The GAO report elaborates on those recommendations in some detail.

Boehlert said, ?This is something that started out as a ?bad news? story that appears to be heading for a happy ending. The report lays out in detail the problems that have repeatedly plagued NASA?s cost estimating over many years. Congress needs to be aware of these problems when evaluating NASA?s proposals. But NASA does have concrete steps underway to improve the situation, for which Administrator O?Keefe should be congratulated. GAO has told us that those steps will go a long way toward solving the problem. And there?s some indication that those steps are beginning to bear fruit. The newest program that GAO examined, the cockpit avionics upgrade, also was the one with the best performance, although GAO still had some concerns with it. So there?s cause for optimism. Our Committee?s job will be to ensure that NASA continues to implement the steps it has outlined fully, carefully and as speedily as possible.?

?The GAO report?s findings, when coupled with NASA?s failure to pass an independent financial audit for the past three years running, suggest that NASA needs to get its financial house in order,? Gordon said.

Original Source: House Committee on Science News Release